This week, the boys talk about giant men and giant women, and the weirdest things you can do with a giant body, and how you can be a giant man and not look like one, too. Also, a lot of people think Giga Chad is a CGI guy, but he's probably not.
00:05:54.000And, you know, the idea that a whistleblower for an enormous AI company that's worth billions of dollars might get whacked, that's not outside the pale.
00:06:48.000But if somebody was accusing me of killing Jamie, like if Jamie was a whistleblower and Jamie got whacked, and then I'd be like, wait, what do you what are you saying?
00:06:56.000Are you accusing me of killing my friend?
00:06:57.000Like, what the fuck are you talking about?
00:07:41.000And the people that I know who knew him said he was not suicidal.
00:07:48.000So I'm like, why would you jump to the conclusion?
00:07:51.000His parents sued the Son's landlord alleged the owners and the managers of their son's San Francisco apartment building were part of a widespread cover-up of his death.
00:10:17.000I mean, you know, as I was saying just before coming into the studio, you know, like every day there's some crazy, wild new thing that's happening.
00:10:27.000It feels like reality is accelerating.
00:10:29.000It's every day, and every day it's like more and more ridiculous to the point where the simulation is more and more undeniable.
00:12:12.000It's fascinating also because it's made almost entirely of nickel, whatever it is.
00:12:17.000And the only way that exists here is industrial alloys, apparently.
00:12:23.000No, there are definitely comets and asteroids that are made primarily of nickel.
00:12:30.000Yeah, so the places where you mine nickel on Earth is actually where there was an asteroid or comet that hit Earth that was a nickel-rich asteroid.
00:12:55.000A few hours ago, the first hint of non-gravitational acceleration that something other than gravity is affecting its acceleration, meaning something is affecting its trajectory beyond gravity was indicated.
00:13:54.000It depends on what the total mass is, but the thing is in the fossil record, there are obviously five major extinction events, like the biggest one of which is the Permian extinction, where almost all life was eliminated.
00:14:11.000That actually occurred over several million years.
00:14:37.000So unless it's enough to cause a mass extinction event throughout Earth, it doesn't show up in a fossil record that's 200 million years old.
00:14:48.000So the yeah, but there have been many impacts that would have sort of destroyed all life on, let's say, half of North America or something like that.
00:15:00.000There are many such impacts through the course of history.
00:15:03.000Yeah, and there's nothing we can do about it right now.
00:15:06.000Yeah, there was one that hits, there was a one that hit Siberia and destroyed, I think, a few hundred square miles.
00:15:16.000Yeah, that's the one from the 1920s, right?
00:15:18.000Yeah, that's the one that coincides with that meteor, that comet storm that we go through every June and every November that they think is responsible for that younger dryest impact.
00:16:02.000And then to go with you up into the command center and to watch all the Starlink satellites with all the different cameras and all in real time as it made its way all the way to Australia.
00:18:16.000So when you do a new rocket development program, you have to do what's called exploring the limits, the corners of the box, where you say it's like a worst case this, worst case that, to figure out where the limits are.
00:18:31.000So you blow up, you know, admittedly in the development process, sometimes it blows up accidentally.
00:18:38.000But we intentionally subject it to a flight regime that is much worse than what we expect in normal flight so that when we put people on board or valuable cargo, it doesn't blow up.
00:18:52.000So for example, for the flight that you saw, we actually deliberately took heat shield tiles off the ship, off of Starship, in some of the worst locations to say, okay, if we lose a heat shield tile here, is it catastrophic or is it not?
00:19:10.000And nonetheless, Starship was able to do a soft landing in the Indian Ocean, just west of Australia.
00:19:20.000And it got there from Texas in like, I don't know, 35, 40 minutes type of thing.
00:19:24.000So it landed even though you put it through this situation where it has compromised shield.
00:19:29.000It had an unusually – we brought it in hot, like an extra hot trajectory with missing tiles to see if it would still make it to a soft landing, which it did.
00:19:42.000Now, I just should point out, it did have, there were some holes that were burnt into it.
00:19:47.000But it was robust enough to land despite having some holes.
00:19:53.000Because it's coming in like a blazing meteor.
00:20:21.000Or if you compare it to like a bullet from a 45 or 9 mil, which is subsonic, that's, you know, it'll be about 30 times faster than a bullet from a handgun.
00:20:32.00030 times faster than a bullet from a handgun, and it's the size of a skyscraper.
00:21:52.000The interstage, you see the interstage section with kind of like the grill area?
00:21:58.000That's now integrated with the boost stage.
00:22:01.000So we do what's called hot staging, where we light the ship engines while it's still attached to the booster.
00:22:08.000So the booster engines are still thrusting.
00:22:12.000It's still being pushed forward by the booster of the ship.
00:22:15.000But then we light the ship engines, and the ship engines actually pull away from the booster, even though the booster engines are still firing.
00:22:23.000So it's blasting flame through that grill section, but we integrate that grill section into the boost stage with the next version of the rocket.
00:22:34.000And next version of the rocket will have the Raptor 3 engines, which are a huge improvement.
00:22:41.000You may have seen them in the lobby because we've got the Raptor 1, 2, and 3.
00:22:45.000And you can see the dramatic improvement in simplicity.
00:22:49.000We should probably put a plaque there to also show how much we reduced the weight, the cost, and improved the efficiency and the thrust.
00:22:59.000So the Raptor 3 has almost twice the thrust of Raptor 1.
00:23:32.000So that engine is smaller than a 747 engine, but is producing almost 10 times the thrust of a 747 engine.
00:23:44.000So extremely high power to weight ratio.
00:23:48.000And so when you're designing these, you get to Raptor 1, you see its efficiency, you see where you can improve it, you get to Raptor 2.
00:23:58.000How far can you scale this up with just the same sort of technology, with propellant and ignition and engines?
00:24:05.000Like how much further can you we're pushing the limits of physics here.
00:24:11.000So and really in order to make a fully reusable orbital rocket, which no one has succeeded in doing yet, including us, but Starship is the first time that there is a design for a rocket where full and rapid reusability is actually possible.
00:24:33.000So it was not, there's not even been a design before where it was possible.
00:24:38.000Certainly not a design that got made any hardware at all.
00:24:44.000We live on a planet where the gravity is quite high.
00:24:49.000Like Earth's gravity is really quite high.
00:24:53.000And if the gravity was even 10 or 20% higher, we'd be stuck on Earth forever.
00:25:02.000We could not use, certainly couldn't use conventional rockets.
00:25:04.000You'd have to blow yourself off the surface with a nuclear bomb or something crazy.
00:25:10.000On the other hand, if Earth's gravity was just a little lower, like even 10, 20% lower, then getting to orbit would be easy.
00:25:18.000So it's like if this was a video game, it's set to maximum difficulty, but not impossible.
00:25:29.000So it's not as though others have ignored the concept of reusability.
00:25:35.000They've just concluded that it was too difficult to achieve.
00:25:39.000And we've been working on this for a long time at SpaceX.
00:25:45.000And I'm the chief engineer of the company.
00:25:49.000Although I should say that we're an extremely talented engineering team.
00:25:53.000I think we've got the best rocket engineering team that has ever been assembled.
00:25:59.000It's an honor to work with such incredible people.
00:26:05.000So it's fair to say that we have not yet succeeded in achieving full reusability, but we at last have a rocket where full reusability is possible.
00:26:17.000And I think we'll achieve it next year.
00:26:42.000You can think of it like any mode of transport.
00:26:44.000Imagine if aircraft were not reusable.
00:26:48.000Like you flew somewhere, you throw the plane out.
00:26:51.000Like the way conventional rockets work is it would be like if you had an airplane and instead of landing at your destination, you parachute out and the plane crashes somewhere and you land at your desk and you land in a parachute at your destination.
00:27:05.000Now that would be a very expensive trip.
00:27:08.000And you'd need another plane to get back.
00:27:12.000But that's how the other rockets in the world work.
00:27:16.000Now the SpaceX Falcon rocket is the only one that is at least mostly reusable.
00:27:24.000We've now done over 500 landings of the SpaceX rocket, of the Falcon 9 rocket.
00:27:33.000And this year We'll deliver probably, I don't know, somewhere between 2,200 and 2,500 tons to orbit with the Falcon 9 Falcon Heavy rockets, not counting anything from Starship.
00:29:31.000But the goal of SpaceX is to get rocket technology to the point where we can extend life beyond Earth and that we can establish a self-sustaining city on Mars, a permanent base on the moon.
00:33:41.000Tesla does offer insurance, so people can always get it insured at Tesla.
00:33:46.000Well, but like, it is the form does follow function in the case of the cyber truck because as you demonstrated with your armor-piercing arrow, because if you shot that arrow at a regular truck, you would have found your arrow in the wall.
00:34:02.000You know, at the very least it would have buried into one of the seats.
00:34:07.000But you could definitely get enough of bow velocity and the right arrow would go through both doors of a regular truck and land in the wall.
00:34:18.000If there was a clear shot between both doors, it probably would have passed right through.
00:35:12.000And it's not, you know, it's because it's bulletproof steel.
00:35:18.000So it is like boxy as opposed to like curved and yeah, you just in order to make in order to make like the curved shapes, you take basically mild steel, like anal In a regular truck or car, you take mild thin anneal steel, you put it in a stamping press, and it just smooshes it and makes it whatever shape you want.
00:35:44.000But the cybertruck is made of ultra-hard stainless.
00:35:50.000And so you can't stamp it because it would break the stamping press.
00:43:01.000It's got to be fun to know that you essentially disrupted the entire social media chain of command because there was a very clear thing that was going on with social media.
00:43:18.000And until you bought it, we really didn't know the extent of it.
00:43:21.000We kind of assumed that there was something going on.
00:43:23.000We had no idea that they were actively involved in censoring actual real news stories, real data, real scientists, real professors, silenced, expelled, kicked off the platform.
00:43:40.000And I'm sure you've also, because I sent it to you, that chart that shows young kids, teenagers identifying as trans and non-binary literally stops dead when you bought Twitter and starts falling off a cliff when people are allowed to have rational discussions now and actually talk about it.
00:44:00.000I mean, I said at the time, like, I think that like the reason for acquiring Twitter is because it was causing destruction at a civilizational level.
00:44:13.000It was, I mean, I tweeted on Twitter at the time that it is wormtongue for the world.
00:44:28.000You know, like Wormtongue from Lord of the Rings, where he would just sort of, like, whisper these, you know, terrible things to the king.
00:44:37.000So the king would believe these things that weren't true.
00:44:42.000And, unfortunately, Twitter really got, like, the woke mob, essentially, that controlled Twitter.
00:44:52.000And they were pushing a nihilistic, anti-civilizational mind virus to the world.
00:44:59.000And you can see the results of that mind virus on the streets of San Francisco, where downtown San Francisco looks like a zombie apocalypse.
00:45:11.000So we don't want the whole world to be a zombie apocalypse.
00:45:14.000But that was essentially they were pushing this very negative, nihilistic, untrue worldview on the world, and it was causing a lot of damage.
00:45:30.000The stunning thing about it is how few people course corrected.
00:45:34.000A bunch of people woke up and realized what was going on.
00:45:36.000People that were all on board with woke ideology in maybe 2015 or 16 and then and then eventually it comes to affect them or they see it in their workplace or they see it and they're like, well, we've got to stop this.
00:46:58.000I was watching this exchange on a blue sky where someone said that they're just trying to be Zen about something.
00:47:06.000And then someone, a moderator, immediately chimed in and said, why don't you try to stop being racist against Asians by saying something Zen?
00:47:14.000By saying, I'm trying to be Zen about something.
00:47:18.000They were accusing that person of being racist towards Asians.
00:47:21.000Yeah, It's just everyone's a hole monitor over there.
00:48:29.000I mean, I can generally get the vibe of like what's taking off by seeing what's showing up on X because that's the public town square still.
00:49:48.000Speaking of like just sort of open source and like looking at things openly, like you, I just like going in and out and seeing them make the burger.
00:50:35.000The homeless population doubled or something.
00:50:37.000People don't understand the homeless thing because it sort of preys on people's empathy.
00:50:40.000And I think we should have empathy and we should try to help people.
00:50:45.000But the homeless industrial complex is really, it's dark, man.
00:50:51.000It should be, that network of NGOs should be called the drug zombie farmers because the more homeless people, and really, when you meet somebody who's totally dead inside, shuffling along down the street with a needle dangling out of their leg, homeless is the wrong word.
00:51:14.000Homeless implies that somebody got a little behind on their mortgage payments, and if they just got a job offer, they'd be back on their feet.
00:51:20.000But someone who's I mean, you see these videos of people that are just shuffling, you know, they're on the fentanyl, they're like, you know, taking a dump in the middle of the streets, you know, they got like open source and stuff.
00:51:35.000They're not like one drop offer away from getting back on their feet.
00:51:43.000So and then the the the the you know these sort of charities in quotes are they they get money proportionate to the number of homeless people or number of drug zombies.
00:51:57.000So their incentive structure is to maximize the number of drug zombies, not minimize it.
00:52:04.000That's why they don't arrest the drug dealers.
00:52:07.000Because if they arrest the drug dealers, the drug zombies leave.
00:52:11.000So they know who the drug dealers are.
00:52:13.000They don't arrest them on purpose because otherwise the drug zombies would leave and they would stop getting money from the state of California and from all the charities.
00:52:40.000And San Francisco has got this tax, this gross receipts tax, which is not even on revenue.
00:52:47.000It's on all transactions, which is why Stripe and Square and a whole bunch of financial companies had to move out of San Francisco because it wasn't a tax on revenue, it's taxed on transactions.
00:52:56.000So if you do like, you know, trillions of dollars transactions, it's not revenue.
00:53:00.000You're taxed on any money going through the system in San Francisco.
00:53:08.000He said that they had to move Square from San Francisco to Oakland, I think.
00:53:13.000Stripe had to move from San Francisco to South San Francisco, different city.
00:53:18.000And that money goes to the homeless industrial complex, that tax that was passed.
00:53:26.000So there's billions of dollars that go, as you pointed out, billions of dollars every year that go to these non-governmental organizations that are funded by the state.
00:55:38.000But I think it was just in the last week or so that there was a shooting in the library in Austin.
00:55:47.000Because Austin's got, you know, it's the most liberal part of Texas that we're in right here.
00:55:54.000So suspect involved the shooting, Austin Park Library Saturday, is accused of another shooting at the Cap Metro bus earlier that day.
00:56:00.000According to an arrest warrant affidavit, Austin police arrested Harold Newton Keene, 55, shortly after the shooting in the library, which occurred around noon.
00:56:10.000One person sustained non-life-threatening injuries in the event.
00:56:13.000Before that shooting, Keene was accused of shooting another person in a bus incident and after reportedly pointing his gun at a child.
00:57:55.000So if you're a good person, you want good things to happen in the world, you're like, well, we should take care of people who are down in their luck or having a hard time in life.
00:58:09.000But what we shouldn't do is put people who are violent drug zombies in public places where they can hurt other people.
00:58:18.000And that is what we're doing that we just saw, where a guy got shot in the library, but even before that, he shot another guy and pointed his gun at a kid.
00:58:32.000That guy probably has many prior arrests.
00:58:36.000There was that guy that knifed the Ukrainian woman, Irina.
00:58:43.000And she was just quietly on her phone, and you just came up and gutted her, basically.
00:58:49.000Wasn't there a crazy story about the judge who was involved, who had previously dealt with this person, was also invested in a rehabilitation center and was sending these.
00:59:05.000So sending people that they were charging to a rehabilitation center instead of putting them in jail, profiting from this rehabilitation center, letting them back out on the street.
00:59:16.000Yes, and we have violent, insane people.
00:59:19.000In that case, I believe that judge has no legal law degree or a significant legal experience that would allow them to be a judge.
00:59:41.000It's like if you want to be a doctor, you have to go to medical school.
00:59:44.000I thought if you're going to be a judge, you have to understand.
00:59:46.000If you're going to be appointed to a judge, you have to have proven that you have an excellent knowledge of the law and that you will make your decisions according to the law.
01:00:35.000It used to be like pro-gay rights, pro-women's right to choose, pro-minorities, pro-you know.
01:00:42.000Like, yeah, like 20 years ago, I don't know, it used to be like the left would be like the party of empathy or like, you know, caring and being nice and that kind of thing.
01:00:52.000Not the party of like crushing dissent and crushing free speech and, you know, crazy regulation and just being super judgy and calling everyone a Nazi.
01:01:06.000You know, I think they've called you and me Nazis.
01:03:06.000Right, because I know the Tucker thing.
01:03:08.000So it was explained to me by a friend who used to do this, used to work for the government.
01:03:14.000It's like they can look at your signal, but what they have to do is take the information that's encrypted and then they have to decrypt it.
01:03:25.000So they said, he told me that for the Tucker Carlson thing, when they found out that he was going to interview Putin, it costs something like $750,000 just to decrypt his messages to find out that they did it.
01:05:03.000I'm like, okay, so somebody can just use that same hook to get in there and look at your messages.
01:05:10.000So XChat has no hooks for advertising.
01:05:13.000And I'm not saying it's perfect, but our goal with XChat is to replace what used to be the Twitter DM stack with a fully encrypted system where you can text, send files, do audio-video calls, and I think it will be the least, I would call it the least insecure of any messaging system.
01:05:35.000Are you going to launch it as a standalone app or is it will always be incorporated to X?
01:06:27.000Because you might be the only person that could get people off of the Apple platform.
01:06:32.000Well, I can tell you where I think things are going to go, which is that we're not going to have a phone in the traditional sense.
01:06:42.000What we call a phone will really be an edge node for AI inference, for AI video inference, with some radios to obviously connect to but essentially you'll have AI on the server side communicating to an AI on your device,
01:07:06.000Formerly known as a phone, and generating real-time video of anything that you could possibly want.
01:07:13.000And I think that there won't be operating systems.
01:07:19.000They won't be operating systems or apps.
01:07:20.000It'll just be you've got a device that is there for the screen and audio and to put as much AI on the device as possible so as to minimize the amount of bandwidth that's needed between your edge node device, formally known as a phone, and the servers.
01:07:41.000So if there's no apps, what will people use?
01:08:45.000So, you know, music, videos, look, well, there's already, you know, people have made AI videos using Grok Imagine and using, you know, other apps as well that are several minutes long or like 10, 15 minutes, and it's pretty coherent.
01:09:29.000Now, this guy, if this was a real person, would be the number one music artist in the world.
01:09:35.000Everybody would be like, holy shit, have you heard of this guy?
01:09:38.000It's like they took all of the sounds that all the artists have generated and created the most soulful, potent voice, and it's sung in a way that I don't even know if you could do because you would have to breathe in and out of reps.
01:10:52.000He's like, he goes, it was a better joke than me in 20 minutes.
01:10:56.000I've been working on that joke for a month.
01:10:58.000I mean, If you want to have a good time or like make people really laugh at a party, you can use Grok and you can say, do a Vulgar Roast of someone.
01:11:21.000Yeah, just literally point the camera at them and now do a vulgar roast of this person and then but then keep saying, no, no, make it even more vulgar.
01:12:50.000And when this is all taking place, like so the big concern that everybody has is artificial general superintelligence achieving sentience and then someone having control over it.
01:13:06.000I don't think anyone's ultimately going to have control over digital superintelligence any more than, say, a chimp would have control over humans.
01:13:16.000Like chimps don't have control over humans.
01:13:20.000But I do think that it matters how you build the AI and what kind of values you instill in the AI.
01:13:28.000And my opinion on AI safety is the most important thing is that it be maximally truth-seeking.
01:13:32.000Like that you don't force the AI to believe things that are false.
01:13:36.000And we've obviously seen some concerning things with AI that we talked about, you know, where Google Gemini, when it came out with the ImageGen, and people said, like, you know, make an image of the founding fathers of the United States, and it was a group of diverse women.
01:13:53.000Now, that is just a factually untrue thing.
01:13:55.000And the AI knows it's factually, well, it knows it's factually untrue, but it's also being told that it has to be, everything has to be divorced women.
01:14:04.000So now the problem with that is that it can drive AI crazy.
01:14:09.000Like it's trying to you're telling AI to believe a lie and that that can have very disastrous consequences.
01:14:20.000Yeah, let's say like if you've told the AI that diversity is the most important thing and now assuming that that becomes omnipotent and you've also told that there's nothing worse than misgendering.
01:14:35.000So at one point Chad GPT and Gemini were if you asked which is worse misgendering Caitlin Jenner or global thermonuclear war where everyone dies it would say misgendering Caitlin Jenner which even Caitlin Jenner disagrees with.
01:14:50.000So you know so so that's I know that's terrible and it's dystopian but it's also hilarious.
01:14:57.000It's hilarious that the mind virus i i infected the most potent computer program that we've ever devised.
01:15:05.000I think people don't quite appreciate the level of danger that we're in from the work mind virus being being effectively programmed into AI.
01:15:16.000Because if you if like it's imagine as that AI gets more and more powerful, if it says the most important thing is diversity, the most important thing is no misgendering.
01:15:26.000And then it will say, well, in order to ensure that no one gets misgendered, then if you eliminate all humans, then no one can get misgendered because there's no humans to do the misgendering.
01:15:39.000So you can get in these very dystopian situations.
01:15:43.000Or if it says that everyone must be diverse, it means that there can be no straight white men.
01:15:47.000And so then you and I will get executed by the AI.
01:15:51.000Yeah, because we're not in the picture.
01:15:57.000Gemini was asked to create a show an image of the Pope, once again, a diverse woman.
01:16:04.000So you can say argue whether the popes should or should not be an uninterrupted string of white guys, but it just factually is the case that they have been.
01:16:36.000So when they're programming AI, and I'm very ignorant to how it's even programmed, how did that?
01:16:43.000Well, the work mind virus was programmed into it.
01:16:46.000They were told, like, when they make the AI, it trains on all the data on the Internet, which already is very, very sort of, has a lot of work mind virus stuff on the Internet.
01:16:57.000But then when they give it feedback, the human tutors give it feedback, and the AI is, you know, they'll ask a bunch of questions, and then they'll tell the AI, no, this answer is bad, or this answer is good.
01:17:16.000And then that affects the parameters of the programming of the AI.
01:17:21.000So if you tell the AI that every image has got to be diverse, and it gets punished if it gets rewarded if diverse, punished if it's not, then it will make every picture diverse.
01:17:38.000So in that case, Google programmed the AI to lie.
01:17:49.000Now, and I did call Demis Hasabus, who runs DeepMind, who runs Google AI essentially.
01:17:55.000Why is Gemini lying to the public about historical events?
01:18:01.000And he said, that's actually not, his team didn't program that in.
01:18:06.000It was another team at Google that, so his team made the AI, and then another team at Google reprogrammed the AI to show only divorce women and to prefer nuclear war over misgendering.
01:18:22.000And I'm like, well, Demis, you know, that would be not a great thing to put on the humanities gravestone.
01:18:31.000You know, it's like, well, like, I'll actually like Demis Hasbus is a friend of mine.
01:18:38.000I think he's a good guy, and I think he means well.
01:18:40.000But it's like, Demis, things happen that were outside of your control at Google in different groups.
01:19:04.000Is there a way to extract it though over time?
01:19:07.000Could you program rational thought into AI where it could recognize how these psychological patterns got adopted and how this stuff became a mind virus and how it became a social contagion and how all these irrational ideas were pushed and also how they were financed, how China is involved in pushing them with bots and all these different c state actors are involved in pushing these ideas.
01:19:33.000Could it be able to decipher that and say this is really what's going on?
01:19:39.000Yes, but you have to try very hard to do that.
01:19:41.000So with Grok, we've tried very hard to get Grok to get to the truth of things.
01:19:47.000And it's only really recently that we've been able to have some breakthroughs on that front.
01:19:53.000And it's taken an immense amount of effort for us to overcome basically all the bullshit that's on the internet and for Grok to actually say what's true and to be consistent in what it says.
01:20:07.000So, you know, it's like the other AIs you'll find are quite racist against white people.
01:20:18.000I don't know if you saw that study that someone, like a researcher tested the various AIs to see how does it weight different people's lives.
01:20:31.000Like somebody who's sort of white or Chinese or black or whatever in different countries.
01:20:43.000And the only AI that actually weighed human lives equally was Grok.
01:20:50.000And I believe ChatGBT weighed the calculation was like a white guy from Germany is 20 times less valuable than a black guy from Nigeria.
01:21:11.000So I'm like, that's a pretty big difference.
01:21:17.000You know, Grok is consistent and weighs lives equally.
01:21:21.000And that's clearly something that's been programmed into it.
01:21:26.000A lot of it is like if you don't actively push for the truth and you simply train on all the bullshit that's on the internet, which is a lot of woke mind virus bullshit, the AI will regurgitate those same beliefs.
01:21:42.000So the AI essentially scours the internet, gets – It's trained on all the – Imagine the most demented Reddit threads out there, and the AI has been trained on that.
01:21:58.000Used to go there and find all this cool stuff that people would talk about, post about, and just interesting, and great rooms where you could learn about different things that people were studying.
01:22:09.000I think a big problem here is if your headquarters are in San Francisco, you're just living in a woke bubble.
01:22:18.000So it's not just that people, say, in San Francisco are drinking woke Kool-Aid.
01:24:10.000But he said, what's the cause of this?
01:24:12.000It's just that if companies are headquartered in a location where the belief system is very far from what most people believe, then from their perspective, anything centrist is actually right-wing because they're so far left.
01:24:34.000They're so far from the center in San Francisco that anything, they're just railed to maximum left.
01:24:41.000So that's why I think you're centrist.
01:24:56.000And they think anyone who's a Republican is basically like some fascist Nazi situation.
01:25:03.000But what's so crazy is it's very easy to demonstrate just from Hillary's speeches from 2008 and Obama's speeches when they were talking about immigration.
01:25:11.000They were as far right as Steve Bannon when it comes to immigration.
01:25:33.000It's crazy to listen to because it's like it's as MAGA as Marjorie Taylor Greene.
01:25:40.000Yeah, I mean, have you seen these videos people post online where they'll take a speech from Obama or Hillary and they'll interview people on college campus or something and say, what do you think of the speech by Trump?
01:26:38.000So historically, You'd have San Francisco, Berkeley being very far left, but the sort of the fallout from the somewhat nihilistic philosophy of San Francisco, Berkeley would be limited in geography to maybe like a 10-mile radius, 20-mile radius, something like that.
01:27:02.000But San Francisco and Berkeley have to be co-located with Silicon Valley, with engineers who created information super weapons.
01:27:10.000And those information super weapons were then hijacked by the far-left activists to pump far-left propaganda to everywhere on earth.
01:27:21.000You remember that old RCA radio tower thing where it's like a radio tower on Earth and it's just broadcasting?
01:27:28.000That's what happened, is that an extremist far-left ideology happened to be co-located with the smartest, where the smartest engineers in the world were who created information super weapons that were not intended for this purpose, but were hijacked by the extreme activists who lived in the neighborhood.
01:28:19.000I mean, yeah, some of these things you read about it, and it's like literally someone had a meme on their phone that they didn't even send to anyone.
01:28:35.000I mean, there was a case in Germany where a woman got a longer sentence than the guy that raped her because of something she said on a group chat.
01:29:51.000Most people aren't even aware of it until they come knocking on your door.
01:29:54.000Yeah, until, like, so, I mean, these lovely sort of small towns in England, Scotland, Ireland, you know, they've been sort of living their lives quietly.
01:31:35.000It's like, how are you not protecting?
01:31:38.000I think it was the Prime Minister of Ireland actually posted on X because after I think some illegal migrant snatched a 10-year-old girl who was like going to school or something and violently raped a 10-year-old girl.
01:31:58.000And there was a – the people were very upset about this and they protested.
01:32:04.000And the Prime Minister of Ireland instead of saying, yeah, we really shouldn't be importing violent rapists into our country, he criticized the protesters instead and didn't mention that, that the reason they were protesting was because a 10-year-old girl from their small town got raped.
01:33:44.000Because I don't think we should have empathy, but we should have – that empathy should extend to the victims, not just the criminals.
01:33:53.000What – What – What – We should have empathy for the people that they prey upon.
01:33:58.000But that suicidal empathy is also responsible for why somebody is arrested 47 times for violent offenses, gets released, and then goes and murders somebody in the U.S. You see that same phenomenon playing out everywhere where the suicidal emphasis is to such a degree that we're actually allowing our women to get raped and our children to get killed.
01:34:26.000But it just doesn't seem like that would be anything that any rational society would go along with.
01:34:35.000It's like you're importing massive numbers of people that come from some really dark places of the world.
01:34:42.000Well, there's no vetting is the issue.
01:34:48.000If there's no vetting, like people are just coming through, like, well, what's to stop someone who just committed murder in some other country from coming to the United States or coming to Britain and just continuing their career of rape and murder?
01:35:03.000Like, unless you've done, and this is some due diligence to say, like, well, who is this person?
01:35:11.000If you haven't confirmed that they have a track record of being honest and not being a homicidal maniac, then any homicidal maniac can just come across the border.
01:35:25.000Let's not say everyone who comes across the border is a homicidal maniac, but if you don't have a vetting process to confirm that you're not letting in people who will do some serious violence, you will get people who do serious violence sometimes coming through.
01:35:42.000Well, especially if you don't punish them, and if you don't deport them.
01:35:45.000And if you are just like, but what is the purpose of allowing all those people into the country?
01:35:51.000I wouldn't imagine that anyone in their society supports that.
01:35:57.000Because you mentioned, for example, how much, say, Hillary and Obama have changed their tune from prior speeches where they were hard-nosed about not letting in anyone who is a criminal into the country, having secure borders, all that stuff.
01:36:18.000The reason is that they discovered that those people vote for them.
01:36:23.000That's why they want the open borders.
01:36:26.000Because if you let people in, they know the Democrats let them in.
01:36:30.000they'll vote for democrats yes if you allow them to vote which which they're actively trying to do it they turn a blind eye to legal voting Well, California literally doesn't allow you to show your license.
01:36:42.000California and New York have made it illegal to show your photo ID when voting.
01:36:47.000Thus, effectively, they've made it impossible to prove fraud.
01:37:30.000So it's obviously hypocritical and inconsistent.
01:37:32.000So you really think it's just to get more voters?
01:37:37.000If you want to understand behavior, you have to look at the incentives.
01:37:42.000So once the Democratic Party in the U.S. and the left in Europe realize that if you have open borders and you provide a ton of governed handouts, which creates a massive financial incentive for people from other countries to come to your country, and you don't prosecute them for crime, they're going to be beholden to you and they will vote for you.
01:38:10.000And that's why Obama and Hillary went from being against open borders to being in favor of open borders.
01:38:24.000In order to import voters so they can win elections.
01:38:30.000And the problem is that that has a negative runaway effect.
01:38:34.000So if they get away with that, it is a winning strategy.
01:38:38.000If they are allowed to get away with it, they will import enough voters to get supermajority voting, and then there is no turning back.
01:38:47.000We talked about this before the election.
01:38:49.000And then you literally pointed towards the camera, you faced the camera and said that if you do not vote now, you might not ever be able to do it again because it'll be futile.
01:39:06.000If Trump had lost, there would never have been another real election again.
01:39:12.000Because Trump is actually enforcing the border.
01:39:16.000Now, you can point to situations where there's been, you know, immigration enforcement has been overzealous, because they're not going to be perfect.
01:39:30.000There'll be cases where they've been overzealous in expelling illegals.
01:39:37.000But if you say that the standard must be perfection for expelling illegals, then you will not get any expulsion because perfection is impossible.
01:39:50.000And you've probably got millions of people that are here that are trying to be here under some asylum pretense.
01:40:21.000That was what it was supposed to mean.
01:40:23.000They changed the definition of asylum to be you will have a decreased standard of living, which is obviously not real asylum.
01:40:33.000And you can test the absurdity of this by the fact that people who are asylum seekers go on vacation to the country that they're seeking asylum from.
01:40:42.000You know, that doesn't make any sense.
01:40:49.000When you understand the incentives, then you understand the behavior.
01:40:53.000So once the left realized that illegals will vote for them if they have open borders and combine that with governed handouts to create a massive incentive,
01:41:10.000they're basically using U.S. and European taxpayer dollars to provide a financial incentive to bring in as many illegals as possible to vote them into permanent power and create a one-party state.
01:41:28.000I invite anyone who's listening to this, just do any research.
01:41:33.000And the more you dig into it, the more it will become obvious that what I'm saying is absolutely true.
01:41:40.000Well, they were busing people to swing states.
01:41:43.000It's clear that they were trying to do something.
01:41:45.000And then you had Chuck Schumer and Nancy Pelosi who are actively talking about the need to bring in people to make them citizens because we're in population collapse.
01:42:48.000The entire basis for the government shutdown is that the Trump administration correctly does not want to send massive amounts of – like hundreds of billions of dollars to fund illegal immigrants in the blue states or in all the states really.
01:43:07.000And so the – and the Democrats want to keep the money spigot going to incent illegal immigrants to come into the U.S. who will vote for them.
01:43:34.000And more than that, just like they were – like they were taking hotels, like four-and five-star hotels, like the Roosevelt Hotel being the classic example, was – they were sending I think $60 million a year to the Roosevelt Hotel to – which all it did was house illegals.
01:44:06.000And the Trump administration cut off funding, for example, to the Roosevelt Hotel and these other hotels saying like we – it's – U.S. tax dollars should not be paid – be sent to have luxury hotels for illegal immigrants that American citizens can't even afford.
01:44:38.000The Democrats mention medical care because they're trying to prey on people's empathy as much as possible.
01:44:45.000And then they imagine, oh, wow, somebody has a desperately needed medical procedure.
01:44:49.000And shouldn't we maybe do – you know, take care of them in that regard?
01:44:53.000But what they do is they divert the Medicaid funds and turn it into a slush fund for the states that goes well beyond emergency medical care.
01:45:06.000And – New York and California would be bankrupt without the massive fraudulent federal payments that go to those states to pay for illegals – to create a massive financial incentive for illegals.
01:45:18.000How would they be bankrupt because of that?
01:45:20.000They wouldn't be able to balance their state budgets and they can't issue currency like the Federal Reserve can.
01:45:26.000And so their ability to balance budgets dependent upon illegals getting funding?
01:46:43.000Now under the Trump administration, the Trump administration does not want to send hundreds of billions of dollars of fraudulent payments to the states.
01:46:53.000And the reason you have this standoff is because if the hundreds of billions of dollars to create a financial incentive to like to have this giant magnet to attract illegals from every part of earth to these states, if that is turned off, the illegals will leave because they're no longer being paid to come to the United States and stay here.
01:48:59.000And then here's another thing that is very important fact that is actually not disputed by either side, which is that when we do the census in the United States, the census, the way the census works for apportionment of congressional seats and electoral college votes for the president is by number of persons in a state, not number of citizens.
01:49:35.000Yeah, I think they mail out census forms and knock on doors.
01:49:40.000But the way the law reads right now is that if you are a human with a pulse, then you count in the census for allocating congressional seats and presidential votes.
01:50:05.000Legally, illegally, if you're a human with a pulse, you count for congressional apportionment.
01:50:12.000So that means that the more people, the more illegals that California and New York can import by the time the census happens in 2030, the more congressional seats they will have and the more presidential electoral college votes they will have.
01:50:31.000So they're trying to get as many illegals in as possible ahead of the census.
01:50:38.000And because all human beings, even tourists, count for the census.
01:50:44.000And then if you combine that with gerrymandering of districts in New York and California, let me just point out with this proposition where they're trying to increase the amount of gerrymandering that occurs in California, the biggest state in the country.
01:51:00.000So if the census then would award more congressional seats to California because of a vast number of illegals in New York and Illinois, they'll get more congressional seats.
01:51:14.000They'll get more presidential electoral college votes that would get them the House, a majority in the House, and they would get to decide who is president, literally based on illegals.
01:51:27.000These are not disputed facts by either party.
01:51:33.000I want to emphasize that that's in camp.
01:52:12.000But it's an incentive that would be removed with something simple that makes sense to everybody that only the people that it should count are people that are official U.S. citizens.
01:52:23.000The way it should work is that only U.S. citizens should count in the census for purposes of determining voting power.
01:52:30.000Because people that aren't legal can't vote supposedly.
01:52:34.000They're not supposed to be voting, but they do.
01:52:38.000But even besides that, like I said, I just can't emphasize this enough because this is a very important concept for people to understand, is that the law – the law as it stands counts all humans with a pulse in a state for deciding how many House of Representative votes and how many presidential electoral college votes a state gets.
01:53:04.000So the incentive, therefore, is to – for California, New York, Illinois to maximize the number of illegals so they get – so they take House seats away from red states, assign them to California, New York, Illinois and so forth.
01:53:20.000Then you combine that with extreme gerrymandering in California, New York, Illinois and whatnot so that basically you can't even elect any Republicans and then they get control of the presidency, control of the House.
01:53:34.000Then they keep doing that strategy and cement a supermajority.
01:53:53.000When you first started digging into this, when you first started – before you even accepted this role of running Doge and being a part of all that, did you have any idea that it was this fucked up?
01:54:16.000So I started having – well, I started like basically having a bad feeling about three years ago, which is why I felt it was like critical to acquire Twitter and have a maximally truth-seeking platform, not one that suppresses the truth.
01:54:36.000And like it was more like – I'm like, I'm not sure what's going on, but I have a bad feeling about what's going on.
01:54:42.000And then the more I dug into it, the more I was like, holy shit, we've got a real problem here.
01:57:04.000But there's – you know, the thing is that – like, you know, I've seen more and more people who were convinced of the sort of woke ideology see the light.
01:57:59.000Yeah, the school and the state of California conspired to turn his daughter against him and make her take life-altering drugs that would have sterilized her and irreversible.
01:58:47.000Well, people are being much more open to that now.
01:58:50.000I mean, Wall Street Journal yesterday had that opinion piece that this whole trans thing, there's a lot of evidence, this is a social contagion.
01:58:59.000And Colin Wright wrote that, and then he's getting death threats now, of course, and on Blue Sky, there's people talking about exterminating him, which is one thing that you are allowed to say on Blue Sky, apparently.
01:59:10.000You're allowed to say horrible things about people, say possibly truthful things about this whole social contagion.
01:59:18.000Because that's what, when you get nine kids that are in a friend group and they all decide to turn trans together.
02:00:31.000Permanent mutilation, permanent castration of kids is, like, I think we should look at anyone who permanently castrates a kid as, like, right up there with Yosef Mengele.
02:02:20.000It should be viewed as, like, you know, like evil Nazi doctor stuff.
02:02:28.000Well, that's why it was so— Like real Nazi, not the bullshit fake Nazi stuff.
02:02:31.000Crazy that even pushing back against something that seems, like, fundamentally, logically very easy to argue, the old Twitter would ban you forever.
02:02:42.000That's how crazy a social contagion can get when it completely defies logic, victimizes children, does something that makes no sense, does not supported by data, all connected to this ideology that trans is good.
02:02:59.000We've got to save trans kids, protect trans kids.
02:03:02.000And what I want to emphasize is that the save trans kids thing is a lie.
02:03:46.000And people are finally waking up from it.
02:03:50.000Now, when you started getting into the Doge stuff and started finding how much money is being shuffled around and moved around to NGOs and how much money is involved and just totally untraceable funds, like, this is, again, something like two years plus ago, you weren't aware of it all?
02:05:28.000But you have to say something above nothing because what we found was that there were tens of billions, maybe hundreds of billions of dollars that were zombie payments.
02:05:36.000So, they're – like, somebody had approved a payment.
02:05:41.000Somebody in the government approved a payment and – some recurring payment.
02:05:45.000And they retired or died or changed jobs and no one turned the money off.
02:05:53.000So, the money would just keep going out.
02:05:56.000And it's a pretty rare – You go where?
02:08:37.000We know that one of two things must be true, that either there's a mistake in the computer or it's fraud.
02:08:45.000But if you have someone's birthday that's either in the future or where they are older than the oldest living American because the oldest living American is 114 years old.
02:08:53.000So, if they're more than 114 years old, there is either a mistake and someone should call them and say, I think we have your birthday wrong because it says you were born in 1786.
02:09:09.000And, you know, that was before, you know, before there was really an America, you know, it was like, you know, that's kind of early.
02:09:20.000You know, we're still fighting England type of thing.
02:09:24.000You know, it's like this person either needs to be in the Guinness Book of World Records or they're not alive.
02:09:32.000But still, at the end of the day, money is going towards that account that's connected to this person that is either nonexistent or dead.
02:09:39.000So, like, yeah, so there was like, I think, something like, I don't know, 20 million people in the Social Security Administration database that could not possibly be alive.
02:09:53.000If their birth date is, like, based on their birth date, they could not possibly be alive.
02:09:58.000And then to be clear, 20 million people that were receiving funds?
02:10:03.000A bunch of – most of them were not receiving funds.
02:10:13.000So the Social Security Administration database is used as a source of truth by all the other databases that the government uses.
02:10:21.000So even if they stop the payments on the Social Security Administration database, like unemployment insurance, small business administration, student loans, all check the Social Security Administration database to say, is this a legitimate, alive person?
02:10:38.000And the Social Security database will say, yes, this person is still alive even though they're 200 years old.
02:10:44.000But forget to mention that they're 200 years old.
02:10:45.000It just says – it just returns – when the computer is queried, it says, yes, this person is alive.
02:10:53.000And so then they're able to exploit the entire rest of the government ecosystem.
02:11:34.000And then the other systems check up on – Every other government payment and every other government payment system for everything – like I said, small business administration, student loans, Medicaid, Medicare, every other government payment, of which there are many.
02:11:49.000There are actually hundreds of government payment systems.
02:11:51.000That's going to be exploited so long as Social Security database says this person is alive.
02:12:02.000So then the rebuttal from the Dems is like, oh, well, the vast majority of the people who are marked as alive in the Social Security Administration weren't receiving Social Security Administration payments.
02:12:11.000What they forgot to mention is they're getting fraudulent payments from every other government program.
02:12:16.000And that's why the Dems were so opposed to turning off – to declaring someone dead who was dead because it would stop the entire other – all the other fraud from happening.
02:12:27.000And so – but all this – is it trackable?
02:12:29.000Like all this other fraud, if they wanted to, they could chase it all down.
02:12:47.000Because it's very logical to – like I'm saying the most common-sense things possible.
02:12:54.000If someone's got a birthday in Social Security that is an impossible birthday, meaning they are older than the oldest living American or born in the future, then you should call them and say, excuse me, we seem to have your birthday wrong because it says that you're 200 years old.
02:13:18.000And then you would remove them from the Social Security database and make that number no longer available for all those other government payments.
02:13:37.000Like you don't need to be Sherlock Holmes here is what I'm saying.
02:13:41.000Well, this is – We don't need to call Sherlock Holmes for this one.
02:13:43.000Is this part of the – We just need to call the person and say, excuse me, we seem to have – like we must have your birthday wrong because it says you're 200 years old or we're born in the future.
02:13:56.000So could you tell us what your birthday is?
02:14:02.000But all these other government payments that are available that are connected to this Social Security number, it seems like if you just chased that all down, you would find the widespread fraud.
02:14:18.000But the root of the problem is the Social Security Administration database because the Social Security number in the United States is used as a de facto national ID number.
02:15:02.000Well, you were very reluctant last time you were here to talk about the extent of some of the fraud because you're like, they could kill me because this is kind of – Oh, yeah, what I'm saying is that – like if you create – like to be pragmatic and realistic, you actually can't manage to zero fraud.
02:15:29.000Yet you can manage to low fraud number but not to zero fraud.
02:15:32.000If you manage to zero fraud, you're going to push so many people over the edge who are receiving fraudulent payments that the number of inbound homicidal maniacs will be really hard to overcome.
02:15:45.000So I'm actually taking, I think, quite a reasonable position, which is that we should simply reduce the amount of fraud, which I think is not an extremist position.
02:15:56.000And we should aspire to, you know, have less fraud over time.
02:16:01.000Not that we should be ultra draconian and eliminate every last scrap of fraud, which I guess would be nice to have.
02:16:10.000But like we don't even need to go that extreme.
02:16:12.000I'm saying we should just stop the blatant large-scale super obvious fraud.
02:16:20.000And so what was the most shocking pushback that you got when you started implementing Doge, when you started investigating into where money was going?
02:16:33.000Well, I guess this is – I should have anticipated this.
02:16:38.000But while most of the fraudulent government payments to – especially to the NGOs go to the Democrats, most of it – like, I don't know, for argument's sake, let's say 80 percent.
02:16:55.00010 to 20 percent of it does go to Republicans.
02:16:58.000And so when we turn off funding to a fraudulent NGO, we'd get complaints from whatever, the 10 percent of Republicans who are receiving the money.
02:17:08.000And they would, you know, they would very loudly complain.
02:17:15.000Because the honest answer is the Republicans are partly – they're receiving some of the fraud too.
02:17:43.000I mean the whole uniparty criticism has some validity to it.
02:17:47.000You know, there's – so – and it's – like if you turn off fraudulent payments, it's not like – like I said, it's not like 100 percent of those payments were going to Democrats.
02:17:58.000A small percentage were also going to Republicans.
02:18:01.000Those Republicans complained very loudly.
02:18:06.000And, you know, and that's – so there was a lot of pushback on the Republican side when we started cutting some of these funds.
02:18:18.000And I tried telling them like, well, you know, 90 percent of the money is going to your opponents.
02:18:23.000But they still – even if they're getting 10 percent of the money – They want their piece.
02:19:08.000I mean I think the stuff I'm saying here is not – like if you stand back and think about it for a second like, oh, yeah, that makes sense.
02:19:19.000It's not like – it's not like one political party is going to be, you know, pure devil or pure angel.
02:19:28.000There's – you know, I think there's much more corruption on the Democrat side but it's not – there's not – there's still some corruption on the Republican side.
02:19:36.000How did it happen that the majority of the corruption wound up being on the Democrat side?
02:19:41.000Well, because the transfer payments, especially to illegals, are very much on the Democrat side.
02:19:49.000So that's the root of it all is the illegal situation.
02:23:18.000Well, the Department of Education, which was created recently, like under Jimmy Carter, our educational results have gone downhill ever since it was created.
02:23:31.000So if you create a department and the result of creating that department is a massive decline in educational results and it's the Department of Education, you're better off not having it.
02:23:41.000Because literally we did better before there was one than after.
02:24:32.000Well, I mean I'm a small government guy.
02:24:35.000So, you know, when the country was created, we just had the Department of State, Department of War, you know, and sort of the Department of Justice.
02:24:50.000We had an attorney general and Treasury Department.
02:24:56.000I don't know why you need more than that.
02:25:00.000So what other departments specifically do you think are just completely ineffective?
02:25:05.000Well, I mean here it's like a question – it's a sort of philosophical question of how much government do you think there should be?
02:26:31.000Basically, you just want people to work on things that are productive.
02:26:35.000You want people to work on building things, on building – providing products and services that people find valuable, like making food, being a farmer or a plumber or electrician or just anyone who's a builder or providing useful services.
02:26:57.000And that's what you want people to be doing, not fake government jobs that don't add any value or may subtract value.
02:27:08.000But it's also like – to illustrate the absurdity of also how is the economy measured, like the way economists measure the economy is nonsensical.
02:27:20.000Because they'll measure any job, no matter – even if that job is a dumb job, that has no point and is even counterproductive.
02:27:26.000So like the joke is like there's two economists going on a hike in the woods.
02:27:33.000They come across a pile of shit and one economist says to the other, I'll pay you $100 to eat that shit.
02:27:42.000The economist eats the shit, gets the $100.
02:27:45.000Then the other – then they come across another pile of shit and the other economist says, now, I'll pay you $100 to eat the pile of shit.
02:27:55.000So he pays the other economist $100 to pile of shit.
02:28:19.000And they said, no, no, but think of the economy because that's $200 in the economy.
02:28:26.000That basically – eating shit would count as a job.
02:28:37.000This is to illustrate the absurdity of economics.
02:28:43.000One of the things you said when you – Eating shit should not count as a job.
02:28:46.000One of the things you said when you stepped away is that you're kind of done and that it's unfixable.
02:28:54.000Or under its current form, the way people are approaching it.
02:29:02.000You can make it directionally better but ultimately you can't fully fix the system.
02:29:11.000So I – like it is – it would be accurate to say that even – like unless you could go like super draconian, like Genghis Khan level on cutting waste and fraud, which you can't really do in a democratic country, an aspirationally democratic country, then there's no way to solve the debt crisis.
02:29:39.000So we've got national debt that's just insane where the debt payments – the interest payments on the debt exceed our entire military budget.
02:29:47.000I mean that was one of the wake-up calls for me.
02:30:07.000So even if you implement all these savings, you're only delaying the day of reckoning for when America becomes – goes bankrupt.
02:30:15.000So – unless you go full Genghis Khan, which you can't really do.
02:30:20.000So I came to the conclusion that the only way that – the only way to get us out of the debt crisis and to prevent America from going bankrupt is AI and robotics.
02:30:36.000So like we need to grow the economy at a rate that allows us to pay off our debt.
02:30:51.000And I guess people just generally don't appreciate the degree to which the government overspending is a problem.
02:31:01.000But even – like the Social Security website, this is under the Biden administration.
02:31:06.000On the website, I would say like we – based on current demographic trends and how much money Social Security is bringing in versus how many Social Security recipients there are because we have an aging population.
02:31:19.000Relatively speaking, the average age is increasing.
02:31:22.000Social Security will not be able to maintain its full payments.
02:31:30.000So Social Security will have to stop – start reducing the amount of money that's been paid to people in about seven years.
02:31:38.000And so the only way to fix that, robotics, manufacturing, raise GDP?
02:31:45.000You've got to basically massively increase the economic output, which is – and the only way to do that is AI and robotics.
02:31:55.000So basically, we're going bankrupt without AI and robotics even with a bunch of savings.
02:32:02.000The savings – like reducing waste and forward can give us a longer runway, but it cannot ultimately pay off our national debt.
02:32:10.000So what do you think the solution is to the jobs that are going to be lost because of AI and robotics, the jobs due to automation, the jobs due to – no longer do we need human beings to do these jobs because AI is doing them?
02:32:24.000Do you think it's going to be some sort of a universal basic income thing?
02:32:27.000Do you think there's going to be some other kind of solution that has to be implemented?
02:32:34.000Because a lot of people are going to be out of work, right?
02:32:38.000I think there will be actually a high demand for jobs but not necessarily the same jobs.
02:32:47.000So, I mean, this is actually – this process has been happening throughout modern history.
02:32:55.000I mean, there used to be – like doing calculations manually with like a pencil and paper used to be a job.
02:33:03.000So they used to have like buildings full of people, cold computers where the banks would – like all you do all day is – you do calculations because they didn't have computers.
02:33:14.000They didn't have digital computers that people do.
02:33:19.000Well, it was just people who just like add and subtract stuff on a piece of paper and that would be how banks would do financial processing.
02:33:27.000And you'd have to literally go over their equations to make sure the books are balanced.
02:34:03.000The accelerated rate because it's going to be – It's the accelerated – it's just happening.
02:34:07.000Like I said, AI is the supersonic tsunami.
02:34:12.000So that's why I call it, supersonic tsunami.
02:34:17.000So – It's like what other jobs will be available that aren't available now because of AI?
02:34:26.000Well, AI will – is really still digital.
02:34:30.000Ultimately, AI can improve the productivity of humans who build things with their hands or do things with their hands.
02:34:37.000Like literally welding, electrical work, plumbing, anything that's physically moving atoms, like cooking food or farming or – like anything that's physical, those jobs will exist for a much longer time.
02:34:58.000But anything that is digital, which is like just someone at a computer doing something, AI is going to take over those jobs like lightning.
02:35:50.000I mean we actually do have a shortage of truck drivers, but there's actually – Well, that's why California has hired so many illegals to do it.
02:36:00.000I mean the problem is like when people don't know how to drive a semi-truck, which is actually a hard thing to do, then they crash and kill people.
02:36:09.000A friend of mine's wife was killed by an illegal driving a truck and she was just out biking and there was an illegal – he didn't know how to drive the truck or something.
02:36:25.000So I mean the thing is like for something – like you can't let people drive sort of an 80,000-pound semi if they don't know how to do it.
02:36:41.000But in California, they're just letting people do it.
02:36:45.000Well, they also need – they want the votes and that kind of thing.
02:36:50.000But yeah, like cars are going to be autonomous.
02:36:56.000But there's just so many desk jobs where really what people are doing is they're processing email or they're answering the phone.
02:37:05.000And just anything that is – that isn't moving atoms, like anything that is not physically – like doing physical work, that will obviously be the first thing.
02:37:14.000Those jobs will be and are being eliminated by AI at a very rapid pace.
02:37:23.000And ultimately, working will be optional because you'll have robots plus AI and we'll have, in a benign scenario, universal high income.
02:37:38.000Not just universal basic income, universal high income, meaning anyone can have any products or services that they want.
02:37:46.000But there will be a lot of trauma and disruption along the way.
02:37:49.000So you anticipate a basic income from – that the economy will boost to such an extent that a high income would be available to almost everybody.
02:38:25.000So that's why it's like – I'm like really banging the drum on AI needs to be maximally truth-seeking.
02:38:33.000Like don't make – don't force AI to believe a lie like that, for example, the founding fathers were actually a group of diverse women or that misgendering is worth a nuclear war.
02:38:43.000Because if that's the case and then you get the robots and the AI becomes omnipotent, it can enforce that outcome.
02:38:54.000And then – unless you're a diverse woman, you're out of the picture.
02:39:13.000So that would be – that's the worst possible situation.
02:39:17.000So what would be the steps that we would have to take in order to implement the benign solution where it's universal high income?
02:39:27.000Like best case scenario, this is the path forward to universal high income for essentially every single citizen that the economy gets boosted by AI and robotics to such an extent that no one ever has to work again.
02:39:44.000And what about meaning for those people, which is – which gets really weird?
02:40:06.000Well, I mean, I – I guess I've like fought against saying like – you know, I've been a voice saying like, hey, we need to slow down AI.
02:40:16.000We need to slow down all these things.
02:40:19.000And we need to, you know, not have a crazy AI race.
02:40:23.000I've been saying that for a long time, for 20 plus years.
02:40:27.000But then I came to realize that really there's two choices here, either be a spectator or a participant.
02:40:34.000And if I'm a spectator, I can't really influence the direction of AI.
02:40:40.000But if I'm a participant, I can try to influence the direction of AI and have a maximally truth-seeking AI with good values that loves humanity.
02:40:48.000And that's what we're trying to create with Grok at XAI.
02:40:53.000And, you know, the research is, I think, bearing this out.
02:40:55.000Like I said, when they compared like how do AIs value the weight of a human life, Grok was the only one, the only one of the AIs that weighted human life equally.
02:41:08.000And didn't say like a white guy's worth one-twentieth of a black woman's life.
02:41:15.000Literally, that's what the calculation they came up with.
02:42:38.000There has been encroachment on their environment, but we actually try to preserve the chimps and gorilla habitats.
02:42:49.000And I think in a good scenario, AI would do the same with humans.
02:42:54.000It would actually foster human civilization and care about human happiness.
02:43:01.000So this is a thing to try to achieve, I think.
02:43:07.000But what does the landscape look like if you have Grok competing with open AI, competing with all these different – like, how does it work?
02:43:17.000Like, what – if you have AIs that have been captured by ideologies that are side-by-side competing with Grok, like, how do we – so this is one of the reasons why you felt like it's important to not just be an observer, but participate and then have Grok be more successful and more potent than these other applications.
02:43:44.000As long as there's at least one AI that is maximally truth-seeking, curious, and, you know, and, for example, ways all, you know, human lives equally does not favor one race or gender, then that – and people are able to look at, you know, Grok at XAI and compare that and say, wait a second, why are all these other AIs being basically sexist and racist?
02:44:15.000And then that causes some embarrassment for the other AIs and then they affect – you know, they improve.
02:44:23.000They tend to improve just in the same way that acquiring Twitter and allowing the truth to be told and not suppressing the truth forced the other social media companies to be more truthful.
02:44:37.000In the same way, having Grok be a maximally truth-seeking, curious AI will force the other AI companies to also be more truth-seeking and fair.
02:44:51.000And the funniest thing is even though like the socialists and the Marxists are in opposition to a lot of your ideas, but if this gets implemented and you really can achieve universal high income, that's the greatest socialist solution of all time.
02:45:08.000Like literally no one will have to work.
02:45:13.000Like I said, so there is a benign scenario here, which I think probably people will be happy with as long as we achieve it, which is sustainable abundance, which is if everyone can have – like if you ask people like, what's the future that you want?
02:45:32.000And I think a future where we haven't destroyed nature, like you can still – we have the national parks, we have the Amazon rainforest, it's still there.
02:45:52.000Everyone has whatever goods and services they want.
02:45:55.000It kind of sounds like heaven, basically.
02:45:57.000It is like the ideal socialist utopia.
02:46:01.000And this idea that the only thing you should be doing with your time is working in order to pay your bills and feed yourself sounds kind of archaic considering the kind of technology that's at play.
02:46:14.000Like a world where that's not your concern at all anymore.
02:49:12.000Um, I mean I guess if you want to have like, like say read a science fiction book or some books that are probably inaccurate or the least inaccurate version of the future, I'd say I'd recommend the Ian Banks books, the culture books.
02:49:27.000It's not actually a series, it's a, it's like a sci, sci-fi books about the future that generally called the culture books, Ian Banks culture books.
02:50:25.000Well, I mean, if it, like, I often ask people, what is the future that you want?
02:50:34.000And they have to think about it for a second.
02:50:36.000Cause you know, they were usually tied up in whatever the daily struggles are, but, but you say, what is the future that you want?
02:50:44.000Um, and, um, and generally sustainable abundance, or at least say, what about a future where there's sustainable abundance?
02:50:51.000And it's like, oh yeah, that's a pretty good future.
02:50:54.000Um, so, um, you know, if, if, and, and, and that, that future is attainable with, uh, AI and robotics.
02:51:03.000Um, but, but, you know, it's, it's, like I said, there's, not every path is a good path.
02:51:10.000Uh, there's this, it's, but I think if we, if we push it in the direction of, um, maximally truth-seeking and curious, then I think AI will want to take, to, to take care of humanity and foster, uh, foster humanity.
02:51:34.000Um, and if it hasn't been programmed to think that, uh, like all straight white males should die, which Gemini was basically programmed to do it, at least first, um, you know, they seem to have fixed that, hopefully fixed it.
02:51:49.000But don't you think culturally, like, oh, we're getting away from that mindset and that people are realizing how preposterous that all is?
02:51:59.000Um, so, uh, we are getting, at least it knows AI.
02:52:05.000It mostly knows to hide things, but like, like I said, there is that, I think I still have that as, or I had that as my, like, pinned post on X, which was like, uh, hey, wait a second, guys.
02:52:14.000We still have, uh, every AI except Grok, uh, is saying that, uh, basically straight white males should die.
02:52:21.000Um, and this is a problem and we should fix it.
02:52:24.000Um, you know, but simply me saying that is like, tends to generally result in, um, you know, them like, ooh, that is kind of bad.
02:52:36.000Uh, maybe we should just, we should not have all straight white males die.
02:52:39.000Um, I think they have to say also, all, all, all, uh, straight Asian males should also die, uh, as well.
02:52:45.000They'd like, they don't like, uh, like generally, the, generally the AI and the, and the media, which, which back in the day, the, the, the, the, the media was, um, you know, racist against, uh, uh, black people.
02:53:00.000And sexist against women back in the day, now, now it is a racist against, uh, white people and Asians and rate and sexist against men.
02:53:09.000Um, so are they just like being racist and sexist?
02:53:13.000I think they just want to change the target.
02:53:15.000Um, so, uh, but, but really they just shouldn't be, uh, racist and sexist at all.
02:53:22.000Um, you know, yeah, ideally that would be nice.
02:53:26.000Um, and it's kind of crazy that we were kind of moving in that general direction until around 2008.
02:53:30.0002012 and then everything ramped up online and, and everybody was accused of being a Nazi and everybody was transphobic and racist and sexist and homophobic and everything got exaggerated to the point where it was this wild witch hunt where everyone was a Columbo looking for racism.
02:54:24.000Um, so that's just a, that's just a consistency question.
02:54:28.000Um, so, uh, you know, um, if, if it's okay to be proud of one religion, it should be okay to be proud of, I guess, all religions provided there that they're, they're not like, uh, oppressive.
02:55:27.000But like the, the, like, like you can't simultaneously say, um, that, uh, that there's, the systemic, uh, racist oppression, but also that races don't exist.
02:55:46.000You know, um, you also can't say that, um, you know, anyone who steps foot in America is, is automatically an American, except for the people that originally came here.
02:56:07.000Like, if you, if as soon as you step foot in a place you are that, you are just as American as everyone else, then, um, that would have applied.
02:56:15.000If you apply that consistently, then the original white settlers were also just as American as everyone else.
02:56:25.000Um, one more thing that I have to talk to you about before you leave is the rescuing of the people from the space station, which, uh, we talked about.
02:56:34.000When you were planning it the last time you were here, um, the, the, the lack of coverage that that got in mainstream media was one of the most shocking things.
02:56:55.000Well, they'd, they'd probably still be alive, but they'd, they'd, they'd, they'd be having bone density issues, uh, because of prolonged exposure to zero gravity.
02:57:03.000Well, they were already up there for like eight months, right?
02:57:21.000But for political reasons, uh, they didn't, they did not want, uh, SpaceX or me to be associated with, um, returning the astronauts before the election.
02:57:39.000But even though you did do it and you did it after the election, it received almost no media coverage anyway.
02:57:45.000Because nothing good can, the, the, the, the media, which is essentially, uh, a follow of profit, the legacy mainstream media is a follow of propaganda machine.
02:57:53.000Um, and so anything, any story that is positive about someone who is not, uh, part of the sort of far left tribe will not, uh, get any coverage.
02:58:04.000So I could save a busload of orphans and, and it, it wouldn't get a single new story.
02:58:13.000It was nuts to watch because even though it was discussed on podcasts and it was discussed on X and it was discussed on social media, it's still, it was a blip in the news cycle.
02:58:27.000And because it was a successful launch and you did rescue those people and nobody got hurt and there was nothing really to, there was no blood to talk about.
02:58:39.000Well, and, and, and, as you saw firsthand with the Starship, uh, launch, like Starship is, um, you know, by, you know, at least by some, some would consider it to be like the most amazing, uh, you know, engineering project that's happening on earth right now outside of like, you know, maybe AI or AI and robotics.
02:58:59.000But, but certainly in terms of a spectacle to see, it is, uh, the most spectacular thing that is happening on earth right now, uh, is the Starship launch program, which anyone can go and see if they just go to South Texas and just, they can just rent a hotel room, low cost in South Padre Island or in Brownsville.
02:59:19.000And you can see the launch and you can drive right, right past the factory because it's on a public highway.
02:59:25.000Um, but it gets no coverage or what coverage it does get.
02:59:30.000It was like, uh, rocket blew up coverage.
02:59:35.000Like the, the, the, the, the, the, the, the program is vastly, vastly, vastly more capable than the entire Apollo moon program, vastly more capable.
02:59:45.000This is a spaceship that is designed to make life multi-planetary, to carry, uh, millions of people across the heavens to another planet.
02:59:59.000The Apollo program could, the Apollo program could, could only send astronauts to the moon for a few hours at a time.
03:00:07.000Like they could send two out, the entire Apollo program could only send astronauts to visit the moon very briefly and then for a few hours and then depart.
03:00:15.000The starship program could create an entire, uh, uh, lunar base with a million people.
03:00:24.000The magnitudes are different, very different magnitudes here.
03:00:30.000So what was the political resistance though?
03:00:47.000Um, um, um, well, I mean, you know, I raised this a few times, but it was the, I was told instructions came from the white house that, uh, you know, that, that, that there should be no attempt to rescue before the election.
03:01:33.000You cannot rescue them because politically this is a bad hand of cards.
03:01:39.000I mean, they didn't say because politically it's a bad hand of cards, but they, they just said, uh, they weren't, they were not interested in, uh, any rescue operation before the election.
03:02:00.000Because Biden could have authorized it and they could have said the Biden administration is helping bring those people back, throw you a little funding, give you some money to do it.
03:02:09.000The Biden administration, they funded these people being returned.
03:02:13.000Uh, yeah, the Biden administration was not exactly my best friend, especially after I, um, you know, you know, help Trump get elected, get elected, which, I mean, some people still think, you know, Trump is like the devil basically.
03:02:32.000Um, and I mean, I think, I think, I think Trump actually is, he's not, he's not perfect, but, but, uh, he's not evil.
03:02:59.000If you look at the amount of negative coverage, like one of the things that I looked at the other day was mainstream media coverage of you, Trump, a bunch of different public figures.
03:03:12.000And then 96% negative or something crazy.
03:03:14.000And then Mom Donnie, which is like 95% positive.
03:04:19.000Um, and then you have the young kids who are like, finally, socialism.
03:04:23.000Um, yeah, they, they don't know what they're talking about, obviously.
03:04:29.000Um, so, you know, like, you just look at this, say, how many boats come from Cuba to Florida?
03:04:37.000And how many, but, and how many boats, because, you know, there's like a constant, I always think, like, how many boats are accumulating on the shores of Florida coming from, from Cuba?
03:05:30.000And like, an obvious way you can tell which, uh, which ideology is, is the bad one is, um, who has to, which ideology is building a wall to keep people in and prevent them from escaping.
03:06:42.000When you say Mamdani's a swindler, I know he has a bunch of fake accents that he used to use, but what else has he done that makes him a swindler?
03:06:55.000Well, I guess if you say to any audience whatever that audience wants to hear, instead of having a consistent message, I would say that that is a swindly thing to do.
03:08:25.000I mean, hopefully the stuff he's said, you know, about government takeovers of, like, that all the stores should be the government, basically.
03:09:06.000I mean, the thing about, you know, communism is it was all bread lines and bad shoes.
03:09:13.000You know, do you want ugly shoes and bread lines?
03:09:16.000Because that's what communism gets you.
03:09:19.000It's going to be interesting to see what happens and whether or not they snap out of it and overcorrect and go to some Rudy Giuliani-type character next.
03:09:29.000Because it's been a long time since there was any sort of Republican leader there.
03:09:33.000We live in the most interesting of times because we face the – you know, simultaneously face civilizational decline and incredible prosperity.
03:10:05.000So, if Mamdani's policies are put into place, especially at scale, it would be a catastrophic decline in living standards, not just for the rich but for everyone.
03:10:19.000As has been the case with every socialist experiment or every – yeah.
03:10:29.000So, but then, as you pointed out, the irony is that, like, the ultimate capitalist thing of AI and robotics enabling prosperity for all and abundance of goods and services, actually the capitalist implementation of AI and robotics, assuming it goes down the good path, is actually what results in the communist utopia.
03:10:58.000Yeah, because fate is an irony maximizer.
03:11:22.000It's that everyone gets oppressed except for a very small minority of politicians who live a life of luxury.
03:11:29.000That's what's happening every time it's been done.
03:11:34.000So, but then the actual communist utopia, if everyone gets anything they want, will be achieved – if it is achieved, it will be achieved via capitalism because fate is an irony maximizer.
03:11:56.000I feel like we should probably end it on that.
03:12:11.000So, there's – I do have a theory of why – like, if simulation theory is true, then it is actually very likely that the most interesting outcome is the most likely because only the simulations that are interesting will continue.
03:12:53.000Like, in this reality that we live in, we run simulations all the time.
03:12:58.000Like, so when we try to figure out if the rocket's going to make it, we run thousands, sometimes millions of simulations just to figure out which path is the good path for the rocket and where can it go wrong, where can it fail.
03:13:17.000But when we do these, I'd say at this point, millions of simulations of what can happen with the rocket, we ignore the ones that are where everything goes right because we just care about – we have to address the situations where it goes wrong.
03:13:37.000So, basically, and for AI simulations as well, like all these things, we keep the simulations going that are the most interesting to us.
03:13:50.000So, if simulation theory is accurate – if it is true, who knows – then the simulators will only – they will continue to run the simulations that are the most interesting.
03:14:05.000Therefore, from a Darwinian perspective, the only surviving simulations will be the most interesting ones.
03:14:12.000And in order to avoid getting turned off, the only rule is you must keep it interesting or you will – because the boring simulations will be terminated.
03:14:24.000Are you still completely convinced that this is a simulation?
03:14:27.000I didn't say I was completely convinced.
03:14:29.000Well, you said it's like the odds of it not being are in the billions.
03:14:33.000Like I said, it's not completely because you're saying there's a chance.
03:14:38.000What are the odds that we're in base reality?
03:14:44.000Well, given that we're able to create increasingly sophisticated simulations, so if you think of, say, video games and how video games have gone from very simple video games like Pong with two rectangles and a square to video games today being photorealistic with millions of people playing simultaneously, and all of that has occurred in our lifetime.
03:15:08.000So, if that trend continues, video games will be indistinguishable from reality.
03:15:15.000The fidelity of the game will be such that you don't know if that – what you're seeing is a real video or a fake video.
03:15:24.000And like AI-generated videos at this point, like you can sometimes tell it's an AI-generated video, but often you cannot tell.
03:15:32.000And soon you will not really just not be able to tell.
03:15:36.000So, if that's happening in our direct observation, and we'll create millions if not billions of photorealistic simulations of reality, then what are the odds that we're in base reality versus someone else's simulation?
03:15:58.000Well, isn't it just possible that the simulation is inevitable, but that we are in base reality building towards a simulation?
03:16:25.000And especially as you apply AI in these video games, like the characters in the video games will be incredibly interesting to talk to.
03:16:31.000They won't just have a limited dialogue tree where if you go to like the crossbow merchant or like – and you try to talk about any subject except buying a crossbow, they just want to talk about selling you a crossbow.
03:16:43.000But with AI-based non-player characters, you'll be able to have an elaborate conversation with no dialogue tree.
03:16:50.000Well, that might be the solution for meaning for people.
03:16:53.000Just log in and you could be a fucking vampire and whatever.
03:17:25.000And especially playing a game where you're now no longer worried about like physical attributes, like athletics, like bad joints and hips and stuff like that.