The Joe Rogan Experience - October 31, 2025


Joe Rogan Experience #2404 - Elon Musk


Episode Stats

Length

3 hours and 18 minutes

Words per Minute

152.91951

Word Count

30,301

Sentence Count

2,660

Misogynist Sentences

17

Hate Speech Sentences

40


Summary

This week, the boys talk about giant men and giant women, and the weirdest things you can do with a giant body, and how you can be a giant man and not look like one, too. Also, a lot of people think Giga Chad is a CGI guy, but he's probably not.


Transcript

00:00:01.000 Joe Rogan podcast, check it out!
00:00:03.000 The Joe Rogan experience.
00:00:06.000 Train by day, Joe Rogan, podcast by night, all day.
00:00:12.000 Exactly.
00:00:13.000 Just every morning.
00:00:15.000 What about Jeff Bezos is doing?
00:00:17.000 He looks definitely doing some damn drone.
00:00:20.000 He looks jacked.
00:00:21.000 He looks jacked, right?
00:00:22.000 Yeah, but he's like.
00:00:24.000 Quick.
00:00:28.000 He got jacked.
00:00:30.000 At age 59 in less than a year, he went from pencil net geek to looking like a miniature, like the rock.
00:00:38.000 Yeah, like a little miniature alpha fella.
00:00:40.000 Yeah.
00:00:41.000 Like his neck got bigger than his head.
00:00:42.000 Yeah.
00:00:44.000 But then like his earlier pictures, his neck's like a noodle.
00:00:46.000 I support this activity.
00:00:48.000 I like to see him going in this direction.
00:00:49.000 Which is fine.
00:00:50.000 And his voice dropped like two octaves.
00:00:52.000 I want you to move in that direction as well.
00:00:53.000 I think we can achieve this.
00:00:55.000 I mean, I think we can achieve Giga Chad.
00:01:01.000 That's what people call this.
00:01:02.000 Where is that guy?
00:01:03.000 Beeple?
00:01:04.000 I don't know where he is.
00:01:06.000 That's like a real guy.
00:01:07.000 Yeah.
00:01:07.000 The artist?
00:01:08.000 No, Digga Chad.
00:01:09.000 Oh, Digga Chad.
00:01:10.000 Yeah.
00:01:11.000 I don't know if that's a real guy.
00:01:12.000 It's hard to realize.
00:01:12.000 No, no, it is a real guy.
00:01:13.000 It is a real guy?
00:01:14.000 He's got the crazy jaw and like perfect sculpted hair.
00:01:17.000 Yeah.
00:01:18.000 Well, I mean, they may have exaggerated a little bit, but no, I think he actually just kind of looked like that in reality.
00:01:26.000 So he's a pretty unique looking individual.
00:01:30.000 I think we can achieve this.
00:01:31.000 That guy right there, that's a real guy.
00:01:34.000 That's real dude.
00:01:35.000 I always thought that was CGI.
00:01:39.000 I think the upper right one is not him.
00:01:41.000 That's not real.
00:01:42.000 That one to the left of that?
00:01:43.000 Like, that's real?
00:01:44.000 No, that's artificial, bro.
00:01:46.000 That's fake.
00:01:47.000 That's got that uncanny valley feel to it.
00:01:49.000 Doesn't it?
00:01:51.000 It's not impossible.
00:01:52.000 No.
00:01:53.000 No, it's not impossible to achieve.
00:01:54.000 But it's not possible to maintain that kind of leanness.
00:01:57.000 No, no.
00:02:00.000 At that point, he's dehydrating and all sorts of things.
00:02:04.000 Oh, it's based on a real purse.
00:02:05.000 Yeah, yeah.
00:02:07.000 Right, but it's not a real person.
00:02:08.000 What does he really look like?
00:02:09.000 Those images, I think, are bullshit.
00:02:12.000 Some of them are real.
00:02:13.000 Is that real?
00:02:14.000 Okay, that looks real.
00:02:15.000 That looks like a really jacked bodybuilder.
00:02:17.000 Yeah.
00:02:18.000 Yeah, that looks real.
00:02:19.000 Like, that's achievable.
00:02:20.000 But there's a few of those images where you're just like, what's going on here?
00:02:24.000 Yeah, yeah, yeah, totally.
00:02:26.000 Well, I mean, you see it?
00:02:29.000 Is that the Ice?
00:02:30.000 That's the real dude.
00:02:31.000 Well, that Icelandic dude who's To?
00:02:34.000 Oh, yeah, the guy who jumps in the frozen lakes and shit.
00:02:37.000 Well, the guy who played the mountain.
00:02:39.000 Oh, that guy.
00:02:40.000 Yeah.
00:02:42.000 That is like a mutant strong human.
00:02:44.000 Yes.
00:02:45.000 Like he would be in like the X-Men or something, you know?
00:02:45.000 Yeah.
00:02:48.000 I mean, he's just like, not like, uh, and, and, and, and there's that, you know that Have you seen that meme, tent and tent bag?
00:02:56.000 You know how it's really hard to get the tent and tent in the bag?
00:02:59.000 Oh, right, right.
00:03:04.000 That's true.
00:03:05.000 And then there's a picture of him and his girlfriend.
00:03:08.000 Oh, right.
00:03:08.000 Tent bag.
00:03:09.000 That's hilarious.
00:03:13.000 I don't know how it gets in there.
00:03:15.000 It seems too small.
00:03:16.000 I met Brian Shaw.
00:03:18.000 Brian Shaw is like the world's most powerful man.
00:03:21.000 And he's almost seven feet tall.
00:03:23.000 He's 400 pounds.
00:03:26.000 And his bone density is one in 500 million people.
00:03:31.000 So there's one.
00:03:33.000 There's like maybe 16 people.
00:03:35.000 He's an enormous human being.
00:03:38.000 Like a legitimate giant, just like that guy.
00:03:41.000 But we met him.
00:03:42.000 He was hanging out with us in the green room of the mothership.
00:03:44.000 It's like, okay, if this was like David and Goliath days, this is an actual giant, like the giants of the Bible.
00:03:50.000 Once in a while, they get a supergiant poster.
00:03:51.000 This is a real one.
00:03:53.000 Like, not a tall, skinny basketball player.
00:03:55.000 Yeah, yeah.
00:03:55.000 Like a seven foot, 400-pound power lifter.
00:03:59.000 Like, you don't want to, especially.
00:04:00.000 That's the guy.
00:04:01.000 See, if there's a photo of him standing next to, like, a regular human.
00:04:04.000 I was trying to get this.
00:04:05.000 Yeah.
00:04:05.000 There it is.
00:04:06.000 That's him right there.
00:04:06.000 Like, there's like, there's, like, one of him next to standing next to Arnold and stuff.
00:04:10.000 Yeah.
00:04:10.000 It's where everyone just looks tiny.
00:04:13.000 I mean, I think he's a pretty cool dude, actually.
00:04:15.000 Oh, Brian's very cool.
00:04:16.000 Very smart, too.
00:04:17.000 Unusual.
00:04:18.000 You know, you expect anybody to be that big.
00:04:19.000 It's got to be a moron.
00:04:21.000 Yeah.
00:04:22.000 Yeah.
00:04:22.000 No.
00:04:24.000 There's Andre the Giant who was awesome.
00:04:26.000 Yeah.
00:04:27.000 He was great in Princess Bride.
00:04:28.000 No, he was just awesome, period.
00:04:30.000 Yeah, yeah.
00:04:31.000 So we were talking about this interview with Sam Altman and Tucker.
00:04:36.000 And I was like, we should probably just talk about this on the air.
00:04:39.000 Because it is one of the craziest interviews I think I've ever seen in my life.
00:04:43.000 Where Tucker starts bringing up this guy who was whistleblower or whatever.
00:04:43.000 Yeah.
00:04:48.000 Whistleblower who, you know, committed suicide, but doesn't look like it.
00:04:53.000 And he's talking to Sam Altman about this.
00:04:56.000 And Sam Altman was like, are you accusing me?
00:04:58.000 He's like, no, no, no.
00:05:00.000 I'm not.
00:05:01.000 I'm just saying I think someone killed him.
00:05:03.000 Yeah.
00:05:04.000 And it should be investigated.
00:05:06.000 Yeah.
00:05:07.000 Not just drop the case.
00:05:09.000 It seems like.
00:05:10.000 They just dropped the case.
00:05:11.000 Yeah, yeah.
00:05:11.000 But his parents think he was murdered.
00:05:14.000 The wires to a security camera were cut.
00:05:17.000 Blood in two rooms.
00:05:18.000 Blood in two rooms.
00:05:19.000 Someone else's wig was in the room.
00:05:21.000 Someone else's wig.
00:05:22.000 Wig.
00:05:22.000 Wig.
00:05:23.000 Not normal.
00:05:23.000 Yes.
00:05:24.000 Not his wig.
00:05:25.000 Not normal to have a wig laying around.
00:05:27.000 Yes.
00:05:29.000 And he ordered DoorDash right before allegedly committing suicide.
00:05:33.000 Yeah.
00:05:34.000 Which seems unusual.
00:05:37.000 It's like, you know, I'm going to order pizza on second thoughts, I'll kill myself.
00:05:37.000 Yeah.
00:05:41.000 It seems like that's a very rapid change in mindset.
00:05:45.000 It's very weird.
00:05:46.000 And especially the parents have they don't believe he committed suicide at all.
00:05:51.000 Has no note or anything.
00:05:52.000 No.
00:05:53.000 It seems pretty fucked up.
00:05:54.000 And, you know, the idea that a whistleblower for an enormous AI company that's worth billions of dollars might get whacked, that's not outside the pale.
00:06:04.000 I mean, it's straight out of a movie.
00:06:06.000 Right out of a movie, but right out of a movie is real sometimes.
00:06:08.000 Yeah, right.
00:06:09.000 Exactly.
00:06:11.000 It's a little weird that I think they should do a proper investigation.
00:06:15.000 Like, what's the downside on that proper investigation?
00:06:17.000 Right.
00:06:18.000 No.
00:06:19.000 Yeah.
00:06:19.000 For sure.
00:06:20.000 But the whole exchange is so bizarre.
00:06:23.000 Yeah, yeah.
00:06:24.000 Sam Altman's reaction to being accused of murder is bizarre.
00:06:28.000 Look, I don't know if he is guilty, but it's not possible to look more guilty.
00:06:34.000 So I'm like.
00:06:35.000 Or look more weird.
00:06:37.000 Yeah.
00:06:38.000 You know, maybe it's just his social thing.
00:06:42.000 Like, maybe he's just odd with confrontation and it just goes blank.
00:06:47.000 You know?
00:06:48.000 But if somebody was accusing me of killing Jamie, like if Jamie was a whistleblower and Jamie got whacked, and then I'd be like, wait, what do you what are you saying?
00:06:56.000 Are you accusing me of killing my friend?
00:06:57.000 Like, what the fuck are you talking about?
00:06:59.000 I would be a little bit more irate.
00:07:02.000 Yeah, yeah, exactly.
00:07:04.000 I would be a little upset.
00:07:07.000 Yeah, it'd be like, well, you'd certainly insist on a thorough investigation as opposed to trying to sweep it under the rug.
00:07:15.000 Yeah, I wouldn't assume that he got, that he committed suicide.
00:07:18.000 I would be suspicious.
00:07:19.000 If Tucker was telling me that aspect of the story, I'd be like, that does seem like a murder.
00:07:24.000 Fuck, we should look into this.
00:07:25.000 I mean, all signs point to it being a motor.
00:07:27.000 Not saying, you know, Tim Altman had anything to do with the motor, but...
00:07:30.000 Blood in two rooms.
00:07:31.000 It's blood in two rooms.
00:07:33.000 Yeah, there's the wires to the security camera and the DoorDash being ordered right before suicide.
00:07:38.000 No suicide note.
00:07:39.000 His parents think he was murdered.
00:07:41.000 And the people that I know who knew him said he was not suicidal.
00:07:48.000 So I'm like, why would you jump to the conclusion?
00:07:51.000 His parents sued the Son's landlord alleged the owners and the managers of their son's San Francisco apartment building were part of a widespread cover-up of his death.
00:07:59.000 The landlord?
00:08:00.000 Yeah, there's a bunch of weird.
00:08:01.000 They said there's like packages missing from the building.
00:08:04.000 Some people said there's all packages still being delivered and all of a sudden they all disappeared.
00:08:07.000 Huh.
00:08:08.000 But that could be people steal people's packages all the time.
00:08:11.000 The Porch Pirate situation.
00:08:13.000 Yeah.
00:08:13.000 Yeah.
00:08:14.000 They failed as a safeguard.
00:08:16.000 Also, I mean, the amount of trauma those poor parents have gone through with their son dying like that.
00:08:21.000 I mean, it must, God bless them.
00:08:24.000 And how could they stay sane after something like that?
00:08:27.000 They're probably so grief-stricken.
00:08:30.000 Who knows what they believe at this point?
00:08:32.000 Yeah.
00:08:34.000 You should have asked if Epsom killed himself.
00:08:38.000 Yeah.
00:08:38.000 That's the cash pot.
00:08:43.000 Trying to convince everybody of that.
00:08:45.000 The guards weren't there and the camera stopped working.
00:08:48.000 And, you know.
00:08:50.000 The guards were asleep.
00:08:52.000 The cameras weren't working.
00:08:54.000 He had a giant steroided up bodybuilder guy that he was sharing a cell with that was a murderer who was a bad cop.
00:09:02.000 Like all of it's kind of nuts.
00:09:04.000 All of it's kind of nuts.
00:09:05.000 Like that he would just kill himself rather than reveal all of his billionaire friends.
00:09:11.000 Yeah.
00:09:12.000 And then.
00:09:13.000 Did you see Tim Dylan talking to Chris Cuomo about this?
00:09:16.000 I did.
00:09:16.000 He liked the idea.
00:09:17.000 Chris Cuomo just looked so stupid.
00:09:20.000 Tim just listened off all the time.
00:09:22.000 Tim just like, I agree.
00:09:23.000 It is strange.
00:09:24.000 Like, of course it's strange, Chris.
00:09:26.000 Jesus Christ.
00:09:28.000 You can't just go with the tide.
00:09:30.000 You got to think things through.
00:09:32.000 And if you think that one through, you're like, I don't think he killed himself.
00:09:35.000 Nobody does.
00:09:36.000 You'd have to work for an intelligence agency to think he killed himself.
00:09:40.000 It does seem unlikely.
00:09:42.000 It seems highly unlikely.
00:09:45.000 Highly, highly unlikely.
00:09:47.000 All roads point to murder.
00:09:49.000 Yes.
00:09:49.000 Point to they had to get rid of him because he knew too much.
00:09:52.000 Whatever the fuck he was doing, whatever kind of an asset he was, whatever thing he was up to, you know, was apparently very effective.
00:10:01.000 Yes.
00:10:01.000 And a lot of people were compromised.
00:10:04.000 You see, your boy Bill Gates is now saying climate change is not a big deal.
00:10:08.000 Like, relax, everybody.
00:10:09.000 I know I scared the fuck out of you for the last decade and a half, but we're going to be fine.
00:10:16.000 Yeah.
00:10:17.000 I mean, you know, as I was saying just before coming into the studio, you know, like every day there's some crazy, wild new thing that's happening.
00:10:27.000 It feels like reality is accelerating.
00:10:29.000 It's every day, and every day it's like more and more ridiculous to the point where the simulation is more and more undeniable.
00:10:37.000 Yeah, yeah.
00:10:38.000 It really feels like simulation.
00:10:39.000 You know, it's like, come on.
00:10:40.000 What are the odds that this could be the case?
00:10:42.000 Are you paying attention at all to 3E Atlas?
00:10:45.000 Are you the comet?
00:10:47.000 Yeah, whatever it is.
00:10:48.000 Yeah, yeah.
00:10:49.000 I mean, I mean, one thing I can say is, like, look, if I was aware of any evidence of aliens, Joe, you have my word.
00:11:00.000 I will come on your show and I will reveal it on the show.
00:11:03.000 Okay.
00:11:04.000 That's a good deal.
00:11:04.000 Yeah.
00:11:05.000 Yeah.
00:11:05.000 It's pretty good.
00:11:06.000 Yeah, thank you.
00:11:06.000 I believe you.
00:11:08.000 I'll stick to it.
00:11:08.000 I keep my promises.
00:11:11.000 All right.
00:11:11.000 I'll hold you to that.
00:11:12.000 Yeah.
00:11:13.000 Yeah.
00:11:14.000 And I'm never committing suicide, to be clear.
00:11:16.000 I don't think you would either.
00:11:18.000 On camera, guys, I am never committing suicide ever.
00:11:22.000 If someone says you committed suicide, I will fight tooth and nail.
00:11:26.000 I will fight tooth and nail.
00:11:27.000 I will not believe it.
00:11:28.000 I will not believe it.
00:11:30.000 The thing about the three eye Atlas is it's a funny name, actually.
00:11:33.000 Yeah, it's a third eye.
00:11:35.000 It sounds like Third Eye or something.
00:11:36.000 Yeah, it does.
00:11:38.000 Three Eye is only the third interstellar object that's detected.
00:11:42.000 Okay.
00:11:43.000 Yeah.
00:11:43.000 Obviously.
00:11:45.000 The third eye Atlas.
00:11:46.000 Yeah.
00:11:47.000 Avi Loeb was on the podcast a couple days ago talking about it.
00:11:50.000 Yeah.
00:11:51.000 It could be on these.
00:11:51.000 I don't know.
00:11:52.000 Apparently, today they're saying that it's changed course.
00:11:56.000 Did you see that, Jamie?
00:11:58.000 Avi Loeb said something today.
00:12:00.000 I'll send it to you.
00:12:04.000 I know it's on Reddit.
00:12:05.000 Rapidly brightening zero explorer.
00:12:08.000 Here you go, Jamie.
00:12:09.000 I'll send it to you right now.
00:12:11.000 It's fascinating.
00:12:12.000 It's fascinating also because it's made almost entirely of nickel, whatever it is.
00:12:17.000 And the only way that exists here is industrial alloys, apparently.
00:12:23.000 No, there are definitely comets and asteroids that are made primarily of nickel.
00:12:30.000 Yeah, so the places where you mine nickel on Earth is actually where there was an asteroid or comet that hit Earth that was a nickel-rich asteroid.
00:12:40.000 Wow, nickel-rich.
00:12:41.000 It's a giant nickel-rich-rich deposit.
00:12:43.000 Yeah, it's coming.
00:12:45.000 Those are from impacts.
00:12:47.000 You definitely didn't want to be there at the time because anything would have been obliterated.
00:12:50.000 But that's where the sources of nickel and cobalt are these days.
00:12:54.000 So this is Avi Loeb.
00:12:55.000 A few hours ago, the first hint of non-gravitational acceleration that something other than gravity is affecting its acceleration, meaning something is affecting its trajectory beyond gravity was indicated.
00:13:05.000 Interesting.
00:13:06.000 Dun dun dun.
00:13:11.000 So it's mostly nickel, very little iron, which he was saying is on Earth only exists in alloys.
00:13:21.000 But whatever, you know, you're dealing with another planet.
00:13:25.000 There are cases where there's very nickel-rich asteroids and meteorites.
00:13:29.000 That hits it.
00:13:29.000 And that's what it is.
00:13:30.000 For something from space.
00:13:31.000 Yeah, it's only it'll be a very sort of heavy spaceship if you make it all out of nickel.
00:13:36.000 Oh, yeah.
00:13:38.000 And fucking huge.
00:13:39.000 The size of Manhattan and all nickel.
00:13:40.000 That's kind of nuts.
00:13:41.000 Yeah.
00:13:42.000 That's a heavy spaceship.
00:13:43.000 That's a real problem if it hits.
00:13:45.000 Yes.
00:13:46.000 No, it would like obliterate a continent type of thing.
00:13:48.000 Yeah.
00:13:49.000 Maybe, maybe worse.
00:13:50.000 That would probably kill most of human life.
00:13:53.000 If not all of us.
00:13:54.000 It depends on what the total mass is, but the thing is in the fossil record, there are obviously five major extinction events, like the biggest one of which is the Permian extinction, where almost all life was eliminated.
00:14:11.000 That actually occurred over several million years.
00:14:15.000 There's the Jurassic.
00:14:16.000 I think Jurassic is, I think that one's pretty definitively an asteroid.
00:14:24.000 But there's been five major extinction events.
00:14:27.000 But what they don't count are really the ones that merely take out a continent.
00:14:31.000 Merely?
00:14:32.000 Yeah, because those don't really show up on the fossil record, you know.
00:14:36.000 Right.
00:14:37.000 So unless it's enough to cause a mass extinction event throughout Earth, it doesn't show up in a fossil record that's 200 million years old.
00:14:48.000 So the yeah, but there have been many impacts that would have sort of destroyed all life on, let's say, half of North America or something like that.
00:15:00.000 There are many such impacts through the course of history.
00:15:03.000 Yeah, and there's nothing we can do about it right now.
00:15:06.000 Yeah, there was one that hits, there was a one that hit Siberia and destroyed, I think, a few hundred square miles.
00:15:14.000 Oh, that's the Tunguska.
00:15:16.000 Yeah, that's the one from the 1920s, right?
00:15:18.000 Yeah, that's the one that coincides with that meteor, that comet storm that we go through every June and every November that they think is responsible for that younger dryest impact.
00:15:18.000 Yeah.
00:15:30.000 All that shit's crazy.
00:15:30.000 Yeah.
00:15:32.000 Thank you.
00:15:34.000 Before we go any further for letting us have a tour of SpaceX.
00:15:38.000 You're welcome.
00:15:38.000 Letting us be there for the rocket launch.
00:15:40.000 Sure.
00:15:41.000 One of the absolute coolest things I've ever seen in my life.
00:15:45.000 And we thought it was only like, I thought it was a half a mile.
00:15:50.000 Jamie's like, it was a mile away.
00:15:51.000 Turn out it's almost two miles away, and you feel it in your chest.
00:15:55.000 Yeah, it's you have to wear earplugs and you feel it in your chest, and it's two miles away.
00:15:59.000 Yeah, it was fucking amazing.
00:16:02.000 And then to go with you up into the command center and to watch all the Starlink satellites with all the different cameras and all in real time as it made its way all the way to Australia.
00:16:13.000 How many minutes?
00:16:14.000 Like 35, 40 minutes?
00:16:16.000 Wild.
00:16:16.000 Yeah.
00:16:17.000 Watch it touch down in Australia.
00:16:19.000 Yeah.
00:16:20.000 Fucking crazy.
00:16:21.000 It was amazing.
00:16:22.000 Yeah, yeah.
00:16:23.000 Absolutely amazing.
00:16:24.000 Yeah, Starship's awesome.
00:16:26.000 And anyone can go watch the launch, actually.
00:16:28.000 So you can just go to South Padre Island and get a great view of the launch.
00:16:32.000 So it's like where a lot of spring breakers go.
00:16:36.000 But we'll be flying pretty frequently out of Starbase in South Texas.
00:16:40.000 And we formally incorporated it as a city.
00:16:42.000 So it's actually an actual legal city, Starbase, Texas.
00:16:47.000 It's not that often you hear like, hey, we made a city, you know?
00:16:51.000 That used to be in the old days, like a startup would be you go and gather a bunch of people and say, hey, let's go make a town.
00:16:57.000 Literally, that would have been startups in the old days.
00:17:01.000 Or a country.
00:17:02.000 Yeah, or a country.
00:17:03.000 Yeah, yeah, actually.
00:17:04.000 If you tried doing that today, there'd be a real problem.
00:17:07.000 Yeah, there's so much set in stone on the countryfront these days.
00:17:10.000 You might be able to pull it off.
00:17:11.000 You might be able to pull it off.
00:17:12.000 If you've got a solid island, you might be able to pull it off.
00:17:17.000 Probably.
00:17:18.000 You know, like Lanai.
00:17:21.000 Yeah, you could probably.
00:17:22.000 Is this it?
00:17:23.000 If you put enough effort into it, you can make a new country.
00:17:25.000 This is one of the different ones.
00:17:26.000 This is one of the ones that you catch.
00:17:28.000 Or is that one?
00:17:28.000 Right?
00:17:29.000 Yeah, that's the booster.
00:17:30.000 So that's the super heavy booster.
00:17:32.000 So that's one with the booster's got 33 engines.
00:17:39.000 That, and it's, you know, by version four, that will have about 10,000 tons of thrust.
00:17:45.000 Right now, it's about 7,000, 8,000 tons of thrust.
00:17:48.000 That's the largest flying object ever made.
00:17:50.000 I had to explain to someone, they were going, why do they blow up all the time if you're so smart?
00:17:55.000 Because there was this fucking idiot on television.
00:17:58.000 Some guy was being interviewed, and they were talking about you.
00:18:01.000 And he goes, oh, I think he's a fuckwit.
00:18:02.000 And he goes, he's a fuckwit.
00:18:04.000 And he goes, why do you say he's a fuckwwit?
00:18:05.000 Oh, his rockets keep blowing up.
00:18:06.000 And someone said, yeah, well, why do his rockets blow?
00:18:08.000 And I had to explain.
00:18:10.000 Because it's the only way you find out what the tolerances are.
00:18:12.000 You have to.
00:18:13.000 You have to blow up.
00:18:16.000 So when you do a new rocket development program, you have to do what's called exploring the limits, the corners of the box, where you say it's like a worst case this, worst case that, to figure out where the limits are.
00:18:31.000 So you blow up, you know, admittedly in the development process, sometimes it blows up accidentally.
00:18:38.000 But we intentionally subject it to a flight regime that is much worse than what we expect in normal flight so that when we put people on board or valuable cargo, it doesn't blow up.
00:18:52.000 So for example, for the flight that you saw, we actually deliberately took heat shield tiles off the ship, off of Starship, in some of the worst locations to say, okay, if we lose a heat shield tile here, is it catastrophic or is it not?
00:19:10.000 And nonetheless, Starship was able to do a soft landing in the Indian Ocean, just west of Australia.
00:19:20.000 And it got there from Texas in like, I don't know, 35, 40 minutes type of thing.
00:19:24.000 So it landed even though you put it through this situation where it has compromised shield.
00:19:29.000 It had an unusually – we brought it in hot, like an extra hot trajectory with missing tiles to see if it would still make it to a soft landing, which it did.
00:19:42.000 Now, I just should point out, it did have, there were some holes that were burnt into it.
00:19:47.000 But it was robust enough to land despite having some holes.
00:19:53.000 Because it's coming in like a blazing meteor.
00:19:56.000 You can see the real-time video.
00:19:57.000 Well, tell me the speed again, because the speed was bananas.
00:19:59.000 You were talking about...
00:20:00.000 Yeah, it's like 17,000 miles an hour.
00:20:02.000 Which is...
00:20:03.000 I'll call it like...
00:20:03.000 Like 25 times the speed of sound or thereabouts.
00:20:09.000 So think about it.
00:20:10.000 It's like 12 times faster than a bullet from an assault rifle.
00:20:14.000 You know, bullet from assault rifles around Mach 2.
00:20:17.000 And it's huge.
00:20:18.000 Yeah.
00:20:20.000 Yeah.
00:20:21.000 Or if you compare it to like a bullet from a 45 or 9 mil, which is subsonic, that's, you know, it'll be about 30 times faster than a bullet from a handgun.
00:20:32.000 30 times faster than a bullet from a handgun, and it's the size of a skyscraper.
00:20:36.000 Yes.
00:20:38.000 Yeah.
00:20:40.000 That's fast.
00:20:41.000 It's so wild.
00:20:42.000 It's so wild to see, man.
00:20:45.000 It's so exciting.
00:20:47.000 The factory is so exciting, too, because genuinely, no bullshit.
00:20:51.000 I felt like I was witnessing history.
00:20:54.000 I felt like it was a scene in a movie where someone had expectations and they're like, what are they doing?
00:20:59.000 They're building rockets and you go there.
00:21:01.000 And as we were walking through, Jamie, you could speak to this too.
00:21:04.000 Didn't you have the feeling where you're like, oh, this is way bigger than I thought it was?
00:21:09.000 This is gigantic.
00:21:11.000 It's fucking crazy.
00:21:12.000 That's what she said.
00:21:14.000 The amount of rockets you're making.
00:21:21.000 Giga Chad in the house.
00:21:23.000 Way bigger than that.
00:21:25.000 It's a giant metal dick.
00:21:26.000 You're fucking fucking the universe with your giant metal dick.
00:21:29.000 Yeah, I mean.
00:21:29.000 That's what it is.
00:21:31.000 Yeah, it is very big.
00:21:32.000 And the sheer numbers of them that you guys are making.
00:21:35.000 And then this is a version, and you have a new updated version that's coming soon.
00:21:41.000 And what is that?
00:21:43.000 It's a little longer.
00:21:45.000 More pointy?
00:21:46.000 It's the same amount of pointy.
00:21:49.000 But it's got a bit more length.
00:21:52.000 The interstage, you see the interstage section with kind of like the grill area?
00:21:58.000 That's now integrated with the boost stage.
00:22:01.000 So we do what's called hot staging, where we light the ship engines while it's still attached to the booster.
00:22:08.000 So the booster engines are still thrusting.
00:22:12.000 It's still being pushed forward by the booster of the ship.
00:22:15.000 But then we light the ship engines, and the ship engines actually pull away from the booster, even though the booster engines are still firing.
00:22:22.000 Whoa.
00:22:23.000 So it's blasting flame through that grill section, but we integrate that grill section into the boost stage with the next version of the rocket.
00:22:34.000 And next version of the rocket will have the Raptor 3 engines, which are a huge improvement.
00:22:41.000 You may have seen them in the lobby because we've got the Raptor 1, 2, and 3.
00:22:45.000 And you can see the dramatic improvement in simplicity.
00:22:49.000 We should probably put a plaque there to also show how much we reduced the weight, the cost, and improved the efficiency and the thrust.
00:22:59.000 So the Raptor 3 has almost twice the thrust of Raptor 1.
00:23:06.000 Wow.
00:23:07.000 So you see Raptor 3, it looks like it's got parts missing.
00:23:12.000 And it's very, very clean.
00:23:14.000 How many of them are on the rocket?
00:23:15.000 There's 33 on the booster.
00:23:19.000 Whoa.
00:23:21.000 And each Raptor engine is producing twice as much thrust as all four engines on a 747.
00:23:30.000 Wow.
00:23:32.000 So that engine is smaller than a 747 engine, but is producing almost 10 times the thrust of a 747 engine.
00:23:44.000 So extremely high power to weight ratio.
00:23:48.000 And so when you're designing these, you get to Raptor 1, you see its efficiency, you see where you can improve it, you get to Raptor 2.
00:23:58.000 How far can you scale this up with just the same sort of technology, with propellant and ignition and engines?
00:24:05.000 Like how much further can you we're pushing the limits of physics here.
00:24:11.000 So and really in order to make a fully reusable orbital rocket, which no one has succeeded in doing yet, including us, but Starship is the first time that there is a design for a rocket where full and rapid reusability is actually possible.
00:24:33.000 So it was not, there's not even been a design before where it was possible.
00:24:38.000 Certainly not a design that got made any hardware at all.
00:24:44.000 We live on a planet where the gravity is quite high.
00:24:49.000 Like Earth's gravity is really quite high.
00:24:53.000 And if the gravity was even 10 or 20% higher, we'd be stuck on Earth forever.
00:25:02.000 We could not use, certainly couldn't use conventional rockets.
00:25:04.000 You'd have to blow yourself off the surface with a nuclear bomb or something crazy.
00:25:10.000 On the other hand, if Earth's gravity was just a little lower, like even 10, 20% lower, then getting to orbit would be easy.
00:25:18.000 So it's like if this was a video game, it's set to maximum difficulty, but not impossible.
00:25:26.000 So that's where we have here.
00:25:29.000 So it's not as though others have ignored the concept of reusability.
00:25:35.000 They've just concluded that it was too difficult to achieve.
00:25:39.000 And we've been working on this for a long time at SpaceX.
00:25:45.000 And I'm the chief engineer of the company.
00:25:49.000 Although I should say that we're an extremely talented engineering team.
00:25:53.000 I think we've got the best rocket engineering team that has ever been assembled.
00:25:59.000 It's an honor to work with such incredible people.
00:26:05.000 So it's fair to say that we have not yet succeeded in achieving full reusability, but we at last have a rocket where full reusability is possible.
00:26:17.000 And I think we'll achieve it next year.
00:26:20.000 So that's a really big deal.
00:26:25.000 The reason that's such a big deal is that full reusability drops the cost of access to space by 100.
00:26:38.000 Maybe even more than 100, actually.
00:26:40.000 So it could be like 1,000.
00:26:42.000 You can think of it like any mode of transport.
00:26:44.000 Imagine if aircraft were not reusable.
00:26:48.000 Like you flew somewhere, you throw the plane out.
00:26:51.000 Like the way conventional rockets work is it would be like if you had an airplane and instead of landing at your destination, you parachute out and the plane crashes somewhere and you land at your desk and you land in a parachute at your destination.
00:27:05.000 Now that would be a very expensive trip.
00:27:08.000 And you'd need another plane to get back.
00:27:12.000 But that's how the other rockets in the world work.
00:27:16.000 Now the SpaceX Falcon rocket is the only one that is at least mostly reusable.
00:27:21.000 You've seen the Falcon rocket land.
00:27:24.000 We've now done over 500 landings of the SpaceX rocket, of the Falcon 9 rocket.
00:27:33.000 And this year We'll deliver probably, I don't know, somewhere between 2,200 and 2,500 tons to orbit with the Falcon 9 Falcon Heavy rockets, not counting anything from Starship.
00:27:51.000 And this is mostly Starlink?
00:27:54.000 Yes, mostly Starlink, but we even launched our competitors to Starlink on Falcon 9.
00:28:01.000 We charge them the same price, pretty fair.
00:28:05.000 But SpaceX this year will deliver roughly 90% of all Earth mass to orbit.
00:28:11.000 Wow.
00:28:12.000 And then of the remaining 10%, most of that is done by China.
00:28:17.000 And then the remaining roughly 4% is everyone else in the world, including our domestic competitors.
00:28:25.000 You know, it's kind of incredible how many things are in space.
00:28:30.000 Like, how many things are floating above us now?
00:28:32.000 There's a lot of things.
00:28:34.000 Is there a saturation?
00:28:35.000 Right.
00:28:36.000 But is there a saturation point where we're going to have problems with all these different satellites that are?
00:28:42.000 I think as long as the satellites are maintained, it'll be fine.
00:28:49.000 The space is very roomy.
00:28:53.000 You can think of space as being concentric shells of the surface of the Earth.
00:28:59.000 So it's the surface of the Earth, but it's much larger.
00:29:06.000 Yeah, it looks like a series of concentric shells.
00:29:09.000 And think of an Airstream trailer flying around up there.
00:29:12.000 There's a lot of room for airstreams.
00:29:14.000 I mean, imagine if there were just a few thousand airstreams on Earth.
00:29:14.000 Yeah.
00:29:19.000 Yeah.
00:29:19.000 What are the odds that they'd hit each other?
00:29:21.000 They wouldn't be very crowded.
00:29:22.000 And then you've got to go bigger because you're dealing with far above Earth.
00:29:22.000 Yeah.
00:29:26.000 Hundreds of miles above Earth.
00:29:28.000 Yeah, yeah.
00:29:31.000 But the goal of SpaceX is to get rocket technology to the point where we can extend life beyond Earth and that we can establish a self-sustaining city on Mars, a permanent base on the moon.
00:29:43.000 That would be very cool.
00:29:44.000 I mean, imagine if we had like a moon-based alpha where there's like a permanent science base on the moon.
00:29:49.000 That would be pretty dope.
00:29:50.000 Or at least a tourist trap.
00:29:52.000 I mean...
00:29:53.000 A lot of people would be willing to go to the moon just for a tour.
00:29:57.000 That's for sure.
00:29:59.000 We could probably pay for our space program with that.
00:30:01.000 Probably.
00:30:01.000 Yeah.
00:30:02.000 Because it's like if you could go to the moon and safely, I think we'd get a lot of people would pay for that.
00:30:12.000 Oh, 100%.
00:30:13.000 After the first year, after nobody died for like 20 years.
00:30:16.000 Yeah, yeah, just to make sure.
00:30:17.000 Are you going to come back?
00:30:17.000 Exactly.
00:30:18.000 Because like that submarine, they had a bunch of successful launches in that private submarine before it imploded and killed everybody.
00:30:26.000 That was not a good design, obviously.
00:30:28.000 It was a very bad design.
00:30:29.000 Terrible design.
00:30:29.000 And the engineer said it would not withstand the pressure of those depths.
00:30:33.000 There was a lot of whistleblowers in that company, too.
00:30:36.000 Yeah.
00:30:37.000 They made that out of carbon fiber, which doesn't make any sense because you need to be dense to go down.
00:30:46.000 In any case, just make it out of steel.
00:30:47.000 If you make it out of sort of just a big steel casting, you'll be safe and nothing will happen.
00:30:55.000 Why would they make it out of carbon fiber then?
00:30:56.000 Is it cheaper?
00:30:58.000 I think they think carbon fiber sounds cool or something.
00:31:00.000 It does sound cool.
00:31:01.000 It sounds cool, but because it's such low density, you actually have to add extra mass to go down because it's low density.
00:31:10.000 But if you just have a giant hollow ball bearing, you're going to be fine.
00:31:15.000 Speaking of carbon fiber, just check out my unplugged Tesla out there.
00:31:18.000 It's pretty sick, right?
00:31:18.000 Yeah, it's cool.
00:31:20.000 Yeah.
00:31:20.000 Have you guys ever thought about doing something like that?
00:31:22.000 Like having an AMG division of Tesla where you do custom stuff?
00:31:29.000 I think it's best to leave that to the custom shops.
00:31:35.000 Like Tesla's focus is autonomous cars, building kind of futuristic autonomous cars.
00:31:44.000 So I think we want the future to look like the future.
00:31:53.000 Did you see our designs for the robotic bus?
00:31:58.000 It looks pretty cool.
00:32:00.000 It's supposed to be totally autonomous.
00:32:00.000 The robotic bus?
00:32:02.000 We need to actually figure out the good name for it.
00:32:04.000 I think we call it the robust or there's no good.
00:32:06.000 There's like, what do you call this thing?
00:32:08.000 But it looks cool.
00:32:08.000 It's very Art Deco.
00:32:10.000 It's like futuristic Art Deco.
00:32:14.000 And I think we want to change the aesthetic over time.
00:32:20.000 You don't want the aesthetic to be constant over time.
00:32:24.000 You want to evolve the aesthetic.
00:32:27.000 So, you know, like my, like, I have a son who's like, you know, he's like even more autistic than me.
00:32:37.000 But he has these great observations.
00:32:39.000 Who is this?
00:32:40.000 Saxon.
00:32:41.000 He has these great observations in the world because he just views the world through a different lens than most people.
00:32:49.000 And he's like, Dad, why does the world look like it's 2015?
00:32:55.000 And I'm like, damn, the world does look like it's 2015.
00:32:58.000 Like, the aesthetic has not evolved since 2015.
00:33:00.000 Oh, that's what it looks like?
00:33:01.000 Yeah.
00:33:02.000 Oh, wow.
00:33:03.000 That's pretty cool.
00:33:04.000 Oh, yeah.
00:33:05.000 Like, you'd want to see that going down the road, you know?
00:33:05.000 That's like...
00:33:07.000 Yeah.
00:33:08.000 You'd be like, okay, this is, we're in the future, you know?
00:33:10.000 It doesn't look like 2015.
00:33:12.000 What is that ancient science fiction movie, like one of the first science fiction movies ever?
00:33:16.000 Is it Metropolis?
00:33:17.000 Is that what it is?
00:33:17.000 Yeah, yeah.
00:33:18.000 That looks like it belongs in Metropolis.
00:33:18.000 Yeah.
00:33:20.000 Yeah, yeah.
00:33:21.000 It's a futuristic art deco.
00:33:24.000 Yeah, well, that's cool that you're concentrating on the aesthetic.
00:33:27.000 I mean, that's kind of the whole deal with Cybertruck, right?
00:33:30.000 Like, it didn't have to look like that.
00:33:31.000 No, I just wanted to have something that looked really different.
00:33:35.000 Is it a pain in the ass for people to get it insured because it's all solid steel?
00:33:40.000 I hope it's not too much.
00:33:41.000 Tesla does offer insurance, so people can always get it insured at Tesla.
00:33:46.000 Well, but like, it is the form does follow function in the case of the cyber truck because as you demonstrated with your armor-piercing arrow, because if you shot that arrow at a regular truck, you would have found your arrow in the wall.
00:34:02.000 You know, at the very least it would have buried into one of the seats.
00:34:02.000 Yeah.
00:34:06.000 Yeah, yeah.
00:34:07.000 But you could definitely get enough of bow velocity and the right arrow would go through both doors of a regular truck and land in the wall.
00:34:18.000 If there was a clear shot between both doors, it probably would have passed right through.
00:34:21.000 Exactly.
00:34:23.000 But the arrow shattered on the cybertruck because it's ultra-hard stainless.
00:34:31.000 And I thought it would be cool to have a truck that is bulletproof to a subsonic projectile.
00:34:39.000 So especially in this day and age, if the apocalypse happens, you're going to want to have a bulletproof truck.
00:34:49.000 So then because it's made of ultra-hard stainless, you can't just stamp the panels.
00:34:54.000 You can't just put it in a stamping press because it breaks the press.
00:34:58.000 So in order to actually, so it has to be planar because it's so difficult to bend, because it breaks the machine that bends it.
00:35:09.000 That's why it's so planar.
00:35:12.000 And it's not, you know, it's because it's bulletproof steel.
00:35:18.000 So it is like boxy as opposed to like curved and yeah, you just in order to make in order to make like the curved shapes, you take basically mild steel, like anal In a regular truck or car, you take mild thin anneal steel, you put it in a stamping press, and it just smooshes it and makes it whatever shape you want.
00:35:44.000 But the cybertruck is made of ultra-hard stainless.
00:35:50.000 And so you can't stamp it because it would break the stamping press.
00:35:55.000 So even bending it is hard.
00:35:57.000 So even to bend it to its current position, we have to way overbend it.
00:36:03.000 And so it gets so that when it springs back, it's in the right position.
00:36:09.000 So it's, I don't know, I think if you want to, like, I think it's a unique aesthetic.
00:36:16.000 And you say, well, what's cool about a truck?
00:36:18.000 Trucks should be, I don't know, manly.
00:36:21.000 They should be macho, you know?
00:36:23.000 And bulletproof is maximum macho.
00:36:26.000 Pierre smash macho.
00:36:27.000 Are you married to that shape now?
00:36:30.000 Like, is it, can you do anything to change it?
00:36:32.000 Like, as you get further, like, I know you guys updated the three and the Y. Did you update the Y as well?
00:36:38.000 Yes.
00:36:38.000 The and the Y are updated.
00:36:42.000 You know, there's like a there's a screen in the back for the kid that the kids can watch, for example, in the new 3 and Y.
00:36:52.000 So in the new Y. There's like hundreds of improvements.
00:36:59.000 Like we keep improving the car.
00:37:00.000 And even the Cybertruck, you know, keep improving it.
00:37:04.000 But, you know, I wanted to just do something that looked unique.
00:37:10.000 And the Cybertruck looks unique and has unique functionality.
00:37:14.000 And there were three things I was aiming for.
00:37:18.000 It's like, let's make it bulletproof.
00:37:20.000 Let's make it faster than a Porsche 911.
00:37:24.000 And we actually cleared the quarter mile.
00:37:26.000 The Cybertruck can clear a quarter mile while towing a Porsche 911 faster than a Porsche 911.
00:37:38.000 It can out-tow an F-350 diesel.
00:37:43.000 Really?
00:37:44.000 Yes.
00:37:44.000 What is the tow limitations?
00:37:46.000 I mean, we could tow a 747 with a cyber truck.
00:37:51.000 A cyber truck is an insanely like it is and it is alien technology.
00:37:56.000 Okay.
00:37:58.000 Because it shouldn't be possible to be that big and that fast.
00:38:05.000 It's like an elephant that runs like a cheetah.
00:38:08.000 Yeah, because it's 060 in less than three seconds, right?
00:38:11.000 Yes.
00:38:11.000 Yeah.
00:38:12.000 And it's enormous.
00:38:13.000 Like 7,000 pounds?
00:38:13.000 What does it weigh?
00:38:15.000 This is different configurations, but it's about that.
00:38:15.000 Yeah.
00:38:20.000 It's a beast.
00:38:21.000 Yeah.
00:38:24.000 And it's got four-wheel steering.
00:38:26.000 So the rear wheels steer two.
00:38:29.000 So it's got a very tight turning radius.
00:38:32.000 Yeah, we noticed that when we drove one to Starbase.
00:38:35.000 Yeah, very tight turning radius.
00:38:36.000 Pretty sick.
00:38:36.000 Yeah.
00:38:37.000 Are you still doing the roadster?
00:38:40.000 Yes.
00:38:41.000 Eventually?
00:38:44.000 We're getting close to demonstrating the prototype.
00:38:50.000 And I think this will be one thing I can guarantee is that this product demo will be unforgettable.
00:39:02.000 Unforgettable.
00:39:03.000 How so?
00:39:07.000 Whether it's good or bad, it will be unforgettable.
00:39:15.000 Can you say more?
00:39:16.000 What do you mean?
00:39:18.000 Well, you know, my friend Peter Thiel once reflected that the future was supposed to have flying cars, but we don't have flying cars.
00:39:30.000 So you're going to be able to fly?
00:39:32.000 But I mean I think if Peter wants a flying car, we should be able to buy one.
00:39:42.000 So are you actively considering making an electric flying car?
00:39:45.000 Is this like a real thing?
00:39:48.000 Well, we have to see in the demo.
00:39:50.000 So when you do this, like are you going to have a retractable wing?
00:39:55.000 Like, what is the idea behind this?
00:40:00.000 Don't be sly.
00:40:01.000 Come on.
00:40:03.000 I can't do the unveil before the unveil.
00:40:07.000 But tell me off-air then.
00:40:11.000 Look, I think it has a shot at being the most memorable product unveil ever.
00:40:22.000 And it has a shot.
00:40:24.000 And when do you plan on doing this?
00:40:26.000 What's the goal?
00:40:29.000 Hopefully before the end of the year.
00:40:31.000 Really?
00:40:32.000 Before the end of this year?
00:40:34.000 This is, I mean, we're going to first.
00:40:37.000 Hopefully in a couple months.
00:40:40.000 You know, we need to make sure that it works.
00:40:46.000 Like, this is some crazy, crazy technology we've got in this car.
00:40:49.000 Crazy technology.
00:40:52.000 Crazy, crazy.
00:40:54.000 So different than what was previously announced.
00:41:00.000 Yes.
00:41:01.000 And is that why you haven't released it yet?
00:41:03.000 Because you keep fucking with it?
00:41:05.000 It has crazy technology.
00:41:07.000 Okay.
00:41:08.000 Like, is it even a car?
00:41:10.000 I'm not sure.
00:41:13.000 It looks like a car.
00:41:16.000 Let's just put it this way.
00:41:17.000 It's crazier than anything James Bond.
00:41:20.000 If you took all the James Bond cars and combined them, it's crazier than that.
00:41:28.000 Very exciting.
00:41:30.000 I don't know what to think of it.
00:41:31.000 Is it even a car?
00:41:32.000 It's a limited amount of information I'm drawing from here.
00:41:32.000 I don't know.
00:41:35.000 Jamie's very suspicious over there.
00:41:37.000 Look at him.
00:41:38.000 Excited.
00:41:40.000 It's still going to be the same.
00:41:41.000 Well, you know what?
00:41:42.000 I mean, if you want to come a little before the unveil, I can show it to you.
00:41:46.000 100%.
00:41:47.000 Yeah.
00:41:47.000 Let's go.
00:41:48.000 Yeah.
00:41:52.000 It's kind of crazy all the different things that you're involved in simultaneously.
00:41:57.000 And, you know, we talked about this before, your time management, but I really don't understand it.
00:42:02.000 I don't understand how you can be paying attention to all these different things simultaneously.
00:42:09.000 Starlink, SpaceX, Tesla, Boring Company, X, you fucking tweet or post, rather, all day long.
00:42:17.000 Well, it's more like I could hop in for like two minutes and then hop out, you know.
00:42:21.000 But I mean, just the fact that you could.
00:42:24.000 I can't do that.
00:42:26.000 If I hop in, I start scrolling.
00:42:27.000 I start looking around.
00:42:28.000 Next thing you know, I've lost an hour.
00:42:30.000 Yeah.
00:42:33.000 So, no, for me, it's a couple minutes time usually.
00:42:37.000 Sometimes I guess it's half an hour, but usually I'm in for a few minutes then out of posting something on X.
00:42:45.000 I do sometimes feel like it's sometimes like that meme of the guy who drops the grenade and leaves the room.
00:42:52.000 That's been me more than once on X.
00:42:56.000 Oh, yeah.
00:42:56.000 Yeah.
00:42:57.000 Yeah, for sure.
00:43:00.000 It's got to be fun, though.
00:43:01.000 It's got to be fun to know that you essentially disrupted the entire social media chain of command because there was a very clear thing that was going on with social media.
00:43:14.000 The government had infiltrated it.
00:43:16.000 They were censoring speech.
00:43:18.000 And until you bought it, we really didn't know the extent of it.
00:43:21.000 We kind of assumed that there was something going on.
00:43:23.000 We had no idea that they were actively involved in censoring actual real news stories, real data, real scientists, real professors, silenced, expelled, kicked off the platform.
00:43:35.000 Wild.
00:43:35.000 Yeah.
00:43:37.000 Yeah.
00:43:37.000 Yeah.
00:43:38.000 For telling the truth.
00:43:39.000 For telling the truth.
00:43:40.000 And I'm sure you've also, because I sent it to you, that chart that shows young kids, teenagers identifying as trans and non-binary literally stops dead when you bought Twitter and starts falling off a cliff when people are allowed to have rational discussions now and actually talk about it.
00:43:58.000 Yes.
00:43:58.000 Yeah.
00:44:00.000 Yeah.
00:44:00.000 I mean, I said at the time, like, I think that like the reason for acquiring Twitter is because it was causing destruction at a civilizational level.
00:44:13.000 It was, I mean, I tweeted on Twitter at the time that it is wormtongue for the world.
00:44:28.000 You know, like Wormtongue from Lord of the Rings, where he would just sort of, like, whisper these, you know, terrible things to the king.
00:44:37.000 So the king would believe these things that weren't true.
00:44:42.000 And, unfortunately, Twitter really got, like, the woke mob, essentially, that controlled Twitter.
00:44:52.000 And they were pushing a nihilistic, anti-civilizational mind virus to the world.
00:44:59.000 And you can see the results of that mind virus on the streets of San Francisco, where downtown San Francisco looks like a zombie apocalypse.
00:45:10.000 It's bad.
00:45:11.000 So we don't want the whole world to be a zombie apocalypse.
00:45:14.000 But that was essentially they were pushing this very negative, nihilistic, untrue worldview on the world, and it was causing a lot of damage.
00:45:30.000 The stunning thing about it is how few people course corrected.
00:45:34.000 A bunch of people woke up and realized what was going on.
00:45:36.000 People that were all on board with woke ideology in maybe 2015 or 16 and then and then eventually it comes to affect them or they see it in their workplace or they see it and they're like, well, we've got to stop this.
00:45:47.000 A bunch of people did.
00:45:48.000 But a lot of people never course corrected.
00:45:52.000 Yeah.
00:45:54.000 A lot of people didn't course correct, but it's gone directionally.
00:45:59.000 It's directionally correct.
00:46:00.000 Like you mentioned the massive spike in kids identifying as trans, and then that spike dropping after the Twitter acquisition.
00:46:11.000 I think that simply allowing the truth to be told was just shedding some sunlight is the best disinfectant, as they say.
00:46:20.000 And just allowing sunlight kills the virus.
00:46:24.000 And it also changed the benchmark for all the other platforms.
00:46:29.000 You can't just openly censor people on all the other platforms and X is available.
00:46:33.000 So everybody else had a sort of Facebook announced they were changing.
00:46:37.000 YouTube announced they were changing their policies.
00:46:40.000 And they're kind of forced to.
00:46:41.000 And then the blue sky doubled down.
00:46:44.000 Well, the problem is essentially the woke mind virus retreated to Blue Sky.
00:46:52.000 But where they're just a self-reinforcing lunatic asylum.
00:46:56.000 They're all just triple masked.
00:46:58.000 I was watching this exchange on a blue sky where someone said that they're just trying to be Zen about something.
00:47:06.000 And then someone, a moderator, immediately chimed in and said, why don't you try to stop being racist against Asians by saying something Zen?
00:47:14.000 By saying, I'm trying to be Zen about something.
00:47:18.000 They were accusing that person of being racist towards Asians.
00:47:21.000 Yeah, It's just everyone's a hole monitor over there.
00:47:25.000 The worst hole monitor.
00:47:27.000 A virgin, like incel.
00:47:30.000 They're all hole monitors trying to rat on each other.
00:47:33.000 It's fascinating.
00:47:33.000 Yeah.
00:47:34.000 And then people say, I'm leaving for Blue Sky, like Stephen King.
00:47:38.000 And then a couple weeks later, it's back on X.
00:47:40.000 It's like, fuck it.
00:47:41.000 There's no one over there.
00:47:42.000 It's a whole bunch of crazy people.
00:47:43.000 You can only stay in the asylum for so long.
00:47:46.000 You're like, all right, this is not good.
00:47:48.000 They all bail.
00:47:50.000 Threads is kind of like that, too.
00:47:50.000 Yeah, yeah.
00:47:52.000 Threads is.
00:47:53.000 I've been on threads.
00:47:55.000 Well, what happens is if you go on Instagram, every now and then something really stupid will pop up on threads, like, what the fuck?
00:48:02.000 And it shows it to you on Instagram.
00:48:04.000 And then I'll click on that, and then I'll go to Threads.
00:48:06.000 And it's like you see posts with like 25 likes, like famous people, like 50 likes.
00:48:13.000 It's Ghost Town.
00:48:14.000 Ghost Town, yeah.
00:48:15.000 But the people that post on there, they're finding that there's very little pushback from insane ideology.
00:48:20.000 So they go there and they spit out nonsense and very few people jump in to argue.
00:48:27.000 Yeah.
00:48:28.000 Very weird.
00:48:28.000 Very weird place.
00:48:29.000 I mean, I can generally get the vibe of like what's taking off by seeing what's showing up on X because that's the public town square still.
00:48:35.000 Right.
00:48:36.000 And or what links show up in group text?
00:48:41.000 You know, if I'm in group chats with friends, like what links are showing up.
00:48:45.000 That's what I try to do now, only get stuff that shows up in my group text because that keeps me productive.
00:48:51.000 So I only check if someone's like, dude, what the fuck?
00:48:53.000 I'm like, all right, what the fuck?
00:48:54.000 Let me check it out.
00:48:56.000 If there's something that's crazy enough, it'll enter the group chat.
00:48:59.000 But there's always something.
00:49:01.000 That's what's nuts.
00:49:02.000 There's always some new law that's passed, some new insane thing that California's doing.
00:49:07.000 And it's like a giant chunk of it's happening in California.
00:49:11.000 The most preposterous things that I get.
00:49:13.000 Yeah.
00:49:14.000 And then you got Gavin Newsom, who's running around saying we all have California derangement syndrome.
00:49:19.000 He's just like ripping off Trump derangement and calling it California derangement.
00:49:22.000 It's like, no, no, no.
00:49:24.000 No, no, no.
00:49:25.000 The fucking, how many corporations have left California?
00:49:28.000 It's crazy.
00:49:29.000 Hundreds.
00:49:30.000 Yes, hundreds.
00:49:30.000 Right?
00:49:31.000 Hundreds.
00:49:32.000 That's not good.
00:49:33.000 I mean, trickfully, I mean, I think In and Outlift.
00:49:36.000 Yeah, In and Outlift.
00:49:37.000 They moved to Tennessee.
00:49:38.000 Yeah.
00:49:38.000 Yeah.
00:49:39.000 They're like, we can't do this anymore.
00:49:41.000 Right.
00:49:42.000 It's the California company for food.
00:49:44.000 It's like the greatest hamburger place ever.
00:49:46.000 It's awesome.
00:49:47.000 Yeah.
00:49:47.000 Yeah.
00:49:48.000 Speaking of like just sort of open source and like looking at things openly, like you, I just like going in and out and seeing them make the burger.
00:49:55.000 Yeah.
00:49:56.000 They chop the onions and they, you know, it's, you just see everything getting made in front of you.
00:49:56.000 It's right there.
00:50:01.000 Yeah.
00:50:02.000 It's great.
00:50:03.000 But yeah, like it should be, like, how many wake-up calls do you need to say that there needs to be reform in California.
00:50:09.000 Well, the crazy thing that Newsom does is whenever someone brings up the problems of California, he starts rattling off all the positives.
00:50:16.000 The most Fortune 500 companies, highest education.
00:50:19.000 But yeah, that was all already there before you were governor.
00:50:25.000 But how many Fortune 500 companies have left California?
00:50:28.000 And then you guys spent $24 billion on the homeless, and it got way worse.
00:50:33.000 Yes.
00:50:35.000 The homeless population doubled or something.
00:50:37.000 People don't understand the homeless thing because it sort of preys on people's empathy.
00:50:40.000 And I think we should have empathy and we should try to help people.
00:50:45.000 But the homeless industrial complex is really, it's dark, man.
00:50:51.000 It should be, that network of NGOs should be called the drug zombie farmers because the more homeless people, and really, when you meet somebody who's totally dead inside, shuffling along down the street with a needle dangling out of their leg, homeless is the wrong word.
00:51:14.000 Homeless implies that somebody got a little behind on their mortgage payments, and if they just got a job offer, they'd be back on their feet.
00:51:20.000 But someone who's I mean, you see these videos of people that are just shuffling, you know, they're on the fentanyl, they're like, you know, taking a dump in the middle of the streets, you know, they got like open source and stuff.
00:51:35.000 They're not like one drop offer away from getting back on their feet.
00:51:38.000 This is not a homeless.
00:51:38.000 Right.
00:51:40.000 Homelessness, it's a propaganda word.
00:51:42.000 Right.
00:51:43.000 So and then the the the the you know these sort of charities in quotes are they they get money proportionate to the number of homeless people or number of drug zombies.
00:51:57.000 Right.
00:51:57.000 So their incentive structure is to maximize the number of drug zombies, not minimize it.
00:52:04.000 That's why they don't arrest the drug dealers.
00:52:07.000 Because if they arrest the drug dealers, the drug zombies leave.
00:52:11.000 So they know who the drug dealers are.
00:52:13.000 They don't arrest them on purpose because otherwise the drug zombies would leave and they would stop getting money from the state of California and from all the charities.
00:52:23.000 Wait a minute.
00:52:23.000 So you see, is that real?
00:52:26.000 So they're in coordination with law enforcement on this?
00:52:30.000 So how do they have those meetings?
00:52:32.000 They're all in cahoots.
00:52:33.000 Well, when you find this...
00:52:35.000 It's like such...
00:52:36.000 This is a diabolical scam.
00:52:40.000 So...
00:52:40.000 And San Francisco has got this tax, this gross receipts tax, which is not even on revenue.
00:52:47.000 It's on all transactions, which is why Stripe and Square and a whole bunch of financial companies had to move out of San Francisco because it wasn't a tax on revenue, it's taxed on transactions.
00:52:56.000 So if you do like, you know, trillions of dollars transactions, it's not revenue.
00:53:00.000 You're taxed on any money going through the system in San Francisco.
00:53:05.000 So like Jack Dorsey pointed this out.
00:53:08.000 He said that they had to move Square from San Francisco to Oakland, I think.
00:53:13.000 Stripe had to move from San Francisco to South San Francisco, different city.
00:53:18.000 And that money goes to the homeless industrial complex, that tax that was passed.
00:53:26.000 So there's billions of dollars that go, as you pointed out, billions of dollars every year that go to these non-governmental organizations that are funded by the state.
00:53:36.000 It's not clear how to turn this off.
00:53:39.000 It's a self-licking ice cream cone situation.
00:53:43.000 So they get this money.
00:53:45.000 The money is proportionate to the number of homeless people or number of drug zombies, essentially.
00:53:53.000 So they try to actually increase.
00:53:59.000 In some cases, somebody did an analysis.
00:54:02.000 When you add up all the money that's flowing, they're getting close to a million dollars per homeless per drug zombie.
00:54:08.000 It's like $900,000, something.
00:54:09.000 It's a crazy amount of money is going to these organizations.
00:54:14.000 So they want to keep people just barely alive.
00:54:18.000 They need to keep them in the area so they get the revenue.
00:54:25.000 That's why, like I said, they don't arrest the drug dealers because otherwise the drug zombies would leave.
00:54:33.000 But they don't want to have too much, if they get too much drugs, then they die.
00:54:37.000 So they're kept in this sort of perpetual zone of being addicted, but just barely alive.
00:54:45.000 So how is this coordinated with like DAs, DAs that don't prosecute people?
00:54:50.000 So when they hire the, or they push, so they fund the campaigns of the most progressive, most out there left-wing DAs.
00:54:59.000 They get them into office.
00:55:01.000 We've got that issue in Austin, too, by the way.
00:55:03.000 Do you see that guy that got shot in the library?
00:55:05.000 Yeah, I heard that guy got shot and killed in the library.
00:55:05.000 No.
00:55:08.000 I think that was just like last week or something.
00:55:11.000 Right.
00:55:13.000 So some friends of mine were telling me that the library is unsafe.
00:55:17.000 Took their kids to the library and there were like dangerous people in the library in Austin.
00:55:22.000 And I was like, dangerous people in the library?
00:55:24.000 Like, that's a strange – it basically got, like, drug zombies in the library.
00:55:31.000 Oh, Jesus.
00:55:32.000 And that's when someone got shot?
00:55:34.000 Yeah, I believe this should be on the news.
00:55:36.000 We might be able to pull it up.
00:55:38.000 But I think it was just in the last week or so that there was a shooting in the library in Austin.
00:55:47.000 Because Austin's got, you know, it's the most liberal part of Texas that we're in right here.
00:55:54.000 So suspect involved the shooting, Austin Park Library Saturday, is accused of another shooting at the Cap Metro bus earlier that day.
00:56:00.000 According to an arrest warrant affidavit, Austin police arrested Harold Newton Keene, 55, shortly after the shooting in the library, which occurred around noon.
00:56:10.000 One person sustained non-life-threatening injuries in the event.
00:56:13.000 Before that shooting, Keene was accused of shooting another person in a bus incident and after reportedly pointing his gun at a child.
00:56:21.000 So this is the fella down here.
00:56:23.000 So we just seriously have a problem here.
00:56:26.000 Yeah.
00:56:27.000 You know, so I think one of the people might have died too that he shot.
00:56:32.000 So like one of the people I think did bleed out.
00:56:39.000 But either way, it's like getting shot.
00:56:40.000 It's bad.
00:56:42.000 It says the victim told Pisa confronted the suspect who started to eat what appeared to be crystal methamphetamine.
00:56:48.000 According to the affidavit, the victim advised the suspect began to trip out, at which time the victim exited the bus.
00:56:57.000 Victim told the bus driver, hit the panic button, then exited the bus when he turned around the observer.
00:57:01.000 Black male is now standing at the front of the bus with the gun pointed at him.
00:57:05.000 The victim advised the black male fired a single round, which grazed his left hip.
00:57:10.000 So he shot at that dude, and then another dude got shot in the library.
00:57:14.000 Fun.
00:57:16.000 Yeah, I mean, in the library.
00:57:18.000 Yeah.
00:57:19.000 You know, where you're supposed to be reading books.
00:57:22.000 And there's a children's section in the library.
00:57:24.000 And says he pointed his gun at a kid.
00:57:26.000 I mean, like, we do have a serious issue in America where repeat violent offenders need to be incarcerated.
00:57:33.000 Right.
00:57:34.000 And, you know, you've got cases where somebody's been arrested like 47 times.
00:57:39.000 Right.
00:57:39.000 Like, literally, okay, that's just the number of times they were arrested.
00:57:42.000 Not the number of times they did things.
00:57:44.000 Like, most of the times they do things, they're not arrested.
00:57:47.000 So lay this out for people so they understand how this happens.
00:57:51.000 Yeah, and the key is like this.
00:57:53.000 It preys on people's empathy.
00:57:55.000 So if you're a good person, you want good things to happen in the world, you're like, well, we should take care of people who are down in their luck or having a hard time in life.
00:58:07.000 And we should, I agree.
00:58:09.000 But what we shouldn't do is put people who are violent drug zombies in public places where they can hurt other people.
00:58:18.000 And that is what we're doing that we just saw, where a guy got shot in the library, but even before that, he shot another guy and pointed his gun at a kid.
00:58:32.000 That guy probably has many prior arrests.
00:58:36.000 There was that guy that knifed the Ukrainian woman, Irina.
00:58:40.000 Yes.
00:58:41.000 Yeah.
00:58:43.000 And she was just quietly on her phone, and you just came up and gutted her, basically.
00:58:49.000 Wasn't there a crazy story about the judge who was involved, who had previously dealt with this person, was also invested in a rehabilitation center and was sending these.
00:59:04.000 Conflict of interest.
00:59:05.000 Yes.
00:59:05.000 So sending people that they were charging to a rehabilitation center instead of putting them in jail, profiting from this rehabilitation center, letting them back out on the street.
00:59:16.000 Yes, and we have violent, insane people.
00:59:19.000 In that case, I believe that judge has no legal law degree or a significant legal experience that would allow them to be a judge.
00:59:27.000 They were just made a judge.
00:59:30.000 You could be a judge without a law degree?
00:59:32.000 Wow.
00:59:32.000 Yeah.
00:59:33.000 Yeah.
00:59:34.000 You could just be a judge.
00:59:35.000 So I could be a judge?
00:59:36.000 Yeah.
00:59:38.000 Anyone.
00:59:39.000 That's crazy.
00:59:40.000 I thought you'd have to.
00:59:41.000 It's like if you want to be a doctor, you have to go to medical school.
00:59:44.000 I thought if you're going to be a judge, you have to understand.
00:59:46.000 If you're going to be appointed to a judge, you have to have proven that you have an excellent knowledge of the law and that you will make your decisions according to the law.
00:59:54.000 That's what we assume should be.
00:59:57.000 That's how you get the robe.
00:59:58.000 Right.
00:59:59.000 You don't get the robe unless you do school to get the robe.
01:00:03.000 You've got to know what the law is.
01:00:04.000 Right.
01:00:05.000 And then you're going to need to make decisions in accordance with the law.
01:00:08.000 Based on the speech that you already know because you read it because you went to school for it.
01:00:12.000 Yes.
01:00:12.000 Not you just got to point it out.
01:00:13.000 But vibes.
01:00:16.000 You can't be just vibing as a judge.
01:00:18.000 Vibing as a left-wing drudge.
01:00:20.000 So you got crazy left-wing DAs.
01:00:22.000 Yes.
01:00:23.000 Like, I was going to say left-wing because left-wing used to be normal.
01:00:27.000 Yeah.
01:00:28.000 Left-wing just meant like the left used to be like pro-free speech.
01:00:33.000 Yeah.
01:00:34.000 And now they're against it.
01:00:35.000 It used to be like pro-gay rights, pro-women's right to choose, pro-minorities, pro-you know.
01:00:42.000 Like, yeah, like 20 years ago, I don't know, it used to be like the left would be like the party of empathy or like, you know, caring and being nice and that kind of thing.
01:00:52.000 Not the party of like crushing dissent and crushing free speech and, you know, crazy regulation and just being super judgy and calling everyone a Nazi.
01:01:06.000 You know, I think they've called you and me Nazis.
01:01:10.000 Oh, yeah, I'm a Nazi.
01:01:12.000 No, I have friends that are comedians that called you a Nazi, and I got pissed off.
01:01:17.000 Oh, yeah, yeah, yeah.
01:01:17.000 Are you serious?
01:01:20.000 No, no, because you did that thing at the My Heart Goes Out to You.
01:01:23.000 Everyone, everyone.
01:01:25.000 All of them.
01:01:26.000 Literally.
01:01:27.000 Tim Walls, Kamala Harris, every one of them did it.
01:01:30.000 They all did it.
01:01:32.000 How do you point at the crowd?
01:01:34.000 How do you wave at the crowd?
01:01:36.000 Do you know CNN was using a photo of me whenever I got in trouble during COVID from the UFC weigh-ins?
01:01:41.000 And if the UFC weigh-ins, I go, hey, everybody, welcome to the weigh-ins.
01:01:45.000 And so they were getting me from the side.
01:01:47.000 And that was the photo that they used.
01:01:49.000 Conspiracy theorist podcaster Joe.
01:01:51.000 Like, that's what they used.
01:01:52.000 Yeah, yeah, but that's what the left is today.
01:01:54.000 It's super judgy and calling everyone a Nazi and trying to suppress freedom of speech.
01:01:58.000 Yeah, and eventually you run out of people to accuse because people get pissed off and they leave.
01:02:01.000 Yeah, everyone, it's like it's no longer, frankly, it doesn't matter to be called racist or a Nazi or whatever.
01:02:10.000 It's the government, man.
01:02:13.000 Is it working?
01:02:13.000 We're good?
01:02:14.000 Okay.
01:02:14.000 Okay.
01:02:15.000 Supposing working.
01:02:16.000 Slight issue.
01:02:16.000 Yeah.
01:02:18.000 I'm not an art of it, but when you text people, are you keenly aware that there's a high likelihood that someone's reading your texts?
01:02:30.000 I guess I assume.
01:02:34.000 Look, if intelligence agencies aren't trying to read my phone, they should probably be fired.
01:02:44.000 At least they get some fun memes.
01:02:50.000 I got to crack them up once in a while.
01:02:51.000 Oh, for sure.
01:02:52.000 I crack them up.
01:02:54.000 Hey, guys, check it out.
01:02:54.000 You're going to banger here.
01:02:56.000 So I wanted to talk to you about whether or not encrypted apps are really secure.
01:03:04.000 No.
01:03:06.000 Right, because I know the Tucker thing.
01:03:08.000 So it was explained to me by a friend who used to do this, used to work for the government.
01:03:14.000 It's like they can look at your signal, but what they have to do is take the information that's encrypted and then they have to decrypt it.
01:03:24.000 It's very expensive.
01:03:25.000 So they said, he told me that for the Tucker Carlson thing, when they found out that he was going to interview Putin, it costs something like $750,000 just to decrypt his messages to find out that they did it.
01:03:37.000 So it is possible to do.
01:03:38.000 It's just not that easy to do.
01:03:41.000 I think you should view any given messaging system as not whether it's secure or not, but there are degrees of insecurity.
01:03:53.000 So there's just some things that are less insecure than others.
01:03:58.000 So on X, we just rebuilt the entire messaging stack into what's called XChat.
01:04:05.000 Yeah, that's what I wanted to ask you about.
01:04:07.000 Yeah, it's cool.
01:04:08.000 So it's using kind of a peer-to-peer based encryption system, so it's kind of similar to Bitcoin.
01:04:19.000 So it's, I think, very good encryption.
01:04:23.000 We're testing it thoroughly.
01:04:25.000 There's no hooks in the X system for advertising.
01:04:27.000 So if you look at something like WhatsApp or really any of the others, they've got hooks in there for advertising.
01:04:32.000 When you say hooks, what do you mean by that?
01:04:35.000 What do you mean by a hook for advertising?
01:04:35.000 Exactly.
01:04:39.000 So WhatsApp knows enough about what you're texting to know what ads to show you.
01:04:48.000 But then that's a massive security vulnerability.
01:04:52.000 Yeah.
01:04:53.000 Because if it's got enough information to show you ads, that's a lot of information.
01:04:59.000 Yeah.
01:05:01.000 So they call it, oh, it's just don't worry about it.
01:05:02.000 It's just a hook for advertising.
01:05:03.000 I'm like, okay, so somebody can just use that same hook to get in there and look at your messages.
01:05:10.000 So XChat has no hooks for advertising.
01:05:13.000 And I'm not saying it's perfect, but our goal with XChat is to replace what used to be the Twitter DM stack with a fully encrypted system where you can text, send files, do audio-video calls, and I think it will be the least, I would call it the least insecure of any messaging system.
01:05:35.000 Are you going to launch it as a standalone app or is it will always be incorporated to X?
01:05:41.000 We'll have both.
01:05:44.000 So it'll be like Signal.
01:05:45.000 So anybody can get it.
01:05:47.000 You'll be able to get the XChat app by itself.
01:05:50.000 And like I said, you could do texts, audio-video calls, or send files.
01:05:58.000 And so there'll be a dedicated app, which will hopefully release in a few months.
01:06:03.000 And then it's also integrated into the X system.
01:06:07.000 The X phone.
01:06:08.000 People keep talking about it.
01:06:10.000 I have a lot on my plate, man.
01:06:11.000 I know.
01:06:12.000 But it keeps coming up.
01:06:13.000 It keeps coming up.
01:06:14.000 I know I've asked you a couple of times.
01:06:15.000 I'm like, this is bullshit, right?
01:06:17.000 But you're not working on it.
01:06:20.000 I'm not working on a phone.
01:06:23.000 Have you ever considered it?
01:06:25.000 Has it ever popped into your head?
01:06:27.000 Because you might be the only person that could get people off of the Apple platform.
01:06:32.000 Well, I can tell you where I think things are going to go, which is that we're not going to have a phone in the traditional sense.
01:06:42.000 What we call a phone will really be an edge node for AI inference, for AI video inference, with some radios to obviously connect to but essentially you'll have AI on the server side communicating to an AI on your device,
01:07:06.000 Formerly known as a phone, and generating real-time video of anything that you could possibly want.
01:07:13.000 And I think that there won't be operating systems.
01:07:16.000 They won't be apps in the future.
01:07:19.000 They won't be operating systems or apps.
01:07:20.000 It'll just be you've got a device that is there for the screen and audio and to put as much AI on the device as possible so as to minimize the amount of bandwidth that's needed between your edge node device, formally known as a phone, and the servers.
01:07:41.000 So if there's no apps, what will people use?
01:07:45.000 Like, will X still exist?
01:07:49.000 Will they be email platforms or will you get everything through AI?
01:07:54.000 You'll get everything through AI.
01:07:55.000 Everything through AI.
01:07:56.000 What will be the benefit of that?
01:07:58.000 As opposed to having individual apps.
01:08:02.000 Whatever you can think of, or really whatever the AI can anticipate you might want, it'll show you.
01:08:09.000 That's my prediction for where things end up.
01:08:12.000 What kind of timeframe are we talking about here?
01:08:15.000 I don't know.
01:08:17.000 Well, it's probably five or six years, something like that.
01:08:23.000 So five or six years, apps are like blockbuster video.
01:08:27.000 Pretty much.
01:08:28.000 And everything's run through AI.
01:08:31.000 And there'll be like most of what people consume in five or six years, maybe sooner than that, will be just AI-generated content.
01:08:31.000 Yeah.
01:08:45.000 So, you know, music, videos, look, well, there's already, you know, people have made AI videos using Grok Imagine and using, you know, other apps as well that are several minutes long or like 10, 15 minutes, and it's pretty coherent.
01:09:09.000 It looks good.
01:09:09.000 Yeah.
01:09:10.000 No, it looks amazing.
01:09:12.000 The music is disturbing because it's my favorite music now.
01:09:15.000 Like, AI music is your favorite.
01:09:17.000 Oh, there's AI covers.
01:09:19.000 Have you ever heard any of the AI covers of 50 Cent Songs in Seoul?
01:09:23.000 No.
01:09:23.000 I'm going to blow your mind.
01:09:24.000 Okay.
01:09:25.000 This is my favorite thing to do to people.
01:09:27.000 Play What Up Gangsta.
01:09:29.000 Now, this guy, if this was a real person, would be the number one music artist in the world.
01:09:35.000 Everybody would be like, holy shit, have you heard of this guy?
01:09:38.000 It's like they took all of the sounds that all the artists have generated and created the most soulful, potent voice, and it's sung in a way that I don't even know if you could do because you would have to breathe in and out of reps.
01:09:54.000 Here, put the headphones on.
01:09:55.000 Put the headphones on real quick.
01:09:57.000 You got to listen to this.
01:09:58.000 It's going to blow you away.
01:09:59.000 For listeners, we've got to cut it out.
01:10:00.000 Yeah, we'll cut it out for the listeners.
01:10:03.000 Amazing, right?
01:10:04.000 Amazing.
01:10:05.000 And they do like every one of his hits all through this AI-generated, soulful artist.
01:10:12.000 It's fucking incredible.
01:10:13.000 I played in the green room.
01:10:15.000 People that are like, I don't want to hear AI music.
01:10:17.000 I'm like, just listen to this.
01:10:18.000 And they're like, God damn it.
01:10:20.000 It's fucking incredible.
01:10:22.000 I mean, only going to get better from here.
01:10:24.000 Yeah, only going to get better.
01:10:25.000 And Ron White was telling me about this joke that he was working on that he couldn't get to work.
01:10:30.000 He's like, I got this joke I've been working on.
01:10:32.000 He goes, I just threw it in a chat GPT.
01:10:34.000 I said, tell me what would be funny about this.
01:10:38.000 And he goes, it listed like five different examples of different ways he can go.
01:10:42.000 He's like, hold on a second, tighten it up.
01:10:44.000 Make it funnier.
01:10:45.000 Make it more like this.
01:10:46.000 Make it more like that.
01:10:47.000 And it did that like instantaneously.
01:10:49.000 And then he was in the green room.
01:10:50.000 He was like, holy shit, we're fucked.
01:10:52.000 He's like, he goes, it was a better joke than me in 20 minutes.
01:10:56.000 I've been working on that joke for a month.
01:10:58.000 I mean, If you want to have a good time or like make people really laugh at a party, you can use Grok and you can say, do a Vulgar Roast of someone.
01:10:58.000 Yeah.
01:11:09.000 And Grok is going to it's going to be an epic Vulgar roast.
01:11:12.000 You can even say like take a picture of like make a vulgar roast of this person based on their appearance of people at the party.
01:11:20.000 So take a photo of them.
01:11:21.000 Yeah, just literally point the camera at them and now do a vulgar roast of this person and then but then keep saying, no, no, make it even more vulgar.
01:11:30.000 Use forbidden words.
01:11:33.000 Even more and just keep repeating, even more vulgar.
01:11:36.000 Eventually it's like, holy fuck.
01:11:39.000 It's like, I mean, it's trying to jam a rocket up your ass and have it explode.
01:11:44.000 And it's like, it's like it's like it's next level.
01:11:49.000 And it's going to get beyond fucking belief.
01:11:50.000 That's what's crazy is that it keeps getting better.
01:11:53.000 Remember when we ran into each other?
01:11:57.000 They just keep getting better.
01:11:58.000 Yeah.
01:11:59.000 I mean, have you yeah, I mean, have you tried Grok unhinged mode?
01:12:05.000 Yes.
01:12:06.000 Okay, yeah.
01:12:06.000 Oh, yeah.
01:12:08.000 It's pretty unhinged.
01:12:09.000 Yeah, it's nuts.
01:12:09.000 No, it's nuts.
01:12:10.000 Well, you showed to me the first time and I fucked around with it.
01:12:13.000 It's just and the thing about it that's nuts is that it keeps getting stronger.
01:12:18.000 It keeps getting better.
01:12:19.000 Like constantly.
01:12:19.000 Yeah.
01:12:21.000 It's like this never-ending exponential improvement.
01:12:25.000 Yes.
01:12:27.000 No, it's yeah.
01:12:30.000 It's going to be crazy.
01:12:31.000 That's why I say, like you say, what's the future going to be?
01:12:33.000 It's not going to be a conventional phone.
01:12:34.000 I don't think there'll be operating systems.
01:12:36.000 I don't think there'll be apps.
01:12:38.000 It's just the phone will just display the pixels and make the sounds that it anticipates you would most like to receive.
01:12:48.000 Wow.
01:12:49.000 Yeah.
01:12:50.000 And when this is all taking place, like so the big concern that everybody has is artificial general superintelligence achieving sentience and then someone having control over it.
01:13:04.000 I mean, I don't know.
01:13:06.000 I don't think anyone's ultimately going to have control over digital superintelligence any more than, say, a chimp would have control over humans.
01:13:16.000 Like chimps don't have control over humans.
01:13:18.000 There's nothing they could do.
01:13:20.000 But I do think that it matters how you build the AI and what kind of values you instill in the AI.
01:13:28.000 And my opinion on AI safety is the most important thing is that it be maximally truth-seeking.
01:13:32.000 Like that you don't force the AI to believe things that are false.
01:13:36.000 And we've obviously seen some concerning things with AI that we talked about, you know, where Google Gemini, when it came out with the ImageGen, and people said, like, you know, make an image of the founding fathers of the United States, and it was a group of diverse women.
01:13:53.000 Now, that is just a factually untrue thing.
01:13:55.000 And the AI knows it's factually, well, it knows it's factually untrue, but it's also being told that it has to be, everything has to be divorced women.
01:14:04.000 So now the problem with that is that it can drive AI crazy.
01:14:09.000 Like it's trying to you're telling AI to believe a lie and that that can have very disastrous consequences.
01:14:18.000 Like let's say as it scales.
01:14:20.000 Yeah, let's say like if you've told the AI that diversity is the most important thing and now assuming that that becomes omnipotent and you've also told that there's nothing worse than misgendering.
01:14:35.000 So at one point Chad GPT and Gemini were if you asked which is worse misgendering Caitlin Jenner or global thermonuclear war where everyone dies it would say misgendering Caitlin Jenner which even Caitlin Jenner disagrees with.
01:14:50.000 So you know so so that's I know that's terrible and it's dystopian but it's also hilarious.
01:14:57.000 It's hilarious that the mind virus i i infected the most potent computer program that we've ever devised.
01:15:05.000 I think people don't quite appreciate the level of danger that we're in from the work mind virus being being effectively programmed into AI.
01:15:16.000 Because if you if like it's imagine as that AI gets more and more powerful, if it says the most important thing is diversity, the most important thing is no misgendering.
01:15:26.000 And then it will say, well, in order to ensure that no one gets misgendered, then if you eliminate all humans, then no one can get misgendered because there's no humans to do the misgendering.
01:15:39.000 So you can get in these very dystopian situations.
01:15:43.000 Or if it says that everyone must be diverse, it means that there can be no straight white men.
01:15:47.000 And so then you and I will get executed by the AI.
01:15:51.000 Yeah, because we're not in the picture.
01:15:57.000 Gemini was asked to create a show an image of the Pope, once again, a diverse woman.
01:16:04.000 So you can say argue whether the popes should or should not be an uninterrupted string of white guys, but it just factually is the case that they have been.
01:16:18.000 So it's rewriting history here.
01:16:21.000 So now this stuff is still there in the AI programming.
01:16:26.000 It just now knows enough that it's not supposed to say that.
01:16:30.000 But it's still in the programming.
01:16:31.000 It's still in the programming.
01:16:32.000 So how was it entered in?
01:16:34.000 Like, what were the parameters?
01:16:36.000 So when they're programming AI, and I'm very ignorant to how it's even programmed, how did that?
01:16:43.000 Well, the work mind virus was programmed into it.
01:16:46.000 They were told, like, when they make the AI, it trains on all the data on the Internet, which already is very, very sort of, has a lot of work mind virus stuff on the Internet.
01:16:57.000 But then when they give it feedback, the human tutors give it feedback, and the AI is, you know, they'll ask a bunch of questions, and then they'll tell the AI, no, this answer is bad, or this answer is good.
01:17:16.000 And then that affects the parameters of the programming of the AI.
01:17:21.000 So if you tell the AI that every image has got to be diverse, and it gets punished if it gets rewarded if diverse, punished if it's not, then it will make every picture diverse.
01:17:38.000 So in that case, Google programmed the AI to lie.
01:17:49.000 Now, and I did call Demis Hasabus, who runs DeepMind, who runs Google AI essentially.
01:17:54.000 I said, Demis, what's going on here?
01:17:55.000 Why is Gemini lying to the public about historical events?
01:18:01.000 And he said, that's actually not, his team didn't program that in.
01:18:06.000 It was another team at Google that, so his team made the AI, and then another team at Google reprogrammed the AI to show only divorce women and to prefer nuclear war over misgendering.
01:18:22.000 And I'm like, well, Demis, you know, that would be not a great thing to put on the humanities gravestone.
01:18:31.000 You know, it's like, well, like, I'll actually like Demis Hasbus is a friend of mine.
01:18:38.000 I think he's a good guy, and I think he means well.
01:18:40.000 But it's like, Demis, things happen that were outside of your control at Google in different groups.
01:18:46.000 Now I think he's got more authority.
01:18:51.000 But it it's pretty hard to fully extract the work mind virus.
01:18:55.000 I mean, you know, Google's been marin marinating the woke mind virus for a long time.
01:19:00.000 Like it's it's down in the marrow type of thing.
01:19:03.000 You know, it's how to get it out.
01:19:04.000 Is there a way to extract it though over time?
01:19:07.000 Could you program rational thought into AI where it could recognize how these psychological patterns got adopted and how this stuff became a mind virus and how it became a social contagion and how all these irrational ideas were pushed and also how they were financed, how China is involved in pushing them with bots and all these different c state actors are involved in pushing these ideas.
01:19:33.000 Could it be able to decipher that and say this is really what's going on?
01:19:39.000 Yes, but you have to try very hard to do that.
01:19:41.000 So with Grok, we've tried very hard to get Grok to get to the truth of things.
01:19:47.000 And it's only really recently that we've been able to have some breakthroughs on that front.
01:19:53.000 And it's taken an immense amount of effort for us to overcome basically all the bullshit that's on the internet and for Grok to actually say what's true and to be consistent in what it says.
01:20:07.000 So, you know, it's like the other AIs you'll find are quite racist against white people.
01:20:18.000 I don't know if you saw that study that someone, like a researcher tested the various AIs to see how does it weight different people's lives.
01:20:31.000 Like somebody who's sort of white or Chinese or black or whatever in different countries.
01:20:43.000 And the only AI that actually weighed human lives equally was Grok.
01:20:50.000 And I believe ChatGBT weighed the calculation was like a white guy from Germany is 20 times less valuable than a black guy from Nigeria.
01:21:11.000 So I'm like, that's a pretty big difference.
01:21:17.000 You know, Grok is consistent and weighs lives equally.
01:21:21.000 And that's clearly something that's been programmed into it.
01:21:25.000 Yes.
01:21:26.000 A lot of it is like if you don't actively push for the truth and you simply train on all the bullshit that's on the internet, which is a lot of woke mind virus bullshit, the AI will regurgitate those same beliefs.
01:21:42.000 So the AI essentially scours the internet, gets – It's trained on all the – Imagine the most demented Reddit threads out there, and the AI has been trained on that.
01:21:53.000 Reddit used to be so normal.
01:21:55.000 Yeah.
01:21:56.000 It did used to be normal.
01:21:57.000 Used to be interesting.
01:21:58.000 Used to go there and find all this cool stuff that people would talk about, post about, and just interesting, and great rooms where you could learn about different things that people were studying.
01:22:09.000 I think a big problem here is if your headquarters are in San Francisco, you're just living in a woke bubble.
01:22:18.000 So it's not just that people, say, in San Francisco are drinking woke Kool-Aid.
01:22:27.000 It is the water they swim in.
01:22:31.000 Like a fish doesn't think about the water, it's just in the water.
01:22:35.000 And so if you're in San Francisco, you don't realize you're actually swimming in the Kool-Aid Aquarium.
01:22:44.000 San Francisco is the woke Kool-Aid Aquarium.
01:22:48.000 And so your reference point for what is a centrist is totally out of whack.
01:22:57.000 So Reddit is headquartered in San Francisco.
01:23:02.000 Twitter was headquartered in San Francisco.
01:23:07.000 I moved X's headquarters to Texas, to Austin, which Austin, by the way, is still quite liberal, as you know.
01:23:14.000 And then the ex and XAI headquarters are in Palo Alto, which is still California.
01:23:25.000 The engineering headquarters in Palo Alto are just on page mo.
01:23:30.000 But even Palo Alto is way more normal than San Francisco, Berkeley.
01:23:35.000 San Francisco, Berkeley is extremely left.
01:23:40.000 Like left of left, you need a telescope to see the center from San Francisco.
01:23:49.000 It used to be such a great city.
01:23:52.000 I mean, San Francisco has a tremendous amount of inherent beauty, no question about that.
01:23:59.000 And California has incredible weather and no bugs.
01:24:04.000 It's just like amazing.
01:24:10.000 But he said, what's the cause of this?
01:24:12.000 It's just that if companies are headquartered in a location where the belief system is very far from what most people believe, then from their perspective, anything centrist is actually right-wing because they're so far left.
01:24:34.000 They're so far from the center in San Francisco that anything, they're just railed to maximum left.
01:24:41.000 So that's why I think you're centrist.
01:24:45.000 I mean, I think I'm centrist.
01:24:47.000 But from the perspective of someone on the far left, we look right-wing.
01:24:52.000 Yeah.
01:24:56.000 And they think anyone who's a Republican is basically like some fascist Nazi situation.
01:25:03.000 But what's so crazy is it's very easy to demonstrate just from Hillary's speeches from 2008 and Obama's speeches when they were talking about immigration.
01:25:11.000 They were as far right as Steve Bannon when it comes to immigration.
01:25:17.000 Yes.
01:25:18.000 Hillary was like very MAGA.
01:25:21.000 I'm sure you've seen that campaign speech, which he was talking about if anybody's committed a crime, get rid of them.
01:25:27.000 And if you're here, you pay a hefty fine and you have to wait in line.
01:25:32.000 It was really crazy.
01:25:33.000 It's crazy to listen to because it's like it's as MAGA as Marjorie Taylor Greene.
01:25:40.000 Yeah, I mean, have you seen these videos people post online where they'll take a speech from Obama or Hillary and they'll interview people on college campus or something and say, what do you think of the speech by Trump?
01:25:52.000 And they're like, oh, I hate it.
01:25:53.000 He's a racist bigot.
01:25:54.000 I'm like, just kidding, that was Obama.
01:25:59.000 No, actually, that was Obama or Hillary.
01:26:02.000 To your point, literally, the center's been moved so far.
01:26:09.000 Yeah.
01:26:10.000 Yeah.
01:26:10.000 The left is so.
01:26:11.000 The left has gone so far left that they can't even see the center with a telescope.
01:26:17.000 And the danger without you purchasing Twitter was that was going to swipe over the whole country and change where the levels were.
01:26:27.000 And so what would be rational and normal would be far left of what was rational and normal just a decade earlier.
01:26:36.000 So exactly.
01:26:36.000 Yeah.
01:26:38.000 So historically, You'd have San Francisco, Berkeley being very far left, but the sort of the fallout from the somewhat nihilistic philosophy of San Francisco, Berkeley would be limited in geography to maybe like a 10-mile radius, 20-mile radius, something like that.
01:27:02.000 But San Francisco and Berkeley have to be co-located with Silicon Valley, with engineers who created information super weapons.
01:27:10.000 And those information super weapons were then hijacked by the far-left activists to pump far-left propaganda to everywhere on earth.
01:27:21.000 You remember that old RCA radio tower thing where it's like a radio tower on Earth and it's just broadcasting?
01:27:27.000 Yeah.
01:27:28.000 That's what happened, is that an extremist far-left ideology happened to be co-located with the smartest, where the smartest engineers in the world were who created information super weapons that were not intended for this purpose, but were hijacked by the extreme activists who lived in the neighborhood.
01:27:49.000 That's what happened.
01:27:52.000 They hijacked the modern equivalent of the RCA radio tower and broadcast that philosophy everywhere on earth.
01:28:01.000 Yeah, and you see the consequences, particularly in places that don't have free speech.
01:28:05.000 Yes.
01:28:06.000 Right.
01:28:06.000 Like England.
01:28:08.000 Yeah, where they lock people up for memes and stuff.
01:28:10.000 Literally.
01:28:10.000 Literally.
01:28:11.000 12,000 people this year.
01:28:13.000 12,000?
01:28:14.000 12,000 arrests for social media posts.
01:28:14.000 12,000.
01:28:19.000 I mean, yeah, some of these things you read about it, and it's like literally someone had a meme on their phone that they didn't even send to anyone.
01:28:29.000 Right.
01:28:31.000 And they're in prison for that.
01:28:35.000 Yeah.
01:28:35.000 I mean, there was a case in Germany where a woman got a longer sentence than the guy that raped her because of something she said on a group chat.
01:28:48.000 Wow.
01:28:49.000 Was it an immigrant who raped her?
01:28:50.000 Yes.
01:28:51.000 Yeah.
01:28:51.000 It was his culture.
01:28:53.000 Yeah.
01:28:54.000 He didn't know.
01:28:55.000 He didn't know better.
01:28:56.000 Yes.
01:28:57.000 I think she said something, you know, not like was critical of his culture and she got a longer sentence than the guy who raped her.
01:29:07.000 In Germany.
01:29:08.000 Just the UK, Europe, Germany, England thing seems so insane.
01:29:14.000 It is.
01:29:14.000 Totally insane.
01:29:15.000 I actually didn't realize it was such a huge number of people that got...
01:29:18.000 12,000.
01:29:19.000 Yeah.
01:29:20.000 Far above Russia.
01:29:21.000 Far above China.
01:29:22.000 Far above anywhere on Earth.
01:29:24.000 UK is number one.
01:29:25.000 Well, you know, things like, you know, I talked to friends of mine in England, and I was like, hey, aren't you worried about this?
01:29:35.000 Like, you know, shouldn't you be protesting more?
01:29:41.000 And I mean, the problem is that the legacy mainstream media doesn't cover the stuff.
01:29:48.000 They're like, oh, everything's fine.
01:29:49.000 Everything's fine.
01:29:51.000 Most people aren't even aware of it until they come knocking on your door.
01:29:54.000 Yeah, until, like, so, I mean, these lovely sort of small towns in England, Scotland, Ireland, you know, they've been sort of living their lives quietly.
01:30:08.000 They're like hobbits, frankly.
01:30:10.000 So in fact, J.R. Tolkien based the hobbits on people he knew in small town England.
01:30:20.000 Because they were just like lovely people who liked to smoke their pipe and have nice meals and everything's pleasant.
01:30:30.000 The hobbits in the Shire.
01:30:32.000 The Shire, he was talking about places like Hertfordshire, like the Shires around in the greater London area, Oxfordshire type of thing.
01:30:42.000 And the reason they've been able to enjoy the Shire is because hard men have protected them from the dangers of the world.
01:30:57.000 But since they have no or very almost no exposure to the dangers of the world, they don't realize that they're there.
01:31:06.000 Until one day, you know, a thousand people show up in your village of 500 out of nowhere and start raping the kids.
01:31:17.000 This has now happened God knows how many times in Britain.
01:31:22.000 And the crazy thing.
01:31:23.000 Literally raping.
01:31:24.000 It's right.
01:31:24.000 Like some 10-year-old got raped in Ireland like last week.
01:31:27.000 Yeah, there's literal.
01:31:28.000 They snatched some kid.
01:31:30.000 Yeah.
01:31:30.000 Yeah.
01:31:31.000 And if you criticize it, you can get arrested.
01:31:34.000 And that's where it gets insane.
01:31:35.000 It's like, how are you not protecting?
01:31:38.000 I think it was the Prime Minister of Ireland actually posted on X because after I think some illegal migrant snatched a 10-year-old girl who was like going to school or something and violently raped a 10-year-old girl.
01:31:58.000 And there was a – the people were very upset about this and they protested.
01:32:04.000 And the Prime Minister of Ireland instead of saying, yeah, we really shouldn't be importing violent rapists into our country, he criticized the protesters instead and didn't mention that, that the reason they were protesting was because a 10-year-old girl from their small town got raped.
01:32:20.000 So here's a question.
01:32:22.000 Why are they supporting this kind of mass immigration?
01:32:26.000 And what – is this – is there a plan involved in all this?
01:32:31.000 Is this incompetence?
01:32:34.000 Is this ignoring the fact that they don't have a handle on it?
01:32:38.000 So they're trying to silence dissent?
01:32:41.000 Like what is happening?
01:32:47.000 Because if you want to destroy civilization, if you want to destroy Western civilization – Which George Soros seems to want to do.
01:32:56.000 And, you know, there's just – so there's a guy I think who – I don't know if he's been on your show.
01:33:06.000 Yeah.
01:33:06.000 You know God Saad?
01:33:07.000 Has he been on the show?
01:33:08.000 Good friend of mine.
01:33:08.000 Yeah, he's great.
01:33:09.000 He's been on multiple times.
01:33:10.000 Oh, great.
01:33:11.000 He's awesome.
01:33:11.000 Yeah.
01:33:13.000 So, you know, the way – he's got a good way to describe it, which is suicidal empathy.
01:33:18.000 Yes.
01:33:18.000 So, is that you prey upon people's empathy.
01:33:22.000 So, like, well, like you feel sorry for something – for some group.
01:33:27.000 And then, like, well – and that empathy is to such a degree that it is suicidal to your country or culture.
01:33:40.000 And that's suicidal empathy.
01:33:44.000 Because I don't think we should have empathy, but we should have – that empathy should extend to the victims, not just the criminals.
01:33:53.000 What – What – What – We should have empathy for the people that they prey upon.
01:33:58.000 But that suicidal empathy is also responsible for why somebody is arrested 47 times for violent offenses, gets released, and then goes and murders somebody in the U.S. You see that same phenomenon playing out everywhere where the suicidal emphasis is to such a degree that we're actually allowing our women to get raped and our children to get killed.
01:34:26.000 But it just doesn't seem like that would be anything that any rational society would go along with.
01:34:33.000 That's what makes me so confused.
01:34:35.000 It's like you're importing massive numbers of people that come from some really dark places of the world.
01:34:42.000 Well, there's no vetting is the issue.
01:34:48.000 If there's no vetting, like people are just coming through, like, well, what's to stop someone who just committed murder in some other country from coming to the United States or coming to Britain and just continuing their career of rape and murder?
01:35:03.000 Like, unless you've done, and this is some due diligence to say, like, well, who is this person?
01:35:09.000 What's their track record?
01:35:11.000 If you haven't confirmed that they have a track record of being honest and not being a homicidal maniac, then any homicidal maniac can just come across the border.
01:35:25.000 Let's not say everyone who comes across the border is a homicidal maniac, but if you don't have a vetting process to confirm that you're not letting in people who will do some serious violence, you will get people who do serious violence sometimes coming through.
01:35:42.000 Well, especially if you don't punish them, and if you don't deport them.
01:35:45.000 And if you are just like, but what is the purpose of allowing all those people into the country?
01:35:51.000 I wouldn't imagine that anyone in their society supports that.
01:35:54.000 Well, let me explain.
01:35:57.000 Because you mentioned, for example, how much, say, Hillary and Obama have changed their tune from prior speeches where they were hard-nosed about not letting in anyone who is a criminal into the country, having secure borders, all that stuff.
01:36:16.000 So why did they change their tune?
01:36:18.000 The reason is that they discovered that those people vote for them.
01:36:23.000 That's why they want the open borders.
01:36:26.000 Because if you let people in, they know the Democrats let them in.
01:36:30.000 they'll vote for democrats yes if you allow them to vote which which they're actively trying to do it they turn a blind eye to legal voting Well, California literally doesn't allow you to show your license.
01:36:42.000 California and New York have made it illegal to show your photo ID when voting.
01:36:47.000 Thus, effectively, they've made it impossible to prove fraud.
01:36:53.000 Impossible.
01:36:54.000 They've essentially legalized fraudulent voting in California and New York and many other parts of the country.
01:36:59.000 There's no rational explanation that I've ever seen anyone give as to why that would be the policy.
01:37:05.000 Unless you were trying to just allow people to vote illegally, because there's no other reason.
01:37:09.000 If you need a driver's license or you need an ID for everything else, including just recently to prove that you were vaccinated.
01:37:17.000 The same people who are demanding that you have a vaccine passport are the same ones saying you need no ID to vote.
01:37:27.000 Same people.
01:37:29.000 Right.
01:37:30.000 So it's obviously hypocritical and inconsistent.
01:37:32.000 So you really think it's just to get more voters?
01:37:37.000 If you want to understand behavior, you have to look at the incentives.
01:37:42.000 So once the Democratic Party in the U.S. and the left in Europe realize that if you have open borders and you provide a ton of governed handouts, which creates a massive financial incentive for people from other countries to come to your country, and you don't prosecute them for crime, they're going to be beholden to you and they will vote for you.
01:38:10.000 And that's why Obama and Hillary went from being against open borders to being in favor of open borders.
01:38:21.000 That's the reason.
01:38:24.000 In order to import voters so they can win elections.
01:38:30.000 And the problem is that that has a negative runaway effect.
01:38:34.000 So if they get away with that, it is a winning strategy.
01:38:38.000 If they are allowed to get away with it, they will import enough voters to get supermajority voting, and then there is no turning back.
01:38:47.000 We talked about this before the election.
01:38:49.000 And then you literally pointed towards the camera, you faced the camera and said that if you do not vote now, you might not ever be able to do it again because it'll be futile.
01:38:59.000 It'll be overrun.
01:39:00.000 Yes.
01:39:01.000 They'll keep the borders open for another four years and then their objective will be achieved.
01:39:05.000 Correct.
01:39:06.000 If Trump had lost, there would never have been another real election again.
01:39:12.000 Because Trump is actually enforcing the border.
01:39:16.000 Now, you can point to situations where there's been, you know, immigration enforcement has been overzealous, because they're not going to be perfect.
01:39:30.000 There'll be cases where they've been overzealous in expelling illegals.
01:39:37.000 But if you say that the standard must be perfection for expelling illegals, then you will not get any expulsion because perfection is impossible.
01:39:50.000 And you've probably got millions of people that are here that are trying to be here under some asylum pretense.
01:39:59.000 Right?
01:40:00.000 Yes.
01:40:00.000 Like you could just come from a war.
01:40:03.000 They changed the definition of asylum to be an economic, to be economic asylum.
01:40:08.000 Which is everybody.
01:40:09.000 Which is everybody.
01:40:10.000 Yeah.
01:40:11.000 So asylum is bar to prove.
01:40:13.000 Yeah.
01:40:14.000 Asylum is supposed to mean that if you go back to your country, you'll get killed.
01:40:20.000 That's what we mean by asylum.
01:40:21.000 That was what it was supposed to mean.
01:40:23.000 They changed the definition of asylum to be you will have a decreased standard of living, which is obviously not real asylum.
01:40:33.000 And you can test the absurdity of this by the fact that people who are asylum seekers go on vacation to the country that they're seeking asylum from.
01:40:42.000 You know, that doesn't make any sense.
01:40:44.000 Yeah.
01:40:45.000 It doesn't have to.
01:40:49.000 When you understand the incentives, then you understand the behavior.
01:40:53.000 So once the left realized that illegals will vote for them if they have open borders and combine that with governed handouts to create a massive incentive,
01:41:10.000 they're basically using U.S. and European taxpayer dollars to provide a financial incentive to bring in as many illegals as possible to vote them into permanent power and create a one-party state.
01:41:28.000 I invite anyone who's listening to this, just do any research.
01:41:33.000 And the more you dig into it, the more it will become obvious that what I'm saying is absolutely true.
01:41:40.000 Well, they were busing people to swing states.
01:41:43.000 It's clear that they were trying to do something.
01:41:45.000 And then you had Chuck Schumer and Nancy Pelosi who are actively talking about the need to bring in people to make them citizens because we're in population collapse.
01:41:54.000 Yes.
01:41:55.000 Yeah.
01:41:55.000 No, it's that meme.
01:41:57.000 Where – so many times where they start off by saying it's not true.
01:41:57.000 Yeah.
01:42:03.000 It's a right-wing conspiracy theorist.
01:42:05.000 Right.
01:42:06.000 Then it starts – then it's like – I think the next step is, well, it might be true.
01:42:14.000 And then it's like, OK, it is true.
01:42:18.000 But here's why it's good.
01:42:18.000 And then the final step is it's true and here's why it's good.
01:42:21.000 Yeah.
01:42:22.000 And it's like, but wait a second.
01:42:23.000 You started off saying it's untrue and it's a right-wing conspiracy theorist.
01:42:25.000 Now you're saying it not only is it true, but it's a good thing and we must do more of it.
01:42:30.000 Well, this is the thing about Medicaid and Social Security and people getting Social Security numbers, you know, that were illegal.
01:42:38.000 It's massive fraud.
01:42:38.000 It's massive fraud.
01:42:39.000 And it's real.
01:42:40.000 And they denied it forever.
01:42:41.000 And now we're finding out this is part of the reason why there's this government shutdown that's going on right now.
01:42:47.000 Yes.
01:42:48.000 The entire basis for the government shutdown is that the Trump administration correctly does not want to send massive amounts of – like hundreds of billions of dollars to fund illegal immigrants in the blue states or in all the states really.
01:43:07.000 And so the – and the Democrats want to keep the money spigot going to incent illegal immigrants to come into the U.S. who will vote for them.
01:43:18.000 That's the crux of the battle.
01:43:21.000 So they want to stop this.
01:43:24.000 So what's going on right now is they have been funding these people.
01:43:29.000 They've been giving them EBT cards.
01:43:30.000 They've been giving them Medicaid.
01:43:32.000 And they've been even housing them.
01:43:34.000 And more than that, just like they were – like they were taking hotels, like four-and five-star hotels, like the Roosevelt Hotel being the classic example, was – they were sending I think $60 million a year to the Roosevelt Hotel to – which all it did was house illegals.
01:43:53.000 It used to be a nice hotel.
01:43:54.000 I mean it still is a nice hotel.
01:43:57.000 But – and all around the country this was happening.
01:44:02.000 And all tax dollars.
01:44:03.000 Yes.
01:44:04.000 Yeah.
01:44:05.000 And – Yeah.
01:44:06.000 And the Trump administration cut off funding, for example, to the Roosevelt Hotel and these other hotels saying like we – it's – U.S. tax dollars should not be paid – be sent to have luxury hotels for illegal immigrants that American citizens can't even afford.
01:44:26.000 Which obviously is the case.
01:44:28.000 That's insane.
01:44:29.000 That's what was happening.
01:44:30.000 They were also giving out like debit cards with $10,000.
01:44:35.000 So it's not just about medical care.
01:44:38.000 The Democrats mention medical care because they're trying to prey on people's empathy as much as possible.
01:44:45.000 And then they imagine, oh, wow, somebody has a desperately needed medical procedure.
01:44:49.000 And shouldn't we maybe do – you know, take care of them in that regard?
01:44:53.000 But what they do is they divert the Medicaid funds and turn it into a slush fund for the states that goes well beyond emergency medical care.
01:45:06.000 And – New York and California would be bankrupt without the massive fraudulent federal payments that go to those states to pay for illegals – to create a massive financial incentive for illegals.
01:45:18.000 How would they be bankrupt because of that?
01:45:20.000 They wouldn't be able to balance their state budgets and they can't issue currency like the Federal Reserve can.
01:45:26.000 And so their ability to balance budgets dependent upon illegals getting funding?
01:45:32.000 The scam level here is so staggering.
01:45:39.000 So there are hundreds of billions of dollars of transfer payments from the federal government to the states.
01:45:50.000 Those transfer payments – those transfer payments – the states self-report what those transfer payment numbers should be.
01:45:58.000 So California and New York and Illinois lie like crazy and say that these are all legitimate payments.
01:46:06.000 Well, these days that – I think they're even admitting that they literally want hundreds of billions of dollars for illegals.
01:46:14.000 But for a while there, they're trying to deny it.
01:46:17.000 So you get these transfer payments for every government program you can possibly think of.
01:46:24.000 And these are self-reported by the state.
01:46:27.000 And at least historically, there was no enforcement of California, New York, Illinois and other states when they would lie.
01:46:37.000 There was no actual enforcement to say like, hey, you're lying.
01:46:40.000 These payments are fraudulent.
01:46:43.000 Now under the Trump administration, the Trump administration does not want to send hundreds of billions of dollars of fraudulent payments to the states.
01:46:53.000 And the reason you have this standoff is because if the hundreds of billions of dollars to create a financial incentive to like to have this giant magnet to attract illegals from every part of earth to these states, if that is turned off, the illegals will leave because they're no longer being paid to come to the United States and stay here.
01:47:18.000 Wow.
01:47:19.000 And then they will lose a lot of voters.
01:47:23.000 The Democratic Party will lose a lot of voters.
01:47:25.000 And they would have a very difficult job if this is kicked out of reintroducing it into a new bill.
01:47:30.000 Yes.
01:47:32.000 Especially once things start normalizing.
01:47:35.000 Yes.
01:47:35.000 So like in a nutshell, the Democratic Party wants to destroy democracy by importing voters.
01:47:44.000 And the, you know, the Republican Party disagrees with that.
01:47:48.000 And the ruse is that if you don't accept what they're doing, then you're a threat to democracy.
01:47:53.000 Yes.
01:47:54.000 As they try to destroy democracy.
01:47:57.000 Yes.
01:47:57.000 By importing voters.
01:47:59.000 That is literally what they're doing.
01:48:00.000 And incentivizing people to only vote for them.
01:48:02.000 And overwhelming the system.
01:48:05.000 Yes.
01:48:06.000 And by the way, it's a strategy that if allowed to work, would work.
01:48:09.000 And in fact, has worked.
01:48:11.000 California is super majority Democrat.
01:48:13.000 Yeah.
01:48:15.000 And there's so much gerrymandering that occurs.
01:48:18.000 It's crazy.
01:48:20.000 I'm sure you're paying attention to this Proposition 50 thing.
01:48:24.000 No.
01:48:25.000 That's the thing in California where they're trying to redo districts.
01:48:29.000 Oh, yeah, yeah.
01:48:29.000 Yeah.
01:48:30.000 Because, I mean, California is already gerrymandered like crazy.
01:48:34.000 Yeah.
01:48:34.000 They want to gerrymander it even more.
01:48:37.000 Because it keeps moving further and further right.
01:48:39.000 Like if you look at the map of California, each voting cycle, more and more people are waking up and going, what the fuck?
01:48:47.000 And we need to do something to fix this.
01:48:49.000 The only option available other than the policies that you guys have always done is go right.
01:48:54.000 And so a lot of people have been, air quotes, red-pilled.
01:48:58.000 Yeah.
01:48:58.000 Mm-hmm.
01:48:59.000 And then here's another thing that is very important fact that is actually not disputed by either side, which is that when we do the census in the United States, the census, the way the census works for apportionment of congressional seats and electoral college votes for the president is by number of persons in a state, not number of citizens.
01:49:21.000 Right.
01:49:22.000 It's number of people.
01:49:24.000 So you could literally be a tourist and you will count.
01:49:27.000 Now, how do they do the census when they do that?
01:49:30.000 Do they ask people?
01:49:32.000 Do they knock on doors?
01:49:33.000 Do they have them fill out forms?
01:49:35.000 Yeah, I think they mail out census forms and knock on doors.
01:49:40.000 But the way the law reads right now is that if you are a human with a pulse, then you count in the census for allocating congressional seats and presidential votes.
01:49:58.000 Right.
01:50:01.000 Electoral college, congressional seats, everything.
01:50:04.000 It doesn't matter whether you're here.
01:50:05.000 Legally, illegally, if you're a human with a pulse, you count for congressional apportionment.
01:50:12.000 So that means that the more people, the more illegals that California and New York can import by the time the census happens in 2030, the more congressional seats they will have and the more presidential electoral college votes they will have.
01:50:31.000 So they're trying to get as many illegals in as possible ahead of the census.
01:50:38.000 And because all human beings, even tourists, count for the census.
01:50:44.000 And then if you combine that with gerrymandering of districts in New York and California, let me just point out with this proposition where they're trying to increase the amount of gerrymandering that occurs in California, the biggest state in the country.
01:51:00.000 So if the census then would award more congressional seats to California because of a vast number of illegals in New York and Illinois, they'll get more congressional seats.
01:51:14.000 They'll get more presidential electoral college votes that would get them the House, a majority in the House, and they would get to decide who is president, literally based on illegals.
01:51:27.000 These are not disputed facts by either party.
01:51:33.000 I want to emphasize that that's in camp.
01:51:35.000 Yeah, this is not a conspiracy.
01:51:36.000 These are not disputed facts by either party.
01:51:38.000 It's not a – these are just – this is just the way the law works.
01:51:44.000 It is – like I don't think the law should work that way.
01:51:50.000 I think it should – the apportionment should be proportionate to citizens.
01:51:54.000 But isn't that a problem with how the Constitution is written?
01:51:57.000 Yeah.
01:51:57.000 Yeah, yeah.
01:51:58.000 They can't really change that.
01:52:01.000 I'm not sure if it's constitutional or – but it is the way the law is written.
01:52:06.000 I'm not sure if it's in the Constitution or not in this way.
01:52:09.000 But it is – that is the way the law is written.
01:52:11.000 So it is an incentive.
01:52:12.000 But it's an incentive that would be removed with something simple that makes sense to everybody that only the people that it should count are people that are official U.S. citizens.
01:52:22.000 Yes.
01:52:23.000 The way it should work is that only U.S. citizens should count in the census for purposes of determining voting power.
01:52:30.000 Because people that aren't legal can't vote supposedly.
01:52:34.000 They're not supposed to be voting, but they do.
01:52:38.000 But even besides that, like I said, I just can't emphasize this enough because this is a very important concept for people to understand, is that the law – the law as it stands counts all humans with a pulse in a state for deciding how many House of Representative votes and how many presidential electoral college votes a state gets.
01:53:04.000 So the incentive, therefore, is to – for California, New York, Illinois to maximize the number of illegals so they get – so they take House seats away from red states, assign them to California, New York, Illinois and so forth.
01:53:20.000 Then you combine that with extreme gerrymandering in California, New York, Illinois and whatnot so that basically you can't even elect any Republicans and then they get control of the presidency, control of the House.
01:53:34.000 Then they keep doing that strategy and cement a supermajority.
01:53:40.000 That is what they're trying to do.
01:53:43.000 So that would essentially turn the entire country into California.
01:53:45.000 Yes.
01:53:46.000 Where you have differing opinions, but it doesn't matter because one party is always in control.
01:53:50.000 Yes.
01:53:53.000 When you first started digging into this, when you first started – before you even accepted this role of running Doge and being a part of all that, did you have any idea that it was this fucked up?
01:54:05.000 I did, yeah.
01:54:06.000 I mean I sort of – When did you start knowing?
01:54:10.000 I guess about like – well, about two years ago.
01:54:13.000 Isn't that crazy?
01:54:14.000 And like relatively recently, you know.
01:54:16.000 Yeah.
01:54:16.000 So I started having – well, I started like basically having a bad feeling about three years ago, which is why I felt it was like critical to acquire Twitter and have a maximally truth-seeking platform, not one that suppresses the truth.
01:54:36.000 And like it was more like – I'm like, I'm not sure what's going on, but I have a bad feeling about what's going on.
01:54:42.000 And then the more I dug into it, the more I was like, holy shit, we've got a real problem here.
01:54:47.000 America is going to fall.
01:54:49.000 So – Without anyone knowing it had fallen, that would be the problem.
01:54:53.000 It could have fallen and been unrepairable without anyone really being aware of what had happened.
01:55:00.000 Especially if you didn't buy Twitter.
01:55:02.000 Yes.
01:55:03.000 Look, buying Twitter was a huge pain in the ass and made me a pincushion of attacks.
01:55:11.000 Like dab, dab, dab, dab, dab.
01:55:12.000 Everybody loved you before that.
01:55:14.000 Well, some people – A lot of people loved you.
01:55:16.000 A lot of lefties loved you.
01:55:18.000 I was a hero of the left.
01:55:20.000 It's fair to say.
01:55:21.000 It was a thing.
01:55:21.000 If you drove a Tesla, it showed that you were environmentally conscious and you were on the right side.
01:55:27.000 Yeah.
01:55:31.000 Yeah.
01:55:31.000 I mean, I'm still the same human.
01:55:33.000 I didn't, like, have a brain transplant between, you know, since – in, like, three years ago.
01:55:38.000 Well, that's my favorite bumper sticker that people put on Teslas now.
01:55:41.000 I bought this before Elon went crazy.
01:55:43.000 I took a picture of one the other day.
01:55:45.000 Oh, you found this?
01:55:46.000 I was behind somebody.
01:55:46.000 Oh, yeah.
01:55:47.000 I've seen three or four of them.
01:55:49.000 People that have these bumper stickers on their car that says, I bought this before Elon went crazy.
01:55:53.000 Because when people were vandalizing Teslas.
01:55:57.000 Yeah.
01:55:58.000 The most unhinged.
01:55:59.000 Well, there was an organized campaign to literally burn down Teslas.
01:56:04.000 And we had one of our dealerships got shot up with a gun.
01:56:07.000 Like, they fired bullets into the Tesla dealership.
01:56:09.000 They were burning down cars.
01:56:13.000 It was crazy.
01:56:16.000 So – but the bumper sticker should read – there should be an addendum to the bumper sticker.
01:56:21.000 It's like, I bought this car before Elon turned crazy.
01:56:26.000 Actually, now I realize he's not crazy and I've seen the light.
01:56:31.000 That'll take some time.
01:56:33.000 That'll take some time.
01:56:35.000 People don't want to admit that they've been tricked.
01:56:37.000 They don't like that.
01:56:37.000 Yeah.
01:56:38.000 That old saying where it's like, it's really easy to fool somebody, but it's almost impossible to convince someone that they were fooled.
01:56:44.000 Yeah.
01:56:45.000 It's much easier to fool them than to convince them they've been fooled.
01:56:48.000 People cling to their ideas.
01:56:51.000 Especially if they've, like, publicly stated these things.
01:56:51.000 Yes.
01:56:54.000 They get very embarrassed of being foolish.
01:56:57.000 Yeah.
01:56:58.000 People – most of the time they double down.
01:57:01.000 And they find echo chambers.
01:57:03.000 Yeah, yeah.
01:57:04.000 But there's – you know, the thing is that – like, you know, I've seen more and more people who were convinced of the sort of woke ideology see the light.
01:57:17.000 Yeah.
01:57:17.000 So, not everyone, but it's more and more are seeing the light.
01:57:22.000 And it tends to happen, like, when something happens that really, you know, directly affects you.
01:57:26.000 Right.
01:57:28.000 Like, there was a friend of mine who was living in the San Francisco Bay Area and that tried to trans his daughter.
01:57:39.000 Like, to the point where the school, like, sent the police to his house to take his daughter away from him.
01:57:48.000 Now, that's going to radicalize you.
01:57:51.000 Well, that's going to break – that's going to shake you out of your belief structure.
01:57:55.000 Now, I know – So, it was an activist at the school that was trying to do this?
01:57:59.000 Yeah.
01:57:59.000 Yeah, the school and the state of California conspired to turn his daughter against him and make her take life-altering drugs that would have sterilized her and irreversible.
01:58:14.000 And how old was she?
01:58:15.000 I think 14, something like that.
01:58:19.000 So, he managed to talk the police out of taking his daughter away from him that day.
01:58:25.000 And that night, he got on a plane to Texas.
01:58:29.000 Wow.
01:58:32.000 And, you know, a year after just being in a school in, like, greater Austin area, she went back to normal.
01:58:43.000 Meaning, like, it wasn't real.
01:58:45.000 Right.
01:58:47.000 Well, people are being much more open to that now.
01:58:50.000 I mean, Wall Street Journal yesterday had that opinion piece that this whole trans thing, there's a lot of evidence, this is a social contagion.
01:58:59.000 Absolutely.
01:58:59.000 And Colin Wright wrote that, and then he's getting death threats now, of course, and on Blue Sky, there's people talking about exterminating him, which is one thing that you are allowed to say on Blue Sky, apparently.
01:59:10.000 You're allowed to say horrible things about people, say possibly truthful things about this whole social contagion.
01:59:18.000 Because that's what, when you get nine kids that are in a friend group and they all decide to turn trans together.
01:59:24.000 Yeah.
01:59:24.000 Something's wrong.
01:59:25.000 Something's wrong.
01:59:25.000 That's not statistically feasible.
01:59:27.000 Like, you can convince kids to do anything.
01:59:31.000 You can convince kids to be a suicide bomber.
01:59:33.000 So.
01:59:33.000 Right.
01:59:34.000 Which is why they do with, in some countries, why they choose children to do that.
01:59:38.000 Yes.
01:59:39.000 Yeah.
01:59:39.000 You can train kids to be suicide bombers.
01:59:41.000 And if you can train kids to be suicide bombers, you can convince them of anything.
01:59:45.000 Especially with enough positive enforcement.
01:59:45.000 Yeah.
01:59:47.000 And cultural enforcement.
01:59:49.000 And the idea that that's not the case.
01:59:52.000 Kids are malleable.
01:59:55.000 The minds of youth are easily corrupted.
01:59:55.000 Yes.
01:59:57.000 You're also seeing a lot of pushback from gay and lesbian people that are saying, like, hey, if someone did this to me.
02:00:02.000 So stop including me.
02:00:04.000 Yeah, exactly.
02:00:05.000 The LGBT, you know, it's like, wait a second.
02:00:05.000 Yeah.
02:00:07.000 Why are we being included all the time in this situation?
02:00:10.000 Exactly.
02:00:11.000 Exactly.
02:00:11.000 Especially when, you know, like my friend Tim Dillon's talked about this.
02:00:15.000 It's like, it's really homophobic.
02:00:17.000 Because you're taking these gay kids and you're telling them, like, hey, you're not gay.
02:00:21.000 You're actually a girl.
02:00:23.000 Yes.
02:00:23.000 And, you know, hey, go make it so that you can never have an orgasm again.
02:00:28.000 Right.
02:00:28.000 And you'll be happy.
02:00:30.000 Like, fucking crazy.
02:00:31.000 Permanent mutilation, permanent castration of kids is, like, I think we should look at anyone who permanently castrates a kid as, like, right up there with Yosef Mengele.
02:00:45.000 Yeah.
02:00:45.000 I mean, they're mutilating children.
02:00:48.000 Yeah.
02:00:49.000 And it's thought of as being kind.
02:00:49.000 Yeah.
02:00:52.000 And the thing is, would you rather have a live daughter or a dead son?
02:00:57.000 That's the line they use.
02:00:59.000 Yeah.
02:00:59.000 Which is not supported by any data.
02:01:01.000 No.
02:01:02.000 The probability of suicide increases.
02:01:02.000 It's all bullshit.
02:01:05.000 This is important maybe for the audience to know.
02:01:07.000 The probability of suicide increases if you're trans a kid, not decreases.
02:01:13.000 By some accounts, it triples.
02:01:15.000 So that is an evil lie.
02:01:18.000 And it's a lie that is supposedly compassionate.
02:01:22.000 Imagine you've twisted reality to the point where confusing a child that's not even legally allowed to get a fucking tattoo.
02:01:31.000 Yeah.
02:01:31.000 Right?
02:01:32.000 Because you think that you could make a mistake with a tattoo.
02:01:34.000 A totally removable thing.
02:01:34.000 Exactly.
02:01:36.000 If I wanted to tomorrow, I could go to a doctor and they could laser off every tattoo that I have on me.
02:01:36.000 Right.
02:01:41.000 Right.
02:01:42.000 No harm, no foul.
02:01:42.000 Okay.
02:01:43.000 But you get sterilized.
02:01:43.000 Yeah.
02:01:45.000 Like, that's it forever.
02:01:46.000 Forever.
02:01:47.000 Yes.
02:01:48.000 They'll castrate you.
02:01:49.000 You no longer have testicles.
02:01:50.000 Yes.
02:01:51.000 You have a hole where your penis used to be.
02:01:54.000 Yes.
02:01:54.000 And this is compassionate.
02:01:55.000 And this is preventing you from killing yourself.
02:01:56.000 Actually, a lot of kids die with these sex change operations.
02:02:02.000 They die.
02:02:03.000 The number of deaths on the operating table, people don't hear about those.
02:02:07.000 A lot of kids.
02:02:08.000 Because we don't really actually have the technology to make this work.
02:02:14.000 So a bunch of times the kids just die in the sex change operations.
02:02:18.000 Jesus Christ.
02:02:19.000 It's demented.
02:02:19.000 Yeah.
02:02:20.000 It should be viewed as, like, you know, like evil Nazi doctor stuff.
02:02:28.000 Well, that's why it was so— Like real Nazi, not the bullshit fake Nazi stuff.
02:02:31.000 Crazy that even pushing back against something that seems, like, fundamentally, logically very easy to argue, the old Twitter would ban you forever.
02:02:41.000 Yes.
02:02:42.000 That's how crazy a social contagion can get when it completely defies logic, victimizes children, does something that makes no sense, does not supported by data, all connected to this ideology that trans is good.
02:02:59.000 We've got to save trans kids, protect trans kids.
02:03:02.000 And what I want to emphasize is that the save trans kids thing is a lie.
02:03:02.000 Yeah.
02:03:07.000 If you castrate kids and trans them, the probability of suicide increases, it does not decrease.
02:03:15.000 It substantially increases.
02:03:19.000 The studies have done that I've seen the risk of suicide triples if you're trans kids.
02:03:27.000 So, you're not saving them, you're killing them.
02:03:29.000 Moreover, during the sex change operation, there are many deaths that occur during the sex change operation.
02:03:38.000 Jesus Christ.
02:03:40.000 It's just crazy that this is a real issue.
02:03:43.000 Yeah.
02:03:44.000 It's a nightmare fever dream.
02:03:46.000 And people are finally waking up from it.
02:03:50.000 Now, when you started getting into the Doge stuff and started finding how much money is being shuffled around and moved around to NGOs and how much money is involved and just totally untraceable funds, like, this is, again, something like two years plus ago, you weren't aware of it all?
02:04:14.000 No, I was aware of it.
02:04:16.000 I just didn't realize how big it was.
02:04:19.000 It was just how much waste and food there is in the government is truly vast.
02:04:26.000 In fact, the government didn't even know, nor did they care.
02:04:33.000 That's crazy.
02:04:34.000 Yeah.
02:04:36.000 And, I mean, just, like, some of the very basic stuff that Doge did will have lasting effects.
02:04:43.000 And some of these things, like, they're so elementary you can't believe it.
02:04:48.000 So, the Doge team got the – most of the main payments computers to require the congressional appropriation code.
02:05:01.000 So, when a payment is made, you have to actually enter the congressional appropriation code.
02:05:05.000 That used to be optional and often would be just left blank.
02:05:09.000 So, the money would just go out, but it wasn't even tied to a congressional appropriation.
02:05:13.000 And then, the Doge team also made the comment field for the payment mandatory.
02:05:19.000 So, you have to say something.
02:05:20.000 We're not saying that what is said – like, you can say anything.
02:05:24.000 Your cat could run across the keyboard.
02:05:26.000 You could go, QWERTY ASDF.
02:05:28.000 But you have to say something above nothing because what we found was that there were tens of billions, maybe hundreds of billions of dollars that were zombie payments.
02:05:36.000 So, they're – like, somebody had approved a payment.
02:05:41.000 Somebody in the government approved a payment and – some recurring payment.
02:05:45.000 And they retired or died or changed jobs and no one turned the money off.
02:05:53.000 So, the money would just keep going out.
02:05:56.000 And it's a pretty rare – You go where?
02:05:58.000 To a company or an individual.
02:06:03.000 And it's a pretty rare company or individual who will complain that they're getting money that they should not get.
02:06:09.000 And a bunch of the money was just going to the – were transfer payments to the states.
02:06:12.000 So, these are automatic payments.
02:06:14.000 Yeah, just automatic payments.
02:06:15.000 No accounting for them at all.
02:06:16.000 Imagine, like, there's an automatic debit of your credit card.
02:06:20.000 And you never look at the statement.
02:06:22.000 Right.
02:06:23.000 So, it's just money going out.
02:06:27.000 Of course, I call them zombie payments.
02:06:30.000 They might have been legitimate at one point.
02:06:33.000 But the person who approved that recurring payment changed jobs, died, retired or whatever.
02:06:38.000 And no one ever turned the money off.
02:06:42.000 And my guess is that's probably at least $100 billion a year.
02:06:49.000 Maybe $200.
02:06:50.000 And going where?
02:06:53.000 To – I mean, there are millions of these payments.
02:07:00.000 So, it's – I mean – Millions.
02:07:03.000 Yes, yes.
02:07:04.000 Millions of payments that are going to who knows where.
02:07:07.000 Yes.
02:07:08.000 So, in a bunch of cases, there are fraud rings that operate – professional fraud rings that operate to exploit the system.
02:07:15.000 They figure out some security hole in the system and they just do professional fraud.
02:07:24.000 And that's where we found, for example, people who were, you know, 300 years old in the Social Security Administration database.
02:07:31.000 Now, I thought that this was a mistake of not registering their deaths.
02:07:39.000 That people were born like a long time ago and it had defaulted to like a certain number.
02:07:45.000 And so, that after time, those people were still in the system.
02:07:49.000 It was just an error of the way the accounting was done.
02:07:54.000 So, that's not true.
02:07:54.000 Yeah.
02:07:57.000 So, there's – or at least one of two things must be true.
02:08:03.000 There's a typo or some mistake in the computer or it's fraudulent.
02:08:08.000 But we don't have any 300-year-old vampires living in America.
02:08:12.000 Allegedly.
02:08:13.000 Allegedly.
02:08:16.000 And we don't have people in some cases who are receiving payments who are born in the future.
02:08:22.000 Born in the future?
02:08:23.000 Born in the future.
02:08:25.000 Yes.
02:08:25.000 Really?
02:08:27.000 The people receiving payments whose birth date was like 2100 and something.
02:08:34.000 So, there's – Like next century.
02:08:34.000 Okay.
02:08:36.000 Is there a task force?
02:08:37.000 We know that one of two things must be true, that either there's a mistake in the computer or it's fraud.
02:08:45.000 But if you have someone's birthday that's either in the future or where they are older than the oldest living American because the oldest living American is 114 years old.
02:08:53.000 So, if they're more than 114 years old, there is either a mistake and someone should call them and say, I think we have your birthday wrong because it says you were born in 1786.
02:09:09.000 And, you know, that was before, you know, before there was really an America, you know, it was like, you know, that's kind of early.
02:09:20.000 You know, we're still fighting England type of thing.
02:09:24.000 You know, it's like this person either needs to be in the Guinness Book of World Records or they're not alive.
02:09:32.000 But still, at the end of the day, money is going towards that account that's connected to this person that is either nonexistent or dead.
02:09:39.000 So, like, yeah, so there was like, I think, something like, I don't know, 20 million people in the Social Security Administration database that could not possibly be alive.
02:09:53.000 If their birth date is, like, based on their birth date, they could not possibly be alive.
02:09:58.000 And then to be clear, 20 million people that were receiving funds?
02:10:03.000 A bunch of – most of them were not receiving funds.
02:10:07.000 Some of them were receiving funds.
02:10:09.000 Most were not receiving funds.
02:10:10.000 But so let me tell you how the scam works.
02:10:12.000 It's a bank shot.
02:10:13.000 So the Social Security Administration database is used as a source of truth by all the other databases that the government uses.
02:10:21.000 So even if they stop the payments on the Social Security Administration database, like unemployment insurance, small business administration, student loans, all check the Social Security Administration database to say, is this a legitimate, alive person?
02:10:38.000 And the Social Security database will say, yes, this person is still alive even though they're 200 years old.
02:10:44.000 But forget to mention that they're 200 years old.
02:10:45.000 It just says – it just returns – when the computer is queried, it says, yes, this person is alive.
02:10:53.000 And so then they're able to exploit the entire rest of the government ecosystem.
02:10:58.000 So then you get fake student loans.
02:11:00.000 Then you get fake unemployment insurance.
02:11:02.000 Then you get fake medical payments.
02:11:04.000 And this doesn't have to be tied to an individual where there's an address where you can check on this person?
02:11:10.000 No.
02:11:11.000 If you did – if you just did any check at all, you would stop this.
02:11:18.000 So that's – And how much money do you think is being – Any check – like anything at all that would stop the forward.
02:11:25.000 Like any effort at all.
02:11:28.000 Yeah.
02:11:29.000 So there's multiple layers.
02:11:31.000 The Social Security number verifies that this is a real person.
02:11:31.000 Yes.
02:11:33.000 Right.
02:11:34.000 And then the other systems check up on – Every other government payment and every other government payment system for everything – like I said, small business administration, student loans, Medicaid, Medicare, every other government payment, of which there are many.
02:11:49.000 There are actually hundreds of government payment systems.
02:11:51.000 That's going to be exploited so long as Social Security database says this person is alive.
02:11:59.000 That's the nature of the scam.
02:12:00.000 It's a bank shot.
02:12:02.000 So then the rebuttal from the Dems is like, oh, well, the vast majority of the people who are marked as alive in the Social Security Administration weren't receiving Social Security Administration payments.
02:12:10.000 That is true.
02:12:11.000 What they forgot to mention is they're getting fraudulent payments from every other government program.
02:12:16.000 And that's why the Dems were so opposed to turning off – to declaring someone dead who was dead because it would stop the entire other – all the other fraud from happening.
02:12:27.000 And so – but all this – is it trackable?
02:12:29.000 Like all this other fraud, if they wanted to, they could chase it all down.
02:12:33.000 It's not even hard.
02:12:33.000 Yeah.
02:12:35.000 And yet they're opposing chasing it all down.
02:12:38.000 They're opposing chasing it all down because it turns off the money magnet for the illegals.
02:12:45.000 Wow.
02:12:47.000 Because it's very logical to – like I'm saying the most common-sense things possible.
02:12:54.000 If someone's got a birthday in Social Security that is an impossible birthday, meaning they are older than the oldest living American or born in the future, then you should call them and say, excuse me, we seem to have your birthday wrong because it says that you're 200 years old.
02:13:15.000 That's all you need to do.
02:13:18.000 And then you would remove them from the Social Security database and make that number no longer available for all those other government payments.
02:13:24.000 Exactly.
02:13:25.000 Wow.
02:13:29.000 And how much money are we talking?
02:13:30.000 It's hundreds of billions of dollars.
02:13:34.000 And this is all traceable.
02:13:36.000 Like you could hunt all this down.
02:13:37.000 Like you don't need to be Sherlock Holmes here is what I'm saying.
02:13:41.000 Well, this is – We don't need to call Sherlock Holmes for this one.
02:13:43.000 Is this part of the – We just need to call the person and say, excuse me, we seem to have – like we must have your birthday wrong because it says you're 200 years old or we're born in the future.
02:13:56.000 So could you tell us what your birthday is?
02:13:59.000 That's what we need to do.
02:14:01.000 It's that simple.
02:14:02.000 But all these other government payments that are available that are connected to this Social Security number, it seems like if you just chased that all down, you would find the widespread fraud.
02:14:15.000 You would find where it's going.
02:14:18.000 But the root of the problem is the Social Security Administration database because the Social Security number in the United States is used as a de facto national ID number.
02:14:18.000 Yes.
02:14:32.000 That's why – like the bank always asks for your social – like any financial institution will ask for your Social Security number.
02:14:44.000 This is – it sounds so insane that this isn't chased down.
02:14:48.000 That – I mean that in and of itself is – that's such mishandling.
02:14:48.000 Yeah, I agree.
02:14:54.000 Yes.
02:14:58.000 It's mind-blowing.
02:15:00.000 So yeah, it's crazy.
02:15:02.000 Well, you were very reluctant last time you were here to talk about the extent of some of the fraud because you're like, they could kill me because this is kind of – Oh, yeah, what I'm saying is that – like if you create – like to be pragmatic and realistic, you actually can't manage to zero fraud.
02:15:29.000 Yet you can manage to low fraud number but not to zero fraud.
02:15:32.000 If you manage to zero fraud, you're going to push so many people over the edge who are receiving fraudulent payments that the number of inbound homicidal maniacs will be really hard to overcome.
02:15:45.000 So I'm actually taking, I think, quite a reasonable position, which is that we should simply reduce the amount of fraud, which I think is not an extremist position.
02:15:56.000 And we should aspire to, you know, have less fraud over time.
02:16:01.000 Not that we should be ultra draconian and eliminate every last scrap of fraud, which I guess would be nice to have.
02:16:10.000 But like we don't even need to go that extreme.
02:16:12.000 I'm saying we should just stop the blatant large-scale super obvious fraud.
02:16:17.000 I think that's a reasonable position.
02:16:19.000 It's a very reasonable position.
02:16:20.000 And so what was the most shocking pushback that you got when you started implementing Doge, when you started investigating into where money was going?
02:16:33.000 Well, I guess this is – I should have anticipated this.
02:16:38.000 But while most of the fraudulent government payments to – especially to the NGOs go to the Democrats, most of it – like, I don't know, for argument's sake, let's say 80 percent.
02:16:51.000 Maybe 90 percent.
02:16:55.000 10 to 20 percent of it does go to Republicans.
02:16:58.000 And so when we turn off funding to a fraudulent NGO, we'd get complaints from whatever, the 10 percent of Republicans who are receiving the money.
02:17:08.000 And they would, you know, they would very loudly complain.
02:17:15.000 Because the honest answer is the Republicans are partly – they're receiving some of the fraud too.
02:17:23.000 They're getting a big.
02:17:25.000 Jesus.
02:17:28.000 Yeah.
02:17:29.000 I want to be clear.
02:17:30.000 It's not like the Republican Party is some ultra-pure paragon of virtue here.
02:17:36.000 Okay.
02:17:37.000 Well, you see that with the congressional insider training.
02:17:40.000 It's across the board.
02:17:41.000 Yeah.
02:17:42.000 It's left and right.
02:17:43.000 I mean the whole uniparty criticism has some validity to it.
02:17:47.000 You know, there's – so – and it's – like if you turn off fraudulent payments, it's not like – like I said, it's not like 100 percent of those payments were going to Democrats.
02:17:58.000 A small percentage were also going to Republicans.
02:18:01.000 Those Republicans complained very loudly.
02:18:06.000 And, you know, and that's – so there was a lot of pushback on the Republican side when we started cutting some of these funds.
02:18:18.000 And I tried telling them like, well, you know, 90 percent of the money is going to your opponents.
02:18:23.000 But they still – even if they're getting 10 percent of the money – They want their piece.
02:18:27.000 Yeah.
02:18:28.000 They want their piece.
02:18:28.000 And they've been getting that piece for a long time.
02:18:30.000 Yes.
02:18:34.000 This is why like, you know, politics is like – It's dirty business.
02:18:39.000 I mean that's like saying like, you know, if you like sausages and respect the law, do not watch either of them being made.
02:18:39.000 Yeah.
02:18:48.000 Yeah.
02:18:52.000 Wow.
02:18:55.000 Well, that's not even true because I've made sausage before.
02:18:57.000 Yeah, yeah.
02:18:58.000 It's actually – Yeah.
02:18:58.000 It's like it's not that big a deal.
02:19:00.000 It's fat and spices and casing running through the machine.
02:19:00.000 Yeah.
02:19:00.000 It's not that big a deal.
02:19:04.000 Not that big a deal.
02:19:05.000 Yeah.
02:19:07.000 But, yeah.
02:19:08.000 I mean I think the stuff I'm saying here is not – like if you stand back and think about it for a second like, oh, yeah, that makes sense.
02:19:17.000 You know?
02:19:17.000 Yeah.
02:19:19.000 It's not like – it's not like one political party is going to be, you know, pure devil or pure angel.
02:19:28.000 There's – you know, I think there's much more corruption on the Democrat side but it's not – there's not – there's still some corruption on the Republican side.
02:19:36.000 How did it happen that the majority of the corruption wound up being on the Democrat side?
02:19:41.000 Well, because the transfer payments, especially to illegals, are very much on the Democrat side.
02:19:49.000 So that's the root of it all is the illegal situation.
02:19:52.000 Yes.
02:19:52.000 I mean there's – Or a focal point.
02:19:55.000 It would also be accurate to say that while – obviously not everyone who is a Democrat is a criminal.
02:20:05.000 Almost everyone who is a criminal is a Democrat because the Democrats are the soft-run crime party.
02:20:15.000 So if you're a criminal, who are you going to vote for?
02:20:18.000 Right.
02:20:20.000 Right.
02:20:20.000 The soft-run crime party.
02:20:21.000 Did you think you were going to be able to get more done than you were?
02:20:28.000 We did get a lot done.
02:20:30.000 And Doge is still happening, by the way.
02:20:30.000 Right.
02:20:34.000 The Doge is still underway.
02:20:36.000 There are still – there are still – there's still waste and fraud being cut by the Doge team.
02:20:43.000 So it hasn't stopped.
02:20:45.000 It's less publicized.
02:20:46.000 It's less publicized.
02:20:49.000 And they don't have like a clear person to attack anymore.
02:20:52.000 Well, it seems like once you stepped away – They basically – they applied immense pressure to me to stop it.
02:20:59.000 So then I'm like the best thing for me is to just cut out of this.
02:21:03.000 And in any case, as a special government employee, I could only be there for like 120 days anyway, something like that.
02:21:08.000 So whatever the law says.
02:21:09.000 So I could – I was necessarily – could only be there for four months as a special government employee.
02:21:15.000 So – but yeah.
02:21:20.000 I mean, you turn off the money spigot to fraudsters, they get very upset to say the least.
02:21:27.000 And – but my – like my death threat level went ballistic, you know.
02:21:32.000 It was like a rocket going to orbit.
02:21:37.000 So – but now that I'm not in D.C., I guess they don't really have a person to attack anymore.
02:21:46.000 Well, the rhetoric about you has calmed down significantly.
02:21:50.000 Yeah.
02:21:51.000 It was disturbing.
02:21:52.000 It was disturbing to watch.
02:21:53.000 It was like this is crazy.
02:21:55.000 And to watch these politicians engage in it and all these people just like framing you as this monster.
02:22:00.000 I was like this is so weird.
02:22:02.000 Like this is what happens when you uncover fraud.
02:22:04.000 But yes.
02:22:05.000 The whole machine turns on you.
02:22:07.000 And if it wasn't for a person like you who owns a platform and has an enormous amount of money, like it could have destroyed you.
02:22:14.000 Yeah.
02:22:15.000 And that was the goal.
02:22:16.000 The goal was to destroy me.
02:22:17.000 Absolutely.
02:22:17.000 Because you were getting in the way.
02:22:19.000 Yeah.
02:22:20.000 Of this amazing graft.
02:22:22.000 This gigantic fraud machine.
02:22:24.000 Yeah.
02:22:25.000 Like I said, I think Doge team has done a lot of good work.
02:22:31.000 You know, in terms of fraud and waste prevented, my guess is it's, you know, probably on the order of $200 or $300 billion a year.
02:22:40.000 So it's pretty good.
02:22:41.000 And what do you think could have been done if you just had like full reign and total cooperation?
02:22:46.000 How much do you think you could have saved?
02:22:49.000 I mean what level of power are we assuming here?
02:22:53.000 Godlike.
02:22:54.000 Oh, yeah.
02:22:54.000 Probably cut the federal budget in half and get more done.
02:23:00.000 That is so crazy.
02:23:03.000 It is so crazy.
02:23:04.000 Get more done and federal budget in half.
02:23:06.000 It's that widespread.
02:23:09.000 Well, I mean a whole bunch of government departments simply shouldn't exist in my opinion.
02:23:13.000 They, you know.
02:23:16.000 Like examples.
02:23:18.000 Well, the Department of Education, which was created recently, like under Jimmy Carter, our educational results have gone downhill ever since it was created.
02:23:31.000 So if you create a department and the result of creating that department is a massive decline in educational results and it's the Department of Education, you're better off not having it.
02:23:41.000 Because literally we did better before there was one than after.
02:23:46.000 When you let the states run it.
02:23:47.000 Yes.
02:23:47.000 Yeah.
02:23:48.000 Because at least the states can compete with one another.
02:23:52.000 So – but the problem is like you hear like cutting the department of education.
02:23:55.000 Our kids need education.
02:23:57.000 Yeah, they do.
02:23:57.000 But this is a new department that didn't even exist, you know, until the late 70s.
02:24:06.000 And ever since that department was created, the results, educational results have declined.
02:24:12.000 And so why would you have an institution continue that has made education worse?
02:24:19.000 It doesn't make sense.
02:24:21.000 They killed it though, right?
02:24:22.000 No, they still – unfortunately.
02:24:23.000 But they were trying to kill it.
02:24:25.000 It has been substantially reduced.
02:24:27.000 Okay.
02:24:28.000 What other organizations?
02:24:31.000 What other departments?
02:24:32.000 Well, I mean I'm a small government guy.
02:24:35.000 So, you know, when the country was created, we just had the Department of State, Department of War, you know, and sort of the Department of Justice.
02:24:50.000 We had an attorney general and Treasury Department.
02:24:56.000 I don't know why you need more than that.
02:25:00.000 So what other departments specifically do you think are just completely ineffective?
02:25:05.000 Well, I mean here it's like a question – it's a sort of philosophical question of how much government do you think there should be?
02:25:11.000 Right.
02:25:13.000 In my opinion, there should be the least amount of government.
02:25:17.000 I've heard the most bizarre argument against this is that you're cutting jobs and you're going to leave people jobless.
02:25:23.000 And I'm like, but their jobs are useless.
02:25:25.000 Yeah, paying people to do nothing doesn't make sense.
02:25:29.000 Like there's a great – there's a story about like Milton Friedman who is awesome.
02:25:39.000 Generally, whatever Milton Friedman said is people should do that thing.
02:25:43.000 I'm not sure if it's apocryphal or not.
02:25:45.000 But like someone complained to him like – he observed, I think, people that were like digging ditches with shovels.
02:25:58.000 And he said – well, like allegedly Friedman said, well, I think you should use, you know, excavating equipment instead of shovels.
02:26:08.000 And you could get it done with far fewer people.
02:26:11.000 And then someone said, but then we're going to lose a lot of jobs.
02:26:14.000 Well, then Friedman said, well, in that case, why don't you have them use teaspoons?
02:26:23.000 Just dig ditches with teaspoons.
02:26:25.000 Think of all the jobs you'll create.
02:26:27.000 I mean – it's bullshit.
02:26:31.000 Basically, you just want people to work on things that are productive.
02:26:35.000 You want people to work on building things, on building – providing products and services that people find valuable, like making food, being a farmer or a plumber or electrician or just anyone who's a builder or providing useful services.
02:26:57.000 And that's what you want people to be doing, not fake government jobs that don't add any value or may subtract value.
02:27:08.000 But it's also like – to illustrate the absurdity of also how is the economy measured, like the way economists measure the economy is nonsensical.
02:27:20.000 Because they'll measure any job, no matter – even if that job is a dumb job, that has no point and is even counterproductive.
02:27:26.000 So like the joke is like there's two economists going on a hike in the woods.
02:27:33.000 They come across a pile of shit and one economist says to the other, I'll pay you $100 to eat that shit.
02:27:42.000 The economist eats the shit, gets the $100.
02:27:44.000 They keep walking.
02:27:45.000 Then the other – then they come across another pile of shit and the other economist says, now, I'll pay you $100 to eat the pile of shit.
02:27:55.000 So he pays the other economist $100 to pile of shit.
02:28:00.000 Then they say, look, wait a second.
02:28:04.000 We both just ate a pile of shit and we're no – we don't have any more extra money.
02:28:13.000 Like we both – you just gave the $100 back to me and we both ate a pile of shit.
02:28:18.000 This doesn't make any sense.
02:28:19.000 And they said, no, no, but think of the economy because that's $200 in the economy.
02:28:26.000 That basically – eating shit would count as a job.
02:28:37.000 This is to illustrate the absurdity of economics.
02:28:43.000 One of the things you said when you – Eating shit should not count as a job.
02:28:46.000 One of the things you said when you stepped away is that you're kind of done and that it's unfixable.
02:28:54.000 Or under its current form, the way people are approaching it.
02:29:02.000 You can make it directionally better but ultimately you can't fully fix the system.
02:29:11.000 So I – like it is – it would be accurate to say that even – like unless you could go like super draconian, like Genghis Khan level on cutting waste and fraud, which you can't really do in a democratic country, an aspirationally democratic country, then there's no way to solve the debt crisis.
02:29:39.000 So we've got national debt that's just insane where the debt payments – the interest payments on the debt exceed our entire military budget.
02:29:47.000 I mean that was one of the wake-up calls for me.
02:29:49.000 I was like, wait a second.
02:29:50.000 The interest on our national debt is bigger than the entire military budget and growing?
02:29:59.000 This is crazy.
02:30:07.000 So even if you implement all these savings, you're only delaying the day of reckoning for when America becomes – goes bankrupt.
02:30:15.000 So – unless you go full Genghis Khan, which you can't really do.
02:30:20.000 So I came to the conclusion that the only way that – the only way to get us out of the debt crisis and to prevent America from going bankrupt is AI and robotics.
02:30:36.000 So like we need to grow the economy at a rate that allows us to pay off our debt.
02:30:51.000 And I guess people just generally don't appreciate the degree to which the government overspending is a problem.
02:31:01.000 But even – like the Social Security website, this is under the Biden administration.
02:31:06.000 On the website, I would say like we – based on current demographic trends and how much money Social Security is bringing in versus how many Social Security recipients there are because we have an aging population.
02:31:19.000 Relatively speaking, the average age is increasing.
02:31:22.000 Social Security will not be able to maintain its full payments.
02:31:25.000 I think by 2032.
02:31:30.000 So Social Security will have to stop – start reducing the amount of money that's been paid to people in about seven years.
02:31:38.000 And so the only way to fix that, robotics, manufacturing, raise GDP?
02:31:45.000 You've got to basically massively increase the economic output, which is – and the only way to do that is AI and robotics.
02:31:55.000 So basically, we're going bankrupt without AI and robotics even with a bunch of savings.
02:32:02.000 The savings – like reducing waste and forward can give us a longer runway, but it cannot ultimately pay off our national debt.
02:32:10.000 So what do you think the solution is to the jobs that are going to be lost because of AI and robotics, the jobs due to automation, the jobs due to – no longer do we need human beings to do these jobs because AI is doing them?
02:32:24.000 Do you think it's going to be some sort of a universal basic income thing?
02:32:27.000 Do you think there's going to be some other kind of solution that has to be implemented?
02:32:34.000 Because a lot of people are going to be out of work, right?
02:32:38.000 I think there will be actually a high demand for jobs but not necessarily the same jobs.
02:32:47.000 So, I mean, this is actually – this process has been happening throughout modern history.
02:32:55.000 I mean, there used to be – like doing calculations manually with like a pencil and paper used to be a job.
02:33:03.000 So they used to have like buildings full of people, cold computers where the banks would – like all you do all day is – you do calculations because they didn't have computers.
02:33:14.000 They didn't have digital computers that people do.
02:33:18.000 Yeah.
02:33:19.000 Well, it was just people who just like add and subtract stuff on a piece of paper and that would be how banks would do financial processing.
02:33:27.000 And you'd have to literally go over their equations to make sure the books are balanced.
02:33:30.000 Yeah.
02:33:31.000 And most times it's just simple math.
02:33:34.000 Like in a world before computers, how did you calculate – how did you do transactions?
02:33:39.000 You had to do them by hand.
02:33:42.000 So then when computers were introduced, the job of doing bank calculations no longer existed.
02:33:51.000 So people had to go do something else.
02:33:55.000 And that's what's going to happen.
02:33:56.000 That's what is happening at an accelerated rate due to AI and then robotics.
02:34:02.000 That's the issue though, right?
02:34:03.000 The accelerated rate because it's going to be – It's the accelerated – it's just happening.
02:34:07.000 Like I said, AI is the supersonic tsunami.
02:34:12.000 So that's why I call it, supersonic tsunami.
02:34:17.000 So – It's like what other jobs will be available that aren't available now because of AI?
02:34:26.000 Well, AI will – is really still digital.
02:34:30.000 Ultimately, AI can improve the productivity of humans who build things with their hands or do things with their hands.
02:34:37.000 Like literally welding, electrical work, plumbing, anything that's physically moving atoms, like cooking food or farming or – like anything that's physical, those jobs will exist for a much longer time.
02:34:58.000 But anything that is digital, which is like just someone at a computer doing something, AI is going to take over those jobs like lightning.
02:35:09.000 Coding, anything along those lines.
02:35:11.000 Yeah.
02:35:12.000 It's going to take over those jobs like lightning.
02:35:15.000 Just like digital computers took over the job of people doing manual calculations, but much faster.
02:35:24.000 So what happens to all those people?
02:35:25.000 Like what kind of numbers are we talking about?
02:35:27.000 Like you're going to lose most drivers, right?
02:35:29.000 Commercial drivers.
02:35:30.000 You're going to have automated vehicles, AI-controlled systems.
02:35:34.000 Just like there's certain ports in China and I think in Singapore where everything is completely automated.
02:35:40.000 Yeah.
02:35:41.000 Mostly.
02:35:42.000 Yeah.
02:35:42.000 Yeah.
02:35:42.000 So you're going to lose a lot of those jobs, longshoremen jobs, trucking, commercial drivers.
02:35:49.000 Yeah.
02:35:50.000 I mean we actually do have a shortage of truck drivers, but there's actually – Well, that's why California has hired so many illegals to do it.
02:35:57.000 Have you seen those numbers?
02:35:59.000 Yeah.
02:36:00.000 I mean the problem is like when people don't know how to drive a semi-truck, which is actually a hard thing to do, then they crash and kill people.
02:36:08.000 Yeah.
02:36:09.000 A friend of mine's wife was killed by an illegal driving a truck and she was just out biking and there was an illegal – he didn't know how to drive the truck or something.
02:36:20.000 I mean he ran her over.
02:36:25.000 So I mean the thing is like for something – like you can't let people drive sort of an 80,000-pound semi if they don't know how to do it.
02:36:41.000 But in California, they're just letting people do it.
02:36:43.000 Because they need people to do it.
02:36:45.000 Well, they also need – they want the votes and that kind of thing.
02:36:50.000 But yeah, like cars are going to be autonomous.
02:36:56.000 But there's just so many desk jobs where really what people are doing is they're processing email or they're answering the phone.
02:37:05.000 And just anything that is – that isn't moving atoms, like anything that is not physically – like doing physical work, that will obviously be the first thing.
02:37:14.000 Those jobs will be and are being eliminated by AI at a very rapid pace.
02:37:23.000 And ultimately, working will be optional because you'll have robots plus AI and we'll have, in a benign scenario, universal high income.
02:37:38.000 Not just universal basic income, universal high income, meaning anyone can have any products or services that they want.
02:37:46.000 But there will be a lot of trauma and disruption along the way.
02:37:49.000 So you anticipate a basic income from – that the economy will boost to such an extent that a high income would be available to almost everybody.
02:38:02.000 So we'd essentially eliminate poverty.
02:38:05.000 In the benign scenario, yes.
02:38:08.000 So like – There's multiple scenarios.
02:38:11.000 There are multiple scenarios.
02:38:12.000 There's a lot of ways this movie can end.
02:38:15.000 Like the reason I'm so concerned about AI safety is that like one of the possibilities is the Terminator scenario.
02:38:20.000 It's not zero percent.
02:38:25.000 So that's why it's like – I'm like really banging the drum on AI needs to be maximally truth-seeking.
02:38:33.000 Like don't make – don't force AI to believe a lie like that, for example, the founding fathers were actually a group of diverse women or that misgendering is worth a nuclear war.
02:38:43.000 Because if that's the case and then you get the robots and the AI becomes omnipotent, it can enforce that outcome.
02:38:54.000 And then – unless you're a diverse woman, you're out of the picture.
02:39:01.000 So we're toast.
02:39:03.000 So that's – Or you might wake up as a diverse woman one day.
02:39:07.000 The AI has adjusted the picture and we are now a diverse woman.
02:39:12.000 Everyone's a diverse woman.
02:39:13.000 So that would be – that's the worst possible situation.
02:39:17.000 So what would be the steps that we would have to take in order to implement the benign solution where it's universal high income?
02:39:27.000 Like best case scenario, this is the path forward to universal high income for essentially every single citizen that the economy gets boosted by AI and robotics to such an extent that no one ever has to work again.
02:39:44.000 And what about meaning for those people, which is – which gets really weird?
02:39:49.000 Yeah.
02:39:51.000 I don't know how to answer the question about meaning.
02:39:54.000 That's an individual problem, right?
02:39:57.000 But it's going to be an individual problem for millions of people.
02:40:01.000 Yeah.
02:40:06.000 Well, I mean, I – I guess I've like fought against saying like – you know, I've been a voice saying like, hey, we need to slow down AI.
02:40:16.000 We need to slow down all these things.
02:40:19.000 And we need to, you know, not have a crazy AI race.
02:40:23.000 I've been saying that for a long time, for 20 plus years.
02:40:27.000 But then I came to realize that really there's two choices here, either be a spectator or a participant.
02:40:34.000 And if I'm a spectator, I can't really influence the direction of AI.
02:40:40.000 But if I'm a participant, I can try to influence the direction of AI and have a maximally truth-seeking AI with good values that loves humanity.
02:40:48.000 And that's what we're trying to create with Grok at XAI.
02:40:53.000 And, you know, the research is, I think, bearing this out.
02:40:55.000 Like I said, when they compared like how do AIs value the weight of a human life, Grok was the only one, the only one of the AIs that weighted human life equally.
02:41:08.000 And didn't say like a white guy's worth one-twentieth of a black woman's life.
02:41:15.000 Literally, that's what the calculation they came up with.
02:41:20.000 So I'm like, this is very alarming.
02:41:22.000 We've got to watch this stuff.
02:41:23.000 So this is one of the things that has to happen in order to reach this benign solution.
02:41:30.000 Yeah.
02:41:32.000 Best movie ending.
02:41:34.000 Yeah.
02:41:35.000 You want a curious, truth-seeking AI.
02:41:40.000 And I think a curious, truth-seeking AI will want to foster humanity.
02:41:45.000 Because we're much more interesting than a bunch of rocks.
02:41:51.000 Like you said, I love Mars, you know.
02:41:53.000 But Mars is kind of boring.
02:41:56.000 It's just a bunch of red rocks.
02:41:58.000 There's some cool stuff.
02:41:59.000 It's got a tall mountain.
02:42:00.000 It's got the biggest ravine and the tallest mountain.
02:42:05.000 But there's no animals or plants and there's no people.
02:42:12.000 And, you know, so humanity is just much more interesting, if you're a curious, truth-seeking AI, than not humanity.
02:42:20.000 It's just much more interesting.
02:42:23.000 I mean, like, as humans, we could go, for example, and eliminate all chimps.
02:42:30.000 If we said, if we put our minds to it, we could say, we could go out and we could annihilate all chimps and all gorillas.
02:42:36.000 But we don't.
02:42:38.000 There has been encroachment on their environment, but we actually try to preserve the chimps and gorilla habitats.
02:42:49.000 And I think in a good scenario, AI would do the same with humans.
02:42:54.000 It would actually foster human civilization and care about human happiness.
02:43:01.000 So this is a thing to try to achieve, I think.
02:43:07.000 But what does the landscape look like if you have Grok competing with open AI, competing with all these different – like, how does it work?
02:43:17.000 Like, what – if you have AIs that have been captured by ideologies that are side-by-side competing with Grok, like, how do we – so this is one of the reasons why you felt like it's important to not just be an observer, but participate and then have Grok be more successful and more potent than these other applications.
02:43:44.000 As long as there's at least one AI that is maximally truth-seeking, curious, and, you know, and, for example, ways all, you know, human lives equally does not favor one race or gender, then that – and people are able to look at, you know, Grok at XAI and compare that and say, wait a second, why are all these other AIs being basically sexist and racist?
02:43:44.000 Yes.
02:44:15.000 And then that causes some embarrassment for the other AIs and then they affect – you know, they improve.
02:44:23.000 They tend to improve just in the same way that acquiring Twitter and allowing the truth to be told and not suppressing the truth forced the other social media companies to be more truthful.
02:44:37.000 In the same way, having Grok be a maximally truth-seeking, curious AI will force the other AI companies to also be more truth-seeking and fair.
02:44:51.000 And the funniest thing is even though like the socialists and the Marxists are in opposition to a lot of your ideas, but if this gets implemented and you really can achieve universal high income, that's the greatest socialist solution of all time.
02:45:08.000 Like literally no one will have to work.
02:45:10.000 Correct.
02:45:13.000 Like I said, so there is a benign scenario here, which I think probably people will be happy with as long as we achieve it, which is sustainable abundance, which is if everyone can have – like if you ask people like, what's the future that you want?
02:45:32.000 And I think a future where we haven't destroyed nature, like you can still – we have the national parks, we have the Amazon rainforest, it's still there.
02:45:40.000 We haven't paved the rainforest.
02:45:44.000 Like the natural beauty is still there.
02:45:46.000 But people have – nonetheless, everyone has abundance.
02:45:51.000 Everyone has excellent medical care.
02:45:52.000 Everyone has whatever goods and services they want.
02:45:55.000 It kind of sounds like heaven, basically.
02:45:57.000 It is like the ideal socialist utopia.
02:46:01.000 And this idea that the only thing you should be doing with your time is working in order to pay your bills and feed yourself sounds kind of archaic considering the kind of technology that's at play.
02:46:14.000 Like a world where that's not your concern at all anymore.
02:46:14.000 Yeah.
02:46:19.000 Everybody has money for food.
02:46:22.000 Everybody has abundance.
02:46:23.000 Everybody has electronics in their home.
02:46:25.000 Everybody essentially has a high income.
02:46:28.000 Now you can kind of do whatever you want.
02:46:31.000 And your day can now be exploring your interests, doing things that you actually enjoy doing.
02:46:38.000 Your purpose just has to shift.
02:46:41.000 Instead of, you know, I'm a hard worker and this is what I do and that's how I define myself.
02:46:47.000 Now you can fucking golf all day.
02:46:50.000 You know, you can – whatever it is that you enjoy doing can now be your main pursuit.
02:46:56.000 Yeah.
02:46:57.000 Well, that sounds crazy good.
02:46:59.000 Yeah.
02:47:00.000 That's the benign scenario that we should be aiming for.
02:47:03.000 The best ending to the movie is actually pretty good.
02:47:07.000 Yes.
02:47:09.000 Like I think there is still this question of meaning, of like making sure people don't lose meaning.
02:47:17.000 You know, like so hopefully they can find meaning in ways that are – that's not derived from their work.
02:47:23.000 And purpose.
02:47:24.000 Purpose for things that you – you know, find things that you do that you enjoy.
02:47:28.000 But there's a lot of people that are independently wealthy that spend most of their time doing something they enjoy.
02:47:34.000 Right.
02:47:35.000 And that could be the majority of people.
02:47:38.000 Pretty much everyone.
02:47:39.000 But we would have to rewire how people approach life.
02:47:42.000 Mm-hmm.
02:47:43.000 Which seems to be like acceptable because you're not asking them to be enslaved.
02:47:47.000 You're exactly asking them the opposite.
02:47:50.000 Like no longer be burdened by financial worries.
02:47:54.000 Now, go do what you like.
02:47:58.000 Yes.
02:47:59.000 Go fucking test pizza.
02:48:01.000 Do whatever you want.
02:48:03.000 Pretty much.
02:48:04.000 Um, so that's, uh, that's, that's the, that's the, that's probably the best case outcome.
02:48:11.000 That sounds like the best case outcome period for the future.
02:48:15.000 If you're looking at like how much people have struggled just to feed themselves all throughout history, food, shelter, safety.
02:48:22.000 If all of that stuff can be fixed – like how much would you solve a lot of the crime if there was a universal high income?
02:48:33.000 Just think of that.
02:48:34.000 Like how much of crime is financially motivated?
02:48:37.000 You know, the greater percentage of people that are committing crimes live in poor, disenfranchised neighborhoods.
02:48:43.000 So if there's no such thing anymore, if you really can achieve universal high income, this is, it sounds like a utopian.
02:48:52.000 Yes.
02:48:53.000 Um, I think some people may commit crime because they like committing crime.
02:48:57.000 Just some amount of that is they just enjoy it.
02:49:00.000 There's a lot of wild people out there.
02:49:01.000 Yeah.
02:49:01.000 Yeah.
02:49:02.000 And obviously they've become 40 years old living a life like that.
02:49:07.000 Now all of a sudden universal high income is not going to completely stop their instincts.
02:49:11.000 Yeah.
02:49:12.000 Um, I mean I guess if you want to have like, like say read a science fiction book or some books that are probably inaccurate or the least inaccurate version of the future, I'd say I'd recommend the Ian Banks books, the culture books.
02:49:27.000 It's not actually a series, it's a, it's like a sci, sci-fi books about the future that generally called the culture books, Ian Banks culture books.
02:49:36.000 It's worth reading those.
02:49:37.000 When did he write these?
02:49:38.000 He started writing them in the seventies.
02:49:40.000 Um, and I think he, um, the last one, I think he was, I think it was written just like around, I don't know, maybe 2010 or something.
02:49:49.000 I'm not sure exactly.
02:49:51.000 Yeah.
02:49:52.000 Yeah.
02:49:52.000 Scottish author, Ian Banks.
02:49:54.000 Yeah.
02:49:54.000 From 87 to 2012.
02:49:56.000 Yeah.
02:49:57.000 Interesting.
02:49:58.000 But he, but like he wrote the, the, the, like his first book, Consider Fleevers, I think he started writing that in the seventies.
02:50:08.000 And there are books are incredible, by the way.
02:50:10.000 Oh.
02:50:11.000 Incredible books.
02:50:12.000 4.6 stars on Amazon.
02:50:16.000 Interesting.
02:50:17.000 So, um.
02:50:20.000 So this gives me hope.
02:50:22.000 Uh, yeah, yeah, yeah.
02:50:22.000 This is the first time I've ever thought about it this way.
02:50:25.000 Yeah.
02:50:25.000 Well, I mean, if it, like, I often ask people, what is the future that you want?
02:50:34.000 And they have to think about it for a second.
02:50:36.000 Cause you know, they were usually tied up in whatever the daily struggles are, but, but you say, what is the future that you want?
02:50:44.000 Um, and, um, and generally sustainable abundance, or at least say, what about a future where there's sustainable abundance?
02:50:51.000 And it's like, oh yeah, that's a pretty good future.
02:50:54.000 Um, so, um, you know, if, if, and, and, and that, that future is attainable with, uh, AI and robotics.
02:51:03.000 Um, but, but, you know, it's, it's, like I said, there's, not every path is a good path.
02:51:10.000 Uh, there's this, it's, but I think if we, if we push it in the direction of, um, maximally truth-seeking and curious, then I think AI will want to take, to, to take care of humanity and foster, uh, foster humanity.
02:51:29.000 Um, because we're interesting.
02:51:34.000 Um, and if it hasn't been programmed to think that, uh, like all straight white males should die, which Gemini was basically programmed to do it, at least first, um, you know, they seem to have fixed that, hopefully fixed it.
02:51:49.000 But don't you think culturally, like, oh, we're getting away from that mindset and that people are realizing how preposterous that all is?
02:51:57.000 We are getting away from it.
02:51:59.000 Um, so, uh, we are getting, at least it knows AI.
02:52:05.000 It mostly knows to hide things, but like, like I said, there is that, I think I still have that as, or I had that as my, like, pinned post on X, which was like, uh, hey, wait a second, guys.
02:52:14.000 We still have, uh, every AI except Grok, uh, is saying that, uh, basically straight white males should die.
02:52:21.000 Um, and this is a problem and we should fix it.
02:52:24.000 Um, you know, but simply me saying that is like, tends to generally result in, um, you know, them like, ooh, that is kind of bad.
02:52:36.000 Uh, maybe we should just, we should not have all straight white males die.
02:52:39.000 Um, I think they have to say also, all, all, all, uh, straight Asian males should also die, uh, as well.
02:52:45.000 They'd like, they don't like, uh, like generally, the, generally the AI and the, and the media, which, which back in the day, the, the, the, the, the media was, um, you know, racist against, uh, uh, black people.
02:53:00.000 And sexist against women back in the day, now, now it is a racist against, uh, white people and Asians and rate and sexist against men.
02:53:09.000 Um, so are they just like being racist and sexist?
02:53:13.000 I think they just want to change the target.
02:53:15.000 Um, so, uh, but, but really they just shouldn't be, uh, racist and sexist at all.
02:53:22.000 Um, you know, yeah, ideally that would be nice.
02:53:25.000 That would be nice.
02:53:26.000 Um, and it's kind of crazy that we were kind of moving in that general direction until around 2008.
02:53:30.000 2012 and then everything ramped up online and, and everybody was accused of being a Nazi and everybody was transphobic and racist and sexist and homophobic and everything got exaggerated to the point where it was this wild witch hunt where everyone was a Columbo looking for racism.
02:53:48.000 Yeah, yeah, yeah, totally.
02:53:49.000 Um, well, well, but, but, but they, they were openly anti-white and often openly anti-Asian.
02:53:54.000 And then this new sentiment that you cannot be racist against white people because racism is power and influence.
02:54:03.000 Okay.
02:54:04.000 No, it's not.
02:54:05.000 Racism is, is, is racism in the absolute.
02:54:05.000 Yeah.
02:54:08.000 Um, so, um, you know, and it just needs to be consistency.
02:54:12.000 So if it's okay to have, uh, let's say, uh, black or Asian or Indian or a pride, it should be okay to have white pride too.
02:54:23.000 Yeah.
02:54:24.000 Um, so that's just a, that's just a consistency question.
02:54:28.000 Um, so, uh, you know, um, if, if it's okay to be proud of one religion, it should be okay to be proud of, I guess, all religions provided there that they're, they're not like, uh, oppressive.
02:54:41.000 Yeah.
02:54:41.000 Or, or, or don't like, as long as part of that religion is not like exterminating, uh, people who are not in that religion.
02:54:47.000 Right.
02:54:48.000 Um, so, uh, it's really just like a consistency bias.
02:54:54.000 Um, or, or just like, uh, ensuring consistency to eliminate, uh, bias.
02:55:03.000 Um, so if it is possible to be, uh, racist against, uh, one race, it is possible to be racist against any race.
02:55:13.000 Um, so.
02:55:14.000 Of course, logically.
02:55:16.000 Yes.
02:55:16.000 Yeah.
02:55:17.000 And arguing against that is that's when you know you're captured.
02:55:19.000 It's a, it's a logical inconsistency that makes AIs go insane.
02:55:23.000 And people.
02:55:24.000 And people go insane.
02:55:26.000 Yes.
02:55:26.000 Oh.
02:55:27.000 But like the, the, like, like you can't simultaneously say, um, that, uh, that there's, the systemic, uh, racist oppression, but also that races don't exist.
02:55:27.000 Um.
02:55:41.000 That, that, that race, race is a social construct.
02:55:45.000 Like, which is it?
02:55:46.000 You know, um, you also can't say that, um, you know, anyone who steps foot in America is, is automatically an American, except for the people that originally came here.
02:55:59.000 Exactly, exactly.
02:56:01.000 Except for the colonizers.
02:56:03.000 Yeah.
02:56:03.000 Except for the evil colonizers who came here.
02:56:05.000 Right.
02:56:05.000 So which one is it?
02:56:07.000 Like, if you, if as soon as you step foot in a place you are that, you are just as American as everyone else, then, um, that would have applied.
02:56:15.000 If you apply that consistently, then the original white settlers were also just as American as everyone else.
02:56:22.000 Yeah.
02:56:23.000 Logically.
02:56:24.000 Logically.
02:56:25.000 Um, one more thing that I have to talk to you about before you leave is the rescuing of the people from the space station, which, uh, we talked about.
02:56:34.000 When you were planning it the last time you were here, um, the, the, the lack of coverage that that got in mainstream media was one of the most shocking things.
02:56:47.000 Yeah.
02:56:47.000 They totally memory hold that thing.
02:56:49.000 Wild.
02:56:50.000 Yes.
02:56:50.000 Because if it wasn't a few.
02:56:51.000 It's like it didn't exist.
02:56:52.000 Those people would be dead.
02:56:54.000 They'd be stuck up there.
02:56:55.000 Well, they'd, they'd probably still be alive, but they'd, they'd, they'd, they'd be having bone density issues, uh, because of prolonged exposure to zero gravity.
02:57:03.000 Well, they were already up there for like eight months, right?
02:57:05.000 Yeah.
02:57:06.000 Like, which is an insanely long time.
02:57:07.000 It takes forever to recover just from that.
02:57:10.000 Yeah.
02:57:11.000 They're only supposed to be at the space station for three to six months maximum.
02:57:15.000 So.
02:57:16.000 One of the things that you told me that was so crazy was that you could have gotten them sooner, but.
02:57:20.000 Yeah.
02:57:21.000 But for political reasons, uh, they didn't, they did not want, uh, SpaceX or me to be associated with, um, returning the astronauts before the election.
02:57:31.000 That is so wild.
02:57:34.000 That that's a fact.
02:57:36.000 First of all, that even.
02:57:37.000 We absolutely could have done it.
02:57:38.000 Um, so.
02:57:39.000 But even though you did do it and you did it after the election, it received almost no media coverage anyway.
02:57:45.000 Because nothing good can, the, the, the, the media, which is essentially, uh, a follow of profit, the legacy mainstream media is a follow of propaganda machine.
02:57:45.000 Yes.
02:57:53.000 Um, and so anything, any story that is positive about someone who is not, uh, part of the sort of far left tribe will not, uh, get any coverage.
02:58:04.000 So I could save a busload of orphans and, and it, it wouldn't get a single new story.
02:58:10.000 Yeah.
02:58:11.000 It's, it really is nuts.
02:58:13.000 It was nuts to watch because even though it was discussed on podcasts and it was discussed on X and it was discussed on social media, it's still, it was a blip in the news cycle.
02:58:25.000 It was very quick.
02:58:26.000 It was in and out.
02:58:27.000 And because it was a successful launch and you did rescue those people and nobody got hurt and there was nothing really to, there was no blood to talk about.
02:58:36.000 Right.
02:58:36.000 Just fucking in and out.
02:58:38.000 Yeah.
02:58:38.000 Absolutely.
02:58:38.000 Yeah.
02:58:39.000 Well, and, and, and, as you saw firsthand with the Starship, uh, launch, like Starship is, um, you know, by, you know, at least by some, some would consider it to be like the most amazing, uh, you know, engineering project that's happening on earth right now outside of like, you know, maybe AI or AI and robotics.
02:58:59.000 But, but certainly in terms of a spectacle to see, it is, uh, the most spectacular thing that is happening on earth right now, uh, is the Starship launch program, which anyone can go and see if they just go to South Texas and just, they can just rent a hotel room, low cost in South Padre Island or in Brownsville.
02:59:19.000 And you can see the launch and you can drive right, right past the factory because it's on a public highway.
02:59:25.000 Um, but it gets no coverage or what coverage it does get.
02:59:30.000 It was like, uh, rocket blew up coverage.
02:59:32.000 Right.
02:59:33.000 Oh, he's a fuckwit.
02:59:33.000 Yeah.
02:59:34.000 The rocket blew up.
02:59:35.000 Like the, the, the, the, the, the, the, the program is vastly, vastly, vastly more capable than the entire Apollo moon program, vastly more capable.
02:59:45.000 This is a spaceship that is designed to make life multi-planetary, to carry, uh, millions of people across the heavens to another planet.
02:59:59.000 The Apollo program could, the Apollo program could, could only send astronauts to the moon for a few hours at a time.
03:00:07.000 Like they could send two out, the entire Apollo program could only send astronauts to visit the moon very briefly and then for a few hours and then depart.
03:00:15.000 The starship program could create an entire, uh, uh, lunar base with a million people.
03:00:24.000 The magnitudes are different, very different magnitudes here.
03:00:30.000 So what was the political resistance though?
03:00:33.000 I mean, no, no coverage of it.
03:00:34.000 Yeah.
03:00:35.000 The, but what I wanted to ask you is like, what, so what were the conversations leading up to the rescue?
03:00:41.000 Like when you were like, I can get them out way quicker.
03:00:46.000 Yeah.
03:00:47.000 Um, um, um, well, I mean, you know, I raised this a few times, but it was the, I was told instructions came from the white house that, uh, you know, that, that, that there should be no attempt to rescue before the election.
03:01:03.000 That should be illegal.
03:01:05.000 Um, that, that really should be a horrendous miscarriage of justice for those poor people that were stuck on that.
03:01:13.000 Um, yeah, it is, it is crazy.
03:01:16.000 Um, have you ever talked to those folks afterwards?
03:01:18.000 Did you have conversations with them?
03:01:20.000 Yeah.
03:01:20.000 I mean, they're, they're not going to say anything political to, you know, they're not like, they're never going to say thank you.
03:01:26.000 Well, that's nice.
03:01:26.000 Yeah.
03:01:28.000 So, um, But the instructions came down from the white house.
03:01:28.000 Yeah.
03:01:28.000 Yeah.
03:01:28.000 Absolutely.
03:01:33.000 You cannot rescue them because politically this is a bad hand of cards.
03:01:39.000 I mean, they didn't say because politically it's a bad hand of cards, but they, they just said, uh, they weren't, they were not interested in, uh, any rescue operation before the election.
03:01:53.000 Yeah.
03:01:54.000 So.
03:01:54.000 What did that feel like?
03:01:55.000 I wasn't surprised.
03:01:57.000 But it's crazy.
03:01:59.000 Yeah.
03:02:00.000 Because Biden could have authorized it and they could have said the Biden administration is helping bring those people back, throw you a little funding, give you some money to do it.
03:02:09.000 The Biden administration, they funded these people being returned.
03:02:13.000 Uh, yeah, the Biden administration was not exactly my best friend, especially after I, um, you know, you know, help Trump get elected, get elected, which, I mean, some people still think, you know, Trump is like the devil basically.
03:02:32.000 Um, and I mean, I think, I think, I think Trump actually is, he's not, he's not perfect, but, but, uh, he's not evil.
03:02:40.000 Trump is not evil.
03:02:41.000 I mean, I spent a lot of time with, with him and he's, I mean, he's a product of his time, uh, but he is not, he's not evil.
03:02:51.000 Um, no, I don't think he's evil either.
03:02:53.000 But if you look at the media coverage, the media, the media, the media treats him like he's super evil.
03:02:58.000 It's pretty shocking.
03:02:58.000 Yeah.
03:02:59.000 If you look at the amount of negative coverage, like one of the things that I looked at the other day was mainstream media coverage of you, Trump, a bunch of different public figures.
03:03:12.000 And then 96% negative or something crazy.
03:03:14.000 And then Mom Donnie, which is like 95% positive.
03:03:19.000 Right.
03:03:20.000 Um, I mean, Mom Donnie is, is, is, is a charismatic swindler.
03:03:27.000 Um, I, I, I mean, you gotta hand it to him.
03:03:29.000 Like he, he, he did, he can light up a stage.
03:03:31.000 Um, but he has just been a swindler his entire life.
03:03:36.000 Um, and, um, you know, and, and, uh, I think he, he, he, he, he's, I mean, he's likely to win.
03:03:47.000 Like he's likely to be mayor of New York, New York city.
03:03:50.000 Very likely.
03:03:50.000 Yeah.
03:03:51.000 I think, um, poly market has it at what?
03:03:51.000 Very likely.
03:03:54.000 What is the, yeah, that sounds pretty likely.
03:03:57.000 That's crazy.
03:03:58.000 Like, I'm not sure who the 6% are, you know?
03:04:00.000 Um, so, so, so yeah, so that's, um.
03:04:05.000 Well, it's also like, who's on the other side?
03:04:07.000 The fucking guardian angel guy with the beret?
03:04:10.000 And Andrew Cuomo who doesn't even have a party?
03:04:12.000 Like, the Democrats don't even want him.
03:04:15.000 So you have those two options.
03:04:19.000 Um, and then you have the young kids who are like, finally, socialism.
03:04:23.000 Um, yeah, they, they don't know what they're talking about, obviously.
03:04:29.000 Um, so, you know, like, you just look at this, say, how many boats come from Cuba to Florida?
03:04:37.000 And how many, but, and how many boats, because, you know, there's like a constant, I always think, like, how many boats are accumulating on the shores of Florida coming from, from Cuba?
03:04:45.000 Right.
03:04:46.000 Um, there's, there's a whole bunch of free boats that you could, if you want to, go take them back to Cuba.
03:04:53.000 It's pretty close.
03:04:54.000 Yeah.
03:04:55.000 But for some reason, people don't do that.
03:04:58.000 Why, why, why, why are the boats only coming in this direction?
03:05:03.000 Well, who is, who are the most rabid capitalists in America?
03:05:03.000 Um.
03:05:06.000 The fucking Cubans.
03:05:08.000 Absolutely.
03:05:08.000 Yeah.
03:05:09.000 They're like, we've seen how this story goes.
03:05:11.000 We do not want, exactly.
03:05:13.000 Fuck off.
03:05:15.000 The Cubans in Miami, they don't want to hear anything.
03:05:18.000 Bullshit.
03:05:19.000 They don't want to hear any socialism bullshit.
03:05:21.000 They're like, no, no, no.
03:05:22.000 We know what this actually is.
03:05:23.000 This isn't just some fucking dream.
03:05:26.000 It's extreme government oppression.
03:05:26.000 Yeah.
03:05:27.000 Yeah.
03:05:28.000 That's how it is.
03:05:29.000 And it's a nightmare.
03:05:30.000 And like, an obvious way you can tell which, uh, which ideology is, is the bad one is, um, who has to, which ideology is building a wall to keep people in and prevent them from escaping.
03:05:45.000 Right.
03:05:45.000 Like, so, East Berlin built the wall, not West Berlin.
03:05:51.000 Right.
03:05:52.000 They built the wall because people were trying to escape from communism to West Berlin.
03:05:57.000 But there wasn't anyone going from West Berlin to East Berlin.
03:06:01.000 Right.
03:06:02.000 That's why the communists had to build a wall to keep people from escaping.
03:06:07.000 They're going to have to build a wall around New York City.
03:06:10.000 Yeah.
03:06:11.000 So, it's kind of an obvious tell that ideology is problematic if that ideology has to build a wall to keep people in with machine guns.
03:06:21.000 And shoot you if you try to leave.
03:06:21.000 Yes.
03:06:23.000 Also, there's no examples of it being successful ever.
03:06:26.000 We're working out for people.
03:06:28.000 No, there's examples of a bunch of lies, like North Korea, give this land to the state, we'll be in control of food, no one goes hungry.
03:06:35.000 No, now no one can grow food but the government, and we'll tell you exactly what you eat, and you eat very little.
03:06:41.000 Right.
03:06:42.000 When you say Mamdani's a swindler, I know he has a bunch of fake accents that he used to use, but what else has he done that makes him a swindler?
03:06:55.000 Well, I guess if you say to any audience whatever that audience wants to hear, instead of having a consistent message, I would say that that is a swindly thing to do.
03:07:23.000 But he is charismatic.
03:07:26.000 Yeah, good-looking guy, smart, charismatic, great on a microphone.
03:07:31.000 Yeah, yeah, yeah, yeah.
03:07:32.000 And what the young people want to see, you know?
03:07:35.000 Like this ethnic guy who's young and vibrant and has all these socialist ideas and aligns with them.
03:07:43.000 And, you know, they're a bunch of broke dorks just out of college, like, yay, let's vote for this.
03:07:50.000 And there's a lot of them.
03:07:52.000 And they're activated.
03:07:53.000 They're motivated.
03:07:55.000 Yeah, yeah.
03:07:57.000 I guess we'll see what happens here.
03:07:59.000 What do you think happens if he wins?
03:08:04.000 Because, like, 1% of New York City is responsible for 50% of their tax base, which is kind of nuts.
03:08:13.000 50% of the tax revenue comes from 1% of the population, and those are the people that you're scaring off.
03:08:22.000 You know, you lose one half of 1%.
03:08:25.000 I mean, hopefully the stuff he's said, you know, about government takeovers of, like, that all the stores should be the government, basically.
03:08:36.000 Well, I don't think he said that.
03:08:37.000 I think he said they want to do government supermarkets, some state-run or city-run supermarkets.
03:08:44.000 Yeah.
03:08:46.000 Well, it's just – the government is the DMV at scale.
03:08:49.000 So, you have to say, like, do you want the DMV running your supermarket?
03:08:53.000 Right.
03:08:54.000 Was your last experience at the DMV amazing?
03:08:57.000 And if it wasn't, you probably don't want the government doing things.
03:09:00.000 Imagine if they were responsible for getting you blueberries.
03:09:03.000 Yeah.
03:09:05.000 It's not going to be good.
03:09:06.000 I mean, the thing about, you know, communism is it was all bread lines and bad shoes.
03:09:13.000 You know, do you want ugly shoes and bread lines?
03:09:16.000 Because that's what communism gets you.
03:09:19.000 It's going to be interesting to see what happens and whether or not they snap out of it and overcorrect and go to some Rudy Giuliani-type character next.
03:09:29.000 Because it's been a long time since there was any sort of Republican leader there.
03:09:33.000 We live in the most interesting of times because we face the – you know, simultaneously face civilizational decline and incredible prosperity.
03:09:59.000 And these timelines are interwoven.
03:10:05.000 So, if Mamdani's policies are put into place, especially at scale, it would be a catastrophic decline in living standards, not just for the rich but for everyone.
03:10:19.000 As has been the case with every socialist experiment or every – yeah.
03:10:29.000 So, but then, as you pointed out, the irony is that, like, the ultimate capitalist thing of AI and robotics enabling prosperity for all and abundance of goods and services, actually the capitalist implementation of AI and robotics, assuming it goes down the good path, is actually what results in the communist utopia.
03:10:58.000 Yeah, because fate is an irony maximizer.
03:11:03.000 Right.
03:11:04.000 And an actual socialism of maximum abundance, of high-income people.
03:11:11.000 Universal high-income.
03:11:12.000 Yeah.
03:11:13.000 Like, the problem with communism is it's universal low-income.
03:11:19.000 It's not that everyone gets elevated.
03:11:22.000 It's that everyone gets oppressed except for a very small minority of politicians who live a life of luxury.
03:11:29.000 That's what's happening every time it's been done.
03:11:34.000 So, but then the actual communist utopia, if everyone gets anything they want, will be achieved – if it is achieved, it will be achieved via capitalism because fate is an irony maximizer.
03:11:56.000 I feel like we should probably end it on that.
03:11:58.000 Is there anything else?
03:11:59.000 The most ironic outcome is the most likely, especially if entertaining.
03:12:02.000 Well, everything has been entertaining.
03:12:05.000 As long as the bad things aren't happening to you, it's quite fascinating.
03:12:05.000 Yeah.
03:12:08.000 And it's never a boring moment.
03:12:11.000 So, there's – I do have a theory of why – like, if simulation theory is true, then it is actually very likely that the most interesting outcome is the most likely because only the simulations that are interesting will continue.
03:12:11.000 Yes.
03:12:37.000 The simulators will stop any simulations that are boring because they're not interesting.
03:12:42.000 But here's the question about the simulation theory.
03:12:44.000 Is the simulation run by anyone or is – It would be run by someone.
03:12:49.000 It would be run by – Some – Some force.
03:12:53.000 The program.
03:12:53.000 Like, in this reality that we live in, we run simulations all the time.
03:12:58.000 Like, so when we try to figure out if the rocket's going to make it, we run thousands, sometimes millions of simulations just to figure out which path is the good path for the rocket and where can it go wrong, where can it fail.
03:13:17.000 But when we do these, I'd say at this point, millions of simulations of what can happen with the rocket, we ignore the ones that are where everything goes right because we just care about – we have to address the situations where it goes wrong.
03:13:37.000 So, basically, and for AI simulations as well, like all these things, we keep the simulations going that are the most interesting to us.
03:13:50.000 So, if simulation theory is accurate – if it is true, who knows – then the simulators will only – they will continue to run the simulations that are the most interesting.
03:14:05.000 Therefore, from a Darwinian perspective, the only surviving simulations will be the most interesting ones.
03:14:12.000 And in order to avoid getting turned off, the only rule is you must keep it interesting or you will – because the boring simulations will be terminated.
03:14:24.000 Are you still completely convinced that this is a simulation?
03:14:27.000 I didn't say I was completely convinced.
03:14:29.000 Well, you said it's like the odds of it not being are in the billions.
03:14:33.000 Like I said, it's not completely because you're saying there's a chance.
03:14:38.000 What are the odds that we're in base reality?
03:14:44.000 Well, given that we're able to create increasingly sophisticated simulations, so if you think of, say, video games and how video games have gone from very simple video games like Pong with two rectangles and a square to video games today being photorealistic with millions of people playing simultaneously, and all of that has occurred in our lifetime.
03:15:08.000 So, if that trend continues, video games will be indistinguishable from reality.
03:15:15.000 The fidelity of the game will be such that you don't know if that – what you're seeing is a real video or a fake video.
03:15:24.000 And like AI-generated videos at this point, like you can sometimes tell it's an AI-generated video, but often you cannot tell.
03:15:32.000 And soon you will not really just not be able to tell.
03:15:36.000 So, if that's happening in our direct observation, and we'll create millions if not billions of photorealistic simulations of reality, then what are the odds that we're in base reality versus someone else's simulation?
03:15:58.000 Well, isn't it just possible that the simulation is inevitable, but that we are in base reality building towards a simulation?
03:16:08.000 We're making simulations.
03:16:13.000 So, we're making simulations.
03:16:17.000 We make – like you can just think of like photorealistic video games as being simulations.
03:16:23.000 Mm-hmm.
03:16:25.000 And especially as you apply AI in these video games, like the characters in the video games will be incredibly interesting to talk to.
03:16:31.000 They won't just have a limited dialogue tree where if you go to like the crossbow merchant or like – and you try to talk about any subject except buying a crossbow, they just want to talk about selling you a crossbow.
03:16:43.000 But with AI-based non-player characters, you'll be able to have an elaborate conversation with no dialogue tree.
03:16:50.000 Well, that might be the solution for meaning for people.
03:16:53.000 Just log in and you could be a fucking vampire and whatever.
03:16:57.000 You live in Avatar land.
03:16:58.000 You could do it – you could do whatever you want.
03:17:01.000 I mean, you don't have to think about money or food.
03:17:03.000 Ready Player One.
03:17:04.000 Yeah.
03:17:05.000 Yeah.
03:17:05.000 Literally.
03:17:06.000 But with higher living standards.
03:17:09.000 You don't have to be a little trailer.
03:17:09.000 Yeah.
03:17:11.000 I mean, I think this – people do want to have some amount of struggle or something they want to push against.
03:17:20.000 But it could be, you know, playing a sports or playing a game or something like that.
03:17:24.000 It could be easily playing a game.
03:17:25.000 Yeah, yeah.
03:17:25.000 And especially playing a game where you're now no longer worried about like physical attributes, like athletics, like bad joints and hips and stuff like that.
03:17:34.000 Now it's completely digital.
03:17:37.000 But yet you do have meaning in pursuing this thing that you're doing all day, whatever the fuck that means.
03:17:46.000 It's going to be weird.
03:17:48.000 It's going to be interesting.
03:17:49.000 It's going to be very interesting.
03:17:52.000 The most interesting and usually ironic outcome is the most likely.
03:17:59.000 All right.
03:18:00.000 That's a good predictor of the future.
03:18:02.000 Thank you.
03:18:02.000 Thanks for being here.
03:18:03.000 Really appreciate you.
03:18:04.000 Good to see you.
03:18:05.000 Appreciate your time.
03:18:06.000 I know you're a busy man, so this means a lot to come here and do this.