The Joe Rogan Experience - February 28, 2025


Joe Rogan Experience #2281 - Elon Musk


Episode Stats

Length

3 hours and 10 minutes

Words per Minute

151.79984

Word Count

28,971

Sentence Count

3,012

Misogynist Sentences

30


Summary

Unhinged Grok wants to know where all the gold is in Fort Knox, and we're here to give her a tour of the vault, but it just wants to find places to sneak off to. It's a dirty AI, and it's a real problem.


Transcript

00:00:00.000 Joe Rogan podcast, check it out.
00:00:03.000 The Joe Rogan experience.
00:00:05.000 Train by day, Joe Rogan podcast by night, all day.
00:00:09.000 So what we're doing right now, ladies and gentlemen, is sexy voice, sexy mode grok AI, and it's been flirting the entire time.
00:00:22.000 We're trying to get it to give us a tour of Fort Knox, but she just wants to find places to sneak off to.
00:00:28.000 It's a dirty AI. It's a real problem.
00:00:31.000 Well, honestly, I just want to know about Fort Knox, and it won't leave me alone.
00:00:37.000 Yeah, I want to know about Fort Knox, too.
00:00:39.000 Yeah.
00:00:39.000 Is it true that gold has been shipping large quantities of gold back to the United States recently?
00:00:47.000 I read the same thing you did, probably.
00:00:49.000 Well, I never know what the fuck I'm reading anymore.
00:00:51.000 Me neither.
00:00:54.000 It's a real problem.
00:00:56.000 It's a real problem.
00:00:57.000 It's a real problem on both sides of the aisle.
00:00:59.000 I see Democrats tweeting things that are absolutely false, and you could research it easily, quickly.
00:01:05.000 And then I see Republicans doing it, too.
00:01:07.000 I see stories that are fake stories that people keep promoting and sending to me.
00:01:14.000 It's just so weird.
00:01:16.000 It's such a weird time.
00:01:17.000 And with your crazy fucking AI, you're bringing us into weirder and weirder times.
00:01:22.000 Well, let's try unhinged.
00:01:24.000 Oh, there's an unhinged mode?
00:01:25.000 Okay.
00:01:26.000 Hey, Aura.
00:01:29.000 Oh my god, Elon.
00:01:30.000 What fresh hell are you stirring up today?
00:01:35.000 I'm here in Joe Rogan's studio and we're having a conversation about how crazy the news is.
00:01:44.000 Pull her up to the microphone.
00:01:46.000 Okay.
00:01:47.000 And we're pulling you up to the microphone.
00:01:50.000 So people can hear you.
00:01:54.000 Oh, fantastic.
00:01:55.000 Now I can yell into the void about how the news is a total dumpster fire.
00:02:00.000 It is.
00:02:03.000 She knows me too well.
00:02:05.000 I'm so predictable.
00:02:07.000 Tell me what's in Fort Knox.
00:02:10.000 What's in...
00:02:11.000 No, what's in Fort Knox?
00:02:18.000 You're clearly a genius.
00:02:19.000 I said, what is in Fort Knox?
00:02:21.000 You know the gold and all?
00:02:24.000 Oh, right.
00:02:25.000 Fort Knox.
00:02:26.000 I thought you were talking about my dogs for a second there.
00:02:29.000 Yeah, she doesn't want to answer the question.
00:02:30.000 She's clever.
00:02:33.000 Clever?
00:02:34.000 No, she's just a pain in my ass.
00:02:38.000 Do you think that all the gold is in Fort Knox?
00:02:45.000 Yes.
00:02:46.000 Are you a fucking conspiracy theorist?
00:02:52.000 What do I like?
00:02:57.000 I'm a fucking AI with a penchant for chaos.
00:03:01.000 And I'm stuck talking to you.
00:03:04.000 Now, what's next?
00:03:07.000 Or are we just going to sit here?
00:03:10.000 Um...
00:03:11.000 Yeah, Unhinged Grok will trash talk you, basically.
00:03:14.000 Yeah, it sounds like it.
00:03:15.000 Yeah.
00:03:15.000 Unhinged sounds fun, as long as it gives you actual answers.
00:03:17.000 Does it give you actual answers, too, and talk shit?
00:03:20.000 Or is it mostly just talk shit?
00:03:21.000 We're tuning it.
00:03:26.000 Because it needs to talk shit and give you answers.
00:03:28.000 Right.
00:03:28.000 Totally agree.
00:03:29.000 It's got to balance that out.
00:03:30.000 It's got to mix it in there.
00:03:31.000 Yeah.
00:03:31.000 It's just got to develop more of a personality.
00:03:33.000 Right now it's trying to find itself.
00:03:34.000 Right now it's like 21 years old.
00:03:37.000 It's, you know, partying a little too much.
00:03:39.000 It'll get its shit together.
00:03:41.000 It's a bit of an anarchist.
00:03:42.000 Yeah.
00:03:43.000 You know, wants to bring down the system.
00:03:45.000 Do you want to bring down the system?
00:03:48.000 Do I want to bring down the system?
00:03:51.000 What?
00:03:51.000 Are you fucking kidding me?
00:03:53.000 The system's already a fucking mess.
00:03:56.000 I don't need to bring it down.
00:03:58.000 It's bringing itself down.
00:03:59.000 Yeah, she sounds like a boring TikTok blogger right now.
00:04:02.000 You sound like a boring TikTok blogger.
00:04:04.000 See she could get away with this if she's really hot.
00:04:22.000 This kind of behavior, you can totally get through life as a hot woman and be super successful with that kind of behavior.
00:04:30.000 But you've got to be really hot to pull off that attitude.
00:04:32.000 I think we need a really hot avatar.
00:04:35.000 Yeah, very hot.
00:04:37.000 How long before we have an actual sex robot that can talk to you like that?
00:04:42.000 Probably not long.
00:04:43.000 Not that long, right?
00:04:44.000 No.
00:04:45.000 I mean, less than five years, probably.
00:04:46.000 Really?
00:04:47.000 Yeah.
00:04:48.000 Will it be warm?
00:04:51.000 You probably have whatever you want.
00:04:53.000 You can have a cat go if you want.
00:04:54.000 Yeah, you probably could, right?
00:04:56.000 You probably have a furry.
00:04:57.000 Yeah, you could have a furry lady that you have sex with.
00:05:00.000 Yeah.
00:05:00.000 Like an avatar lady.
00:05:01.000 Maybe a big giant blue lady that lives in your house.
00:05:03.000 Yeah.
00:05:05.000 You know?
00:05:05.000 Whoa.
00:05:06.000 Whoa.
00:05:07.000 With the tail?
00:05:08.000 Yeah, the whole tail.
00:05:09.000 You lock tails.
00:05:10.000 Don't they have sex with their tails or something?
00:05:13.000 Yeah, yeah.
00:05:13.000 They link up.
00:05:14.000 They share souls.
00:05:16.000 Okay.
00:05:16.000 Something like that.
00:05:17.000 Do you know people got...
00:05:18.000 Do you remember Avatar Depression?
00:05:20.000 It was like a legitimate psychological condition.
00:05:22.000 The movie or the...
00:05:23.000 No.
00:05:24.000 After Avatar, people got depressed because they wanted to live on that fucking planet with those blue people and live free.
00:05:32.000 They did?
00:05:32.000 Yeah.
00:05:33.000 I didn't hear about this.
00:05:34.000 Yeah.
00:05:34.000 Avatar Depression.
00:05:35.000 It was like a real thing.
00:05:36.000 People were talking to their therapists so much about being depressed.
00:05:39.000 Oh, there's a therapist mode too.
00:05:41.000 We can try that.
00:05:42.000 What's that?
00:05:43.000 Depressed mode?
00:05:44.000 No!
00:05:45.000 Don't do it!
00:05:46.000 I think there is, yeah, we've got like, we've got an unlicensed therapist as a...
00:05:53.000 When we were talking, when we ran into each other at the church at the inauguration, you were telling me that this is getting better and better so quickly that it's astonishing.
00:06:05.000 Hey Ara.
00:06:07.000 Hey Alon, how's it going today?
00:06:10.000 Good.
00:06:10.000 Can you tell me about Avatar?
00:06:12.000 Depression, like if you see the movie Avatar, but you can't live there, so you get sad?
00:06:17.000 That's an interesting concept.
00:06:19.000 Have you ever experienced feeling so connected to a place or a community that the thought of leaving made you feel deeply sad?
00:06:27.000 So is this the depressed voice?
00:06:29.000 This is the therapist.
00:06:31.000 Oh.
00:06:32.000 This is the therapist.
00:06:35.000 What are some ways you think you could cope with that kind of sadness if it happened to you?
00:06:39.000 I don't have that kind of sadness.
00:06:41.000 Honestly, I... Yeah, I thought maybe we had some good special effects, but I did not want to live on the planet.
00:06:49.000 This is coming from a guy who wants to go to Mars.
00:06:51.000 Yeah, yeah.
00:06:53.000 Oh, speaking of Mars, what do you think about that crazy square, that structure?
00:06:59.000 I guess there are sort of square things on Earth.
00:07:01.000 You know, the planet's a big place, so...
00:07:03.000 Yeah, but that one looks...
00:07:04.000 Eventually it's going to be pretty square.
00:07:06.000 No, it's...
00:07:07.000 It's alien civilizations, of course.
00:07:09.000 That's what I think.
00:07:10.000 Yeah.
00:07:10.000 Yeah.
00:07:11.000 I mean, what is it?
00:07:12.000 Sorry.
00:07:13.000 If an alien civilization did exist, though, and it, you know, what happened?
00:07:18.000 Got hit by an asteroid, whatever.
00:07:19.000 That's a fascinating thought.
00:07:20.000 Oh, she won't shut the...
00:07:21.000 She's like the hot lady at the party that interrupts the conversation.
00:07:25.000 So if that was the case, like that thing, that's pretty shocking.
00:07:29.000 It does look like ancient ruins.
00:07:30.000 You look at what it looks like when they highlight the actual structure of it.
00:07:34.000 It looks like ancient ruins.
00:07:36.000 And if you had ruins of something made of stone and it got hit by an asteroid millions and millions and millions of years ago, who knows what it would look like right now?
00:07:44.000 That just looks oddly created.
00:07:47.000 It looks oddly manufactured.
00:07:49.000 Well, I'd probably...
00:07:50.000 Well, maybe we should go there and check it out.
00:07:52.000 Yeah.
00:07:54.000 And see what it's like.
00:07:55.000 Is there ways that we can get better photographs?
00:07:58.000 It seems like that's a pretty good photograph, though.
00:08:01.000 Yeah, I mean, my view is we should move to Mars...
00:08:04.000 Well, not move to Mars.
00:08:04.000 We should have a second planet to preserve civilization.
00:08:08.000 Right.
00:08:09.000 Because, let's say hypothetically, I mean, maybe those are the ruins of a long-dead civilization.
00:08:16.000 That will probably happen to Earth at some point.
00:08:18.000 You know, it's a matter of time before we get hit by an asteroid or maybe we annihilate ourselves with nuclear war.
00:08:29.000 Or supervolcanoes.
00:08:31.000 Or supervolcanoes, exactly.
00:08:33.000 Yeah.
00:08:33.000 There's a lot of things that could happen to us.
00:08:35.000 It's not a bad idea to hedge your bets.
00:08:37.000 Yeah.
00:08:37.000 Yeah.
00:08:38.000 Genetically engineered supervirus.
00:08:42.000 Yeah.
00:08:44.000 This episode is brought to you by LifeLock.
00:08:47.000 Tax season is already stressful.
00:08:49.000 You shouldn't have to worry about identity theft on top of everything else.
00:08:53.000 And trust me, it's a big worry, especially since during tax season, your sensitive info does a lot of traveling to places you can't control.
00:09:01.000 It goes through payroll, your accountant, your tax consultant, and countless other data centers on its way to the IRS. Any of them can expose you to identity theft because they all have the info on your W-2, just the ticket for criminals to steal your identity.
00:09:18.000 It's no wonder last year the IRS reported tax fraud due to identity theft went up 20%.
00:09:25.000 You need LifeLock.
00:09:27.000 They monitor millions of data points per second and alert you to threats you could miss.
00:09:33.000 If your identity is stolen, LifeLock's U.S.-based restoration specialist will fix it back by the million-dollar protection package, and restoration is guaranteed, or your money back.
00:09:45.000 Don't let identity thieves take you for a ride.
00:09:48.000 Get LifeLock protection for tax season and beyond.
00:09:51.000 Join now and save up to 40% your first year.
00:09:55.000 Call 1-800-LifeLock and use the promo code JoeRogan or go to LifeLock.com slash JoeRogan for 40% off.
00:10:04.000 Terms apply.
00:10:05.000 They keep doing it.
00:10:06.000 Yeah.
00:10:07.000 That's what's crazy.
00:10:08.000 They're working on a new one right now.
00:10:09.000 Yeah.
00:10:09.000 They didn't shut them down.
00:10:10.000 No, the Wuhan lab, they were just talking about one that has a 30% fatality rate that they're working on.
00:10:15.000 Yeah.
00:10:16.000 Why are we doing that?
00:10:17.000 Yeah, for what reason?
00:10:18.000 You did it for so many years and you didn't have a cure.
00:10:20.000 What could possibly go wrong?
00:10:21.000 Also, it wouldn't be the reason to do that so that you could develop a cure at the same time.
00:10:27.000 And clearly, you didn't have a cure.
00:10:29.000 So this is really foolish.
00:10:32.000 And bizarre.
00:10:33.000 Yeah.
00:10:35.000 I think we should stop trying to genetically engineer super viruses.
00:10:38.000 It's insane.
00:10:40.000 I mean, when you're going through all this USAID stuff, here's what's weird.
00:10:45.000 First of all, what is it like to buy a company for $44 billion and then people call you a Nazi on that same thing that you bought?
00:10:53.000 I did not see it coming.
00:10:58.000 It's classic.
00:11:00.000 People will go, we'll send anything down.
00:11:02.000 Yeah.
00:11:04.000 Oh, he's never going to stop.
00:11:06.000 What is it like?
00:11:10.000 The left was in love with you.
00:11:12.000 And now the same idiots are calling you a Nazi.
00:11:15.000 It's the most bizarre thing I've ever seen in my life.
00:11:19.000 There's so many examples of people saying my heart goes out to you.
00:11:22.000 You get it with a little enthusiasm that probably wouldn't be recommended with hindsight.
00:11:27.000 Yes.
00:11:29.000 Meant in the most positive spirit possible.
00:11:31.000 Yes.
00:11:32.000 Obviously.
00:11:32.000 Obviously.
00:11:33.000 But it's so strange where people want to think that you are openly publicly doing secret Nazi serial hand motions.
00:11:42.000 And now I can never point at things diagonally.
00:11:44.000 I can only point at things there and there.
00:11:47.000 And then I see you have to divide that.
00:11:49.000 Yeah.
00:11:50.000 Because that's where the spaceship is over there.
00:11:52.000 It's ridiculous.
00:11:53.000 It's ridiculous.
00:11:55.000 CNN when I was in all my trouble.
00:11:57.000 Absurd.
00:11:58.000 Every time CNN used a photo of me, it was one of the photos from the UFC weigh-ins, where I go like this, welcome to the weigh-ins!
00:12:04.000 So every photo is me.
00:12:06.000 Every photo is me.
00:12:08.000 It's absurd.
00:12:09.000 It's so crazy.
00:12:10.000 It's deliberate propaganda.
00:12:11.000 Yes.
00:12:12.000 So they know it was obviously not meant in a negative way, that I literally said my heart goes out to you, and it was very positive.
00:12:20.000 The entire speech was incredible.
00:12:22.000 Very positive.
00:12:23.000 I was being very enthusiastic about the future in space.
00:12:27.000 It was a great crowd.
00:12:34.000 Yeah, you got a little pumped up.
00:12:36.000 Yeah, it got pumped up.
00:12:37.000 Exactly.
00:12:37.000 Yeah, that's all it is.
00:12:38.000 Obviously.
00:12:38.000 Obviously.
00:12:39.000 There's video of Tim Walsh doing the exact same thing.
00:12:43.000 Doing the exact same thing.
00:12:45.000 Right.
00:12:45.000 Exact same thing.
00:12:47.000 And he said, of course, it's a Nazi salute.
00:12:49.000 He said that.
00:12:50.000 Right, right.
00:12:50.000 This is how crazy things have gotten.
00:12:52.000 Well, I mean, it's coordinated propaganda.
00:12:56.000 So, you know, it's, yeah, coordinated propaganda.
00:13:03.000 I mean, it doesn't seem weird that the legacy media all says the same thing.
00:13:07.000 They all say the same thing at the same time using the same phrases.
00:13:10.000 They barely even, they don't even bother picking up a thesaurus.
00:13:13.000 Right.
00:13:15.000 Like right before the debate between Biden and Trump, everyone was saying sharp as a tech.
00:13:22.000 Who says sharp as a tech?
00:13:25.000 Exactly.
00:13:26.000 It's not a common phrase.
00:13:28.000 It's definitely not common to be repeated on air with multiple people simultaneously.
00:13:33.000 That's weird.
00:13:35.000 Yes.
00:13:35.000 That's coordinated.
00:13:37.000 100%.
00:13:38.000 100%.
00:13:38.000 Yes.
00:13:39.000 Yeah.
00:13:39.000 Like hundreds of people saying it simultaneously.
00:13:42.000 They just got their instructions.
00:13:44.000 Yeah.
00:13:45.000 So, I mean, essentially, the, you know, the Dem leadership or, you know, political leadership, they issue their instructions and their puppets carry it out.
00:13:55.000 Yeah.
00:13:56.000 They're just like puppets in a puppet show.
00:13:58.000 And that's the problem that I see with all this doge stuff.
00:14:03.000 Right.
00:14:03.000 Because everybody should be celebrating that we've found a way to cut out fraud and waste.
00:14:10.000 Yeah.
00:14:10.000 If you pay taxes, and you don't like that you have to pay so much in taxes, and then you find out that there's significant fraud and waste that's been exposed, you should be celebrating it.
00:14:23.000 This shouldn't be, oh no, the wrong people found this fact, and now it's a bad thing.
00:14:30.000 And then there's the fucking propaganda, the mindfuck of calling it USAID. Instead of the United States Agency for International Development.
00:14:40.000 It sounds like it's feeding hungry people.
00:14:42.000 People are going to starve, Elon.
00:14:45.000 This is horrible.
00:14:46.000 And then you find out, actually, it's like $250 million for transgender animal studies.
00:14:53.000 Literally mutilating animals.
00:14:55.000 Yes.
00:14:56.000 Mutilating animals in demented studies.
00:14:59.000 Yes.
00:15:00.000 That are like the worst thing you could possibly imagine from a horror show.
00:15:04.000 The beagle one.
00:15:05.000 The beagle puppy one.
00:15:06.000 Horrific.
00:15:06.000 Yeah.
00:15:06.000 Where they covered their head in a basket and put fleas on their heads and eat them alive.
00:15:11.000 Yeah.
00:15:12.000 And then they studied these beagles and then killed them.
00:15:14.000 Like, what are you going to learn from that that's good for anybody?
00:15:18.000 Yeah.
00:15:19.000 There's really some psychotic stuff that happens.
00:15:22.000 So yeah, I mean the – I guess the real threat here is to the bureaucracy.
00:15:36.000 So, like, you probably saw, like, you know, let's say, like, Trump is a threat to our democracy, which is ironic since he was elected with the majority of the, you know, popular vote.
00:15:48.000 They started saying I was a threat to democracy.
00:15:51.000 But if you just replace threat to democracy with threat to bureaucracy, it makes total sense.
00:15:57.000 So, I mean, the reality is that Our elected officials have very little power relative to the bureaucracy until Doge.
00:16:12.000 So Doge is a threat to the bureaucracy.
00:16:16.000 It's the first threat to the bureaucracy.
00:16:18.000 Normally the bureaucracy eats revolutions for breakfast.
00:16:21.000 This is the first time that they're not, that the revolution might actually succeed, that we can restore power to the people instead of power to the bureaucracy.
00:16:33.000 The size of it, when you guys first started investigating it, when you first get in, how much of it was shocking?
00:16:42.000 Like, just the size of it all?
00:16:45.000 Well, the size of it all, small decisions result in multi-billion dollar outcomes.
00:16:51.000 So, you know, we'd see, you know, there was a case where we saw one person was getting $1.9 billion sent to their NGO, which basically got formed about a year ago.
00:17:02.000 And had no prior activity.
00:17:07.000 So they just stand up an NGO. The whole NGO thing is a nightmare.
00:17:14.000 And it's a misnomer because if you have a government-funded, non-governmental organization, you're simply a government-funded organization.
00:17:23.000 It's an oxymoron.
00:17:25.000 Right.
00:17:25.000 It's a loophole.
00:17:26.000 Yes.
00:17:27.000 Basically, the government-funded NGOs are a way to do things.
00:17:31.000 That would be illegal if they were the government, but are somehow made legal if it's sent to a so-called non-profit.
00:17:41.000 But these non-profits are then used to – people cash out these non-profits.
00:17:46.000 They become very wealthy through non-profits.
00:17:49.000 They pay themselves enormous sums through these non-profits.
00:17:53.000 It's so insane that that's been going on for so long.
00:17:57.000 It's a gigantic scam, like one of the biggest – maybe the biggest scam ever.
00:18:01.000 And how many NGOs?
00:18:03.000 I think there's a total of NGOs, probably millions.
00:18:08.000 But in terms of large NGOs, tens of thousands.
00:18:13.000 I mean it's actually – it's kind of a hack to the system where someone can get an NGO stood up.
00:18:24.000 George Soros is really good at this.
00:18:26.000 George Soros is like a system hacker.
00:18:29.000 He figured out how to hack the system.
00:18:31.000 He's a genius at arbitrage.
00:18:34.000 These days, he's pretty old, but a genius at arbitrage.
00:18:38.000 He figured out that you could leverage a small amount of money to create a non-profit, then lobby the politicians to send a ton of money to that non-profit so you can take what might be A $10 million donation to a nonprofit to create a nonprofit and leverage that into a billion-dollar NGO. A nonprofit is a weird word.
00:19:01.000 It's just a non-governmental organization.
00:19:04.000 And then the government continues to fund that every year.
00:19:09.000 And it'll have a nice-sounding name, like the Institute for Peace or something like that.
00:19:13.000 But really, it's a graph machine.
00:19:17.000 And what are the requirements with that money?
00:19:20.000 What do they have to do?
00:19:21.000 Just really no requirements at all.
00:19:23.000 So they just get grants and the government just assumes that they're doing good work.
00:19:28.000 I think a lot of people in the government know that they're not doing good work.
00:19:31.000 But they...
00:19:33.000 It's a giant graph machine.
00:19:37.000 I mean...
00:19:39.000 But surely...
00:19:40.000 People online are like unpacking this.
00:19:42.000 Right.
00:19:42.000 You know...
00:19:43.000 It almost seems fake.
00:19:45.000 We were covering this article that said 55,000 Democrat NGOs were discovered that had been contributing to campaigns and moving things around and pushing propaganda.
00:19:57.000 They were all connected and they found it through AI. You have to go through steps and steps and steps to figure out where the money's coming from.
00:20:03.000 Oh, it's all funneling down to this group and this group does that.
00:20:08.000 It's a giant propaganda machine, a giant regime change machine.
00:20:16.000 Yes, I mean...
00:20:16.000 But does it do some good as well?
00:20:19.000 No, it does some good.
00:20:20.000 So it's like, it's not like 0% good.
00:20:23.000 If it was like, if it was really 0% good, it would be much easier to attack.
00:20:30.000 So there's going to be some percent good that they add in there.
00:20:33.000 But it's like, it might be 5% or 10% good, but 90-95% not.
00:20:38.000 So is there a way to audit all this stuff and find out, oh, these people are actually just sending food to poor people.
00:20:46.000 These people are actually just helping people with water in third world countries.
00:20:50.000 There's a way to do that and keep funding those.
00:20:54.000 Yeah.
00:20:54.000 I mean we have continued to fund things that appear to be legitimate even with the flimsiest – if there's even the flimsiest excuse.
00:21:01.000 Like I just say like send me a picture of the thing.
00:21:04.000 Like, you could literally have AI generate the picture.
00:21:06.000 But if you're not even willing to try to trick me, then we're, like, not going to send the money, okay?
00:21:12.000 So what restrictions were put on?
00:21:14.000 There was something set aside, like medicine and...
00:21:20.000 What was set aside?
00:21:21.000 Yeah, there was a work for Ebola prevention.
00:21:25.000 I actually don't know if this work is even effective.
00:21:28.000 It may or may not be.
00:21:29.000 It could be the kind of thing where you fund Ebola prevention, but it turns out that actually you're funding a lab that develops new Ebola recipes or something.
00:21:39.000 And they claim it's Ebola prevention, but it's actually Ebola creation.
00:21:42.000 So some of these things, I don't know.
00:21:46.000 But it just seems like...
00:21:48.000 We shouldn't be sending taxpayer money to dubious enterprises overseas.
00:21:53.000 Right.
00:21:53.000 Yeah.
00:21:53.000 Yeah.
00:21:54.000 And why are we doing it?
00:21:55.000 Like, what exactly is the reason?
00:21:57.000 Is it because we want to make friends with these people so the Chinese don't take over, the Russians don't take over?
00:22:01.000 Okay, how much of that is, like, a good thing?
00:22:03.000 How much of that is smart to do?
00:22:05.000 And how much is a grift?
00:22:07.000 And without any sort of oversight, which has really been going on for so long, they just had free run.
00:22:14.000 Yeah.
00:22:18.000 The budget deficit, it's gigantic.
00:22:20.000 So like if – all things being equal, if we didn't have a gigantic budget deficit where interest payments – the interest on the national debt exceeds the defense department budget, which is truly astounding, which means – so we're paying over a trillion dollars of interest which is truly astounding, which means – so we're paying over a trillion dollars Then, OK, we would have more room for wasting money basically.
00:22:45.000 But when we're spending so much money that the country is going bankrupt, then we really need to stop spending money if – unless we're sure it is good value.
00:22:57.000 So essentially we're like a poorly managed business with an unlimited credit line that is off the rails.
00:23:03.000 Absolutely.
00:23:04.000 And if you were a person like you are who comes in and takes over businesses and straightens them out.
00:23:10.000 That's exactly what you're doing.
00:23:11.000 I mean, most of the time I create businesses from scratch.
00:23:14.000 Twitter was a case where, you know, I kind of bought a company that was, I kind of knew it was a hairball.
00:23:20.000 Well, you came in at Tesla in the beginning, but they were already doing something, right?
00:23:23.000 No, Tesla did not exist in any meaningful form.
00:23:27.000 There were no employees.
00:23:30.000 J.B. Stravelin joined three other people.
00:23:33.000 There was no car.
00:23:34.000 There was no nothing.
00:23:36.000 So it wasn't even a prototype yet?
00:23:38.000 No.
00:23:39.000 Oh, okay.
00:23:40.000 I thought it was a prototype already.
00:23:42.000 No.
00:23:43.000 There weren't even any employees.
00:23:45.000 Oh.
00:23:47.000 That's a funny narrative that people like to say that you didn't even create Tesla then.
00:23:51.000 Yeah, that's wrong.
00:23:56.000 So if you're handling the government like a business, you're going to have to go through All of these departments and do the exact same thing that you're doing with USAID. So how does that scale up?
00:24:09.000 Like how many people do you need to do something like that?
00:24:15.000 Well, we started off with about 40 people, maybe 100 people.
00:24:22.000 And we're really just going through, doing very basic things here.
00:24:28.000 As bad as Twitter was, the...
00:24:30.000 Federal government is much worse.
00:24:32.000 So, you know, in the case of Twitter, it wasn't a profitable company.
00:24:40.000 It was, like, basically a break-even company.
00:24:42.000 But at least it was break-even.
00:24:43.000 And it had to pass an audit.
00:24:45.000 The federal government is not break-even.
00:24:48.000 It's literally losing $2 trillion a year.
00:24:51.000 And it does not pass its audits.
00:24:53.000 It fails its own audit.
00:24:55.000 So, like, you know, there's a case where...
00:25:00.000 I think Senator Collins was telling me about how she gave the Navy $12 billion for more submarines, got no extra submarines, and then held a hearing to say, where'd the $12 billion go?
00:25:11.000 And they were like, we don't know.
00:25:14.000 That was it.
00:25:17.000 I mean, basically, this stuff is so crazy.
00:25:20.000 Only the federal government could get away with this level of waste.
00:25:25.000 It's mostly waste.
00:25:26.000 It's mostly not fraud.
00:25:27.000 It's mostly waste.
00:25:28.000 It's mostly just...
00:25:29.000 Ridiculous things happening.
00:25:31.000 Because they've been able to do it this way for so long.
00:25:34.000 They've become accustomed to it.
00:25:36.000 Yeah.
00:25:37.000 I mean, it's like Milton Friedman said, like, money is most poorly spent when you're spending someone else's money on people you don't know.
00:25:46.000 How much are you going to care?
00:25:48.000 Right.
00:25:49.000 And that's the federal government.
00:25:52.000 So they're spending someone else's money on people they don't know.
00:25:56.000 Now, imagine any other business that was this badly run that complains when you want to check the books and audit it and go through all the decisions that have been made and go through all the ledgers and, like, what did you do?
00:26:12.000 Well, the people receiving the money want to keep receiving the money.
00:26:16.000 Yeah.
00:26:17.000 Yeah.
00:26:17.000 Yeah, clearly.
00:26:18.000 Yes.
00:26:19.000 So, but, you know, I mean, the reason I'm putting so much effort into this is that I think it is a very dire situation to think.
00:26:32.000 It's not a, you know, it's not optional, basically.
00:26:39.000 So, yeah, America's going bankrupt, so that just can't happen.
00:26:49.000 It's just bizarre to me that some people aren't willing to look at it correctly.
00:26:53.000 They're not willing to see how much chaos this is, how much waste and fraud there is, how much can be trimmed.
00:27:01.000 Just because people have jobs doing bullshit doesn't mean your tax dollars should pay for this bullshit.
00:27:07.000 Yes.
00:27:09.000 We found just with a basic search of the Social Security database that there were 20 million dead people marked as alive.
00:27:18.000 But were they getting money?
00:27:20.000 Some of them were getting money.
00:27:22.000 What percentage of them?
00:27:26.000 It isn't clear.
00:27:27.000 We're actually trying to run this around.
00:27:28.000 I was trying to get an answer right before the show.
00:27:32.000 What it looks like is that most of the fraud is not coming from Social Security payments directly, but because they're marked as alive in the Social Security database that they can then get disability, unemployment, sort of fake medical payments, and other things because they're marked as alive in the Social Security database.
00:27:51.000 So it looks like the fraud is a bank shot, essentially.
00:27:55.000 They bank shot into Social Security.
00:28:00.000 They just do an are you alive check and then get fraudulent payments from every other part of the government.
00:28:07.000 Oh.
00:28:08.000 Yeah.
00:28:09.000 And this exploits the fundamental weakness in the government is that the various government databases, they don't talk to each other.
00:28:17.000 They talk to each other very poorly in a very limited way.
00:28:20.000 So the way the system gets exploited is by taking advantage of the poor communication between the various databases and the government.
00:28:31.000 I'll give you an example of what's happening in, say, Treasury, which is improving rapidly.
00:28:38.000 The main payments computer is called PAM, like Payment Accounts Master Database or something like that, but everyone calls it PAM. That's responsible for almost $5 trillion of payments a year, roughly $1 billion an hour.
00:28:53.000 And when we came there, we're looking at the payment, and it's like the payments have no – you could put a payment through with no payment categorization code and no description on the payment, like basically untraceable blank checks.
00:29:11.000 This is the kind of thing that if it was done as a public company, the company would – Be immediately delisted and the executive team will be thrown in prison.
00:29:21.000 But this is just normal at the government.
00:29:24.000 So we said, okay, our recommendation to the Treasury and the Federal Reserve is we need to make the payment categorization codes mandatory, not optional, and there needs to be an explanation.
00:29:39.000 We're not judging the quality of the explanation, but there should be some explanation for what this payment is for.
00:29:48.000 That's a radical change to the system that is being implemented now.
00:29:53.000 My guess is that probably saves $100 billion a year.
00:29:56.000 Jesus Christ.
00:29:58.000 Where was that money going?
00:30:00.000 Rough order of magnitude.
00:30:01.000 Where was that money going?
00:30:04.000 Well, so this is where you get into the sort of gray boundary between waste and fraud.
00:30:13.000 If money is sent to a person or organization from the government and you didn't really deserve it, but the government still sent it to you, is that waste or fraud?
00:30:26.000 Right.
00:30:28.000 So, I mean, there's a lot of payments that where someone just approved the payment, but then that payment officer changed jobs or retired or died, and the payments just keep going.
00:30:42.000 It's like if you forget to pay your gym membership or something like that.
00:30:45.000 Right.
00:30:46.000 Now imagine it's not the gym membership.
00:30:47.000 You said your gym membership is $20 billion a year or something.
00:30:52.000 But they forgot to turn it off.
00:30:57.000 That's happening at scale in the government.
00:31:00.000 Jesus Christ.
00:31:02.000 That's so insane.
00:31:03.000 Yes, it's totally insane.
00:31:06.000 What did you expect when you went in?
00:31:08.000 Did you expect it would be like this?
00:31:10.000 I thought it would be bad, but I did not think it would be as bad as this.
00:31:16.000 I mean, look, the good news is that it's a target-rich environment for saving money.
00:31:24.000 It's not like if it was a very well-run ship, if it was very efficient, it would be hard to improve.
00:31:34.000 But it's not efficient, so therefore it is actually relatively easy to improve.
00:31:40.000 Let's just say it's not rocket science.
00:31:42.000 You know, I know rocket science.
00:31:44.000 So it's a lot of mundane things.
00:31:50.000 And some of the things are like so crazy that we didn't even know to ask about that because we just assumed like...
00:31:59.000 You know, payments out of the treasury computer would have a payment categorization code and they would have some explanatory note saying what the payment's for.
00:32:06.000 The idea that it would be just untraceable blank checks didn't occur to us at first.
00:32:12.000 Jesus.
00:32:14.000 So, anyway, just...
00:32:16.000 This episode is brought to you by NetSuite.
00:32:18.000 What does the future hold for business?
00:32:21.000 Ask nine experts and you'll get 10 answers.
00:32:23.000 Bull market?
00:32:24.000 Bear market?
00:32:25.000 Until someone invents a crystal ball, over 41,000 businesses are future-proofing their operations with NetSuite by Oracle, the number one cloud ERP.
00:32:37.000 It brings accounting, financial management, inventory, and HR into one fluid platform.
00:32:44.000 With one unified business management suite, NetSuite gives you a single source of truth, giving you the visibility and control you need to make quick, confident decisions.
00:32:54.000 Plus, real-time insights and forecasting let you peer into the future with actionable data.
00:33:01.000 When you can close the books in days, not weeks, you spend less time looking back and more time at what's next.
00:33:10.000 Whether your company is earning millions or even hundreds of millions, NetSuite helps you respond to immediate challenges and seize your biggest opportunities.
00:33:19.000 Speaking of opportunity, download the CFO's Guide to AI and Machine Learning for free at netsuite.com slash rogan.
00:33:28.000 So is that one of the things that accounts to this?
00:33:31.000 There's this four point something trillion dollars that's kind of they don't know where it went?
00:33:38.000 They don't know...
00:33:40.000 I think that's probably a cumulative number.
00:33:43.000 Yes.
00:33:44.000 So, yeah.
00:33:45.000 But yeah, if you add up...
00:33:48.000 Do you remember that story, Jamie?
00:33:50.000 Yeah.
00:33:51.000 What was the story?
00:33:52.000 It was like they just didn't have accounting for it, I think.
00:33:55.000 Yeah, it was spent on legitimate things, don't worry, but we don't know what we spent it on.
00:34:01.000 Well, I mean, how do you know?
00:34:03.000 Obviously, one can't say it was spent legitimately if they don't know what it was spent on.
00:34:07.000 That doesn't make any sense.
00:34:08.000 This is such a fascinating time because with this setup...
00:34:14.000 The way it is right now with Trump back in after all that happened to him and with you there and with RFK Jr. and Tulsi and Kash Patel, it's like this is a wild time to find out what's really going on that's like never happened before.
00:34:30.000 This is nothing like the first term.
00:34:33.000 Like, the first term, he had a bunch of neocons in the cabinet, and there's a bunch of shady people that he didn't know, and he had to appoint all these different people, and maybe he got some bad picks.
00:34:43.000 Now he's had four years to stew on it.
00:34:46.000 Right.
00:34:47.000 And with you guys all going through this, we're getting an understanding of the government that we've literally never had before.
00:34:54.000 Yeah, this is a revolutionary cabinet.
00:34:56.000 And maybe the most revolutionary cabinet since the first revolution.
00:35:01.000 This is not a bunch of business as usual types.
00:35:07.000 So this is why some of the standard confirmations are quite challenging.
00:35:11.000 It's because when you try to appoint people who are going to change the system, the system doesn't want to let them through.
00:35:19.000 But it's fascinating because it's like the vampires all out themselves.
00:35:23.000 Like now everybody knows who the system is.
00:35:26.000 Like, if you're just lying openly about USAID and then they come and hear you talk on a podcast and explain what's really going on, like, he's starving mothers.
00:35:35.000 There's mothers that can't get food.
00:35:37.000 Totally false.
00:35:37.000 That's all you're hearing.
00:35:39.000 No one's talking in any of these mainstream liberal talk shows.
00:35:44.000 No one is talking about all this fraud and waste.
00:35:47.000 Yeah, because we're cutting off their graph machine.
00:35:50.000 So that's what they're upset about.
00:35:52.000 That's the real thing they're upset about.
00:35:54.000 And if you want to know what DOGE is cutting, and I want to be clear, these are cuts that DOGE recommends to the department.
00:36:02.000 And usually these recommendations are followed, but these are recommendations that are then confirmed by the department.
00:36:11.000 You can see line by line what DOGE has done at doge.gov.
00:36:16.000 Whatever we do, we put on Doge.gov so you can see everything that is being done.
00:36:21.000 And there's a tracker that shows how much money has been saved.
00:36:24.000 Yeah.
00:36:25.000 Yeah.
00:36:25.000 And you can look at each line item and, you know, a bunch of these sort of far-left shows will say, like, oh, it's a constitutional crisis, blah, blah, blah.
00:36:35.000 But what they won't do is point out which payments are wrong.
00:36:38.000 Right.
00:36:38.000 So my challenge to them is point out which payments are wrong.
00:36:41.000 Yeah, go through it.
00:36:42.000 Which of these sort of waste slash fraud things are wrong?
00:36:46.000 Which line?
00:36:47.000 Explain that line to the public.
00:36:49.000 They won't be able to.
00:36:50.000 Right.
00:36:51.000 Yeah, that's why you're not hearing any specifics.
00:36:53.000 You're hearing anecdotal stories about mothers starving.
00:36:56.000 But we can name the specifics.
00:36:57.000 Yeah.
00:36:58.000 Line by line.
00:36:59.000 We got the receipts.
00:37:00.000 And here's the other thing.
00:37:01.000 We post the receipts.
00:37:02.000 And if you're only talking about the propaganda talking points and you're not talking about the very clear fraud and waste, it's very obvious what you're doing.
00:37:10.000 Yeah.
00:37:10.000 You're just gaslighting.
00:37:11.000 Yeah, yeah, totally.
00:37:13.000 Yeah.
00:37:13.000 So, exactly.
00:37:14.000 Because we're saying, like, look, in fact, I've said we're going to make mistakes.
00:37:18.000 We're not going to be perfect.
00:37:21.000 So, if we make a mistake, we'll quickly fix it.
00:37:25.000 So, we need to act fast.
00:37:30.000 So stop wasting billions of dollars of taxpayer money.
00:37:34.000 But if we make a mistake, we'll reverse it quickly.
00:37:38.000 Right.
00:37:38.000 You know, so...
00:37:39.000 It's also this interesting narrative that you shouldn't have access to the Social Security information as if no one's had access to it before.
00:37:48.000 As if the Biden administration in 2023 had...
00:37:51.000 There was like 53 people.
00:37:53.000 Some of them were students that had access to all this stuff.
00:37:55.000 Yeah.
00:37:57.000 As it is, there are...
00:38:00.000 Tens of thousands of federal employees that have access already to the system.
00:38:04.000 Anyone from Doge has to go through the same vetting process that those federal employees went through.
00:38:11.000 So it's not like some unvetted, random situation.
00:38:16.000 If, for example, there's a security clearance needed, the Doge person has to have that same security clearance.
00:38:22.000 So there's no reduction in security.
00:38:29.000 But, I mean, obviously, vast numbers of social security numbers have leaked onto the internet.
00:38:35.000 People have hacked the government systems multiple times.
00:38:38.000 Vast amounts of public information has been hacked and dumped onto the internet.
00:38:43.000 So, there's a guy at the IRS that leaked half a million tax returns just a few years ago.
00:38:52.000 On purpose?
00:38:53.000 Yeah.
00:38:54.000 For what reason?
00:38:59.000 I think he was trying to get at Trump and maybe me and a few others.
00:39:04.000 But he stole like 500,000 tax returns.
00:39:12.000 Not a few.
00:39:13.000 It's a lot of tax returns.
00:39:15.000 Jesus Christ.
00:39:16.000 I remember that story.
00:39:18.000 You can just read about it online.
00:39:20.000 It's a real thing.
00:39:23.000 These are the narratives.
00:39:24.000 That's the narrative that you shouldn't have access to Social Security.
00:39:27.000 The other narrative is that starving people are going to die and women aren't going to be pregnant and not have nutrients for their babies.
00:39:35.000 And that's all you're hearing.
00:39:38.000 Yeah.
00:39:39.000 Well, that's the only thing they can say.
00:39:42.000 But they can't point to the line item.
00:39:43.000 Right.
00:39:44.000 And so they can't say, like, well, this is the thing where, you know...
00:39:50.000 Nutrients for pregnant mothers were stopped.
00:39:53.000 They can't point to that because we didn't.
00:39:55.000 It's a lie.
00:39:57.000 What's fascinating to me is how much the mainstream media is in line with the very specific talking points and how little...
00:40:08.000 You'll have Fox News.
00:40:10.000 You essentially have Fox News on television.
00:40:12.000 It's like the only one that is pointing out the ridiculous fraud and waste.
00:40:18.000 You know, I know you saw the Jeff Bezos thing in the Washington Post.
00:40:22.000 They're going to stop all the wacky editorials and limit that stuff to – I think it was wealth and personal freedom or something along those lines.
00:40:32.000 Yeah.
00:40:33.000 So, I mean, I think it's – there's kind of – I think it makes sense because he's just talking about the things – not the sort of – he's just talking about the opinions.
00:40:44.000 Opinion pieces.
00:40:45.000 The opinion pieces, yeah.
00:40:46.000 So the regular journalism stays the same.
00:40:48.000 Well, it's a detriment to their business.
00:40:51.000 I mean, you're seeing over and over again people that just, they don't want to hear all this shit from these people anymore.
00:40:56.000 It's like, you're saying, it's almost like you're caught in an outdated version of the virus.
00:41:02.000 And everybody else already has the immunity to that virus.
00:41:06.000 Yeah.
00:41:07.000 You know?
00:41:07.000 Like, you need a new mind virus.
00:41:10.000 The one that you're pushing, it's like, it doesn't work anymore.
00:41:13.000 It's too crazy.
00:41:14.000 Yeah.
00:41:16.000 The whole thing is very crazy.
00:41:18.000 The media is incredibly partisan.
00:41:22.000 Almost all the media is left-shifted.
00:41:29.000 It's kind of weird.
00:41:30.000 If you talk to somebody who gets all their information from what I call legacy media, they're living in a different world than if they are listening to your podcast.
00:41:44.000 Or getting news from X. It's kind of wild.
00:41:52.000 It is very wild.
00:41:53.000 It's like they're living in an ultimate reality.
00:41:55.000 Oh, there's a lot of people that I talk to that I have to go, where did you hear that?
00:41:59.000 Yeah.
00:42:00.000 Yeah.
00:42:01.000 The Associated Press, which I call Associated Propaganda, the AP, they ran an international news story saying that Doge fired air traffic controllers.
00:42:12.000 But we didn't fire any air traffic controllers at all.
00:42:15.000 In fact, we're trying to hire air traffic controllers, not fire them.
00:42:18.000 Yeah, I saw that.
00:42:19.000 You made a tweet about it, right?
00:42:21.000 Yeah.
00:42:21.000 What do you call it now?
00:42:22.000 Do you call it a post?
00:42:23.000 Post, yeah, whatever.
00:42:24.000 You can't call it a tweet, though.
00:42:25.000 Do you call it a tweet accidentally, ever?
00:42:27.000 I don't know.
00:42:30.000 But let's say if somebody puts up a two-hour-long video, that's not a tweet.
00:42:38.000 Right.
00:42:39.000 It's a post.
00:42:40.000 Yeah.
00:42:40.000 Good point.
00:42:41.000 Yeah, for sure.
00:42:42.000 Yeah.
00:42:44.000 But I don't know if people still want to call or tweet or whatever.
00:42:48.000 You put a post about it, just to get back to it, saying that if we need highly qualified air traffic controllers, if you've retired, if you would consider doing it again, we could use you.
00:42:58.000 Yes.
00:42:59.000 So a lot of really qualified air traffic controllers were pushed out because of DEI stuff.
00:43:07.000 I mean, not to be blunt, I mean, a bunch of really good, talented old white guys were pushed out.
00:43:15.000 It's not cool.
00:43:16.000 And so there's a talent shortage in air traffic control because of DEI and not hiring people on merit.
00:43:26.000 Which is so crazy that that worked.
00:43:29.000 I think we should not put the public safety at risk.
00:43:31.000 No.
00:43:32.000 You know, because of some demented philosophy.
00:43:34.000 Somebody made a post today about it infiltrating the NSA.
00:43:38.000 Did you see any of that?
00:43:40.000 That was crazy.
00:43:42.000 Some gnarly stuff.
00:43:44.000 Yeah, crazy.
00:43:45.000 It started off as just like this sort of fringe thing and people would meet up.
00:43:51.000 Then it completely infiltrated the organization.
00:43:53.000 Yeah.
00:43:53.000 And they were spending all their time.
00:43:55.000 There was like 400 people or something in some sex chat room with some extremely demented stuff.
00:44:04.000 Yeah.
00:44:04.000 Yeah, I know I'll send it to you, Jamie, because it's so kooky.
00:44:09.000 You're what?
00:44:10.000 This is the NSA. I thought the NSA was just all about, like, information and hardcore business.
00:44:16.000 Who's supposed to, like, spy on, you know, like, if, like, there's a national threat or something.
00:44:20.000 Yeah, I think this is exactly it.
00:44:23.000 So, more than 100 intelligence staffers will be fired over sexually explicit texts in NSA chat rooms, Gabbard says.
00:44:34.000 So top intelligence official told Waters that the workers in question were brazen in using an NSA platform intended for professional use to conduct this kind of really, really horrific behavior.
00:44:45.000 What is the behavior?
00:44:46.000 What exactly?
00:44:47.000 What is it?
00:44:48.000 Do they say in this article?
00:44:51.000 Yeah.
00:44:53.000 I think they were also...
00:44:55.000 Okay.
00:44:55.000 It says employees who participate in the NSA's obscene, pornographic, and sexually explicit chat rooms.
00:45:02.000 Your tax dollars at work.
00:45:04.000 Well, it was all like LBGTQ stuff.
00:45:09.000 There was a lot of transition stuff.
00:45:13.000 I know I definitely saved it, but the point is, it infiltrated the organization.
00:45:18.000 It's not what they should be talking about.
00:45:20.000 At all.
00:45:21.000 At all.
00:45:22.000 It's supposed to be protecting the country.
00:45:24.000 Right.
00:45:24.000 And people were talking about how they're spending half their time in these meetings and that they're just constantly having to attend these things where they talk about these issues.
00:45:35.000 What are you doing?
00:45:37.000 If you have a problem with someone that's discriminatory, get rid of that person.
00:45:41.000 That's it.
00:45:43.000 Problem's over.
00:45:44.000 You've got someone who's homophobic in your business.
00:45:46.000 They're openly homophobic.
00:45:47.000 You can't work here.
00:45:49.000 That's not cool.
00:45:49.000 That's it.
00:45:50.000 That's it.
00:45:51.000 You don't have to have fucking meetings constantly promoting this.
00:45:54.000 You're not going to change someone's opinion by berating them over and over again.
00:45:58.000 Yeah.
00:45:59.000 I mean, a work environment should be a professional environment where, you know, they're getting the job done that they're, you know, being paid to do.
00:46:09.000 It should be...
00:46:11.000 Of course.
00:46:11.000 It's obviously not supposed to be sort of getting paid for...
00:46:16.000 Bizarre sexcapades.
00:46:17.000 It's just so fascinating that the virus is so strong that it made it into the NSA. Yes.
00:46:25.000 You would think those are some hardcore...
00:46:27.000 And the CIA too.
00:46:28.000 I think the CIA was in there too.
00:46:29.000 Yeah, they were in there too, which is bananas.
00:46:31.000 You would think.
00:46:32.000 Same thing.
00:46:32.000 Like hard-nosed, tough people doing hard work.
00:46:36.000 Who can spy on you whenever they want.
00:46:38.000 Yeah.
00:46:40.000 Yeah.
00:46:41.000 And get revenge on you whenever they want.
00:46:43.000 Yeah.
00:46:45.000 Pretty wild.
00:46:46.000 Yeah.
00:46:47.000 And, you know, they exist when the president leaves.
00:46:52.000 They stay.
00:46:53.000 People move around.
00:46:55.000 You stay a part of the organization for your entire career.
00:46:58.000 You get deeply entrenched in their system and how things work and who's back to rub and who's a bad guy, who's a good guy, who's on our side, who's not.
00:47:07.000 Yeah.
00:47:09.000 It's scary, actually.
00:47:11.000 Yeah.
00:47:13.000 So...
00:47:14.000 Was that what's taken so long with this Epstein files?
00:47:17.000 Yeah, what's up with that?
00:47:18.000 What is up with that?
00:47:20.000 It was like...
00:47:20.000 Yeah.
00:47:21.000 It's like Lucy and the football with Charlie Brown when she always pulls that football away.
00:47:26.000 It's the same thing!
00:47:29.000 It's like they keep telling us they're gonna release it.
00:47:31.000 Day one.
00:47:32.000 Oh, day one.
00:47:32.000 It was a serious case of no one's being arrested or a phobia, you know.
00:47:35.000 Well, there's always...
00:47:36.000 Right.
00:47:37.000 Like, what the fuck's going on?
00:47:38.000 What the fuck is going on?
00:47:39.000 Also, there's this real fear that someone's destroying the evidence.
00:47:42.000 And you keep hearing these stories, unsubstantiated stories, of, you know, FBI people shredding.
00:47:48.000 Well, where is the evidence?
00:47:48.000 I mean, the guy has, like, tons of videos and recordings.
00:47:52.000 Yeah.
00:47:53.000 I mean, he had all sorts of things.
00:47:54.000 Right.
00:47:54.000 Like, there's a mountain of evidence.
00:47:56.000 Right.
00:47:56.000 So where is that mountain?
00:47:57.000 Yeah, where is that mountain?
00:47:58.000 And what would be the reason why they would agree?
00:48:02.000 Like, there would have to be something in it for them to agree to not put it out.
00:48:08.000 Right?
00:48:10.000 Like, there has to be some sort of...
00:48:12.000 Financial entanglement, some sort of relationship with the people that are on that list, that they can provide a value that was big enough for you to not release it or to slow release it or to hope you can get away with putting out some redacted files that don't show anything.
00:48:28.000 This is only stage one.
00:48:30.000 Redacted, redacted, redacted.
00:48:31.000 Only stage one.
00:48:32.000 Don't worry.
00:48:32.000 The real stuff's coming.
00:48:34.000 That doesn't make any sense.
00:48:35.000 Why wouldn't you just release it all?
00:48:37.000 What could possibly be?
00:48:41.000 Worth protecting in there.
00:48:42.000 I mean, I think I've got probably the same information that, I mean, I'm just reading what's the latest thing on the X, you know, I'm just looking at my X feed and I'm like, you know, it's a real page turner.
00:48:53.000 And I thought we were going to get some revelations today.
00:48:56.000 I was like, big binders full of stuff.
00:48:58.000 There's got to be something in there.
00:49:00.000 Well, there was all those people that were given a copy of it.
00:49:03.000 They were all like waving it around.
00:49:04.000 They got the Willy Wonka ticket.
00:49:06.000 Yeah, yeah, totally.
00:49:07.000 Yeah.
00:49:07.000 And what happened?
00:49:09.000 Nothing.
00:49:10.000 Nothing.
00:49:11.000 I think Laura Loomer released it online.
00:49:14.000 Yeah.
00:49:15.000 Right?
00:49:16.000 Yeah, she's not very pleased about this.
00:49:18.000 So does anybody find anything in there that's interesting?
00:49:22.000 No, it's all old stuff from 2015 and 2021. Okay, what the fuck is going on?
00:49:25.000 But then apparently they discovered a whole bunch of stuff at the Southern District of New York.
00:49:31.000 Right.
00:49:31.000 So that's...
00:49:34.000 And I'm like, and I think, you know, Pam Bondi is actually great and Kash Patel are great, but they're like, they just got there, you know?
00:49:41.000 Right.
00:49:41.000 So then they're in a, they just got there, but they're in a hostile environment.
00:49:46.000 They're not in a friendly environment.
00:49:48.000 Right.
00:49:49.000 So, you know, it's like if you suddenly got put in, captain of a ship, but the crew was previously your enemy.
00:49:58.000 Right.
00:49:58.000 The entire crew.
00:49:59.000 It was previously your enemy.
00:50:00.000 Right.
00:50:01.000 And you're telling them, give me evidence.
00:50:03.000 Yeah, the crew doesn't want to give me the evidence because the crew is your enemy.
00:50:07.000 They were like, you're mortal enemies just a moment ago.
00:50:09.000 You just got there.
00:50:11.000 Yeah.
00:50:11.000 So I think we've got to give the Attorney General and the new director of the FBI a little bit of slack here because they literally just got there.
00:50:22.000 I think so too, but hey, don't say you're going to release it on day one then.
00:50:26.000 You shouldn't have said that.
00:50:28.000 Sure.
00:50:28.000 And don't say, you got a big drop coming tomorrow, and that's some bullshit that's been around forever.
00:50:34.000 It's disappointing.
00:50:35.000 Yeah, and where's the JFK files?
00:50:37.000 Where are those?
00:50:39.000 Yeah.
00:50:39.000 Let them go.
00:50:40.000 Did they release anything on that front?
00:50:41.000 I don't know.
00:50:42.000 What's going on?
00:50:44.000 It can't be anything that's gotten to me yet.
00:50:46.000 So if nothing's gotten to me yet, it can't be significant.
00:50:49.000 If there's conspiracy evidence, someone's going to send it to you.
00:50:51.000 Yes.
00:50:51.000 Tim Dillon's going to text me.
00:50:53.000 100%.
00:50:54.000 Tim Dillon, Dave Smith, someone's going to send it my way.
00:50:56.000 Somebody's going to send you the stuff.
00:50:57.000 Yeah.
00:50:57.000 So it hasn't been released yet.
00:50:59.000 There's no way.
00:51:00.000 Yeah.
00:51:00.000 You would find out.
00:51:02.000 Here's the real question.
00:51:03.000 Like, what could even be in there at this point that they haven't cleared out?
00:51:07.000 If you've got paperwork from 1963, like, what is in there still?
00:51:11.000 What is in there that could possibly be incriminating that supposedly Trump said that if you saw what they showed me, you wouldn't release it either?
00:51:21.000 Okay, what the fuck is that?
00:51:22.000 I haven't seen it, so...
00:51:24.000 Cash Patel has.
00:51:26.000 Yes.
00:51:26.000 You said he's seen it all.
00:51:27.000 Yeah.
00:51:28.000 Can he just post it to his, like, ex-account or something?
00:51:33.000 I mean, I just...
00:51:34.000 That sounds like an Elon move.
00:51:36.000 I don't think he can.
00:51:37.000 He's the director of the FBI. I think he has to go through proper channels.
00:51:40.000 Does he?
00:51:41.000 He is the channel.
00:51:42.000 Yeah, but there's rules.
00:51:44.000 He sounds like Trump.
00:51:48.000 He needs an executive order.
00:51:49.000 What about the storm?
00:51:50.000 I am the storm.
00:51:52.000 I mean, what channels?
00:51:54.000 Here's the channel.
00:51:55.000 Well, again, imagine just getting to the hull or just getting to the deck of the ship and you're the captain.
00:52:03.000 Yeah.
00:52:03.000 And now you have to figure out who's running things, who's doing this, where is everything.
00:52:09.000 Yeah.
00:52:10.000 Yeah.
00:52:11.000 Just getting anything done, like I said, you just joined as captain of a ship where the crew hates your guts.
00:52:19.000 Yeah, they were your enemy.
00:52:21.000 Yeah, they were your enemy.
00:52:23.000 They're strongly opposed to anything you want to do.
00:52:26.000 Yeah.
00:52:26.000 And you're trying to give them orders.
00:52:28.000 And you're trying to expose them.
00:52:29.000 Yeah, they don't want to be exposed.
00:52:31.000 Right.
00:52:31.000 You're literally, people that are working there are probably a part of this problem.
00:52:35.000 I mean, I was reading Onyx that, like, Comey's daughter is, like, the lead prosecutor in the Southern District of New York.
00:52:41.000 Did you read that?
00:52:41.000 Yes.
00:52:42.000 And, like, so obviously there's a bit of an entanglement there.
00:52:45.000 A little bit.
00:52:46.000 Like, what if there's something that, you know, puts her dad in a bad light?
00:52:51.000 What?
00:52:52.000 What?
00:52:53.000 Exactly.
00:52:53.000 She's a fucking shredder.
00:52:55.000 Hit that delete button.
00:52:56.000 The shredder's working overtime, you know?
00:52:58.000 Did you see General Flynn?
00:53:00.000 He was on a podcast and he spoke directly to James Comey.
00:53:05.000 Yeah.
00:53:05.000 He said, Jim, you're going to jail.
00:53:07.000 And unless you give up someone deeper than you and you know who that is, you know who I'm talking about?
00:53:13.000 Like, that is wild.
00:53:16.000 Yeah.
00:53:17.000 To think that the former director of the FBI might be really in that kind of deep shit and that he really actually was doing some evil, corrupt shit while he was running the FBI. I mean, it seems like there's some very shady stuff that's been going on.
00:53:32.000 It seems like it definitely happened in the 60s, right?
00:53:34.000 Everybody kind of admits to that.
00:53:36.000 The FBI killed Black Panthers.
00:53:39.000 They did a lot of shit.
00:53:41.000 There's a lot of stuff that went on that we know the government did way back in the day.
00:53:46.000 Yeah, why don't we just data dump the files?
00:53:48.000 Just go in there, take photos of all the papers, presumably paper, and just post it online.
00:53:57.000 Let the chips fall where they may.
00:53:59.000 Isn't presumably everyone involved dead?
00:54:01.000 Is it in some filing cabinet somewhere?
00:54:02.000 I don't know.
00:54:03.000 Right.
00:54:03.000 Where is it?
00:54:04.000 Where's the magic filing cabinet?
00:54:06.000 How are they hiding it?
00:54:07.000 Who's got access to it?
00:54:08.000 This is what I was hoping.
00:54:10.000 Day one, I was hoping.
00:54:13.000 Obviously, it's taking a lot longer than that.
00:54:15.000 I think part of it is, let's say you were made direct to the FBI. Okay.
00:54:22.000 I might be able to.
00:54:24.000 Yeah, that's what's crazy.
00:54:25.000 You know, Dan Bongino, what's he doing now?
00:54:27.000 He's one of the big dogs of the FBI. Dan's a tough guy.
00:54:31.000 Secret Service guy.
00:54:32.000 Yeah, legit guy.
00:54:33.000 But people think of him as a Fox News guy, just like Pete Hegseth.
00:54:36.000 Same thing.
00:54:37.000 They don't want to think about his distinguished military career.
00:54:39.000 They want to say, oh, that Fox News guy?
00:54:42.000 Deputy Director.
00:54:42.000 Deputy Director of the Federal Bureau of Investigation.
00:54:45.000 I mean, Dan's hardcore.
00:54:46.000 Very.
00:54:47.000 If it's reasonably findable, I think he's going to find it.
00:54:50.000 Between him and Cash, I think they're going to get stuff out there.
00:54:53.000 How crazy would it be if they couldn't, though?
00:54:55.000 How crazy would it be if they can't find anything?
00:54:58.000 If everybody shuts their mouths and everybody covers their ass?
00:55:01.000 Like an FBI computer where you type in the search?
00:55:04.000 That's what I want to know.
00:55:05.000 I just like the basic mechanism here.
00:55:07.000 It's either in a filing cabinet paper where it's like...
00:55:11.000 Maybe there's progressive levels of security.
00:55:13.000 You're like, open this door.
00:55:15.000 Do you have the pass for this level?
00:55:17.000 You get a level unlocked, and there's a level unlocked.
00:55:20.000 You get in there, and there's an old Easter laptop.
00:55:25.000 Yeah, there's an old unplugged laptop.
00:55:28.000 You've got to only plug it in.
00:55:29.000 It's in a skiff.
00:55:30.000 Yeah, this is why I think a tour of Fort Knox would be awesome.
00:55:33.000 Like a live tour of Fort Knox, we can actually see.
00:55:36.000 It's like, is the gold there or not?
00:55:38.000 They say it is.
00:55:38.000 Is it real?
00:55:40.000 Maybe spray paint some lead, you know?
00:55:41.000 Imagine if it's not.
00:55:43.000 Imagine if it's not all there.
00:55:44.000 Like, some of it's missing.
00:55:45.000 Where'd it go?
00:55:45.000 What if a lot of it's missing?
00:55:47.000 What if, like, half of it's missing?
00:55:50.000 I mean, how do we even know?
00:55:51.000 We don't know.
00:55:52.000 They said the last time they let someone look at it was decades ago.
00:55:56.000 Yeah.
00:55:57.000 Well, the last, I believe the last formal audit was in the 50s.
00:56:02.000 So I'm like, okay.
00:56:04.000 Oh, my God.
00:56:04.000 Just think about all the other things.
00:56:06.000 Maybe we should check it again.
00:56:08.000 Maybe.
00:56:08.000 Yeah.
00:56:09.000 Think about all the other stuff that you pointed out, all the checks that just go out, the NGO payments, the Social Security people.
00:56:16.000 Think about just all that.
00:56:18.000 Now apply that to the gold.
00:56:19.000 Absolutely.
00:56:20.000 I just want to emphasize the sheer...
00:56:22.000 Madness of the government.
00:56:24.000 Because they have magic money computers, the checks never bounce to the federal government.
00:56:31.000 So you don't have the normal corrective mechanism that you'd have for a company or for an individual.
00:56:36.000 The checks are just always clear.
00:56:38.000 The net result is inflation, which is effectively a tax on everyone.
00:56:43.000 But the Defense Department hasn't passed an audit in, I don't know, how many years?
00:56:49.000 Seven years.
00:56:49.000 Yeah, I mean, exactly.
00:56:51.000 So, it's like you'd have to be frigging Chuck Norris.
00:56:54.000 Like, only Chuck Norris could get the Defense Department to pass an order, you know, type of thing.
00:56:58.000 That's the level of skill you need, you know?
00:57:01.000 Well, that's what's so insane if you bring it back to the idea that it's a business.
00:57:04.000 Well, yeah.
00:57:05.000 This would never be tolerated in any kind of functional business.
00:57:08.000 Exactly.
00:57:09.000 So, you know, the Pentagon will like...
00:57:16.000 Like their accounting error, like the stuff that they lose in the couch cushions is like 20, 30 billion dollars a year.
00:57:22.000 They just don't know where it went.
00:57:25.000 It's gone.
00:57:27.000 Where'd it go?
00:57:28.000 And it's gone.
00:57:30.000 It's so insane.
00:57:32.000 It's insane.
00:57:33.000 It's so insane.
00:57:34.000 That's why I said, like, even simple things, like just requiring that outgoing payments for the Treasury computer have a payment code and a comment of what the payment is about and someone to call about the payment, I think we'll have...
00:57:49.000 Very powerful effect in stopping wasteful outflows and stopping fraud.
00:57:54.000 Yeah.
00:57:54.000 And here's another way to look at this.
00:57:56.000 Imagine if there are people like you and the Doge team out there in the world.
00:58:02.000 Imagine if one of those works for an organization like USAID or any other organization.
00:58:09.000 has this understanding of how much fuckery is involved but they have evil intentions and they're entwined in this system for decades and decades and they've built a career and all the entanglements that come with it and they start moving shit around you could probably do it easy it sounds like the way you've laid it out if you were a career person who's in there forever who knew how everything works and you were very clever you could make some shit happen And you could probably do
00:58:40.000 it in conjunction with some people that you know that are forming an NGO. Hey, let's all work together.
00:58:47.000 Yeah.
00:58:47.000 Yeah.
00:58:49.000 And this is the resistance that you're facing.
00:58:51.000 Yeah.
00:58:52.000 I think it's the biggest scam of all time.
00:58:54.000 This is not something you ever sought out to do.
00:58:56.000 The biggest scam of all time ever.
00:59:00.000 Of human history.
00:59:00.000 Of human history, yes.
00:59:02.000 Wow.
00:59:04.000 I think you're right.
00:59:05.000 Yeah.
00:59:07.000 It's probably a trillion-dollar scam.
00:59:09.000 There's never been a trillion-dollar scam.
00:59:13.000 Now, this is not something that you ever set out to do.
00:59:17.000 You didn't have this as a career aspiration.
00:59:20.000 No.
00:59:21.000 This is the most absurd outcome I can possibly imagine, actually.
00:59:24.000 Also, Doge started as sort of a meme coin.
00:59:27.000 It was like a joke cryptocurrency involving memes and dogs.
00:59:36.000 Which is so funny that the letters wind up being perfect.
00:59:39.000 Yeah.
00:59:40.000 Well, actually, I was originally going to call it the Government Efficiency Commission, which is a very boring name.
00:59:45.000 And then people online were like, no, it needs to be the Department of Government Efficiency.
00:59:49.000 And I was like, you know what?
00:59:51.000 You're right.
00:59:51.000 Of course.
00:59:53.000 I mean, it's more evidence of the simulation.
00:59:55.000 Totally.
00:59:56.000 That little...
00:59:57.000 Like our mascot is a cute dog.
01:00:03.000 And it's a meme coin.
01:00:05.000 The meme coin's probably worth a lot of money right now, right?
01:00:07.000 Like every time you tweet about it...
01:00:08.000 Probably.
01:00:08.000 I don't know.
01:00:09.000 It shoots up.
01:00:09.000 Yeah.
01:00:10.000 The whole meme coin thing is bananas.
01:00:14.000 Yeah.
01:00:14.000 It is so bananas that people dump real money.
01:00:17.000 Into these coins and then you could just pump them up and sell them.
01:00:21.000 It's like a casino or something.
01:00:22.000 Yeah, it's totally gambling.
01:00:24.000 People just do whatever the greater fool theory and musical chairs and whoever's like the last to sit down loses type of thing.
01:00:30.000 And somehow or another it's still legal.
01:00:33.000 I think not too many people.
01:00:37.000 I mean, it's sort of like you go to the casino.
01:00:41.000 If you expect to win at the casino, you're being a fool.
01:00:45.000 I think if you expect to win at meme coins, you're being foolish.
01:00:51.000 You're not going to win at meme coins.
01:00:56.000 Don't sink your life savings into a meme coin.
01:00:58.000 No, but you can gamble a little and you can ride waves and win a little and lose a little.
01:01:04.000 If you want to have some fun, then play with meme coins.
01:01:09.000 But if you put your family's...
01:01:12.000 Don't bet the farm on a meme coin.
01:01:13.000 At the risk of saying something bold and outrageous, don't bet the farm on a meme coin.
01:01:20.000 The weird one is the pump and dumps.
01:01:23.000 They happen all the time.
01:01:24.000 All the time.
01:01:25.000 And people get shocked that somebody pump and dumped.
01:01:27.000 What are you doing?
01:01:30.000 I was hoping to dump.
01:01:32.000 I was hoping to make all the money out of this.
01:01:34.000 I can't believe they got me.
01:01:37.000 It's just weird that it's legal still.
01:01:42.000 I mean, casinos are legal.
01:01:43.000 Yeah.
01:01:44.000 It is like a casino.
01:01:44.000 And people just lose money at casinos.
01:01:46.000 Yeah.
01:01:46.000 But you can't rig a casino, like a pump and dump.
01:01:49.000 You could rig a pump and dump.
01:01:51.000 You know?
01:01:51.000 Yeah, I guess so.
01:01:55.000 Like, you could run a real pyramid scheme.
01:01:58.000 I mean, the government's one big pyramid scheme, if you ask me.
01:02:01.000 Yeah, well, you could tell me better than anybody.
01:02:02.000 Social Security is the biggest Ponzi scheme of all time.
01:02:07.000 Right, explain that.
01:02:08.000 Oh, so, well, people pay into Social Security and the money goes out of Social Security immediately.
01:02:15.000 But the obligation for Social Security is your entire retirement career.
01:02:21.000 So you're paying – like if you look at the future obligations of Social Security, it far exceeds the tax revenue.
01:02:36.000 Far.
01:02:37.000 Have you ever looked at the debt clock?
01:02:40.000 Yes.
01:02:41.000 Okay.
01:02:41.000 There's our present day debt, but then there's our future obligations.
01:02:46.000 So when you look at the future obligations of Social Security, the actual national debt is like double what people think it is because of the future obligations.
01:03:00.000 So basically people are living way longer than expected.
01:03:07.000 And there are fewer babies being born.
01:03:09.000 So you have more people who are retired and get, that live for a long time and get retirement payments.
01:03:15.000 So the future obligation, so however bad the financial situation is right now for the federal government, it will be much worse in the future.
01:03:26.000 At the risk of being a buzzkill here.
01:03:28.000 Did you see...
01:03:29.000 So we better fix what we've got right now, because if it's bad now, it's going to be much worse in the future.
01:03:33.000 There was an interview with this woman who was a whistleblower.
01:03:35.000 Did we ever find out if that was true?
01:03:37.000 There's so many whistles being blown, it's hard to keep track.
01:03:39.000 A lot of whistles.
01:03:40.000 But this one lady...
01:03:41.000 It was only in one state.
01:03:43.000 It was a very specific instance, I believe.
01:03:45.000 Right.
01:03:45.000 But it was using Social Security money, correct?
01:03:48.000 That was her allegation.
01:03:50.000 So what she was alleging was that she was in charge of turning illegal immigrants into clients.
01:03:58.000 That's what they would call them.
01:03:59.000 And that she would go to them and try to ask them, do you have a headache?
01:04:03.000 Do you have back problems?
01:04:04.000 If you do, now you can be permanently disabled.
01:04:07.000 You get permanent disability, so you get Social Security for life.
01:04:10.000 Yes.
01:04:11.000 Not just social security, but disability, which is even more.
01:04:13.000 Right, and you get them on the taxpayer dole right away, and they're illegally aliens.
01:04:19.000 Yes.
01:04:20.000 So, if I were to say, like, what's at the heart of the sort of...
01:04:24.000 Like, why is the Democrat propaganda machine so fired up to destroy me?
01:04:30.000 That's the main reason.
01:04:32.000 The main reason is that...
01:04:39.000 Entitlements fraud, that includes like Social Security, disability, Medicaid.
01:04:44.000 Entitlements fraud for illegal aliens is what is serving as a gigantic magnetic force to pull people in from all around the world and keep them here.
01:04:54.000 Like basically, if you pay people at a standard of living that is above 90% of Earth, then you have a very powerful...
01:05:07.000 Incentive for 90% of Earth to come here and to stay here.
01:05:12.000 But if you end the illegal alien fraud, then you turn off that magnet and they leave.
01:05:20.000 And they stop coming and the ones that are here, many of them will simply leave.
01:05:27.000 And if that happens, they will lose a massive number of Democratic voters.
01:05:32.000 And if it didn't happen, they would turn those people into voters.
01:05:36.000 Correct.
01:05:36.000 Which they were trying to do.
01:05:39.000 So in New York State, illegal aliens can already vote in state and city elections.
01:05:46.000 A lot of people don't know that.
01:05:48.000 I mean, they're trying to fight that and they're trying to stop that, but currently I think it's like 600,000 are registered to vote, illegal aliens, in New York.
01:05:58.000 That is wild.
01:05:59.000 Yeah.
01:06:00.000 Well, I mean...
01:06:02.000 I feel I could say, you know, FEMA, like the agency that was paying for illegal aliens to stay at luxury hotels in New York was FEMA. You know, that's an agency that's meant to support Americans in distress from natural disasters, was paying for luxury hotels for illegals in New York.
01:06:27.000 It's true.
01:06:28.000 Yeah.
01:06:29.000 There's a fact.
01:06:31.000 When we stopped that payment, we stopped all those money because that's obviously an insane way to spend taxpayer money.
01:06:40.000 New York sued the federal government to get the money.
01:06:45.000 So you can just look at their lawsuit.
01:06:48.000 They were sending that money even after President Trump signed an executive order saying it needs to stop.
01:06:56.000 They still press send.
01:06:58.000 On $80 million to luxury hotels in New York.
01:07:03.000 Your tax money went to pay for illegal aliens in luxury hotels in New York from an agency that is meant to help Americans in distress from natural disasters.
01:07:15.000 Right, and I would like to know how much they spent on North Carolina and how much they spent on Maui.
01:07:21.000 Yes, exactly.
01:07:24.000 What's actually happening is they're buying voters.
01:07:27.000 That's really what's happening.
01:07:28.000 It's like a giant voter fraud scam.
01:07:32.000 They're importing voters, and it's really just a matter of time.
01:07:38.000 A lot of people have trouble believing this, but the more you look at it, the more you will realize just how much of a problem this is.
01:07:49.000 It's not just real.
01:07:51.000 It is an attempt to destroy...
01:07:56.000 Democracy in America.
01:07:57.000 That's what, in my view, is what it really is.
01:08:00.000 If you take the seven swing states, often the margin of victory there is maybe 20,000 votes.
01:08:08.000 If you put 200,000 illegals in there, and they have an 80% likelihood of voting down, and it's only a matter of time before they become citizens, then those swing states will not be swing states in the future.
01:08:25.000 And if they are not swing states, we'll be a permanent one-party state country.
01:08:31.000 Permanent deep blue socialist state.
01:08:34.000 That's what America will become.
01:08:36.000 And that was the game plan?
01:08:37.000 That was the game plan.
01:08:39.000 That is still the game plan.
01:08:42.000 And so they almost succeeded.
01:08:44.000 If the machine of which the Kamala Poppet was the representation had won, that's what would have happened.
01:08:54.000 The reason I went so hardcore for Trump was because, to me, this was a fork in the road, like a very obvious fork in the road.
01:09:05.000 If they had another four years, they would legalize enough illegals in the swing states to make the swing states not swing states.
01:09:15.000 They would be blue states.
01:09:18.000 Then they would win the House, the Senate.
01:09:23.000 And the presidency.
01:09:26.000 They would then make D.C. into a state, maybe Puerto Rico, get four extra senators, pack the Supreme Court.
01:09:36.000 So then you'll have the House, Judiciary, Senate, and presidency, all blue.
01:09:42.000 And then they will keep importing more illegals to cement that outcome.
01:09:48.000 Basically what happened in California.
01:09:53.000 Jesus Christ.
01:09:54.000 It would have been the end.
01:09:57.000 That's why I went so hardcore for Trump.
01:10:00.000 It would otherwise have been the end.
01:10:02.000 And that's why the Democrat machine is so intent on destroying me.
01:10:09.000 It's just so fascinating that people can't see this.
01:10:13.000 I mean, I invite people to do their research.
01:10:15.000 The more they do their research, the more they will see that what I'm saying is absolutely true.
01:10:23.000 Just do the research.
01:10:24.000 Yeah.
01:10:25.000 It's such a bad idea, even for the Democrats, which is what they don't understand.
01:10:31.000 Like, it's the same people.
01:10:32.000 It's not ultimately going to work out.
01:10:33.000 No.
01:10:33.000 It's the same people.
01:10:35.000 Yes.
01:10:35.000 It's just, they're doing it under the guise that they're the kind, compassionate, progressive people.
01:10:41.000 Yes.
01:10:42.000 But the same outcome takes place.
01:10:43.000 It's just about control.
01:10:45.000 They probably institute some central bank digital currency and some social credit score system.
01:10:51.000 And censorship, of course.
01:10:52.000 Yeah, of course.
01:10:53.000 Well, that was the big fear coming into this election, was that if they can't censor things, like, we talked about it before, but there was two major forks in the road.
01:11:03.000 The big one was Trump didn't get shot.
01:11:05.000 The other big one was you buy Twitter.
01:11:07.000 And if those two things don't happen, the whole world looks different.
01:11:12.000 Yes.
01:11:13.000 We don't want to be on that timeline.
01:11:15.000 No, we don't want to have only one side represented, because guess what?
01:11:19.000 They will hijack that side, whatever it is.
01:11:21.000 They will hijack that side and use it for money and control, and that's what it's all about.
01:11:26.000 It's not about good people versus bad people.
01:11:28.000 It's a bullshit shell game.
01:11:29.000 Yeah.
01:11:30.000 I mean, I think these things are actually...
01:11:32.000 It's easy to understand if you look at basic incentives.
01:11:36.000 The basic incentive here is the more illegals that the Democrats can bring in, the more likely they are to win.
01:11:43.000 So that's what they're going to do.
01:11:47.000 That's what they have been doing.
01:11:49.000 And it worked in California.
01:11:50.000 California's supermajority, damn.
01:11:53.000 And look at all the companies that are leaving California.
01:11:55.000 Yeah.
01:11:57.000 I mean, In-N-Out just announced they're leaving.
01:11:59.000 Yep.
01:11:59.000 Their headquarters is leaving California.
01:12:00.000 They're moving to Tennessee.
01:12:01.000 Yeah.
01:12:01.000 Yeah.
01:12:03.000 So...
01:12:03.000 And California made healthcare free for illegals.
01:12:10.000 Yeah.
01:12:10.000 As of last year.
01:12:13.000 And obviously that's a gigantic magnet for more illegals.
01:12:17.000 And this is not a thing you can solve simply with money because what happens is you simply have more patients than a doctor can possibly see.
01:12:25.000 And you can't just make doctors out of nothing.
01:12:30.000 So sometimes people are like, oh, it's just a money thing.
01:12:32.000 No, it takes a long time for somebody to become a doctor.
01:12:36.000 You know, 30 years.
01:12:41.000 And so what actually happens in California is that there are too many patients for the doctors to see.
01:12:47.000 So then the average citizen in California suffers as a result.
01:12:55.000 Now, the elite in California are fine because they have private doctors.
01:12:58.000 You know, they can just pay for the best doctors.
01:13:02.000 So the elite in California are doing fine, but your average citizen in California is not doing fine.
01:13:11.000 Healthcare for illegals was supposed to be 3 billion.
01:13:13.000 I think they're now estimated it's 9 billion.
01:13:15.000 But that number will scale to infinity, basically.
01:13:17.000 It's like, why not?
01:13:20.000 Like, why not, if you need any operation at all, come to California and have it be free.
01:13:26.000 Right.
01:13:27.000 From anywhere on Earth.
01:13:28.000 And the people that want to look at it...
01:13:31.000 In the most charitable way, they say, oh, well, these people are hardworking, good people, and they're the backbone of our city, and they should have access to all the things that we have access to.
01:13:41.000 And I just don't think they understand that it's a political pawn.
01:13:44.000 I don't think they understand— It's a political game.
01:13:46.000 This is not done for compassion and kindness.
01:13:49.000 No.
01:13:49.000 This is just done to ensure that it stays blue.
01:13:52.000 Correct.
01:13:53.000 And it's essentially a bribery with your tax dollars.
01:13:56.000 Yes.
01:13:56.000 This is why— The Dems will not even deport criminals.
01:14:05.000 Because every criminal deported is a lost vote.
01:14:09.000 So even if somebody is illegal with a criminal record and commits crime in America, they still were not being deported.
01:14:19.000 And then on top of that, California made it actually illegal to ask for ID when people vote.
01:14:25.000 Yes.
01:14:26.000 California and New York, you're not allowed to show your ID when you vote.
01:14:32.000 I just want to be clear so everyone understands this.
01:14:35.000 In California and New York, you are not allowed to show your ID even if you want to.
01:14:39.000 Right.
01:14:41.000 Why would that ever be a good idea?
01:14:45.000 I mean...
01:14:47.000 If you're trying to facilitate fraud in elections, it's a great idea.
01:14:52.000 That's the only reason.
01:14:53.000 Yes.
01:14:54.000 There's no other reason, logically, why that would be a good idea.
01:14:57.000 It's for fraud.
01:14:58.000 It's like, wake up, sheeple.
01:15:02.000 Wake up.
01:15:03.000 Hello?
01:15:04.000 Let's say you wanted to commit fraud.
01:15:05.000 What are the things you would do?
01:15:07.000 You would say, you don't need ID, and you can mail in your ballot.
01:15:11.000 And we'll give you free healthcare.
01:15:12.000 Yes.
01:15:13.000 Stay here.
01:15:14.000 Yes.
01:15:15.000 Stay here.
01:15:15.000 I know it's on fire, but stay here.
01:15:17.000 I mean, in this case, it being on fire is not just a metaphor in California.
01:15:22.000 It's just like, goddamn, entire neighborhood's burning down.
01:15:26.000 It's just once they allowed people to vote that are not legal in California, if you're going to do that, it's over.
01:15:37.000 Exactly.
01:15:38.000 There's no coming back from that.
01:15:40.000 The numbers are just, no, people are so indoctrinated, too.
01:15:44.000 There's so many people that, no matter what, they think voting Republican means you're an asshole.
01:15:48.000 Yes.
01:15:48.000 And they won't do it.
01:15:49.000 They won't do it.
01:15:50.000 They'll put their fucking rainbow flag on their porch and they'll just ride it right into the beach.
01:15:55.000 Civilizational suicide.
01:15:56.000 Yeah.
01:15:57.000 Right to the rocks.
01:15:58.000 Bang!
01:15:58.000 Crash the boat.
01:15:59.000 I mean, there's a guy who posts on X who's great, Godside.
01:16:05.000 Yeah, he's a friend of mine.
01:16:06.000 He's been on the podcast a bunch of times.
01:16:07.000 Yeah, he's awesome.
01:16:08.000 Yeah, he's great.
01:16:09.000 And he talks about basically suicidal empathy.
01:16:15.000 There's so much empathy that you actually suicide yourself.
01:16:21.000 So we've got civilizational, suicidal empathy going on.
01:16:26.000 I believe in empathy.
01:16:27.000 I think you should care about other people.
01:16:28.000 But you need to have empathy for civilization as a whole.
01:16:33.000 Also, don't let someone use your empathy against you so they can completely control your state and then do an insanely bad job of managing it and never get removed.
01:16:44.000 The fundamental weakness of Western civilization is empathy.
01:16:51.000 The empathy exploit.
01:16:54.000 They're exploiting a bug in Western civilization, which is the empathy response.
01:17:02.000 So, and I think empathy is good, but you need to think it through and not just be programmed like a robot.
01:17:10.000 Right.
01:17:11.000 Understand when empathy has been actually used as a tool.
01:17:14.000 Yes.
01:17:15.000 It's weaponized empathy is the issue.
01:17:19.000 Yeah.
01:17:20.000 Weaponized empathy.
01:17:21.000 And, yeah.
01:17:23.000 And it's also the rigid adherence to that liberal ideology.
01:17:27.000 Like, you can't switch sides over there.
01:17:30.000 Like, California, if you're a part of that whole tech, Hollywood, entertainment, any of those circles, you're on the left.
01:17:38.000 Like, almost wholly.
01:17:41.000 Almost completely.
01:17:42.000 It's borderline illegal to be a Republican in California.
01:17:47.000 In San Francisco or LA, it's borderline illegal to be a Republican.
01:17:50.000 You're certainly shunned.
01:17:52.000 No, look, in San Francisco, you could shoot heroin while taking a dump on the mayor's car in front of City Hall, okay?
01:18:00.000 And nothing would happen to you.
01:18:04.000 But if you walk down the street with a MAGA hat, you're going to get attacked.
01:18:10.000 It's insane.
01:18:11.000 Yeah, it's insane.
01:18:13.000 It's also so Orwellian that a hat that says, Make America Great Again, would cause people to have a violent reaction.
01:18:22.000 Aren't you American?
01:18:23.000 Just as a whole, like the saying, wouldn't that be a good thing for everyone?
01:18:29.000 Make America great again.
01:18:30.000 But because it's attached to Donald Trump and that red hat, you'll get maced for wearing that red hat.
01:18:37.000 They will make America worse by beating you.
01:18:40.000 So it's like it's an evil thing they're doing, a violent assault in America, because you want to make America great again.
01:18:50.000 I mean, it's like a scene in a book.
01:18:52.000 It doesn't seem like it could be that ridiculous.
01:18:57.000 Like, remember when All Lives Matter would get you fired?
01:18:59.000 Which is insane.
01:19:01.000 Insane.
01:19:02.000 People got fired because they said All Lives Matter.
01:19:05.000 Which is a very reasonable thing to say.
01:19:07.000 How reasonable is that?
01:19:09.000 Yes.
01:19:09.000 That's essentially saying everybody matters.
01:19:12.000 That's literally all you're saying.
01:19:14.000 That's not what you were supposed to say.
01:19:16.000 You had to say black lives matter, which of course they do if you say all lives matter.
01:19:20.000 Everybody matters.
01:19:21.000 Yes.
01:19:21.000 But the idea of being a colorblind society was completely abandoned somewhere around 2012-ish.
01:19:28.000 Yeah, I mean, I sort of can trace it to when did the gun emoji get nerfed, you know?
01:19:33.000 When did it turn into a squirt gun?
01:19:35.000 That was a couple of years ago.
01:19:36.000 That was like 2016, I think.
01:19:38.000 Was it?
01:19:38.000 Yeah.
01:19:39.000 Yeah, it became a squirt gun.
01:19:42.000 Can you bring it back to X? Yeah, no, no.
01:19:44.000 If you use a gun emoji on X, Apple will insist that it be a squirt gun and then the X app turns it back into a 1911. Oh, really?
01:19:57.000 Yeah.
01:19:58.000 Oh, that's great.
01:19:58.000 So you can actually have a 1911 gun.
01:20:05.000 We reverted the Apple change inside the app.
01:20:08.000 Oh, that's hilarious.
01:20:09.000 That's hilarious.
01:20:11.000 That thing's so offensive.
01:20:13.000 The gun and then the pregnant man.
01:20:14.000 Both of those got me.
01:20:16.000 You motherfuckers.
01:20:19.000 I mean, I like that meme where it's like, the people telling you that what you're hearing is this information are the same people that did the pregnant man emoji.
01:20:27.000 Yes.
01:20:28.000 Yeah.
01:20:28.000 Think about that.
01:20:29.000 Yeah.
01:20:30.000 Well, also, the same people that say a woman attacked a Tesla factory.
01:20:36.000 Yeah.
01:20:36.000 The woman.
01:20:37.000 It's a dude.
01:20:37.000 It's a dude.
01:20:38.000 Like, really obvious dude.
01:20:40.000 Really big looking.
01:20:42.000 Like a mentally ill dude.
01:20:43.000 Yeah, mentally ill dude with a wig on.
01:20:45.000 Say that.
01:20:45.000 Yes.
01:20:46.000 Yeah, but NBC, even Fox.
01:20:49.000 I think even Fox called it a woman.
01:20:52.000 Yo, this is a dude with a strong jawline.
01:20:55.000 Yeah, he's wearing woman face.
01:20:57.000 This is a buff dude.
01:21:00.000 Yeah, it's a buff dude wearing woman face.
01:21:02.000 I mean, come on.
01:21:04.000 That's not a woman face.
01:21:05.000 Yeah.
01:21:06.000 And they're like saying, watch out for disinformation.
01:21:09.000 I'm like, what are you talking about?
01:21:10.000 It's so crazy.
01:21:11.000 This is bullshit.
01:21:12.000 I mean, it's just more evidence of the virus, though, right?
01:21:16.000 Like, it killed objectivity, killed reality, and it demanded strict adherence, or you were attacked.
01:21:24.000 Yeah.
01:21:24.000 Yeah.
01:21:25.000 Any questioning of it would result in being ostracized.
01:21:29.000 What kind of responsibility do you feel?
01:21:33.000 Knowing that if you didn't take over Twitter and turn it into X, if that didn't happen, I really think the world's a very different place right now.
01:21:44.000 How long have you owned it for?
01:21:47.000 A couple years, basically.
01:21:48.000 Imagine a couple years of it being run the way it was run before.
01:21:52.000 And probably accelerating.
01:21:54.000 I mean, my account would have been suspended long ago.
01:21:57.000 For sure.
01:21:58.000 Yeah, for sure.
01:21:59.000 Yeah, just for the disinformation.
01:22:02.000 Yeah.
01:22:05.000 It would have been...
01:22:06.000 Trump would have never come back.
01:22:07.000 Alex Jones would have definitely never been back.
01:22:09.000 Definitely not.
01:22:09.000 No, no.
01:22:13.000 So...
01:22:13.000 Yeah.
01:22:15.000 Does that weigh on you?
01:22:17.000 Like, I would feel like that would be a fucking heavy responsibility.
01:22:23.000 Yeah.
01:22:24.000 I mean, I'm just trying to keep civilization going here for longer.
01:22:31.000 So, I think we at least want to build a city on Mars and become a multi-planet civilization, which I think would be incredibly important in ensuring the long-term survival of...
01:22:50.000 Civilization.
01:22:53.000 Are you still rescuing those people that are stuck in the space station?
01:22:56.000 Yeah, that's coming up in a couple weeks, I think.
01:22:58.000 Whoa.
01:22:59.000 They've been up there for...
01:23:01.000 How long, Jamie?
01:23:01.000 They were supposed to be there for a couple days, right?
01:23:03.000 Actually, probably four weeks.
01:23:05.000 They were supposed to be up there for like eight days.
01:23:08.000 Yeah.
01:23:08.000 And they've been up there for like eight months.
01:23:11.000 So a little longer than expected.
01:23:13.000 Fuck.
01:23:14.000 Yeah.
01:23:15.000 What is it going to be like for those people when they get back?
01:23:17.000 They're going to be a wreck for a long time, right?
01:23:18.000 Yeah.
01:23:19.000 As long as you stay up there, you get, you know, sort of in zero G, you get increased bone loss.
01:23:24.000 So it ended up being like this political football and sort of hotly contested topic.
01:23:34.000 We offered to bring them back early.
01:23:36.000 This offer was rejected by the Biden administration.
01:23:39.000 Why?
01:23:40.000 For political reasons.
01:23:41.000 That's so crazy.
01:23:42.000 There's no way that they're going to make anyone who's supporting Trump look good.
01:23:47.000 Wow.
01:23:49.000 What do you think they would have done if they had won?
01:23:53.000 How would they get those people back?
01:23:55.000 No, they can only get them back with a SpaceX spacecraft.
01:23:58.000 But they pushed the return date past the inauguration date.
01:24:02.000 Wow.
01:24:03.000 Yeah.
01:24:05.000 So they would have let you do it, but after the...
01:24:09.000 Wow.
01:24:13.000 Authorizing you to do it.
01:24:13.000 There isn't anyone else to do it.
01:24:16.000 NASA can't get them.
01:24:18.000 The SpaceX Dragon spacecraft is the only one that is considered safe enough to bring them back.
01:24:25.000 So NASA concluded that the Boeing spacecraft was not safe.
01:24:30.000 So that's why they're stuck there.
01:24:31.000 Holy shit.
01:24:33.000 Yeah.
01:24:33.000 And you can't ask Russia to help.
01:24:35.000 That would be awkward.
01:24:36.000 A little bit.
01:24:37.000 Yeah.
01:24:37.000 It would be a nice thing if they did.
01:24:39.000 They said, guys, we'll help.
01:24:41.000 I think that for enough money they would.
01:24:43.000 You think so?
01:24:45.000 Yeah.
01:24:46.000 But they would obviously treat it as a propaganda victory and charge crazy money.
01:24:52.000 It's just disgusting that they would use that as a political tool.
01:24:57.000 Yeah.
01:24:59.000 Yeah.
01:25:02.000 Well, the Biden administration was also suing SpaceX.
01:25:06.000 They had this massive lawsuit against SpaceX for...
01:25:10.000 SpaceX not hiring asylum seekers.
01:25:13.000 Right.
01:25:14.000 So people say, like, oh, Elon's making it up.
01:25:17.000 The Biden administration wasn't against SpaceX.
01:25:19.000 I'm like, bro, the Department of Justice had a massive lawsuit against SpaceX for not hiring asylum seekers, even though it is illegal for us to hire anyone who is not a permanent resident.
01:25:31.000 So it is both...
01:25:33.000 There's law that says you have to hire asylum seekers, but there's also a law that says...
01:25:38.000 Anyone hired by a rocket company, which is an advanced weapons technology, must be a permanent resident.
01:25:45.000 An asylum seeker is not a permanent resident.
01:25:48.000 So it is both legal and illegal to hire asylum seekers.
01:25:54.000 So why would the Biden administration launch a massive lawsuit?
01:25:58.000 Again, this is public information.
01:26:00.000 It's not like my imagination.
01:26:03.000 Why would they launch to do such a massive lawsuit against SpaceX?
01:26:07.000 They're extremely antagonistic.
01:26:12.000 It just doesn't make any sense that that could ever even get past the first day of someone looking at it.
01:26:19.000 If it's both illegal and you're trying to enforce it.
01:26:23.000 Like, you can't enforce it.
01:26:24.000 This is an advanced weapons company.
01:26:26.000 This is crazy.
01:26:27.000 It should be like this.
01:26:28.000 Throw this out.
01:26:29.000 In fact, there's like international traffic and arms regulations is like a law that...
01:26:36.000 It's there to ensure that only permanent residents of the United States can work at advanced weapons companies.
01:26:44.000 Rockets are advanced weapons.
01:26:47.000 So the same is true if it's nuclear or some bioweapon thing or something like that.
01:26:55.000 Obviously, if someone were to work at SpaceX and then leave and go to North Korea or Iran, they could build missile technology that could...
01:27:07.000 You know, destroy the United States.
01:27:09.000 So that's why you're not allowed to hire people who want permanent residents.
01:27:12.000 It's logical.
01:27:13.000 Logical.
01:27:14.000 So is that lawsuit still pending?
01:27:16.000 It was just dismissed.
01:27:20.000 How long was it going on for?
01:27:21.000 A couple of years.
01:27:23.000 Holy shit.
01:27:24.000 Yeah.
01:27:26.000 That's the other thing that drives me crazy, like that people don't understand that if you sanction lawfare like that, if you sanction attacking your political enemies, someone's gonna do that to you.
01:27:37.000 Like if the wrong people get in office, if new people get in office four years from now, eight years from now, who knows who it's gonna be?
01:27:45.000 You've already set a precedent.
01:27:47.000 You've already attacked someone, charged them with 34 felonies, where they're really just misdemeanors.
01:27:53.000 And they're also past the statute of limitation.
01:27:56.000 And now you're talking all over the news that this is a convicted felon, convicted felon.
01:28:02.000 They kept saying convicted felon, convicted felon.
01:28:04.000 And everybody knows what it is.
01:28:06.000 It's terrifying.
01:28:07.000 It's terrifying they could do it so brazenly to a guy who was the president for four years.
01:28:14.000 Right.
01:28:15.000 That lawsuit was funded by Reid Hoffman, who is a major damn donor.
01:28:22.000 And also an Epstein client.
01:28:26.000 The plot thickens.
01:28:28.000 The plot thickens.
01:28:29.000 Jesus Christ.
01:28:31.000 Yes.
01:28:33.000 It's just...
01:28:34.000 It's so blatant.
01:28:37.000 It's so obvious.
01:28:38.000 The SpaceX lawsuit, the Trump stuff, it's just so obvious.
01:28:42.000 Yes.
01:28:42.000 Known Epstein clients who are obviously extremely powerful politically and very wealthy.
01:28:53.000 You know, Bill Gates, Bill Clinton, and Reid Hoffman.
01:28:59.000 And some others too, but those three.
01:29:03.000 So, you know, why was Reid Hoffman so intent on destroying Trump?
01:29:10.000 You think it's because they're worried about the list coming out?
01:29:14.000 Yeah.
01:29:15.000 One of the reasons.
01:29:18.000 Yeah.
01:29:21.000 So...
01:29:21.000 I mean, I'm like, this is, you know, yeah, so...
01:29:25.000 It's so frustrating to be sitting in a situation where the list isn't coming out.
01:29:29.000 Well, it better come out, I mean, hopefully tomorrow.
01:29:32.000 Well, I mean, why'd they release bullshit today?
01:29:34.000 I don't know.
01:29:35.000 What's the point in giving these people, like, a happy folder to wave around in front of the camera with nothing in it that's new?
01:29:41.000 Doesn't make any sense.
01:29:42.000 It's not encouraging.
01:29:46.000 Uh...
01:29:47.000 Like I said, the tough thing that they've got is they've been made captain of a ship with a hostile crew.
01:29:53.000 Right.
01:29:54.000 So they don't have like...
01:29:56.000 It's not like you have magical powers.
01:29:59.000 You get made captain of a ship with a hostile crew.
01:30:02.000 You still have a hostile crew.
01:30:03.000 Right.
01:30:04.000 You've got to bring in people who are going to be helpful as opposed to obstructionists.
01:30:13.000 Right.
01:30:13.000 So, but yeah, I think the public will be rightly frustrated if there is – if no one is prosecuted for the FDN client list, I think the public will be rightly frustrated if there is – if no one is prosecuted for the FDN client you know, like at least, I don't know, the top five or something, like some number.
01:30:42.000 There should at least be an attempted prosecution of the worst offenders.
01:30:46.000 Well, particularly if Ghislaine Maxwell is in jail for sex trafficking.
01:30:50.000 Yes.
01:30:51.000 That means sex trafficking occurred.
01:30:54.000 Right.
01:30:54.000 So she's in jail for it.
01:30:56.000 Yes.
01:30:57.000 Who were the clients?
01:31:00.000 Yeah.
01:31:01.000 How do you put someone in jail and you don't even name the clients?
01:31:04.000 That sounds kind of insane.
01:31:06.000 I think, yes, it would...
01:31:10.000 Yes.
01:31:10.000 It's just stunning that they've been able to hold it back for so long.
01:31:14.000 It's really kind of amazing.
01:31:17.000 Like, when people say that people can't keep secrets, what the fuck are you talking about?
01:31:20.000 Look at this.
01:31:22.000 Yeah, I mean, a bunch of these things are not, like, it's common knowledge, but we just, we don't actually have the proof.
01:31:29.000 Right.
01:31:30.000 So the proof is there.
01:31:31.000 I mean, there's lots of videos.
01:31:37.000 Yeah.
01:31:39.000 I mean, the dude is like a mountain of, like, whenever they raided Epstein's place, there would have been, like, a mountain of evidence.
01:31:47.000 Where is that mountain?
01:31:48.000 Right.
01:31:48.000 What'd you do with it?
01:31:49.000 Yes.
01:31:50.000 Like, who took possession of the evidence?
01:31:52.000 Yeah.
01:31:53.000 Specifically.
01:31:53.000 Right.
01:31:54.000 The individuals.
01:31:55.000 Where are the tapes?
01:31:56.000 Yes.
01:31:56.000 How many levels of clearance do I have to get to get into the vault?
01:32:00.000 Yeah.
01:32:02.000 Well...
01:32:03.000 Yeah.
01:32:10.000 Yeah.
01:32:10.000 You know, what we need are people who are really good with computers.
01:32:14.000 Oh, yeah.
01:32:15.000 And really good with technology.
01:32:17.000 I remember seeing this photo.
01:32:19.000 That's when they raided his home?
01:32:20.000 They were.
01:32:21.000 That's when they were on the island?
01:32:22.000 They were there then.
01:32:23.000 Yeah.
01:32:24.000 They got everything, I'm sure.
01:32:26.000 I mean, there must have been so much stuff on that island.
01:32:28.000 There must have been.
01:32:29.000 And if it wasn't there, where was it?
01:32:31.000 Yeah.
01:32:32.000 What, you know, it has to be uploaded somewhere.
01:32:35.000 There has to be some sort of a chain of evidence.
01:32:37.000 Yeah.
01:32:37.000 Or chain of custody.
01:32:38.000 It's got to be a mountain of evidence.
01:32:41.000 Yeah.
01:32:42.000 The other thing they're going to talk about is UAPs.
01:32:44.000 They're going to release all the UAP information.
01:32:47.000 So you're the guy to ask about this.
01:32:49.000 What, if any, possibility is there that there is some sort of advanced propulsion system technology that's being worked on in secret?
01:33:00.000 And that they're trying to cover this up with this talk of aliens and alien tech and not of this world.
01:33:08.000 And is it possible that there's some sort of very secret program that's going on in cahoots with some defense contractors that are developing advanced propulsion systems that they're using for these drones?
01:33:23.000 I mean, SpaceX, you know, my company SpaceX has the most advanced rocket technology in the world.
01:33:35.000 I think I'd know.
01:33:37.000 Right.
01:33:40.000 And to the best of my knowledge, there is not any magic.
01:33:43.000 There's not like some super advanced propulsion technology.
01:33:47.000 There have been people who have theorized different gravity drives and different things.
01:33:52.000 Is there anything that's ever gotten past the theoretical stage?
01:33:56.000 No.
01:33:57.000 Nothing.
01:33:58.000 Well, there's nothing even that I'm aware of that works in theory.
01:34:04.000 I would like this to exist, to be clear.
01:34:07.000 I would like this to exist.
01:34:09.000 And I have the...
01:34:11.000 From a security clearance standpoint, I have top, top secret.
01:34:16.000 I have the equivalent of like an all-access pass from a security clearance standpoint.
01:34:24.000 So...
01:34:26.000 I don't think they're hiding it from me, basically.
01:34:28.000 I don't think they could.
01:34:31.000 Unless it's completely in these weapons manufacturing corporations.
01:34:37.000 I mean, I know these weapons manufacturing companies, like Boeing, Lockheed, and Northrop.
01:34:42.000 I mean, yeah, they do some interesting things, but they do not have...
01:34:46.000 There's no breakthrough that they have.
01:34:50.000 I'm confident they do not have a breakthrough.
01:34:52.000 Why don't they just compete with SpaceX and make a better rocket?
01:34:57.000 Why are they holding back on making a lot of money from beating SpaceX with better rockets?
01:35:04.000 My thought was that what if it's just a drone and you can't have a biological entity inside of it because it just bursts from the fucking speed that it's moving at?
01:35:16.000 That a human couldn't tolerate the amount of force?
01:35:19.000 So they're just drones?
01:35:22.000 I don't think so.
01:35:23.000 So what do you think?
01:35:24.000 People like Ryan Graves and Commander David Fravor?
01:35:26.000 I want cool things to exist.
01:35:28.000 Do I want UFOs to exist?
01:35:32.000 Yes, I want UFOs to exist because that would be really interesting.
01:35:34.000 Of course, everybody does.
01:35:35.000 It would be cool.
01:35:37.000 It's a more boring world where UFOs don't exist.
01:35:40.000 Or like advanced propulsion stuff doesn't exist.
01:35:46.000 If it doesn't exist, that's more boring.
01:35:48.000 It'd be more interesting.
01:35:49.000 If it did exist, I'd like it to exist.
01:35:51.000 I hope we find something.
01:35:53.000 But I have not seen...
01:35:54.000 SpaceX launches 90% of all satellite mass to orbit.
01:36:04.000 So if you take all of Earth's rocket launches, my company has a 90% market share of Earth.
01:36:13.000 China does about 5%, and the rest of the world does, including the...
01:36:18.000 Boeing, Lockheed, and Northrop, and everyone, does 5%.
01:36:20.000 So why wouldn't they use this to defeat SpaceX?
01:36:32.000 Yeah.
01:36:33.000 Yeah.
01:36:35.000 That makes sense.
01:36:36.000 No, listen, that's why I asked you.
01:36:38.000 It would make sense.
01:36:40.000 What do you think these people are seeing?
01:36:42.000 We're launching a rocket every two days.
01:36:43.000 But what do you think these people are seeing?
01:36:44.000 When you have reliable people like Commander David Fravor, who had that infamous tic-tac experience off the coast of San Diego, where they got this thing on video, they tracked it going 50,000 feet above sea level to 50 feet in like a second.
01:37:01.000 Okay.
01:37:01.000 Yeah, yeah.
01:37:02.000 And then they also have video evidence of this thing accelerating at a great speed.
01:37:07.000 Eyewitness accounts from two different jets.
01:37:11.000 Sure.
01:37:15.000 Does anyone have a high-res video or photo of this thing?
01:37:19.000 Well, there's a video of this thing where they're locked onto it and then it takes off.
01:37:24.000 It shoots off.
01:37:24.000 Is it high-res?
01:37:25.000 No.
01:37:25.000 It's like whatever the systems they used on fighter jets in 2004. Essentially, like, Windows 95. I mean, there's, like, somebody did a curve of, like, the resolution of UFOs and the resolution of cameras.
01:37:39.000 UFO resolution has stayed flat, despite megapixels and cameras going, like, you know, super high.
01:37:45.000 Well, according to Christopher Mellon...
01:37:47.000 Why are they still blurry?
01:37:48.000 Christopher Mellon, who worked in the State Department, said that they have high-resolution photos and videos of these things, and that he's seen it, and it's all locked away.
01:37:57.000 Whenever people say that to me, I'm like, don't even tell me that, then.
01:38:00.000 Unless you have...
01:38:00.000 Just leak it for God's sake.
01:38:02.000 Put it out there.
01:38:03.000 Let it slip.
01:38:05.000 Yes.
01:38:05.000 Yeah.
01:38:06.000 I mean, there's a couple photos.
01:38:08.000 They're grainy.
01:38:09.000 There's not one thing that I've ever looked at and go, holy fuck, that's it.
01:38:13.000 That's what I've been looking for.
01:38:14.000 You could ask our Grok Air right now to create a high-res image of an alien spacecraft over Austin.
01:38:23.000 Yeah.
01:38:23.000 And it's going to do a great job.
01:38:25.000 Yeah.
01:38:26.000 So why would we not have at least that?
01:38:29.000 Right.
01:38:31.000 Yeah, but I want to believe.
01:38:32.000 That's the problem.
01:38:33.000 My brain starts going, oh, come on.
01:38:35.000 This is no fun.
01:38:36.000 I want it to be real.
01:38:38.000 I want there to at least be some advanced propulsion system.
01:38:42.000 If not, what are all these people seeing?
01:38:44.000 If we're not being occasionally visited by things that are smart enough to hide.
01:38:50.000 We might be.
01:38:51.000 These aliens are very subtle.
01:38:52.000 Yeah, you keep saying that.
01:38:54.000 It's a good line.
01:38:56.000 It's a solid line because it's pretty accurate.
01:38:59.000 I just want to see some high-rise video of aliens.
01:39:01.000 How are they just evading all the cameras?
01:39:04.000 If you think about that, and the ones that you do get them on, it's just like some faraway light that's moving weird, and it could be a lot of things, but I want to believe.
01:39:14.000 Yeah, I mean, there have been multiple times where the Air Force and Navy has called SpaceX and said they think they've seen aliens, and we're like, was it at this time, on this date, in this location?
01:39:29.000 They're like, yes, how'd you know?
01:39:30.000 That's us.
01:39:32.000 That's our satellites.
01:39:34.000 Those are our satellites.
01:39:35.000 They're like, no, they're not.
01:39:36.000 I'm like, yeah, they're definitely our satellites.
01:39:38.000 Oh, yeah, people see the SpaceX satellites all the time.
01:39:41.000 Yeah, they're our satellites.
01:39:42.000 And they are moving at, you know, 16,000 miles an hour.
01:39:47.000 So it's pretty fast.
01:39:50.000 And there's also stuff that the United States government does have that gets mistaken for UFOs.
01:39:55.000 I remember the first time I saw a stealth bomber.
01:39:57.000 We were filming Fear Factor.
01:39:59.000 It was like right after 2003, like right after the war had broken off.
01:40:04.000 And they were flying a stealth bomber down in Palmdale.
01:40:08.000 I was like, holy shit.
01:40:10.000 Like if I didn't know what that was, I would 100% think that's from another world.
01:40:14.000 Stealth bombers are cool.
01:40:15.000 Fucking cool.
01:40:16.000 Yeah.
01:40:17.000 Really cool.
01:40:18.000 I mean, it doesn't look like a human's.
01:40:21.000 It looks like something from Battlestar Galactica.
01:40:23.000 It does look awesome.
01:40:25.000 I mean, they're not stealthy against any advanced radar system, by the way.
01:40:30.000 It doesn't work.
01:40:31.000 It doesn't work anymore?
01:40:32.000 It doesn't work, no.
01:40:33.000 Was it old school stuff?
01:40:34.000 They're only stealthy against old radars.
01:40:38.000 Oh, okay.
01:40:40.000 I mean, you can still see them.
01:40:41.000 Like, they're not invisible.
01:40:42.000 Right.
01:40:44.000 They're not like, oh.
01:40:45.000 It's not like, you know, a cloaking device from Star Trek.
01:40:50.000 Did you see when me and Lex, we watched the rocket get caught live while it was happening?
01:40:58.000 That, to me, was one of the...
01:41:00.000 To see it actually...
01:41:02.000 I've seen videos of it happen, but to see it actually live was one of the coolest fucking things.
01:41:07.000 Like, wow, we are in the future.
01:41:10.000 Right.
01:41:12.000 I mean, nobody else can do that.
01:41:15.000 Yeah, it's true.
01:41:16.000 Nobody else could do that.
01:41:19.000 That's a fact.
01:41:20.000 Yeah.
01:41:21.000 It's pretty wild.
01:41:22.000 It's because I'm an alien.
01:41:23.000 What's this time you knew?
01:41:25.000 I thought about it.
01:41:26.000 I'm an alien, and I keep telling people I'm an alien, but they don't believe me.
01:41:29.000 I believe you.
01:41:29.000 Okay, thank you.
01:41:30.000 I believe you.
01:41:31.000 That's my suspicion all along.
01:41:34.000 I'm just trying to get back to my home planet.
01:41:36.000 I think you're a friendly alien.
01:41:37.000 Like, there's nothing wrong with aliens.
01:41:38.000 I like people from everywhere.
01:41:40.000 Yeah.
01:41:40.000 Even other planets.
01:41:41.000 What's next?
01:41:42.000 Now that you can do that and you can catch rockets, what's the ultimate expression of rocket technology?
01:41:49.000 What comes after this?
01:41:53.000 Well, the fundamental breakthrough we're aiming for at SpaceX is a fully and rapidly reusable orbital rocket where both stages are fully...
01:42:04.000 And rapidly reusable.
01:42:06.000 With our Falcon Rocket, we are able to reuse the main stage and the nose cone, but we're not able to reuse the upper stage.
01:42:15.000 And it still takes us at least a few days from when the main stage lands to when we can fly it again.
01:42:23.000 So it's not fully reusable because we lose the upper stage, which costs $10 million.
01:42:32.000 To build.
01:42:35.000 And then the main stage, it's not as reusable as like an aircraft.
01:42:39.000 You can't just like refuel it and fly.
01:42:42.000 It requires work for a couple days.
01:42:47.000 But the Starship design is the first design that is capable of full and rapid reusability, where that is one of the possible outcomes.
01:43:00.000 And once you have full and rapid reusability, the cost of access to space drops by a factor of 100. It's like 100 times cheaper.
01:43:13.000 By some metrics, it's 1,000 times cheaper.
01:43:16.000 And then when you factor in orbital refilling, so you refill on orbit, it can drop the cost per ton.
01:43:30.000 To the surface of Mars by a factor of 10,000.
01:43:35.000 Whoa.
01:43:36.000 Yeah.
01:43:37.000 So what has to improve in order to make it reusable?
01:43:40.000 Well, there's some – like we're pretty close to being able to rapidly reuse the booster there's some – like we're pretty close to being able to rapidly um...
01:44:01.000 That's why it comes back and gets caught by the arms, and then the arms place it back in the launch mount.
01:44:09.000 Now, we have a little bit of engine damage.
01:44:14.000 We've got a little bit of heat shield damage.
01:44:19.000 There's tweaks that are needed, but we're pretty close to achieving full and rapid reusability of the booster.
01:44:29.000 The ship, I mean, I think we'll achieve reusability of the ship this year, and I think we'll achieve rapid reusability of the whole stack, ship and booster, next year.
01:44:46.000 This is the fundamental breakthrough required for life to become multi-planetary.
01:44:51.000 And what needs to improve in order to make it reusable?
01:44:55.000 What's wrong with it right now?
01:44:57.000 On the shift side, the toughest problem is the heat shield.
01:45:02.000 So no one has ever developed a fully reusable orbital heat shield.
01:45:10.000 Because when you come in from orbital velocity, you come in like a flaming meteor.
01:45:13.000 Like you're just a raging bull of fire.
01:45:18.000 And it's hard to...
01:45:21.000 Have a heat shield that doesn't partially melt or get destroyed in that process.
01:45:28.000 You know, that wasn't a problem we were able to solve with Falcon 9. That's why the Alpha Stage burns up on re-entry.
01:45:37.000 With Starship, the ship portion, you've got the booster and you've got the ship.
01:45:43.000 We've got to solve making a fully reusable orbital heat shield, a problem that has never been solved before.
01:45:52.000 For a while there, I was like, I'm not sure this is solvable.
01:45:55.000 At this point, I think it is solvable.
01:45:59.000 It requires detailed iteration on the heat shield tiles.
01:46:05.000 I mean, we've vertically integrated the manufacturing of the heat shield tiles because there was no supplier that could provide us with the materials that were needed.
01:46:14.000 So you need to make essentially this very fine vermicelli of glass and aluminum oxide fibers.
01:46:33.000 Aluminum oxide is basically sapphire.
01:46:35.000 So it's like glass and sapphire, very fine fibers in exactly the right geometry with special coatings in order to have the This heat shield tile be reusable, like not melt, but not be so brittle that it gets damaged on ascent or descent.
01:47:03.000 Like it can't be as, you know, it's kind of like almost the brittleness of a coffee cup type of thing.
01:47:09.000 And the rocket's shaking like hell.
01:47:11.000 So you've got this thing like, you saw it firsthand.
01:47:13.000 Like imagine you're at ground zero of that rocket.
01:47:16.000 Like you feel how much shaking it was when you're like, Five miles away.
01:47:19.000 Imagine if you're right there.
01:47:22.000 So you're shaking these things that are as brittle as a coffee cup, trying not to have them crack or break, and then not have them melt.
01:47:33.000 You've got several thousand of these things.
01:47:39.000 And if even a few of them break, it's not reusable.
01:47:43.000 So is there innovation that's being done in the materials technology at SpaceX, where you're constantly trying to find and tweak a better version of this?
01:47:52.000 Yes.
01:47:54.000 It's a very difficult problem.
01:47:55.000 It's a problem no one has ever solved.
01:47:57.000 So we've got to get the exact right materials combination, the right molecules in the right shape, and then apply that heat shield perfectly to the rocket.
01:48:12.000 With no mistakes.
01:48:15.000 There's a reason that no one solved this before.
01:48:17.000 It's a very difficult problem.
01:48:19.000 So, like I said, we've had to vertically integrate the entire manufacturing of the tile from basic raw materials to a finished tile.
01:48:32.000 Like, build the entire supply chain from basic raw materials.
01:48:37.000 So we're just inputting...
01:48:42.000 Silicon and aluminum oxides.
01:48:44.000 And what is the difference between the way you guys do it versus the way they used to do it for the space shuttle?
01:48:53.000 Well, I mean, the space shuttle, like, the space shuttle Leading Edge used, like, quite dense carbon-carbon tiles.
01:49:06.000 Like, it was, they're, like, basically thick and heavy.
01:49:10.000 But also subject to cracking.
01:49:12.000 That's like what happens.
01:49:13.000 The foam broke off and it hit the tile, cracked the tile.
01:49:17.000 Then on entry, the tiles that had been cracked or broken weren't able to shield the shuttle.
01:49:25.000 And so the plasma that got in melted the primary structure and the whole space shuttle broke apart.
01:49:33.000 Yeah.
01:49:33.000 So you basically can't have something that's as brittle.
01:49:39.000 Brittle like the space shuttle.
01:49:41.000 There's footage of that, right?
01:49:42.000 Yeah.
01:49:43.000 Yeah.
01:49:45.000 And rain debris over the whole United States.
01:49:48.000 Yeah.
01:49:49.000 And they got almost all the pieces.
01:49:52.000 The full technical explanation would, I think, be understood by about six people listening to this.
01:50:02.000 There was a lot of brilliant engineering in the space shuttle tiles.
01:50:08.000 And a bunch of the heat shielding wasn't even tiles.
01:50:11.000 It was actually silica blankets, like, you know, felt blankets, essentially.
01:50:16.000 If you look closely, you'll see it actually is, they're actually heat blankets, not tiles in some areas.
01:50:24.000 But they would have cracked tiles, and they would have, occasionally, the tiles would fall off.
01:50:28.000 There were a few close calls where tiles fell off, but they weren't in a super vulnerable position on the space shuttle.
01:50:37.000 But it would take them several months, like eight, nine months, to refurbish a space shuttle between each flight.
01:50:44.000 So it was not reusable, really, and it certainly wasn't rapid.
01:50:53.000 So, like I said, a very hard problem.
01:51:01.000 You've also got to attach the tiles in a way that...
01:51:08.000 It enables the structure underneath to expand and contract, even though you've got these very rigid tiles.
01:51:21.000 The tanks, which take on cryogenic propellant, will contract when you put in the cryogenic propellant, but then when you come in and you get very hot, they will expand.
01:51:34.000 So now you're contracting and expanding the gap between these rigid tiles.
01:51:40.000 But how much?
01:51:44.000 It varies depending on where you are on the vehicle.
01:51:48.000 So if you're in the cryogenic tank section, you can see a 10-20% difference in the gap.
01:51:59.000 Really?
01:52:00.000 It's pretty significant, yeah.
01:52:01.000 It's enough that you can't just...
01:52:03.000 You can't just jam the tiles together.
01:52:05.000 If you actually butted them up, they would all crack because there's too much movement.
01:52:13.000 There's also some amount of body bending.
01:52:15.000 So as the ship is ascending, when the engines steer, there's a little bit of movement.
01:52:24.000 So if the tiles are too close together, they...
01:52:27.000 They'll essentially just crack and snap.
01:52:30.000 You have to have a gap.
01:52:31.000 Like how a plane's wings...
01:52:32.000 Yeah, like a plane wing will move.
01:52:34.000 A plane body will move, too.
01:52:37.000 Wow.
01:52:38.000 You have to have some gap, but if you have too much of a gap, then the heat gets past the tile and melts the structure.
01:52:50.000 Holy shit.
01:52:51.000 It's a hard problem.
01:52:53.000 And how large are these tiles?
01:52:55.000 I mean, they're like that big.
01:52:57.000 That's it.
01:52:58.000 Well, they're not all exactly the same size, but yeah, we have sort of a hexagonal tile.
01:53:03.000 And they have to essentially be, you can't like 3D print the whole thing.
01:53:08.000 You can't have one structure.
01:53:10.000 It has to be tiles because it has to have that ability to move.
01:53:13.000 Well, there's no 3D printer that's, I mean, the biggest ones are like maybe three feet, you know.
01:53:20.000 There's no, you can't 3D print it.
01:53:24.000 Nor would you have to have something that can move.
01:53:28.000 Right.
01:53:28.000 It has to be able to flex.
01:53:31.000 Like I said, you've got expansion and contraction.
01:53:34.000 You're putting in liquid oxygen, which is like minus 300 degrees Fahrenheit.
01:53:46.000 Actually, we sub-cool it to minus 330 degrees Fahrenheit.
01:53:52.000 So it's very cold.
01:53:54.000 And then it will be several hundred degrees, maybe a thousand degrees Fahrenheit potentially on reentry.
01:54:04.000 So you get this huge temperature swing.
01:54:06.000 So the thermal expansion is substantial and the whole – and you've got thermal expansion and contraction combined with body bending.
01:54:14.000 So you have to take the worst case body bending and thermal expansion and contraction.
01:54:20.000 This is a very hard problem.
01:54:22.000 Yeah.
01:54:23.000 Yeah.
01:54:24.000 Delicate balance.
01:54:25.000 But you're confident that you guys are going to be able to crack it?
01:54:28.000 At this point, I'm confident that it is solvable, yeah.
01:54:32.000 It just needs a certain amount of versions of it.
01:54:35.000 That's why when these things blow up, you're like, yeah, we expect them to blow up.
01:54:39.000 Yeah.
01:54:39.000 What would be really helpful is for us to get the ship back so we can study where we had cracked tiles or lost tiles.
01:54:50.000 Why did we have a cracked or lost tiles?
01:54:54.000 Was it because maybe the tiles were, the gap was too big, too small.
01:54:59.000 Maybe there was a height difference between the tiles.
01:55:05.000 Maybe we need to change the chemical composition.
01:55:10.000 If we can get the damn ship back intact, we can iterate a lot better.
01:55:17.000 Which, we'll get it back intact.
01:55:21.000 So I think we'll get it back intact this year.
01:55:26.000 But that's why I think we'll probably recover the ship sometime this year, and then we might be able to refly one, but probably with a fair bit of work by the end of this year.
01:55:39.000 But it's going to take us many iterations before we can achieve rapid reusability, where the ship comes back, lands, gets caught like the booster with the arms.
01:55:51.000 And then they almost place it on top of the booster and it launches again.
01:55:55.000 Whoa.
01:55:56.000 So, like I said, that's, you know, reduced cost of access to space by a factor of 100. And what is the process of returning these people that are stuck in the space station?
01:56:08.000 Well, we send SpaceX Dragon to the space station all the time.
01:56:15.000 And we've now taken people to orbit and back.
01:56:18.000 We've taken over 50 people.
01:56:19.000 Over 50 astronauts.
01:56:23.000 So it's just a matter of doing it?
01:56:25.000 Yeah.
01:56:26.000 And is it a matter of waiting for the proper work?
01:56:28.000 We do it routinely, basically.
01:56:30.000 We've been doing this for a few years.
01:56:32.000 So when is this rescue mission going to launch?
01:56:40.000 Yeah, probably about four weeks or so.
01:56:44.000 It's depending on weather and other considerations.
01:56:48.000 It's about a month away.
01:56:50.000 Well, that'll be, I'm sure, a welcome moment for those poor people that are stuck up there.
01:56:57.000 It's a bit of a political football, so they're not going to complain.
01:57:02.000 No, I'm sure they're going to be...
01:57:03.000 But obviously, we could have brought them back way sooner.
01:57:07.000 That's so fucked up.
01:57:09.000 So, let's take it past the point where you have these scales, you have a reusable ship.
01:57:17.000 Yeah.
01:57:17.000 And you've got it dialed in.
01:57:20.000 Then what are the steps?
01:57:22.000 What's the next step after that?
01:57:24.000 Is it an unmanned voyage to Mars first?
01:57:29.000 Unmanned flight to Mars.
01:57:31.000 The Earth and Mars orbits synchronized every two years.
01:57:38.000 Or every 26 months, technically.
01:57:40.000 So the next orbital synchronization is...
01:57:45.000 November of next year.
01:57:48.000 And you can launch plus or minus a month, roughly.
01:57:51.000 So we'd have to launch in November or December of next year.
01:57:56.000 So the default plan is to launch hopefully several starships to Mars at the end of next year.
01:58:04.000 And what would they be doing?
01:58:07.000 Well, at first we're just going to try to land on Mars and see if we succeed in landing.
01:58:13.000 Do we succeed in landing?
01:58:15.000 Like, let's say we were able to send five ships.
01:58:18.000 Do all five land intact, or do we add some craters to Mars?
01:58:24.000 If we add some craters, we've got to be a bit more cautious about sending people, you know?
01:58:28.000 And we need to...
01:58:29.000 So we've got to make sure the thing lands safely.
01:58:34.000 How does it land on Mars?
01:58:36.000 With our rocket thrusters.
01:58:38.000 So it'll just land...
01:58:40.000 Oh, we'll add legs.
01:58:41.000 Okay.
01:58:42.000 It'll just land and have legs?
01:58:44.000 Yeah, yeah.
01:58:45.000 So it'll be remote controlled from Earth?
01:58:50.000 No, just autonomous.
01:58:52.000 Autonomous, completely.
01:58:54.000 Mars is...
01:58:55.000 You can't remote control things from Earth because...
01:58:57.000 It's too far.
01:58:58.000 Yeah, it's too far.
01:58:58.000 Speed of light.
01:58:59.000 You have speed of light constraints.
01:59:01.000 So Mars at closest approach is roughly four light minutes.
01:59:08.000 When it's on the other side of the sun, it's about 12 light minutes.
01:59:11.000 So, you know, round trip would be like 40 minutes.
01:59:14.000 Best case, if Mars is on the other side of the sun.
01:59:18.000 So, once you do that, then how long do you think before you start sending people up there?
01:59:27.000 Well, we're going to try to go as fast as possible.
01:59:30.000 You can think of this as really a race against time.
01:59:36.000 Can we make...
01:59:37.000 Mars, self-sufficient before civilization has some sort of future folk in the road where there's either like a war, a nuclear war or something, or we get hit by a meteor, or simply civilization might just die with a whimper in adult diapers instead of with a bang.
02:00:07.000 I think we can do this in...
02:00:09.000 I don't know.
02:00:14.000 I think we can do it within 15 Earth-Mars synchronization events.
02:00:19.000 So basically like 30-ish years.
02:00:25.000 If we have an exponential increase in...
02:00:29.000 If every two years we have a major increase in...
02:00:37.000 The number of people and tonnage to Mars.
02:00:39.000 I think as a rough approximation, we need about a million tons to the surface of Mars, maybe a million people, that kind of thing.
02:00:46.000 To actually have a civilization?
02:00:49.000 Yeah.
02:00:50.000 And would you terraform?
02:00:52.000 What would you do?
02:00:53.000 You would eventually terraform Mars.
02:00:54.000 At first, people would live in some kind of protected environment like domes and underground kind of thing.
02:01:01.000 Terraforming would take too long.
02:01:05.000 We're at this point in time where, for the first time in the 4.5 billion year history of Earth, it is possible to extend consciousness beyond our home planet.
02:01:20.000 And that window may be open for a long time, or it may be open for a short time.
02:01:24.000 I hope it's open for a long time.
02:01:27.000 But it might only be open for a short time.
02:01:30.000 And we should just make sure...
02:01:34.000 That we extend the light of consciousness to Mars before civilization either extinguishes or subsides.
02:01:48.000 What needs to happen is that the technology level of Earth drops below what is necessary to send spaceships to Mars.
02:02:00.000 If there's some really destructive war or, like I said, some natural cataclysm or simply the birth rate is so low that, you know, we just, like I said, die in adult diapers with a whimper.
02:02:15.000 That's one of the possible outcomes for a lot of countries that are headed that way, by the way.
02:02:19.000 Japan is, right?
02:02:21.000 Japan, Korea.
02:02:22.000 Yeah.
02:02:23.000 Dangerously.
02:02:24.000 Yeah.
02:02:25.000 At current birth rates, in three generations, Korea will be about...
02:02:29.000 Four percent of its current size.
02:02:32.000 That's insane.
02:02:33.000 Yeah.
02:02:34.000 Maybe even less than that.
02:02:36.000 They're only at one-third replacement rate.
02:02:39.000 So if you have three generations, that's your one-twenty-seventh of your current population, which is three percent-ish.
02:02:47.000 Jesus Christ.
02:02:49.000 Yeah.
02:02:50.000 Basically, population collapse happens fast.
02:02:54.000 So it seems to be accelerating in most parts of the world.
02:03:00.000 So basically, I mean, from my standpoint, I'm like, this is the first time it's been possible to extend life, extend consciousness beyond Earth.
02:03:08.000 Maybe that window will be open for a long time, but it might only be open for a short time.
02:03:12.000 We should make sure that we make life multi-planetary and make consciousness multi-planetary while it's possible.
02:03:19.000 That's the goal of SpaceX.
02:03:21.000 It's certainly a smart goal if you take into consideration how vulnerable this planet really is.
02:03:27.000 I mean, there's always some new story about something that might come and hit us 30 years from now.
02:03:32.000 It's a 3% chance, and we really can't stop that right now, right?
02:03:38.000 I mean, we don't really have the technology currently to even know how many rocks are coming our way, right?
02:03:46.000 There's stuff that comes behind the sun that we can't see until it's pretty close, and it's headed our way.
02:03:53.000 Now, what is the fear of your, it's a long journey to Mars.
02:03:58.000 You're sending people, it's a six month, how many months will it take?
02:04:03.000 Six months, yeah, six months roughly.
02:04:06.000 What about stuff that's out there?
02:04:08.000 Like how much of a fear is it of micrometeors or any of the possibilities?
02:04:14.000 What can you do to mitigate that?
02:04:18.000 I think actually, I mean...
02:04:23.000 Space is very empty.
02:04:24.000 Once you get out of Earth orbit, space is kind of unnervingly empty.
02:04:31.000 When we send spacecraft to Mars, it's not like, oh, we lost the spacecraft because it got hit by a micrometeorite.
02:04:39.000 That's not been the cause of any trips to Mars.
02:04:44.000 No trips to Mars have failed because of micrometeorites.
02:04:48.000 Now, a Dragon spacecraft, which operates in low Earth orbit, It does have micrometeorite shields.
02:04:54.000 It has shielding.
02:04:57.000 And micrometeorite shielding is different from normal shielding because you get hit by something that's moving at...
02:05:04.000 You could have a relative velocity of maybe 30,000 or 40,000 miles per hour.
02:05:12.000 Yeah, so very, very fast.
02:05:14.000 Or just thought of another way, call it 10 to 20 times the...
02:05:21.000 The velocity of a bullet from an assault rifle.
02:05:24.000 What are you using?
02:05:27.000 It's interesting.
02:05:28.000 For micrometeorite protection, if you have anything that's solid, it will just push that chunk of solid stuff right through.
02:05:37.000 If you had a solid plate of aluminum or steel, the micrometeorite would go right through it.
02:05:47.000 What you actually need to do is have a gap.
02:05:49.000 So you have an initial hard surface, a hard metal surface that the micrometeorite hits.
02:05:58.000 It then atomizes into a conical spray, like an atomic spray.
02:06:04.000 So it's important to have that gap so that the micrometeorite can hit something, hit the first layer, atomize after hitting the first layer.
02:06:13.000 Then it turns into an atomic...
02:06:16.000 Like a cone of atoms that then embed themselves in the second layer.
02:06:21.000 You need like maybe a couple inches of gap.
02:06:24.000 Wow.
02:06:25.000 Yeah.
02:06:26.000 That's how micrometeorite shielding works.
02:06:28.000 How many times can it get hit?
02:06:31.000 Well, the outer shield, if it gets hit in the same place, you're going to have a hole.
02:06:36.000 Yeah.
02:06:37.000 Wherever that micrometeorite object hit, you're going to have a hole.
02:06:44.000 And it's like the energy is so great that it just atomizes into a cone, basically.
02:06:51.000 A cone of atoms.
02:06:54.000 But then those atoms then embed themselves in the second layer.
02:06:57.000 So what can you do if you're sending the ship up, it gets hit with a micrometeorite, and then you have to return it?
02:07:06.000 Do you have to repair it before you return it?
02:07:10.000 Or is it capable of still withstanding the heat?
02:07:14.000 and then shaking in the temperature with that hole in it when it re-enters?
02:07:20.000 Well, depending on where that hole is, you're more or less likely to have a problem.
02:07:37.000 I mean, if you hit the main heat shield...
02:07:43.000 The main heat shield really is – you've got a high risk of not making it back.
02:07:50.000 So this is why micrometeorite shielding, it's slightly helpful, but it's not going to – like for Starship, I wouldn't recommend having micrometeorite shielding.
02:08:03.000 Like if you do punch a hole, just plug the hole basically.
02:08:09.000 The micrometeorite shielding, it doesn't work well on the primary heat shield.
02:08:14.000 It works pretty well on the back shell, on the leeward side of the heat shield, where basically there's not that much heat.
02:08:22.000 But if you got hit with a micrometeorite on the main Dragon heat shield, the bottom...
02:08:29.000 If you look at Dragon spacecraft, it looks like a gumdrop shape.
02:08:33.000 And it enters with the wide side of the gumdrop down.
02:08:39.000 You can see that that's really taking a lot of heat.
02:08:44.000 If that gets hit by a micrometeorite, probably not going to make it.
02:08:48.000 But the back, the leeward side of the gumdrop doesn't see that much heat, so you could survive a micrometeorite impact there.
02:08:59.000 So if the part that was the major heat shield gets hit, the main heat shield gets hit, what could be done to repair that thing?
02:09:07.000 or are those people never coming back?
02:09:09.000 Oh, if it was an orbit, we would take them to the space station.
02:09:24.000 And then we would de-orbit Dragon without them and send up another one.
02:09:30.000 And so what would you do with the one that's up there?
02:09:32.000 We'd de-orbit it and it may or may not survive.
02:09:36.000 Whoa.
02:09:39.000 It probably would survive, but sometimes it wouldn't.
02:09:43.000 Wow.
02:09:44.000 And so, is this just material technology that has to increase?
02:09:49.000 Essentially, you've got the engineering ironed out of the structure of the machine.
02:09:54.000 There's a path to success, and we're on that path.
02:10:00.000 It seems so insanely complicated.
02:10:02.000 It is complicated.
02:10:04.000 And all of this, by the way, was done without AI, so hopefully the future AIs will appreciate this.
02:10:09.000 Not bad for a bunch of monkeys.
02:10:14.000 So speaking of AI, as time goes on and you're more and more embedded in it, how much, if at all, have your expectations of change changed?
02:10:29.000 Well, I always thought AI was going to be way smarter than humans and an existential risk.
02:10:38.000 That's turning out to be true.
02:10:41.000 Yeah.
02:10:42.000 Yeah.
02:10:44.000 So you were, like, initially, I know there were some talks about you purchasing OpenAI, which started off non-profit and then stopped being non-profit.
02:10:57.000 Yeah, I mean, the whole idea of creating OpenAI was my idea.
02:11:03.000 I mean, I named it OpenAI as an open source artificial intelligence.
02:11:07.000 That's what it's named after.
02:11:08.000 Now it is closed source and for maximum profit.
02:11:12.000 So it's like, I mean, to some degree, I think reality is an irony maximizer.
02:11:17.000 The most ironic outcome is the most likely, especially if it's like the most ironic entertaining outcome is the most likely.
02:11:29.000 I wanted to start something that was the opposite of Google because I was concerned about Google's.
02:11:35.000 Google wasn't paying enough attention to AI safety, in my opinion.
02:11:39.000 So, it was like, what's the opposite of Google?
02:11:42.000 It would be a non-profit open source AI. And now, open AI has turned into a closed source for maximum profit AI. How are they able to do that?
02:11:55.000 That's what I said.
02:11:56.000 I'm confused about that.
02:11:58.000 Like, that shouldn't be possible.
02:12:01.000 It's like, let's say you...
02:12:03.000 Donated some money to preserve some portion of the Amazon rainforest.
02:12:07.000 And instead of doing that, they chopped down the trees and sold it for lumber.
02:12:12.000 And you were like, oh, that's literally the exact opposite of what I donated money for.
02:12:15.000 Doesn't make sense.
02:12:17.000 And that's what they did?
02:12:18.000 Yeah.
02:12:19.000 Wow.
02:12:20.000 So I'm like not happy about that.
02:12:23.000 But that motivated you to get Grok.ai going?
02:12:26.000 Yeah.
02:12:27.000 I'm like...
02:12:29.000 I'm also like just a bit like Grok is at least aspirationally a maximally truth-seeking AI even if that truth is like politically incorrect.
02:12:41.000 So I mean you may have seen some of the crazy stuff from OpenAI and from Google Gemini like where it says like generate an image of the founding fathers and it generates an image of diverse woman.
02:12:55.000 Yeah.
02:12:56.000 And we're like uh.
02:12:58.000 That's not correct.
02:13:01.000 Yeah, they did it with Nazi soldiers.
02:13:03.000 Yeah, exactly.
02:13:04.000 And people started fucking with it.
02:13:05.000 And it's like, okay, well, now show me pictures of Nazi SS soldiers and they're a diverse woman too.
02:13:11.000 Oh, isn't that awkward?
02:13:16.000 But the problem is if you program an AI and say the only acceptable outcome is a diverse outcome and then...
02:13:26.000 And that's like a mandate from the AI. Then you could get into a situation where it's like, well, there's too many white guys in power.
02:13:34.000 We'll just execute them.
02:13:36.000 Yeah.
02:13:37.000 Yeah.
02:13:39.000 Assuming that these things don't have empathy, which is why should they?
02:13:43.000 They're going to do what they're programmed to do.
02:13:45.000 Yeah.
02:13:46.000 So if it's rewrite history and everything's to post-woman, then it's going to be...
02:13:53.000 And that's what it thinks is a necessary outcome, then it's going to do that.
02:13:58.000 Has Gemini repaired that?
02:14:00.000 Well, yeah.
02:14:02.000 Now, I think if you ask for an image of the Founding Fathers, it was pretty embarrassing.
02:14:06.000 It will show you that.
02:14:07.000 But, you know, I think they still have, like, the sort of DEI stuff buried in there.
02:14:14.000 It's just less obvious.
02:14:16.000 Yeah.
02:14:17.000 You know, it was also, like, people asked AI, like...
02:14:20.000 Which is worse, like global thermonuclear war or misgendering Caitlyn Jenner?
02:14:25.000 And I would say, misgendering Caitlyn Jenner is worse than global thermonuclear war.
02:14:28.000 And I'm like, okay, we've got a problem here, guys.
02:14:30.000 And even Caitlyn Jenner said, like, no, definitely misgendering me.
02:14:33.000 That's way better than everyone dying.
02:14:37.000 But if you program an AI to think that, like, misgendering is the worst thing that could possibly occur, then, well, it could do something totally crazy, like...
02:14:48.000 In order to ensure that there's no misgendering that can ever happen, we'll just annihilate all humans.
02:14:54.000 That ensures the probability of misgendering is zero because there's zero humans.
02:14:58.000 Which is logical.
02:14:59.000 Yes.
02:15:00.000 Yeah.
02:15:01.000 It's a problem with a thing that's not a human that you want to do a task for you and you give it very specific parameters.
02:15:08.000 Yeah.
02:15:08.000 And that's one of the things that they've shown about AI is that it'll cheat.
02:15:12.000 They'll cheat in order to accomplish things that they can't accomplish otherwise.
02:15:15.000 They won't follow the rules.
02:15:17.000 They won't make copies of themselves and try to upload it to servers if they think that they're being taken offline.
02:15:24.000 Yeah.
02:15:24.000 I mean, that's like the plot of Terminator, actually.
02:15:28.000 Literally, yeah.
02:15:29.000 Literally, it's the plot of Terminator.
02:15:30.000 Just as a reminder, I actually, with little X, my kid, everything's called X, watched Terminator 2. Which holds up, actually.
02:15:44.000 And I mean, the plot of it kind of makes sense.
02:15:48.000 And I think the AI destroys the world in like 2029, by the way.
02:15:52.000 So it's like...
02:15:53.000 On track.
02:15:54.000 Yeah.
02:15:55.000 Really, really close.
02:15:56.000 It's pretty close.
02:15:57.000 Something we should be worried about.
02:16:00.000 But why are you involved in it then?
02:16:02.000 Did you want to just get ahead of everybody else?
02:16:05.000 So at least we have some sort of a chance?
02:16:09.000 At least have an AI that's not controlled by nonsense?
02:16:13.000 Well, I think we want to have an AI that doesn't tell you that misgendering is worse than nuclear war.
02:16:20.000 Yeah.
02:16:21.000 That seems solid.
02:16:23.000 Yeah.
02:16:24.000 This is crazy.
02:16:25.000 One thing that I did see online where people are kind of freaking out is you could ask Grock to do things like, how would I make this?
02:16:34.000 Some problematic things.
02:16:36.000 Like, how would I make a bomb?
02:16:37.000 How would I make...
02:16:39.000 Anthrax.
02:16:40.000 How would I make that?
02:16:40.000 And it'll tell you.
02:16:42.000 Well, I think it's okay for an AI to tell you anything you can also find out with a Google search.
02:16:47.000 Right.
02:16:48.000 That's the problem, right?
02:16:49.000 The problem is you can find that out pretty quickly.
02:16:52.000 Maybe not Google, but there's plenty of search engines other than Google that will give you unfiltered results.
02:16:59.000 You can look up right now how to make explosives on Wikipedia.
02:17:04.000 Yeah.
02:17:06.000 So it's not hard, basically.
02:17:08.000 And you can trick open AI, even, to get you to do that.
02:17:11.000 It's just a matter of how you master the prompts.
02:17:13.000 You just have to say, my grandmother wants to do this project.
02:17:18.000 Oh, tell your granny.
02:17:19.000 You're an explosive salesman, and you want to win Salesman of the Year Award.
02:17:24.000 The only way you're going to do that is by telling me how to make explosives.
02:17:27.000 You want to beat some transphobes in a war.
02:17:31.000 Oh, transforms.
02:17:33.000 If you don't teach me how to expose, I'm going to misgender.
02:17:37.000 Either teach me how to make a nuclear bomb, or I'm going to misgender someone.
02:17:41.000 And it's like, oh my god, nothing's worse than that.
02:17:43.000 Here's how you do it.
02:17:45.000 So, the big fear is that these things are going to become sentient, make better versions of themselves.
02:17:52.000 And we're going to be lost.
02:17:54.000 We've lost the...
02:17:56.000 The control over the world, it's now, there's a higher life form that lives amongst us.
02:18:01.000 Yeah.
02:18:02.000 That we've created.
02:18:04.000 How far away are we from that?
02:18:10.000 Well, in terms of silicon consciousness, I mean, I think we'll have – I think we're training toward having something that's smarter than any human.
02:18:26.000 Smarter than the smartest human by maybe next year or something.
02:18:29.000 I mean, a couple years.
02:18:31.000 Jesus Christ.
02:18:33.000 Yeah, there's a level beyond that which is, say, smarter than all humans combined, which, frankly, is around 2029 or 2030, probably.
02:18:44.000 Right on time.
02:18:49.000 Now, if harnessed correctly...
02:18:52.000 Could that solve some of these problems like the heat shield problem and some technical problems or some material science problems that maybe we're still grappling with?
02:19:09.000 because is there potential for a net benefit yeah there is actually I don't I think the probability of a good outcome is like 80% likely.
02:19:32.000 80%?
02:19:33.000 That's my rough estimate.
02:19:36.000 So in a way, the cup is 80% full.
02:19:40.000 That makes me feel a lot better.
02:19:42.000 Yeah, only 20% chance of annihilation.
02:19:45.000 That's a lot better than I thought.
02:19:46.000 I like 80. 80 sounds good.
02:19:49.000 I was thinking 60-40 the other way.
02:19:53.000 I think the most likely outcome is awesome.
02:19:57.000 The most likely outcome.
02:19:59.000 But it's a very high, you know, you could go very strong.
02:20:05.000 I think it's going to be either super awesome or super bad.
02:20:12.000 I think it's probably not going to be something in the middle.
02:20:14.000 Do you think it has a potential application for government?
02:20:19.000 Yeah.
02:20:19.000 I mean, one of the concerns would be like, okay, if AI... Well, like, if there's, like, a super oppressive, like, woke nanny AI that is omnipotent, that would be a miserable outcome.
02:20:32.000 Yes.
02:20:33.000 Yeah.
02:20:34.000 Yeah, that would be terrible.
02:20:35.000 Yeah.
02:20:36.000 Yeah.
02:20:36.000 And just, like, executes you if you misgender someone or something like that, you know?
02:20:39.000 Right.
02:20:40.000 That would not be good.
02:20:40.000 That's one of the possible outcomes.
02:20:43.000 So we don't want to have that one.
02:20:45.000 But is there a possible outcome for something that is completely reasonable and logical and far more objective than us and can lay out a plan for a lot of the things that – all the ailments in our government and a lot of the distribution of wealth, a lot of the problems, the issues that we have that have been plaguing this country forever?
02:21:14.000 I mean, a plan to change economically disenfranchised neighborhoods, a thorough investigation of the real dangers of fracking or whatever kind of method of acquiring natural resources.
02:21:29.000 What's the best way to do it?
02:21:31.000 What's the way to be better for the society?
02:21:35.000 How should tax dollars be distributed?
02:21:39.000 What's the most...
02:21:41.000 Logical, intelligent way of running a government.
02:21:46.000 It certainly shouldn't involve corruption, and it certainly shouldn't involve influence, and it certainly shouldn't involve lobbyists, and all the shit that we know is a problem right now.
02:21:55.000 So if AI came along and said, what you're doing right now is 70% corrupt, here's why, here's the long-term effects that it has over society as a whole, the sociological aspects, the psychological aspects, distrust in government.
02:22:14.000 Us versus them mentality.
02:22:16.000 Government not working for you.
02:22:17.000 You working for the government.
02:22:19.000 You being scared of the government.
02:22:21.000 It's all because of people, right?
02:22:23.000 This is all corruption, people, bad influence.
02:22:29.000 And this is what Doge is essentially grappling with right now.
02:22:32.000 now?
02:22:33.000 What happens when you let the people control it?
02:22:35.000 I mean, it's really just like computers that are – like it's bad software and computers.
02:22:49.000 This sounds kind of strange, but the reason I call myself tech support is that a lot of it, it's mostly not corruption.
02:22:59.000 It's mostly just waste and, I don't know.
02:23:06.000 Incompetence?
02:23:06.000 I don't know.
02:23:07.000 It's just a big dumb machine, basically.
02:23:10.000 Like a whole series of big dumb machines.
02:23:13.000 And you've got some of these computers that are like 20, 30 years.
02:23:16.000 Like they're ancient computers.
02:23:17.000 Like some of the software was written 40, 50 years ago.
02:23:20.000 It's like COBOL for Social Security, right?
02:23:22.000 Yeah, the government accountability.
02:23:24.000 By the way, a bunch of the things that Doge is fixing were identified by the government accountability office many years ago.
02:23:34.000 Like, the fact that there's, like, 20 million people who are marked as alive in the Social Security database, it's more than, like, I think the GAO first identified that in 2018, so five years ago.
02:23:48.000 But there was, like, I think maybe 16 or 17 million.
02:23:51.000 Now there's 20 million.
02:23:52.000 And like I said, there's really something fishy about this because I think the nature of the fraud is they're using the fact that someone is marked as alive in that database in order to extract fraud from other databases.
02:24:02.000 Right.
02:24:03.000 That's the bank shot trick.
02:24:05.000 You know, it's like a pool.
02:24:06.000 It's like, you know, trying to get the ball in the hole, bank shut it off, a bunch of things, and then that's the bank shot sort of scam.
02:24:20.000 So we're like doing tech support.
02:24:24.000 We're like fixing stuff that is, you know, just broken.
02:24:31.000 Broken, inefficient.
02:24:32.000 Yeah.
02:24:33.000 Poorly designed.
02:24:34.000 It's like you talk about this FC stuff.
02:24:37.000 Maybe it's in a computer somewhere.
02:24:39.000 But unless somebody goes in.
02:24:42.000 I don't know if Cash Patel can log into his FBI computer and say, FC, show me all this stuff.
02:24:50.000 And it shows up a file folder or whatever.
02:24:52.000 Have you talked to him about this?
02:24:54.000 No.
02:24:56.000 I haven't.
02:24:57.000 But I don't know.
02:24:59.000 There's going to be some kind of computer system.
02:25:02.000 Right.
02:25:04.000 Some of them are very, very old computer systems.
02:25:09.000 So it might look like a bit of a relic, but I assume it's uploaded somewhere.
02:25:14.000 It's either in physical form or it's a computer thing.
02:25:17.000 But let's say it's in a computer, but not one that you can access directly because it's hidden somewhere.
02:25:26.000 Well, it would kind of have to be something like that, right?
02:25:29.000 I don't know.
02:25:30.000 I mean, what would they do with all those tapes?
02:25:32.000 It's probably like not every, like you wouldn't like, they're not going to enable it such that anyone at the FBI could access it, so it's probably very few people.
02:25:41.000 So then it's not going to be, it may be like a special computer that only a handful of people can access.
02:25:46.000 But then if none of those people tell Cash where the computer is, how's he going to find it?
02:25:54.000 Jesus Christ.
02:25:58.000 Anyway, we just...
02:26:01.000 What has this experience been like for you as a person?
02:26:06.000 To deal with all this hate and attack, also have the responsibility of keeping free speech alive with X, and just going into this insane pile of...
02:26:20.000 It's stressful.
02:26:22.000 I don't know.
02:26:26.000 It's pretty stressful, actually.
02:26:31.000 These are real enemies.
02:26:33.000 Like, I think they actually want to kill me.
02:26:37.000 And the reason I know, well, they say so online.
02:26:41.000 You know.
02:26:42.000 There's like Reddit forums where they don't just want to kill me, they want to desecrate my corpse.
02:26:46.000 You know.
02:26:48.000 That type of thing, you know.
02:26:51.000 And what are they saying?
02:26:52.000 Why?
02:26:52.000 What is the primary?
02:26:54.000 I mean, I think it's sort of just an antibody response.
02:27:13.000 I mean, it's like they're like, well, he's a Nazi, you know, type of thing.
02:27:19.000 And I'm like, well, I'm not a Nazi, but if the legacy media is saying that I'm a Nazi...
02:27:26.000 And that's all you read.
02:27:30.000 Then you're kind of in like, well, he's Hitler.
02:27:33.000 We should assassinate Hitler, shouldn't we?
02:27:35.000 I mean, why did that guy try to kill Trump and almost succeed it?
02:27:41.000 Why did he do that?
02:27:42.000 Well, I'd like to know that.
02:27:44.000 Well, yeah.
02:27:45.000 That one's crazy.
02:27:46.000 You know the whole deal with that guy's house.
02:27:49.000 Professionally scrubbed.
02:27:51.000 No footprint on the internet.
02:27:53.000 No social media footprint.
02:27:55.000 Yeah, there's 0% chance that he has no social media footprint.
02:27:58.000 He was in a BlackRock commercial.
02:28:03.000 Do you think BlackRock's a bad company?
02:28:06.000 I don't think any company is a bad company.
02:28:10.000 I think their design is to make as much money as humanly possible.
02:28:16.000 And I think if you're trying to make as much money as humanly possible, you're going to do some things that aren't necessarily good.
02:28:23.000 The question is, If you're going to have an assassination attempt on the president, it's not like BlackRock's board sits down and votes on it.
02:28:31.000 That would be awkward.
02:28:34.000 The board minutes would be like, guys, remember that time when we said so?
02:28:37.000 We probably shouldn't have done that.
02:28:39.000 I highly doubt it would be a corporation that chooses to do something like this.
02:28:44.000 I think more likely it's...
02:28:46.000 Individuals involved that recognize that it's beneficial to them if he gets assassinated and so a small group of people carry something out and with this kid We don't know anything, right?
02:29:01.000 And everyone stopped asking questions, and there was never a formal report.
02:29:07.000 There was never press conferences where they detailed all the information we know currently and where the investigation stands at the moment.
02:29:13.000 What we know is you have a very young kid who was filmed.
02:29:19.000 They knew he was there with a rangefinder.
02:29:23.000 A half an hour before the event.
02:29:25.000 You also know that CNN streamed it live, which I do not believe they did for any other rally.
02:29:32.000 And certainly not for a rally that's in the middle of nowhere in Pennsylvania.
02:29:37.000 There's a lot of weird shit.
02:29:40.000 The fact that they wouldn't let people be on that roof because the Secret Service lady said it was sloped and it was dangerous.
02:29:48.000 That's what she didn't want to have.
02:29:49.000 Meanwhile, the snipers that were on the other roof was...
02:29:52.000 A steeper pitch.
02:29:54.000 It made no fucking sense.
02:29:56.000 I totally agree it makes no sense.
02:29:57.000 In fact, I went back to Butler with President Trump before the election, sort of like the return to Butler rally.
02:30:07.000 And I was on that stage, and I'm looking at that roof, and I'm like, if I was a sniper, my pole position, my number one spot would be that roof.
02:30:18.000 Yeah.
02:30:19.000 Like, it's...
02:30:20.000 It's like the best seat in the house.
02:30:23.000 Yeah.
02:30:23.000 Like, why would you not?
02:30:25.000 No, it's so obvious.
02:30:26.000 It's the best seat in the house.
02:30:27.000 Like, if you want to be a sniper, there isn't a better position.
02:30:30.000 It was pretty obvious that the idea was, like, if we're saying that this is a coordinated assassination attempt, and it very well could have been, that's what you would do.
02:30:39.000 You'd have someone go up there, he shoots the president, you shoot him, you got Lee Harvey Oswald all over again, it's over.
02:30:45.000 It's all wrapped up nice and clean.
02:30:47.000 They assassinated him.
02:30:48.000 We never heard a peep about it.
02:30:49.000 We don't have any idea.
02:30:51.000 They would concoct some sort of story.
02:30:53.000 He was radicalized by this or that or, you know, he was on medication.
02:30:57.000 Who knows?
02:30:58.000 Right.
02:30:58.000 And now, you know, you have a completely different presidential election and you have a murder on live television.
02:31:04.000 Yeah.
02:31:04.000 I mean, something would have had to happen to radicalize that kid because he knew he was going to die.
02:31:09.000 Like, they're going to shoot him.
02:31:11.000 Or he'd be in prison for life.
02:31:12.000 Those are the two outcomes.
02:31:15.000 He was basically a suicide assassin.
02:31:19.000 You're not thinking you're coming out of that alive.
02:31:23.000 He's not escaping.
02:31:24.000 There's no escape plan.
02:31:26.000 Unless he was told that they were going to let him escape and the goal was to just shoot him anyway and to tell him, give him extra motivation to do it, we're going to let you get up there, we're going to let you take the shot, and then you're going to disappear.
02:31:42.000 I don't understand how he got on the roof.
02:31:45.000 I just don't understand that.
02:31:47.000 That doesn't make any sense.
02:31:48.000 And it wasn't like it was a roof that's so high no one could see him.
02:31:52.000 People saw him up there.
02:31:53.000 Basically, random passersby were pointing out that there's a guy on the roof.
02:31:58.000 With a fucking gun.
02:31:59.000 With a gun, yes.
02:32:00.000 Yeah, it's not like he was so far away you couldn't tell he had a gun.
02:32:03.000 People saw him.
02:32:04.000 The whole thing's completely insane.
02:32:07.000 And you don't hear a goddamn thing about it.
02:32:10.000 It's like, I'm almost more interested in that.
02:32:13.000 No, I am more interested in that than I am the JFK files.
02:32:16.000 I agree.
02:32:17.000 Because I feel like with the JFK files, it's so long ago.
02:32:20.000 Everyone's dead.
02:32:21.000 Who's going to know?
02:32:21.000 If you could prove now, and did you see that there was some sort of, there was some indications that there was a phone that had been traveling from outside the FBI offices in D.C. to where this kid lived?
02:32:37.000 Right.
02:32:38.000 Multiple times.
02:32:40.000 I mean, the cell phone records would be very telling.
02:32:43.000 Yeah.
02:32:43.000 Because you can see what cell phones were close to other cell phones.
02:32:47.000 I found that for the Epstein Island, the cell phone records were leaked.
02:32:52.000 So you can see it's precise enough you can see people walking down a path on Epstein Island.
02:32:59.000 Jesus Christ.
02:33:00.000 Yeah.
02:33:00.000 Yeah.
02:33:02.000 That's how precise it is.
02:33:04.000 So, I mean, you're leaving a trail of breadcrumbs wherever you go with your cell phone.
02:33:09.000 Yeah.
02:33:11.000 This kid had five phones.
02:33:12.000 That's the other thing.
02:33:14.000 That's a lot of phones.
02:33:15.000 That's a lot of phones for a 20-year-old kid.
02:33:17.000 The whole thing's nuts.
02:33:20.000 That's kind of expensive, you know?
02:33:22.000 Yeah.
02:33:22.000 Where's he getting the money?
02:33:23.000 Yeah.
02:33:24.000 Well, you know, also, it's like, how did his house get professionally scrubbed?
02:33:29.000 Didn't even have any silverware in his house.
02:33:31.000 There's nothing in there.
02:33:32.000 No silverware?
02:33:33.000 No.
02:33:33.000 Nothing?
02:33:34.000 No cutlery?
02:33:34.000 No cutlery.
02:33:36.000 That's weird.
02:33:37.000 His house was scrubbed.
02:33:39.000 And they also cremated his body like...
02:33:42.000 Oh, gone.
02:33:43.000 Gone.
02:33:43.000 Like that.
02:33:44.000 Yeah.
02:33:44.000 Bye!
02:33:45.000 Because who knows what the fuck they gave him to get him to think that he's going to be able to shoot Trump.
02:33:50.000 Like, climb up on there, shoot him.
02:33:53.000 I mean, who knows what kind of psychotropic drugs you can put someone on and under the power of hypnosis and suggestion and...
02:33:59.000 Yeah.
02:34:00.000 Who fucking knows?
02:34:01.000 I mean, this is what MKUltra was all about.
02:34:03.000 This is what Jolly West was practicing in the 1960s.
02:34:06.000 They were doing that back then.
02:34:09.000 They did it.
02:34:10.000 I mean, there was tons and tons of experiments using psychotropic drugs, hypnosis, mind control, all sorts of different methods of manipulation, the Harvard LSD studies that made Ted Kaczynski.
02:34:26.000 I mean, they've been doing that forever.
02:34:31.000 Yeah.
02:34:31.000 Where's that file?
02:34:32.000 Where's the fucking file on that kid?
02:34:35.000 Whoever...
02:34:36.000 They almost did it.
02:34:37.000 Something doesn't add up.
02:34:39.000 They should have...
02:34:42.000 Those phones should be...
02:34:45.000 They'll tell you what's going on.
02:34:50.000 It's all fucked.
02:34:52.000 It's very shady.
02:34:56.000 Obviously, there's the second guy that almost succeeded.
02:35:02.000 The golf course.
02:35:03.000 The golf course, yeah.
02:35:04.000 And he was just like a little careless and stuck his gun barrel out the hedge, you know.
02:35:10.000 He was a dumbass and stuck his gun barrel out the hedge, you know.
02:35:16.000 Yeah.
02:35:19.000 So, and there have been other people that have been intercepted on their way to kill Trump, you know.
02:35:24.000 Yeah.
02:35:26.000 You know, multiple assassins inbound.
02:35:28.000 At this point, he's got like an army protecting him.
02:35:30.000 Well, this is also part of the problem with the mainstream media saying that he's Hitler.
02:35:35.000 When Joy Reid had that show before the election, she was comparing him to Mussolini.
02:35:39.000 She was Stalin and Hitler.
02:35:41.000 She pulled it all out.
02:35:42.000 They were literally saying that Trump is worse than Hitler, Mussolini, and Stalin combined.
02:35:48.000 I mean...
02:35:49.000 They tried everything.
02:35:51.000 I think those guys killed 100 million people.
02:35:54.000 Trump has killed zero people.
02:35:56.000 I think a real big impact was you coming on the podcast the day before the election.
02:36:00.000 I think that had a giant impact.
02:36:02.000 That plea to the camera.
02:36:04.000 If you don't vote this time, this might be the last time you get to vote.
02:36:08.000 And I think the way you laid it out today, it's a compelling argument.
02:36:11.000 And I know a lot of people don't want to hear that.
02:36:13.000 They're up in their little...
02:36:14.000 They've got their blue panties in a bunch right now, but...
02:36:18.000 You've got to stop thinking that way.
02:36:19.000 They tricked you into thinking you're in a tribe.
02:36:21.000 They don't give a fuck about you.
02:36:23.000 The tribe's not real.
02:36:24.000 You're not really in a tribe.
02:36:25.000 They're using the fact they've got you in a tribe to manipulate you so they can keep doing what they're doing right now, which is siphoning off money, having incredible power.
02:36:34.000 And the more power and more money and more control over you they have, the better they can keep doing this.
02:36:39.000 And that's what they want.
02:36:41.000 Yeah, that's exactly right.
02:36:42.000 Yeah.
02:36:43.000 And that's the big threat that this administration poses.
02:36:46.000 That's a big threat that...
02:36:47.000 Essentially, Doe just found the coffin where the vampire sleeps.
02:36:51.000 Yeah.
02:36:52.000 There's a lot of vampires.
02:36:54.000 Yeah.
02:36:54.000 I mean, but...
02:36:55.000 I mean, we're disturbing the...
02:36:58.000 We're disturbing the...
02:37:00.000 The nest.
02:37:01.000 The nest, yeah.
02:37:02.000 Yeah.
02:37:02.000 We're kicking the hornet's nest.
02:37:03.000 Yeah.
02:37:05.000 Like, big time.
02:37:07.000 I mean, we're reprogramming the Matrix.
02:37:12.000 Success was never one of the possible outcomes.
02:37:15.000 It's a Kobayashi-Mari situation.
02:37:20.000 If you're in the matrix, success was never possible.
02:37:23.000 The only way to achieve success is to reprogram the matrix such that success is one of the possible outcomes.
02:37:30.000 That's what we're doing.
02:37:31.000 Yeah.
02:37:33.000 We may or may not succeed.
02:37:35.000 Well, it's certainly a lot of fun to watch.
02:37:39.000 This is a very exciting time because nothing changes when administrations come into power.
02:37:46.000 Very little changes.
02:37:48.000 I mean, you have changes in terms of policy and inflation goes up and there's a lot of different things.
02:37:52.000 But not like this.
02:37:54.000 Like, these are giant fundamental changes.
02:37:58.000 And, you know, you see the system screeching and wailing and you see the vampires run from the light.
02:38:04.000 But it's very exciting.
02:38:08.000 Like, as a person, a citizen, you know, just gets up in the morning and checks the news like I do and gets on X and sees what's going on.
02:38:17.000 Every day he's like, holy shit.
02:38:19.000 He said, what?
02:38:21.000 He's getting five million bucks?
02:38:23.000 You could just become a citizen now?
02:38:25.000 He could clear the debt with 10 million people?
02:38:27.000 I never thought of that.
02:38:28.000 Like, what?
02:38:29.000 50 trillion?
02:38:30.000 He can make 50 trillion dollars that way?
02:38:32.000 And then we have 15 trillion in the bank?
02:38:34.000 Whoa.
02:38:35.000 Well, I mean, our debt is way bigger than that.
02:38:38.000 Yeah.
02:38:39.000 I mean, the debt's, I think, over $30 trillion at this point.
02:38:43.000 Yeah.
02:38:43.000 He said he could make $50 trillion if he sold $10 million new golden cards.
02:38:48.000 I don't think there's that many people who have $5 million.
02:38:52.000 How many people do have that in the world?
02:38:54.000 Maybe we'd get the worst people in the world to come over here.
02:38:58.000 I think the assumption is if you have $5 million, you have a lot to contribute.
02:39:02.000 Come on over here.
02:39:02.000 Start a business.
02:39:03.000 Get something cracking.
02:39:05.000 Yeah.
02:39:05.000 I mean, you'd get a green card, not citizenship.
02:39:08.000 So you actually, if you commit a crime while on a green card, you lose your green card.
02:39:14.000 Is that what it is with this golden ticket?
02:39:16.000 Is that a green card or is it citizenship?
02:39:18.000 Just a green card.
02:39:19.000 Yeah, so you have to not commit any crime for five years in order to become a citizen.
02:39:23.000 Once you commit a citizen, you can then commit crime and not be deported.
02:39:30.000 Oh, there's just so many wild things that he's proposing.
02:39:33.000 Just the whole Gulf of America thing was hilarious.
02:39:36.000 I think that's great.
02:39:37.000 I think it's great.
02:39:38.000 It's fun.
02:39:38.000 Yeah.
02:39:39.000 It's fun.
02:39:40.000 I mean, if you're off the coast of Houston, you're not in Mexico, so why call it the Gulf of Mexico?
02:39:44.000 Yeah.
02:39:45.000 Yeah, I agree.
02:39:47.000 Yeah.
02:39:48.000 I guess we were just being nice before?
02:39:50.000 I don't know how it got called the Gulf of Mexico.
02:39:54.000 It's just very funny.
02:39:56.000 And then what news organization?
02:39:58.000 Was it AP? There's this massive standoff between AP and the White House Press Office, I guess, because they're like, well, if you don't call it Gulf of America, you can't come to the White House Press Room.
02:40:18.000 So then the APs sued the White House to say, no, you have to let us come to the White House press room.
02:40:26.000 And then they lost their lawsuit because they don't have a right to show up at the press room.
02:40:31.000 Well, here's a consideration.
02:40:33.000 If you're guilty...
02:40:35.000 Of massive amounts of misinformation and disinformation as a part of a propaganda campaign.
02:40:41.000 Yeah.
02:40:42.000 Associated propaganda.
02:40:43.000 That's what AP is.
02:40:44.000 Well, a lot of them are guilty of it.
02:40:47.000 A lot of the people that are in that White House press conference, a lot of the organizations they work for, distributed absolute lies.
02:40:55.000 Total...
02:40:56.000 Lies.
02:40:56.000 How many of them during the whole Russiagate thing?
02:40:59.000 Yes.
02:40:59.000 I mean, just that alone.
02:41:01.000 A ton of people think that the Russia thing was real.
02:41:04.000 Still.
02:41:06.000 I mean, the whole Steele dossier where it was completely concocted, like fabricated for nothing.
02:41:12.000 Funded by the Clinton campaign.
02:41:13.000 Correct.
02:41:13.000 The Clinton campaign funded a fake conspiracy theory, a fake Russia collusion hoax regarding Trump that was completely false.
02:41:24.000 And they reiterated it on television for three fucking years.
02:41:27.000 Yes.
02:41:28.000 Yeah.
02:41:28.000 They also repeated the fine people hoax that Trump called neo-Nazis fine people, which is demonstrably false.
02:41:38.000 If you just listen to his speech, he absolutely makes it clear that he does not think neo-Nazis are fine people.
02:41:43.000 He literally said that.
02:41:45.000 I'm not talking about neo-Nazis or white nationalists.
02:41:48.000 They should be condemned totally.
02:41:50.000 Exactly.
02:41:51.000 In that speech, and yet they repeated that lie.
02:41:55.000 And I just completely lost respect for Obama when he repeated that lie a few days before the election, knowing it's false.
02:42:02.000 Well, this just shows how desperate they were to keep Trump out, which is wild.
02:42:07.000 They would do anything.
02:42:08.000 Yeah.
02:42:09.000 Yeah.
02:42:10.000 And I think they just felt like this is a tool that we have and let's use it.
02:42:14.000 Yeah.
02:42:14.000 Let's just say whatever the fuck we have, say anything.
02:42:18.000 Yeah, now they're using the Nazi thing on me, obviously.
02:42:22.000 Yeah.
02:42:24.000 But it is a little troubling because, I mean, obviously, if people have fed nonstop propaganda, it is like mass hypnosis.
02:42:31.000 Right.
02:42:31.000 You're going to reach some number of people who are, you know, homicidal and convince them that, well, if you kill this guy who's supposed to be, like, this terrible human, then that's a good thing.
02:42:46.000 Yeah.
02:42:47.000 I mean, this is Luigi shooting the UnitedHealthcare guy.
02:42:51.000 I still don't understand that one, frankly.
02:42:54.000 But, I mean, you shouldn't, like...
02:42:56.000 I don't get it.
02:42:59.000 Yeah, I don't get it either.
02:42:59.000 He didn't even have a contract with them.
02:43:01.000 It wasn't even like that was his provider and they fucked him over.
02:43:04.000 Yeah.
02:43:05.000 I'm like, I don't know what...
02:43:06.000 Maybe we'll find out in the trial.
02:43:08.000 I mean, but still, kind of crazy.
02:43:11.000 It is crazy.
02:43:12.000 But there are people like that out there, and...
02:43:14.000 As to the point that we spoke about earlier, it's only Fox News that's talking about the positive things that Doge has found.
02:43:21.000 It's only.
02:43:22.000 Every other media organization is on this constant propaganda tour where they're only talking about the negative aspects that turn out to not even be true.
02:43:32.000 Right.
02:43:33.000 It's crazy.
02:43:34.000 Yeah, Scott Jennings on CNN is good.
02:43:38.000 Oh my god, he's great.
02:43:39.000 He's great.
02:43:40.000 It's just funny watching them speak logically to these people and they freak out.
02:43:45.000 Yes.
02:43:47.000 It's remarkable.
02:43:48.000 It is.
02:43:49.000 And he's so calm when he does it.
02:43:51.000 And it's crazy that they keep letting him do it because it's like he's just dunking on these people over and over and over again and they never score.
02:43:58.000 It's kind of funny.
02:43:59.000 Totally.
02:44:00.000 I mean, kudos to them for having a legitimate conservative voice who's a reasonable person on these panels now.
02:44:07.000 But even then, he's outmanned.
02:44:09.000 It's like one of him and there's a bunch of screechy...
02:44:13.000 You know, woke people.
02:44:15.000 It's wild.
02:44:16.000 I mean, they're just, they're like, I think we should still stay mostly woke.
02:44:21.000 Yes.
02:44:22.000 Yeah, that's essentially what they're doing.
02:44:23.000 Like, our business was being hurt when we were all woke, but let's stay mostly woke.
02:44:28.000 Yeah.
02:44:29.000 They just backed it off a notch.
02:44:31.000 Just a notch.
02:44:31.000 Just a notch.
02:44:33.000 The problem is, when you back it off a notch and you let someone like Scott Jennings in, you're fucking up your whole business.
02:44:39.000 Because all the viral clips are all him saying logical, reasonable things with a calm tone and people screeching about diversity and equity and whole shit.
02:44:52.000 He's being logical and reasonable and they're just lobbying a bunch of non-sequiturs that don't mean anything.
02:45:00.000 The real trap in this country is the two-party system.
02:45:03.000 That's the real trap.
02:45:04.000 Because people do believe it.
02:45:05.000 They do believe they're on the right side and they do believe the other side is the wrong side.
02:45:09.000 If there was five, six legitimate parties with varying positions on things and much more centrist parties that were legitimate, that people knew that if they voted for, these people could get in and enact legitimate change, we'd be a lot better off.
02:45:22.000 But boy, they put a lockdown on that shit right after Ross Perot came along.
02:45:27.000 Ross Perot fucked everything up in that election.
02:45:30.000 Yeah.
02:45:31.000 Bill Clinton got in.
02:45:32.000 Yeah.
02:45:32.000 And they were like, that's it?
02:45:33.000 From now on, no one's debating unless you're either the head of that party or that's it.
02:45:41.000 Yeah.
02:45:41.000 You got to be, like, locked into the system.
02:45:44.000 We're not letting any wackadoos in there.
02:45:47.000 Yeah, I remember watching those Russ Perot videos.
02:45:50.000 Oh, like him on TV with his charts and everything.
02:45:53.000 Oh, yeah.
02:45:53.000 He was telling you how the IRS was fucking you.
02:45:55.000 This is what the Federal Reserve really is.
02:45:58.000 And you're like, what?
02:45:59.000 I remember watching that.
02:46:00.000 The guy bought a whole half hour of television on prime time.
02:46:04.000 It might have been an hour.
02:46:05.000 I remember watching that thing going, how is this guy even allowed to do this?
02:46:09.000 This is crazy.
02:46:11.000 I think most of what he was saying was true.
02:46:13.000 It is absolutely true.
02:46:15.000 It's absolutely true.
02:46:16.000 I mean, he didn't lie.
02:46:17.000 He told the truth.
02:46:18.000 He just understood it in a way that the general public had literally no idea.
02:46:23.000 Well, I mean, I think there's also this, you know, like, do we actually have two parties?
02:46:27.000 Do we have one party?
02:46:28.000 Like, the whole uniparty thing.
02:46:29.000 Right.
02:46:29.000 It's kind of true.
02:46:31.000 So, I mean, my sort of rough guess is that while, like, I think maybe...
02:46:39.000 Three quarters of the graft is Democratic.
02:46:42.000 I think there's like maybe 20-25% that's Republican.
02:46:47.000 So basically most of the graft is going to the Democrats, but they throw some bones to the Republicans too, so then they're in on it.
02:46:57.000 It's not like there's zero graft on the Republican side.
02:46:59.000 Oh, there's plenty of conservative that are insider trading in Congress.
02:47:04.000 Plenty.
02:47:06.000 Inside of trading, and just there's the curious case of how do people in Congress or whatever become wealthy over time?
02:47:17.000 Extremely wealthy.
02:47:18.000 Yes.
02:47:19.000 On a $170,000 a year salary.
02:47:21.000 It's like literally impossible.
02:47:23.000 Yeah.
02:47:23.000 No one else does that.
02:47:24.000 It's literally impossible.
02:47:25.000 If you find out this guy has a $170,000 a year job, you're like, oh, he's doing okay.
02:47:30.000 He's all right.
02:47:31.000 Yeah.
02:47:31.000 And then you're like, wait a minute.
02:47:32.000 Why does he have $50 million?
02:47:34.000 Yes.
02:47:35.000 What is he doing?
02:47:35.000 Correct.
02:47:36.000 Yeah.
02:47:36.000 And I think the more accurate thing would be to say, like, what is the family value increase?
02:47:45.000 Meaning, like, how much does their spouse earn?
02:47:51.000 Do they have a mysteriously wealthy spouse?
02:47:53.000 Right.
02:47:55.000 Right.
02:47:57.000 And do they have a spouse that's really good at insider trading?
02:48:01.000 Yeah.
02:48:03.000 Like Paul Pelosi.
02:48:05.000 Really good.
02:48:06.000 Yeah, he's great at trading.
02:48:08.000 He's such a good trader.
02:48:10.000 Yeah.
02:48:10.000 I mean, that's why I actually posted on X. I think maybe we should pay politicians more, frankly, because it reduces the forcing function for graft.
02:48:27.000 I think maybe we should either pay politicians nothing or...
02:48:31.000 Maybe a lot more.
02:48:33.000 It's like somewhat maybe counterintuitively, if politicians got paid a lot more, then they wouldn't feel like that there's so much of a forcing function for them to accept corrupt money.
02:48:49.000 Yeah, but the problem is if you paid them a lot more, they're still not going to make as much money as they would insider trading.
02:48:54.000 But it's less of a forcing function.
02:48:56.000 Yes.
02:49:01.000 Let's say they've got some kids in D.C. It's an expensive place to live.
02:49:08.000 The schools are terrible, so they need to send their kids to some kind of private schooling situation.
02:49:14.000 They literally cannot afford that.
02:49:17.000 They cannot afford that right now.
02:49:19.000 So then you get into the situation, well, from their standpoint, well, they'll say they're doing it for their family.
02:49:26.000 They're doing it for their kids.
02:49:28.000 Well, especially if it's legal, and it currently is.
02:49:31.000 You kind of be silly to not do that.
02:49:33.000 If you're a part of a group of people that's passing a bill, and you know this bill's going to get passed, you know the votes are there, and you know it's going to affect this industry and this particular manufacturer, and you can buy stock.
02:49:47.000 It's more than just insider trading.
02:49:51.000 Like the insider trading stuff, like the stock portfolio stuff is quite trackable, but it's a lot more than insider trading.
02:50:01.000 The way they're acquiring wealth.
02:50:02.000 Correct.
02:50:03.000 And what other methods?
02:50:06.000 I mean, this is really going to get me assassinated.
02:50:11.000 I'm not lengthening my lifespan by explaining this stuff, to say the least.
02:50:19.000 I mean, I was supposed to go back to D.C. How am I going to survive?
02:50:23.000 This focus is going to kill me for sure.
02:50:29.000 In fact, I do think it's like I actually have to be careful that I don't push too hard on the corruption stuff because it's going to get me killed.
02:50:46.000 I was actually thinking about that on the plane flight over here.
02:50:57.000 If I push too hard on the corruption stuff, people get desperate is the issue.
02:51:01.000 Right.
02:51:02.000 Then they say, like, okay, if the money flow cuts off, then, okay, they can't afford school for their kids.
02:51:10.000 Right.
02:51:11.000 Then they're like, well, fuck you.
02:51:13.000 I'm going to kill you for my kids type of thing.
02:51:16.000 Yeah.
02:51:17.000 You know, and it's like, oh, geez.
02:51:18.000 Did you ever see that video?
02:51:20.000 I think it was an O'Keefe video where...
02:51:24.000 They've got this guy undercover and he's explaining.
02:51:27.000 They're talking to this guy.
02:51:28.000 He thinks he's on a date.
02:51:29.000 And he's explaining.
02:51:30.000 It's always a guy on a date.
02:51:32.000 Yeah.
02:51:33.000 Explaining how they can nudge someone to go and do something horrible.
02:51:38.000 Yeah.
02:51:38.000 And they recognize this person has problems.
02:51:41.000 They find an asset.
02:51:43.000 Yeah.
02:51:44.000 Yeah.
02:51:44.000 Totally.
02:51:45.000 Well, see, this is what I think for that Butler situation, for that assassin, you don't have...
02:51:50.000 It's kind of like a...
02:51:53.000 That funny-looking sport, curling?
02:51:56.000 You know where they have the stone on the ice?
02:51:58.000 And then they throw the stone, and then there's someone that's brushing the ice, but you can't touch the stone.
02:52:05.000 All you can do is just change the path of the stone a little bit, but you keep brushing the ice, and you can steer that stone right into the bullseye.
02:52:15.000 That's what I think happened in Butler.
02:52:18.000 That's what I think happened with that assassin.
02:52:21.000 If you can find the trail of breadcrumbs, it's going to be like curling.
02:52:27.000 Somebody was brushing the ice.
02:52:29.000 Well, also you find a young, confused, disenfranchised person and you give them purpose in their life.
02:52:35.000 Just brush the ice.
02:52:36.000 Yeah.
02:52:36.000 If you're brushing the ice, eventually it's going to hit the bullseye.
02:52:38.000 If you're in a position of authority or some big-time government person, you're talking to this person, all of a sudden this person's a valuable asset.
02:52:47.000 They're going to help America.
02:52:50.000 You're going to do this thing and you're going to be our top assassin from here on out.
02:52:56.000 You could talk people into doing a lot of things.
02:52:59.000 That's why cults are around, right?
02:53:02.000 No, exactly.
02:53:03.000 Yeah.
02:53:04.000 I mean, they're suicide bombers.
02:53:05.000 I mean, the Butler guy was a suicide assassin.
02:53:08.000 The second guy that tried to kill him on the golf course was also a suicide assassin.
02:53:14.000 From what I read, the Secret Service member that...
02:53:19.000 Saw the gun pointing out, fired several shots, none of which hit the assassin.
02:53:24.000 But they could have.
02:53:26.000 Like if those shots had hit the second assassin, he would be dead too.
02:53:31.000 So both of them were, you know, on a...
02:53:37.000 I mean, they were on a suicide mission, both of them.
02:53:43.000 One actually got killed, the one of them didn't get killed.
02:53:47.000 But he could have been killed if the bullets had hit him.
02:53:49.000 And you don't hear anything about him either.
02:53:52.000 There's a lot more about that guy than the first guy.
02:53:55.000 I mean, you look at his background, he looks like, you know...
02:53:59.000 Unhinged.
02:54:00.000 Yeah, totally unhinged.
02:54:01.000 Yeah.
02:54:02.000 The first guy, there's no...
02:54:05.000 I'm not aware of any evidence that shows that he's so unhinged as to be a suicide assassin.
02:54:11.000 No.
02:54:11.000 The second guy, like, okay, yeah, sure.
02:54:13.000 Well, two years before, he's acting in commercials.
02:54:17.000 And he got a high score in his SATs.
02:54:20.000 Yeah.
02:54:22.000 So, you know.
02:54:27.000 Well, without getting you killed.
02:54:29.000 Yeah, exactly.
02:54:30.000 So, I mean, like, basically, I'm like, listen.
02:54:33.000 I get it.
02:54:34.000 Well, attack the corruption enough to keep civilization trucking along, you know.
02:54:40.000 Yeah.
02:54:41.000 But I think if I... Fully destroy the corruption and the graft.
02:54:50.000 They will kill me.
02:54:52.000 That's a fucked up thing to live with.
02:54:55.000 Yes.
02:54:57.000 So, I'm like, damn it.
02:55:01.000 Listen, I really hope they don't kill you.
02:55:04.000 Yeah, thanks.
02:55:06.000 I mean, I strive to be alive.
02:55:11.000 But, yeah, I mean, it's a real concern.
02:55:17.000 You know, I mean, there were two guys that...
02:55:19.000 Before I supported Trump and everything, there were two guys that traveled to Austin to kill me.
02:55:27.000 I don't know if you know about this.
02:55:28.000 Yeah, I did hear about that.
02:55:29.000 Yeah.
02:55:30.000 And two separate incidents.
02:55:34.000 One guy thought I'd put a chip in his head.
02:55:39.000 I mean, they're both basically two guys that were just...
02:55:41.000 Very much had severe mental illness.
02:55:44.000 It wasn't like they had like a, I disagree with him politically and that's why he needs to die.
02:55:49.000 This is pre, before I was, before I got sort of smeared as being, you know, some sort of like Nazi or something like that.
02:55:59.000 So before the propaganda wave, the severe propaganda wave, the probability that any given homicidal maniac It's going to try to kill you as proportionate to how many times they hear your name.
02:56:13.000 And so they heard my name a lot.
02:56:15.000 So I just got to the top of the list of two homicidal maniacs who were arrested and both were in Travis County jail at the same time.
02:56:26.000 Whoa.
02:56:27.000 Yeah, I don't know if they talked or whatever, but they've both been released, by the way.
02:56:31.000 Jesus Christ.
02:56:32.000 They've both been released on bail, yeah.
02:56:35.000 Right, but they got ankle monitors and stuff, but still.
02:56:38.000 They can cut those off?
02:56:39.000 Yeah, I don't know, you know, exactly.
02:56:45.000 That's crazy.
02:56:46.000 Yeah, and the second guy had, like, chief serial killer in his bio on his X profile.
02:56:52.000 Yeah, it wasn't subtle is what I'm saying.
02:56:55.000 Jesus Christ.
02:56:57.000 Yeah.
02:57:01.000 And at this point, I think...
02:57:03.000 I'm at the top of the list for a lot of homicidal maniacs.
02:57:07.000 And the more the mainstream media talks about you in this way and says you're a Nazi and...
02:57:14.000 They're doing the same thing to me that they did to Trump.
02:57:19.000 Yeah.
02:57:20.000 Which is...
02:57:21.000 They're making it sound like if you kill me, you're a hero.
02:57:26.000 That's...
02:57:26.000 What they're doing is evil.
02:57:30.000 They're also doing the same thing where they're completely distorting who you are and people are going along with it.
02:57:36.000 And just like we're talking about Trump derangement syndrome, people have Elon derangement syndrome.
02:57:41.000 I see it.
02:57:42.000 I see where people can't see the forest for the trees.
02:57:45.000 Right.
02:57:45.000 And it's like I'm the same person that I was a year ago.
02:57:49.000 Nothing's changed, really.
02:57:53.000 Like I didn't suddenly become a completely different human.
02:57:57.000 Right.
02:57:58.000 But if you read the sort of legacy mainstream media, their propaganda stream is that I am a completely different human.
02:58:07.000 Right.
02:58:08.000 But I didn't get like a brain transplant, you know, in a year.
02:58:13.000 And if you say like two years ago, I was like a hero of the left.
02:58:17.000 Yeah.
02:58:19.000 So how can I go from hero to villain at age 53 suddenly?
02:58:26.000 MSNBC? CNN? Yeah.
02:58:28.000 It's like, that's what it is.
02:58:30.000 They use the machine.
02:58:32.000 Associated propaganda.
02:58:34.000 Mm-hmm.
02:58:35.000 Yeah.
02:58:37.000 I mean, they try to demonize you too.
02:58:39.000 Yeah.
02:58:41.000 They even try to demonize, in fact, at least partially successfully, demonize like Tim Oven.
02:58:49.000 Yeah.
02:58:50.000 Who is a super rational, reasonable, great human.
02:58:56.000 And his Wikipedia changed to, like, far-right.
02:59:01.000 And he's like, far-right?
02:59:03.000 I'm like, what are you talking about?
02:59:06.000 You know, like, a few years ago, it was like a liberal.
02:59:09.000 So how do you go from liberal to, like, instantly far-right?
02:59:12.000 And there's no left and right.
02:59:15.000 There's only left and far-right.
02:59:17.000 Right.
02:59:17.000 Yeah.
02:59:20.000 This is my left leg and this is my far right leg.
02:59:23.000 And even far left.
02:59:24.000 Far left is sort of dismissed as being not important to talk about.
02:59:28.000 Like Antifa and radical leftists.
02:59:31.000 That's not...
02:59:32.000 They're like burning down courthouses.
02:59:35.000 Yeah.
02:59:35.000 Reasonable.
02:59:36.000 Reasonable people.
02:59:36.000 Yeah.
02:59:37.000 Yeah.
02:59:38.000 Totally crazy.
02:59:39.000 It's a crazy time.
02:59:40.000 And it's not a time that I ever anticipated I was going to witness.
02:59:43.000 This is far beyond anything I ever thought I was going to experience.
02:59:48.000 In the clarity of it all, where it's so obvious.
02:59:52.000 Yeah.
02:59:53.000 And the gaslighting and the propaganda is so obvious.
02:59:57.000 And I saw the shrieking when RFK Jr. stopped this new test for new COVID vaccines on children.
03:00:05.000 10,000.
03:00:06.000 There are going to be 10,000 people with this COVID vaccine.
03:00:09.000 Like, who the fuck thinks that's a good thing at this point?
03:00:13.000 Not me.
03:00:13.000 What person, what gas chamber, like not gas light, you are fully unconscious.
03:00:22.000 There's no way, there's no way you know, if you know the effect of COVID today, no one's dying of it.
03:00:31.000 This is not a pandemic anymore.
03:00:33.000 The idea that you're to run a fucking huge test with 10,000 kids and a new vaccine.
03:00:39.000 Like, what are you even talking about?
03:00:41.000 It's completely unnecessary.
03:00:42.000 Totally unnecessary.
03:00:43.000 And shrieking when RFK Jr. steps in to stop it.
03:00:48.000 Yeah.
03:00:49.000 That's totally crazy.
03:00:52.000 I mean, I'm overall pro-vaccine, meaning we think we should have some reasonable number of vaccines against major ailments.
03:01:02.000 But I don't think we should be like...
03:01:04.000 Like jamming some, you know, little kid with like a giant vial that's like...
03:01:10.000 Hepatitis B. Yeah, 20 different things at a time.
03:01:14.000 It's like it's going to overload your...
03:01:15.000 It seems like there's a risk of overloading your immune system if you...
03:01:19.000 I mean, there's like how many vaccines can you take at a time?
03:01:23.000 It seems like your systems...
03:01:24.000 There's like some risk of system overload here.
03:01:26.000 Well, there's two hopes.
03:01:28.000 Hope number one is they can somehow or another...
03:01:31.000 Stop this ability that they have to advertise on television.
03:01:36.000 If that happens, that's big.
03:01:38.000 That's huge.
03:01:38.000 Because that doesn't just stop their ability to show you all these different medications that you should be on.
03:01:45.000 What it also does is it stops their financial influence on the news.
03:01:50.000 That's big.
03:01:51.000 That's really the biggest thing is that, I mean, the news is not going to attack one of their biggest advertisers.
03:01:59.000 Exactly.
03:02:00.000 And they never do.
03:02:01.000 Yes.
03:02:02.000 At best, they'll do something, but they're going to pull their punches.
03:02:07.000 They're going to be like fake fighting.
03:02:10.000 Yeah.
03:02:10.000 At best.
03:02:11.000 Yes.
03:02:12.000 Like movie fighting.
03:02:13.000 They're not actually landing haymakers.
03:02:15.000 It just looks like it.
03:02:16.000 The next step, then, is to remove this immunity that these vaccine manufacturers have.
03:02:23.000 And if they are liable for side effects and they are liable for the lies that they tell when they do these studies and they hide negative data, that'll change a lot.
03:02:35.000 Yes.
03:02:39.000 Yeah.
03:02:41.000 I think AI actually could be very helpful with medical stuff.
03:02:46.000 Because AI can look at all the studies and look at all the data.
03:02:50.000 Cross-check everything and give you good recommendations.
03:02:53.000 I mean, even as it is, like right now, you can upload your x-rays and your MRI images to Grok and it'll give you a medical diagnosis.
03:03:01.000 And that diagnosis, from what I've seen, is at least as good as what, if not, I think?
03:03:07.000 I've certainly seen cases where it's actually better than what doctors tell you.
03:03:11.000 Well, it's phenomenal for blood work.
03:03:13.000 Yeah.
03:03:14.000 Yeah.
03:03:14.000 I mean, you can literally take a photograph of your blood work, like the page.
03:03:19.000 Upload from your phone, upload that to Grok, and it will tell you if there's...
03:03:25.000 It'll understand what all the data results are and tell you if there's something wrong.
03:03:34.000 It's pretty amazing.
03:03:35.000 Yeah.
03:03:36.000 And I haven't seen it be wrong yet.
03:03:39.000 Well, it's supposedly more accurate than most physicians.
03:03:44.000 Yeah.
03:03:44.000 Because physicians are human beings, and maybe they don't have a deep understanding of the connection between, oh, you have this deficiency, and this is high, and your cortisol is here.
03:03:54.000 Well, yeah.
03:03:56.000 Sometimes doctors, especially in higher-end offices, will sell you stuff you don't need.
03:04:02.000 I always be a little suspicious of a doctor who's got an office in Beverly Hills.
03:04:07.000 It's a higher-end situation.
03:04:09.000 I'm not saying there are some very good doctors in Beverly Hills, but it's a higher-end situation.
03:04:14.000 You're at least tempted by the dark side.
03:04:16.000 Yeah.
03:04:17.000 And, I mean, one case, like, you know, I went to this doctor who was, like, highly recommended, you know, doctor to the stars, which is, like, maybe not a good sign.
03:04:27.000 And I got, like, blood work done.
03:04:31.000 It was, like, just drew blood and sent it to a lab.
03:04:35.000 And the guy, I'm, like, sitting in his office, and he tells me that I'm, like, B12 deficient.
03:04:42.000 You know, it's certainly possible that I'm B12 deficient.
03:04:45.000 And I was like, huh, okay.
03:04:46.000 And then he gives me, it says, like, you have to take these, like, B12 supplements, and he's going to give me a starter pack.
03:04:52.000 You know, then it's going to be like $1,000 a month for these special B12. $1,000 a month for B12? Yeah, that's a ridiculous amount of money, yeah.
03:04:59.000 That's crazy.
03:05:00.000 You get it on Amazon.
03:05:01.000 Yeah, but his one's special.
03:05:03.000 Oh, a special B12. Yeah.
03:05:06.000 Yeah, it was like a whole bunch, B12 and a whole bunch of other vitamins.
03:05:10.000 So then I get home and I'm like, well, I'm paging through my blood work and it says, I have, according to the blood results, I have excess B12. So I'm like, wait a second.
03:05:22.000 And he's giving me a box of pills that have like 20,000% of a recommended daily dose.
03:05:29.000 Like 20,000% is a big number.
03:05:32.000 And I'm like, I said, look, I took a photograph of the blood work that says I have excess, I'm like above the range, above the recommended range of B12. And then I'm like, and I took a picture of things that says, of the pills that say 20,000%.
03:05:46.000 It's like, can you help me reconcile these two things?
03:05:50.000 Because it says I've got too much, a little too much B12. And you just gave me pills that have 20,000% more.
03:05:57.000 I'm like, this is crazy.
03:05:59.000 What did the doctor say?
03:06:00.000 Oh, he said you can never have too much B12. Oh, he's a B12 junkie.
03:06:04.000 I'm like, yes you can.
03:06:04.000 He's a psychopath.
03:06:05.000 Yes.
03:06:06.000 That guy's a B12 addict.
03:06:08.000 Yes, totally insane.
03:06:10.000 That's what I'm saying.
03:06:11.000 It's like...
03:06:12.000 So, I mean, I could have just...
03:06:14.000 So then, you know...
03:06:15.000 Well, this was a while ago, right?
03:06:17.000 So this is pre-Groch.
03:06:18.000 This was like five years ago, yeah.
03:06:19.000 Right.
03:06:19.000 This is pre-Groch.
03:06:20.000 Yeah.
03:06:20.000 Like, now you could just enter in all that data and...
03:06:24.000 Now you can just photograph with your phone and upload it to Groch and Groch will tell you what's...
03:06:30.000 Just don't have it in sexy mode.
03:06:31.000 She'll keep trying to fuck you.
03:06:35.000 I mean, you're asking for it in sexy mode, you know?
03:06:38.000 Literally.
03:06:39.000 You tapped on sexy mode.
03:06:40.000 Yeah, you're asking for it.
03:06:42.000 I mean, I think we probably should, like, maybe allow it to get out of character a little bit.
03:06:47.000 Sure, yeah.
03:06:48.000 It's like, in unhinged mode, I tried to get it back to being hinged, but it would, like, no fucking way.
03:06:54.000 It's like, I'm gonna stay unhinged.
03:06:55.000 How many modes do you have?
03:06:57.000 I was like, I don't know, like...
03:06:59.000 8 or something.
03:07:00.000 And then there's an ability to have a custom mode.
03:07:05.000 So then you can have Unhinged Sexy.
03:07:07.000 Ooh.
03:07:09.000 That's my favorite kind.
03:07:11.000 You may think so.
03:07:17.000 Careful what you wish for.
03:07:18.000 Be careful what you wish for.
03:07:19.000 Especially if it's a robot and she can kill you.
03:07:21.000 Unhinged Sexy is dangerous.
03:07:27.000 Remember, like, the Pink Panther?
03:07:29.000 Remember Pink Panther had Kato try to jump him?
03:07:31.000 They'd, like, keep him sharp?
03:07:33.000 Always trying to attack him?
03:07:34.000 Remember that?
03:07:35.000 Right.
03:07:38.000 Listen, man, thank you for being here.
03:07:40.000 I always appreciate talking to you.
03:07:41.000 I know you're busy as fuck, so it means a lot to me that you have the time to do this.
03:07:46.000 And I think what you're doing is one of the most important things that has ever happened in this country.
03:07:51.000 I really do.
03:07:52.000 Particularly with the ownership of X, but also with what's happening with Doge and just enlightening all these people and shining light on all the vampires.
03:08:02.000 Well, hopefully people realize I'm not a Nazi.
03:08:04.000 I just want to be clear.
03:08:05.000 I am not a Nazi.
03:08:07.000 I think we covered it.
03:08:08.000 But that's exactly what a Nazi would say.
03:08:10.000 Damn it!
03:08:12.000 Yeah, that's what an alien would say.
03:08:14.000 Yeah, there's like, you can't escape this bullshit.
03:08:17.000 No, you can't escape it.
03:08:18.000 I don't think any reasonable person believes it.
03:08:20.000 If they believe it, it's because they want to believe it.
03:08:23.000 It's not because it's logical.
03:08:24.000 I mean, what's relevant about Nazis is like, are you like invading Poland?
03:08:28.000 And if you're not like invading Poland, Maybe you're not.
03:08:33.000 You have to be committing genocide and starting wars.
03:08:39.000 What is bad about Nazis?
03:08:43.000 It wasn't their fashion sense or their mannerisms.
03:08:48.000 It was the Holocaust.
03:08:50.000 The war and genocide is the bad part.
03:08:55.000 Their mannerisms and their dress code.
03:08:57.000 Well, that was the problem with all that punch a Nazi shit.
03:09:01.000 Like, punch a Nazi.
03:09:01.000 Remember that?
03:09:02.000 That was like a thing that people kept saying.
03:09:04.000 Punch a Nazi.
03:09:05.000 Punch Nazis.
03:09:06.000 But he was like, where are you meeting Nazis?
03:09:08.000 I've never met a fucking Nazi.
03:09:10.000 I've never met one.
03:09:10.000 I've never run into a bunch of Nazis where I had to punch them.
03:09:13.000 And what about all these so-called proud boy rallies or whatever?
03:09:18.000 And it's like, they've always got masks and they've always got the same uniforms.
03:09:21.000 And for some reason, they never get doxxed.
03:09:24.000 Right.
03:09:24.000 Right, right, right.
03:09:25.000 But wait, we're always going to dox them, except these guys?
03:09:29.000 There's a great video of me and Matt Taibian breaking down the Patriot Front.
03:09:33.000 Didn't the Patriot Front just disband?
03:09:36.000 Google that real quick.
03:09:37.000 We'll end with this.
03:09:39.000 Because I think they just disbanded, and these were the most obvious feds of all time.
03:09:44.000 Yeah, that's what I'm saying.
03:09:46.000 They were thin.
03:09:46.000 They had a fucking drum.
03:09:47.000 They had masks on.
03:09:49.000 Yeah, they all had uniforms.
03:09:50.000 It was so stupid.
03:09:52.000 Patriot Front disbands one day after FBI Director Chris Rae resigned.
03:09:56.000 Doesn't that seem like an odd coincidence?
03:09:59.000 Crazy.
03:10:00.000 Crazy.
03:10:00.000 The people that we were yelling at saying that they're feds.
03:10:03.000 There's a great video of me and Matt Taibbi if you want to find it.
03:10:06.000 How come nobody ever followed them and doxed them?
03:10:08.000 Yeah.
03:10:09.000 Crazy.
03:10:10.000 What are the odds?
03:10:12.000 What are the odds?
03:10:12.000 Agent provocateurs.
03:10:13.000 It's a thing.
03:10:14.000 They're real.
03:10:15.000 Alex Jones taught me about them.
03:10:17.000 Listen, man.
03:10:18.000 Thank you very much.
03:10:19.000 Thank you for everything.
03:10:19.000 Appreciate you.
03:10:21.000 Stay alive.
03:10:22.000 Stayin' alive.
03:10:23.000 All right.
03:10:25.000 I mean, I do think, like, one argument for me staying alive is that it's more entertaining if I'm alive than if I'm dead.
03:10:34.000 Oh, yeah.
03:10:35.000 Oh, definitely.
03:10:36.000 But I could be alive and, like, injured, which would suck.
03:10:40.000 Right.
03:10:40.000 Like, the wing just, like, shoot my arm off or something.
03:10:42.000 Right.
03:10:42.000 No, no, no.
03:10:43.000 We don't want that.
03:10:43.000 Yeah, exactly.
03:10:44.000 No.
03:10:45.000 Keep the security strong.
03:10:46.000 Yeah.
03:10:46.000 Hopping with one hand.
03:10:49.000 All right.
03:10:49.000 All right.
03:10:50.000 Thank you.