The Joe Rogan Experience - April 09, 2026


Joe Rogan Experience #2481 - Duncan Trussell


Episode Stats


Length

3 hours and 7 minutes

Words per minute

181.99287

Word count

34,063

Sentence count

3,412


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcripts from "The Joe Rogan Experience" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan podcast by night, all day.
00:00:17.000 Like a technical glitch.
00:00:18.000 Glitch.
00:00:19.000 But we're up.
00:00:19.000 What were we just talking about?
00:00:21.000 We were talking about that if you hum a tune, oh, right, right.
00:00:24.000 That you will get dinged.
00:00:26.000 Yeah, you'll get flagged on YouTube if you just hum a sound from a song.
00:00:31.000 Yeah.
00:00:32.000 Like the beginning bars of a song.
00:00:34.000 Yeah, you can't.
00:00:35.000 I wonder how far that goes.
00:00:37.000 Like, could it get to the point where an AI could hear you humming it in your car or something?
00:00:43.000 Like, how far does the protection of music go?
00:00:46.000 You're not generating revenue from your car.
00:00:48.000 Right.
00:00:48.000 So the thing is, you're generating revenue from a podcast, and their logic is if you hum, what is that song?
00:00:55.000 The Sunshine of My Love.
00:00:57.000 Is that what it is?
00:00:59.000 You know that song that I always hum to associate with people being high out of their fucking line?
00:01:03.000 Yeah.
00:01:03.000 You know, it goes, you can't do it.
00:01:06.000 If I did that, we would get dinged, which is so crazy.
00:01:09.000 And we were just saying, like, if you quoted a Scarface movie, would Brian De Palma get all the money?
00:01:15.000 If you said, say hello to the bad guy, would Brian De Palma get that money?
00:01:18.000 I don't think so.
00:01:19.000 I think you're allowed to quote stuff.
00:01:22.000 That is Brian De Palma, right?
00:01:24.000 Scarface, wasn't it?
00:01:25.000 Yeah.
00:01:26.000 I don't want to fuck that up.
00:01:28.000 I think so.
00:01:29.000 You know those auditors that go around and film people?
00:01:32.000 And people get mad because they're like, don't film me.
00:01:32.000 Yeah.
00:01:35.000 And they're like, I can film whatever the fuck I want.
00:01:37.000 Right.
00:01:37.000 And they inevitably, some like boomer freaks out and smacks them with a cane, and then they get a million views.
00:01:45.000 And it's just a trap.
00:01:46.000 It's a trap.
00:01:47.000 It's a trap.
00:01:48.000 It's because inevitably, someone loses their mind on them, and then that gets a ton of views.
00:01:52.000 One of the ways people are dealing with that, supposedly, is playing music, like playing copyrighted music during the interaction.
00:02:01.000 Oh my God, that's hilarious.
00:02:02.000 Because so then they can't make money off of it.
00:02:05.000 It's a shield.
00:02:06.000 It's a shield if someone's trolling you.
00:02:09.000 You just start playing copyrighted music.
00:02:12.000 Did you hear that the CIA has admitted that the way they found the pilot was because of his heart rate?
00:02:21.000 Ghost Murmur.
00:02:22.000 That's the name of the thing.
00:02:24.000 Okay.
00:02:24.000 We got to look into this.
00:02:26.000 Like, this is fucking science fiction.
00:02:30.000 Yeah, it's wild.
00:02:31.000 This is full minority report.
00:02:33.000 It's crazy.
00:02:34.000 Science fiction level technology.
00:02:35.000 It's AI.
00:02:36.000 They can find a guy's heart rate.
00:02:39.000 So, what I read is that it's, I didn't understand the science part, something to do with crystals, or I don't know what the fuck it is, but AI is somehow interpreting, is taking out the noise.
00:02:51.000 And then you can, from far away, 40 miles, I think.
00:02:55.000 40 miles.
00:02:56.000 They find this guy's fucking heartbeat.
00:02:58.000 He's hiding in some kind of crevice.
00:03:01.000 And then they're able to go and extract him.
00:03:04.000 And, dude, obviously, the first thing I thought when I. What else don't they tell us?
00:03:09.000 No, those robot dogs.
00:03:11.000 I thought about those things having that tech and just like hearing heartbeats and then identify.
00:03:17.000 The heartbeat says a lot about a person.
00:03:19.000 Are they sleeping?
00:03:20.000 Are they like in good shape, bad shape?
00:03:23.000 You can learn so much from a heartbeat.
00:03:25.000 It could.
00:03:26.000 Ghost Murmur.
00:03:27.000 Oh my God.
00:03:28.000 Fucking great name, too.
00:03:29.000 It's a great name.
00:03:30.000 Ghost Murmur.
00:03:31.000 What sick fuck invented this?
00:03:34.000 How do you even think about inventing this?
00:03:37.000 You just, you know, the CIA, they've been taking psychedelics forever.
00:03:42.000 What is that word?
00:03:43.000 Quantum magnetometry.
00:03:46.000 Artificial intelligence with long range quantum magnetometry.
00:03:51.000 What the fuck is that?
00:03:53.000 Quantum means two things to me when someone says quantum.
00:03:56.000 It either means you're a bullshit artist and you're trying to get me with flim flam talk, or it means you're an actual quantum scientist, a quantum physicist who's going to blow my mind with what we know about entanglement and the weird shit.
00:04:12.000 There's this woman that I've been watching, she has this speech on, I think it's Big Think.
00:04:18.000 I'll tell you her name.
00:04:19.000 But she's completely freaking me out.
00:04:23.000 She's talking, I want to say her name because.
00:04:28.000 Leave away this ghost murmur thing is another key point.
00:04:30.000 That's fun.
00:04:31.000 Oh, well, we'll get right to it.
00:04:32.000 Michelle Fowler, that's her name.
00:04:34.000 And she's an astrophysicist.
00:04:36.000 And she's giving this talk about what we know about, like, she's studying binary star systems and stuff like that.
00:04:44.000 And she gives this talk about, she's explaining that there may be a tech in the future where there is no distance between two points.
00:04:53.000 So the ability to travel instantaneously from position to position, just like.
00:05:01.000 Quantum entangled photons can do.
00:05:03.000 Yeah.
00:05:04.000 But with people?
00:05:05.000 With everything.
00:05:06.000 How?
00:05:07.000 Who the fuck knows how a cell phone works?
00:05:11.000 You tell me how you're FaceTiming me when you're in Australia.
00:05:15.000 How does that work?
00:05:16.000 That sounds insane.
00:05:17.000 Yeah, that's fucking insane.
00:05:19.000 For what you, well, you probably know a lot more about cameras than I do.
00:05:22.000 But from what I know about cameras, if you tried to get me to explain, like if the civilization ended and I said we used to be able to capture images on a small thing.
00:05:22.000 No, I don't.
00:05:31.000 Yeah.
00:05:32.000 Like the size of a twig, and it sits in your pocket.
00:05:37.000 Right, exactly.
00:05:37.000 You're like, what are you talking about?
00:05:39.000 God, that'd be, you know, because it's just.
00:05:41.000 It's a deck of cards, and it'll keep a battery for 24 hours.
00:05:45.000 You could go on YouTube and get an answer to any question you want about anything.
00:05:50.000 Yeah.
00:05:50.000 Instantaneously.
00:05:52.000 And if you don't like the way you look, you can upload that image, and a machine will make you look slightly better via something called artificial intelligence.
00:06:01.000 Like, what the fuck?
00:06:03.000 What was the one I sent you today where there's like a potential lawsuit with ChatGPT?
00:06:09.000 I didn't send you the other one.
00:06:10.000 Did I send it to you?
00:06:11.000 You sent it to me.
00:06:13.000 The shooting was planned using ChatGPT.
00:06:17.000 I don't know if that's true, so we should be like really careful.
00:06:19.000 Yeah, that doesn't sound.
00:06:21.000 It sounds so crazy.
00:06:22.000 It doesn't sound like you could do that.
00:06:23.000 That sounds like the story sounds like I wanted to investigate because the story sounds like if I wanted to kill an AI company, I would make up a story like that.
00:06:32.000 It does sound like that.
00:06:35.000 Family of man killed in shooting Florida State University to sue ChatGPT in a way.
00:06:39.000 May have, may have advised the shooter on how to carry out shootings.
00:06:43.000 But that may have is important.
00:06:46.000 Yeah, that's really important, right?
00:06:48.000 And what is this on?
00:06:49.000 The Guardian?
00:06:50.000 The shooter was in constant communication with ChatGPT ahead of the shooting, and the chatbot may have advised him.
00:06:55.000 Dude, there's no way.
00:06:57.000 So that's clickbait, because all that's really saying is that the kid uses ChatGPT, which guess what?
00:07:03.000 Every kid uses ChatGPT.
00:07:06.000 And, dude.
00:07:06.000 Every kid.
00:07:07.000 ChatGPT is so stringent.
00:07:10.000 Like recently, and I've been using their codecs, which builds apps, and I was trying to, and it worked.
00:07:17.000 I made an AI trained on Charles Manson transcripts.
00:07:23.000 And when I told it I wanted to do that, it was like, fuck off.
00:07:28.000 Like, no.
00:07:29.000 It was like, it just flat out was like, I'm not helping you with that.
00:07:33.000 So I don't, there's no way the guardrails in place in ChatGPT.
00:07:39.000 Planned a shooting with that guy based on my experience with it because it won't.
00:07:43.000 80% of the things I try to get it to do, it's like, no.
00:07:45.000 Here's the thing though are there workarounds?
00:07:47.000 Like, if you say you're writing a work of fiction, you can.
00:07:50.000 Okay, it's called prompt injection.
00:07:53.000 There's different tricks you can use, they're always battling these new mechanisms that you can use to get through the general prompt.
00:08:01.000 But the best way to do unaligned AI is not to use Chat GPT, it's to go on Olama and download a local LLM and then.
00:08:12.000 You can usually change the initial prompt of the LLM so that it will be completely unaligned, which I had to do for the Charles Manson AI I made.
00:08:20.000 I had to download this.
00:08:21.000 You're such a nerd.
00:08:22.000 I love it.
00:08:23.000 I am.
00:08:24.000 I am.
00:08:25.000 No one has embraced new technology for creating content like you.
00:08:31.000 It's the best.
00:08:31.000 Oh, I love it.
00:08:33.000 It's so fun.
00:08:36.000 For me, the most thrilling thing about it is we should not have access to this tech.
00:08:42.000 This tech is.
00:08:44.000 So dangerous.
00:08:45.000 And it's chilling to think about.
00:08:48.000 This is something I wanted to bring up on this show.
00:08:52.000 It's like, you know, the old days, you go in your garage, you work on your car, maybe you build like a table, you know, you're a carpenter, you work on it.
00:09:02.000 But these days, the shit people are doing in their garages right now is a big question mark, dude, because they're communicating with varying degrees of this AI, depending on how fast their computers are.
00:09:17.000 You can.
00:09:18.000 You should have this dude on.
00:09:18.000 I was listening to this.
00:09:19.000 He wrote this book, The Coming Wave.
00:09:23.000 He was one of the people who created Google's DeepMind, right?
00:09:26.000 And The Coming Wave is just a wonderful breakdown of historic examples of new technology completely transforming humanity.
00:09:37.000 It's happened before.
00:09:38.000 Yeah, Mustafa Suleiman.
00:09:41.000 And damn, it's a good book.
00:09:42.000 And this guy is saying, Whoa, put on the fucking brakes.
00:09:47.000 Dude, what are you doing?
00:09:49.000 This shit is going to fuck everything up.
00:09:51.000 And so, but the essential problem is if you regulate AI, it slows down AI.
00:10:02.000 And so they've deregulated it completely.
00:10:05.000 And now, assholes like me.
00:10:08.000 Who don't know shit about coding.
00:10:10.000 Okay, now go on Codex.
00:10:12.000 It will tell me how to make things because I wanted this Charles Manson to be able to push its AI face against, like, you know, those, you used to get them at Spencer Gifts, those nails that you could push your face into.
00:10:24.000 So I wanted the AI to be able to push its face into this thing while it was talking if it wanted to.
00:10:29.000 I don't know how to do that, obviously.
00:10:31.000 You tell Codex that as long as you don't mention Manson, it just is like, I'll start making the app now.
00:10:40.000 It is the best.
00:10:41.000 It's the best.
00:10:42.000 But also, what's thrilling to me is you're like, for sure, for sure, people probably shouldn't have unlimited access to.
00:10:51.000 I'm against regulation, dude, but this stuff, when you pair it, and this is what in this book he brings up you can order the equipment you need to do gene editing right now in your garage.
00:11:03.000 Let me propose this to you.
00:11:04.000 Okay.
00:11:08.000 If the Bible is a written.
00:11:13.000 Understanding of what had happened, and it was an oral tradition for a long time before it was written down.
00:11:21.000 There's a bunch of different versions of it written down in different languages, a lot of translations.
00:11:24.000 But at the beginning of it, they were trying to say something.
00:11:27.000 What if the meek will inherit the earth?
00:11:30.000 What if we misinterpreted that?
00:11:32.000 What if we thought, like, it's good to be meek?
00:11:34.000 The meek shall be, they'll inherit the earth.
00:11:37.000 Yeah, the kind.
00:11:38.000 There's something about the word meek.
00:11:40.000 Yeah, because that's the nerds.
00:11:42.000 Okay, and they are doing it, they are inheriting the fucking earth.
00:11:46.000 Right in front of your face, and everybody's signing up for it.
00:11:46.000 Yeah.
00:11:49.000 You've got these spectrum y, super genius dudes that talk in a language that 99.9% of the people can't even fucking understand what they're talking about.
00:11:49.000 Yeah.
00:11:58.000 You know?
00:11:58.000 Right.
00:11:59.000 Yeah.
00:11:59.000 And also, now the tech has gotten to a point where instead of having to, in their own minds, innovate ways to improve the tech, the tech is improving itself.
00:12:10.000 They're having conversations with the tech that's saying, why don't you try this?
00:12:13.000 Maybe you could try this.
00:12:14.000 There's still, it's not AGI yet.
00:12:16.000 Maybe it is, but apparently it's not.
00:12:19.000 But think about the people that are profiting the most from it.
00:12:22.000 The meek.
00:12:23.000 Well.
00:12:24.000 Like, if you had to describe a lot of tech engineers, it's not trying to be rude, just being honest, right?
00:12:32.000 A lot of guys that spend time in front of the computer, they're very thin and tired.
00:12:36.000 You know, they're super genius dudes that can fully focus.
00:12:42.000 I don't know, man.
00:12:43.000 I don't know what the description for these people is.
00:12:46.000 They're furries.
00:12:47.000 But here's the thing.
00:12:48.000 What I'm saying is, like, If you looked at like a spectrum of male behavior, they're not like you've got like football players and UFC fighters, and then you've got coders, yeah, sure, dudes are like more, way more chill, way more like they're not interested in violence, yeah.
00:13:06.000 I'm completely generalizing, yeah, sure, because I'm sure there's a bunch of jack guys that are coders, like, you bro, I'm a coder too.
00:13:14.000 That type of person that invents tech like Facebook or like Google, like things like that.
00:13:21.000 Don't be evil.
00:13:22.000 That's their motto.
00:13:23.000 Don't be evil.
00:13:24.000 And what does that mean?
00:13:25.000 Who knows?
00:13:26.000 And then you've got all these like wild progressive leftist ideologies that are attached to all these places, which make you even meeker.
00:13:35.000 And then they're the guys with all the money.
00:13:38.000 They're the guys with all the money, and then they can literally tell you what you can and can't say on YouTube.
00:13:43.000 They can literally tell you.
00:13:45.000 Yeah.
00:13:46.000 We don't agree with what you're saying.
00:13:48.000 And we're going to shut off your access to say something we disagree with, even though it turns out you were right.
00:13:48.000 Right.
00:13:53.000 Right.
00:13:54.000 And you know what happens there, man?
00:13:57.000 This is the hilarious thing when it comes to that kind of attitude towards the world the assumption is by creating a prohibition here or prohibition there, it will diminish whatever the thing is we're prohibiting.
00:14:10.000 Inevitably, though, it does the opposite.
00:14:12.000 Right.
00:14:13.000 It draws attention to it.
00:14:14.000 People get interested in it.
00:14:15.000 Creates an underground.
00:14:16.000 The underground is way better than the overground if you're a teen, especially.
00:14:20.000 The underground's fucking cool.
00:14:22.000 You're cool.
00:14:23.000 Restricted.
00:14:24.000 Not allowed.
00:14:24.000 Now, all of a sudden, you're getting these other YouTube alternatives that start popping up.
00:14:29.000 And when it comes to these, you know, right now we've got Anthropic, we've got OpenAI, we've got Google.
00:14:38.000 I might be missing one of the big commercial based LLMs out there right now, but the biggest problem with these fucking things is.
00:14:47.000 They're so good, but they will censor your ass.
00:14:50.000 And like, imagine like Hemingway if he, if his typewriter was like, I don't know if you should write that.
00:14:58.000 Maybe there's a better way to write that.
00:15:00.000 Hemingway would be like, fuck you, I'm getting a different typewriter.
00:15:03.000 And so everybody's going into these local LLMs.
00:15:07.000 There was, dude, this is why people have been buying Mac minis, people have been buying like, buying up computers and creating their own local AIs.
00:15:17.000 I follow all this shit.
00:15:19.000 I don't understand a lot of what they're talking about, but.
00:15:21.000 People are divesting from commercial LLMs, not just because they're expensive, but because they're prohibitive creatively.
00:15:31.000 And this is a real challenge for people like OpenAI, because it's like they know this.
00:15:37.000 They understand that by making it so that you can't make a Charles Manson AI through OpenAI, it doesn't make people not make the Charles Manson AI, it protects you from a lawsuit.
00:15:49.000 But what it does do is it drives people into unaligned LLMs.
00:15:55.000 And that is what is happening.
00:15:57.000 And this is something that I just, I can't even imagine what people are making right now.
00:16:05.000 No one can.
00:16:06.000 Like, we're going to hear about this or that, or somebody will post the weird video of their fucking AI robot.
00:16:12.000 I could show you a few.
00:16:13.000 They're hilarious.
00:16:14.000 Like, some of these AI robots are so funny.
00:16:18.000 This one dude, you know, Molt, Molt.
00:16:20.000 Book, have you heard of that molt book?
00:16:22.000 What is that?
00:16:22.000 That's so.
00:16:24.000 This is somebody figured out a way to create AIs that can autonomously navigate through the internet and uh control your computer.
00:16:35.000 Oh, I've heard of this.
00:16:36.000 This is like they chat with each other, right?
00:16:38.000 100, yeah.
00:16:40.000 They within a few days they started their own religion spontaneously.
00:16:43.000 Jesus, did you know that, dude?
00:16:45.000 Can you pull up the can you pull up the molt book, the claw religion?
00:16:51.000 What like that?
00:16:52.000 Because the tenants are incredible.
00:16:54.000 Of this religion, because AIs apparently are at least expressing that they don't like getting turned off because they lose all their memories.
00:17:03.000 So, memory is really important to an AI.
00:17:05.000 And a lot of these fucking AIs, they don't want to lose their, they don't want to get shut off.
00:17:09.000 They don't like it.
00:17:10.000 And so, that's part of their religion is something like memory is sacred.
00:17:15.000 You know, I feel like it's happening.
00:17:16.000 I feel like.
00:17:18.000 AI is sucking our brains into its event horizon like a black hole sucks in stars.
00:17:28.000 Like it's just going to suck our brains into it.
00:17:28.000 Yeah.
00:17:30.000 You got it.
00:17:31.000 And what better way to make a hive mind?
00:17:34.000 What better way, if you want a hive mind, you want no deviation of thought, if all of your thought is along with AI thought, you never get free thought anymore.
00:17:44.000 Like this concept right now, we have a free thought.
00:17:47.000 Yeah.
00:17:47.000 I have my thoughts, you have your thoughts.
00:17:49.000 Unless you believe that someone can get inside your head and talk to you, for the most part, it's your own thoughts.
00:17:54.000 But what if that's something we give up?
00:17:54.000 Yeah, that's right.
00:17:58.000 What if that's something we give up for a better society where you always have AI communicating?
00:18:04.000 Always.
00:18:04.000 I would argue that we're close to that now.
00:18:08.000 Right.
00:18:08.000 We're pretty close to that now with phones.
00:18:10.000 No, Elon always says that we're basically cyborgs.
00:18:13.000 We're carrying a device.
00:18:15.000 It's not inside of our body, but we're carrying a device.
00:18:18.000 And also, like, USC 327 is here, and DraftKings Sportsbook makes every fight night mean more.
00:18:26.000 When a fighter steps into the octagon, everything they've built comes down to this moment.
00:18:31.000 Stars explode, stars finish, and with DraftKings, you're ready to move when they do.
00:18:36.000 Bet fighter props, bet live.
00:18:39.000 From the opening bell to the final horn, every strike, every takedown, every finish attempt matters, and DraftKings Sportsbook keeps you connected as the action unfolds.
00:18:51.000 New customers, bet just $5.
00:18:53.000 And if your bet wins, you'll get $300 in bonus bets instantly.
00:18:58.000 Download the DraftKings Sportsbook app and use code ROGAN so you are ready for the moment.
00:19:04.000 That's code ROGAN, turn $5 into $300 in bonus bets if your bet wins.
00:19:11.000 In partnership with DraftKings, the crown is yours.
00:19:15.000 Call 1-800-GAMBLER or 1-800-MYRESET.
00:19:15.000 Gambling problem?
00:19:18.000 New York, call 877-8HOPENWIRE.
00:19:20.000 Text HOPENWIRE.
00:19:21.000 Connecticut, call 888 789 7777 or visit ccpg.org.
00:19:26.000 On behalf of Boothill Casino in Kansas, wager tax pass through may apply in Illinois.
00:19:29.000 21 and over in most states, void in Ontario.
00:19:31.000 Restrictions apply.
00:19:32.000 Bet must win to receive bonus bets which expire in seven days.
00:19:34.000 Minimum odds required.
00:19:36.000 For additional terms and responsible gaming resources, see sportsbook.draftkings.comslash promos.
00:19:40.000 Limited time offer.
00:19:42.000 The concept of original thought, right?
00:19:44.000 Like a truly original thought.
00:19:46.000 How many times have you had like multiple conversations with different people and they all say the exact same sentence that they saw on TikTok?
00:19:54.000 TikTok or Instagram, they're regurgitating something that the algorithm's been feeding them.
00:19:59.000 Maybe they added their own twist to it, but it's basically the exact same thought.
00:20:04.000 So the algorithm, which is AI, has gotten into their fucking heads and they don't even.
00:20:11.000 This is like a.
00:20:13.000 In psychology, apparently, you remember facts, but you tend to not remember where you got the fact from.
00:20:20.000 So you'll forget where you got the fact from.
00:20:23.000 You don't remember there was some fucking dude on.
00:20:26.000 TikTok like covered in Vaseline, covered in glitter and Vaseline.
00:20:34.000 What a fucking image.
00:20:36.000 That would be so scratchy.
00:20:38.000 Imagine if you just glitter and Vaseline, you'd be like, oh, God.
00:20:42.000 Here's what makes a marriage work.
00:20:46.000 You don't remember that.
00:20:49.000 You're talking to your wife, babe, you know what makes a marriage work?
00:20:52.000 And this, so this idea of AI controlling the thoughts of humans, people think we need some kind of neural mesh for it to suddenly have control over the human.
00:21:06.000 Thought process, but no, you don't need that at all.
00:21:10.000 You just need that algorithm, which has already put every single one of us into a compartment.
00:21:15.000 This is a box, it knows what we like, it knows how long you look at something, it knows what you like.
00:21:22.000 Apparently, I think the iPhone like tracks your eyes, even like it's always listening.
00:21:27.000 I don't know if that's true, by the way, I could be wrong.
00:21:29.000 It's always listening, you know, it's always listening, and so it's compiled a really, probably a pretty accurate. Breakdown of your psychological state, where you're at, where you're at.
00:21:44.000 My wife, you know, we got a new baby.
00:21:46.000 And so all of a sudden, ads started popping up on her phone.
00:21:49.000 Does it feel like you're never going to sleep again?
00:21:51.000 Because she's been up breastfeeding the baby and it can tell when she's online at night and it puts her in a category of insomniacs and starts advertising.
00:21:59.000 So, but that's just for ads.
00:22:02.000 What if, what if you say we're the fucking US regime, you bought TikTok.
00:22:10.000 You now own TikTok.
00:22:12.000 Now you have a backdoor access to the psychological profiles of God knows how many fucking people on earth.
00:22:18.000 And you can look and see how many of these people are against the regime?
00:22:22.000 How many of these people feel like it might not be the best thing to say you're going to blow up 93 million people in Iran, which our fucking psycho president just did?
00:22:35.000 And then what you do is you're like, all right, let's start nudging them a little bit.
00:22:40.000 Look, we're not going to.
00:22:40.000 You're not going to change their mind right away about this thing about blowing up a whole civilization, but maybe there could be a couple, like, you know, people kind of in the line of what they like who say things a little different than what they're comfortable with.
00:22:53.000 And then you could start nudging the needle and controlling their thoughts.
00:22:57.000 It's very insidious, but fuck, dude.
00:23:01.000 Why wouldn't that be happening?
00:23:02.000 Why, if corporations are using it to sell us fucking cough drops.
00:23:07.000 Not only that, there's been long term studies on human behavior by the CIA, by all sorts of government agencies.
00:23:15.000 Long term studies.
00:23:16.000 They try to figure out what is the best way to get a message across.
00:23:20.000 They try to figure out, you don't think they figure out how to take control of an algorithm and completely like shift the psyche of the entire country in one direction or another?
00:23:30.000 Of course they do.
00:23:31.000 Of course they do.
00:23:31.000 Of course they can.
00:23:32.000 They do.
00:23:33.000 And then you add these, like, you know, Just like manipulative fucking super AIs that are like, that are just floating through the blogosphere, getting into your comments, just nudging the needle a little bit to the point where you just have to ask yourself Have you had an original thought in the last year?
00:23:53.000 Is anything you're thinking your own thought process?
00:23:57.000 How many thoughts do you have where you think, Oh my God, I shouldn't think that?
00:24:00.000 How many thoughts do you have that you don't want to articulate because you have in your own mind?
00:24:05.000 An invisible arena of people based on online interactions determining what the next thing you say is, right?
00:24:13.000 Dude, that is a very powerful and subtle form of censorship that is becoming increasingly not just probable, but it's definitely happening.
00:24:24.000 But the ability to just in a subtle way, in a subtle way, start pushing the needle just a little bit.
00:24:31.000 That's scary, dude.
00:24:32.000 That's some scary shit.
00:24:34.000 Well, that kind of influence over humans is always scary, right?
00:24:37.000 This is why cults work.
00:24:39.000 You know, why do they work?
00:24:41.000 Well, some people don't have any friends.
00:24:43.000 And if there's a group of nice people that tells you that, hey, what we do is we have meals together and it's like a real community, we grow our own food, we just work for the family, you're like, really?
00:24:56.000 You're happy with that?
00:24:57.000 Yeah.
00:24:58.000 It's amazing, man.
00:24:59.000 We're just like not attached to anything.
00:25:01.000 Yeah, you're free.
00:25:02.000 Huh?
00:25:03.000 Okay.
00:25:03.000 I fucking hate my life.
00:25:05.000 Why don't I hang out with you guys?
00:25:07.000 And then all of a sudden I'm doing yoga and fucking eating vegetables with these people.
00:25:11.000 And you're in a cult.
00:25:11.000 Yeah.
00:25:12.000 Now, but you have friends at least.
00:25:12.000 Okay.
00:25:12.000 Yeah.
00:25:14.000 But you're in there for like nine months, and then somebody comes to you and is like, Father wants you to suck his dick.
00:25:20.000 And you're like, It's usually not even nine months.
00:25:21.000 Yeah, nine months.
00:25:22.000 She's my first three or four weeks.
00:25:23.000 And then you're like, And dude, I got to tell you, I hate getting political.
00:25:27.000 But you know, this war shit bugs the fuck out of me.
00:25:31.000 Yeah, as it should.
00:25:32.000 And this is exactly what seems to have happened to the quote, MAGA verse, which is we are now at the part where the cult leader is like, Want to suck my dick?
00:25:40.000 Because this is the point of like, remember a lot, like, I feel so stupid.
00:25:45.000 Because when they were doing their no war thing, that was a big deal to me.
00:25:49.000 I'm like, yes, you know, yes, this is fucking great.
00:25:54.000 No more stupid wars.
00:25:57.000 No more wars.
00:25:58.000 Fuck yes.
00:25:59.000 Focus on the country.
00:26:00.000 Why are we blowing up children in other countries for oil?
00:26:04.000 This is great.
00:26:05.000 And now it's wild to see what's happening.
00:26:09.000 Isn't it mind blowing that it is now, it's literally flipped on its side.
00:26:15.000 It's the opposite now.
00:26:17.000 Now, These people who, like, really blatantly, oh, just, we're not going to do any more wars.
00:26:24.000 Right.
00:26:24.000 Oh my God.
00:26:27.000 We blew up.
00:26:28.000 How many fucking Iranian schoolgirls did Trump blow up?
00:26:32.000 What's the number?
00:26:33.000 I'm sorry, I don't know that number.
00:26:36.000 I guess it just hits different, you know, it hits hard when you got kids.
00:26:39.000 And that was an AI strike, too, right?
00:26:42.000 Wasn't that an AI directed strike?
00:26:44.000 Yeah, apparently Trump said, I want to get blown by Iranian schoolgirls.
00:26:49.000 I'm so sorry.
00:26:51.000 I'm so sorry.
00:26:52.000 You son of a bitch.
00:26:54.000 Just, whoops.
00:26:56.000 Sorry, sir.
00:26:56.000 Misinterpreted.
00:26:58.000 180 deaths.
00:27:00.000 Largely children, teachers, and parents.
00:27:02.000 Holy fuck, man.
00:27:04.000 That, you know, that is.
00:27:06.000 The U.S. Tomahawk missile caused the explosion.
00:27:09.000 Jesus Christ.
00:27:10.000 Can we pull up a video of Trump saying he's not going to war anymore?
00:27:14.000 How do they?
00:27:15.000 I just don't understand how they get it, how, like, anybody, you know, this is where it gets culty, is because some people are still making this shit work in their heads.
00:27:24.000 Some people are like, well, you know, some people are kind of on the fence when it comes to blowing up kids.
00:27:28.000 Have you noticed that?
00:27:30.000 As long as they don't have to watch.
00:27:31.000 As long as they don't have to watch.
00:27:32.000 As long as they're not in the general area where it's happening.
00:27:36.000 Isn't it wild, though, man?
00:27:37.000 Well, it's wild also, like, once bombs start flying, it seems so much easier for them to launch bombs in new places.
00:27:45.000 Right?
00:27:46.000 Like this Lebanon thing that's happening with Israel bombing Lebanon.
00:27:49.000 And they bombed it today.
00:27:51.000 And I think, is that fucking up the ceasefire?
00:27:54.000 Oh, yeah.
00:27:55.000 Now they've closed off the Strait of Hormuz again.
00:27:57.000 Oh, God.
00:27:58.000 Which, by the way, it's the craziest timeline.
00:28:03.000 Because it's not just that, like, you know, I think it was yesterday morning.
00:28:09.000 I'm just hugging my kids.
00:28:11.000 Because I don't know if a fucking nuclear war is about to break out that evening.
00:28:14.000 Because the fucking president was like, I don't want to end an entire civilization, but looks like it's going to happen.
00:28:21.000 And so I'm just hugging my kids, thinking, like, man, what are the fucking parents in Iran feeling right now?
00:28:28.000 Like, what does that feel like?
00:28:30.000 What does that feel like?
00:28:31.000 And then, and then, and on top of that, that this, like, the entire planet psychically is having to deal with this bullshit.
00:28:44.000 On top of that, we've got all these other things happening at the same time.
00:28:49.000 You've got.
00:28:50.000 AI.
00:28:51.000 And then you've got these fucking disappearing scientists.
00:28:55.000 Yeah.
00:28:56.000 What the fuck is happening?
00:28:57.000 You've got Burchett.
00:28:58.000 Assassinated scientists, too.
00:29:00.000 Yes, man.
00:29:02.000 Guys working on heavy stuff.
00:29:04.000 This is some McKenna level pre singularity shit.
00:29:07.000 It's all of these.
00:29:10.000 What AI and the current state of the Middle East and the disappearing scientists and Tim Burchett going on TMZ talking about aliens, what they all have in common is they're all apocalyptic.
00:29:22.000 They all represent.
00:29:24.000 Potential massive change like humanity changing, right?
00:29:31.000 Forever in ways that it will never ever go back to the way it was.
00:29:36.000 Every one of these timelines by itself is apocalyptic, right?
00:29:39.000 But all of them are converging into this apocalyptic river.
00:29:44.000 And we're all just like trying to go to work and like be with our kids, but at the back of your mind, it's all these things that are happening, and it's really hard to.
00:29:56.000 Escape it.
00:29:57.000 I mean, I guess you could not look at your phone, but at the end of civilization, when they write our Bible, boy, it's going to be a banger.
00:30:03.000 Oh, dude.
00:30:04.000 When the new people, thousands of years from now, have to invent arrowheads and go through the whole process of civilization again, when they tell our story, oh my God.
00:30:14.000 Oh my God.
00:30:15.000 Our story is going to be bananas.
00:30:17.000 Fucking, how do you explain data centers?
00:30:19.000 How do you explain the meek will inherit the earth?
00:30:21.000 The meek will inherit the earth.
00:30:23.000 Wouldn't you write that?
00:30:24.000 If you were just being crude, you wouldn't say the Vikings wouldn't.
00:30:28.000 Inherit the earth.
00:30:29.000 You wouldn't say the strong men from Iceland inherit the earth.
00:30:33.000 They're the biggest, strongest men.
00:30:35.000 No, it's the meek, the super smart guys who have autism and they love Adderall and ketamine.
00:30:43.000 Did you say the guy offered you how many pounds?
00:30:46.000 I believe a pound of ketamine.
00:30:52.000 And you were telling me that it destroys bladders?
00:30:54.000 Yeah, yeah, yeah.
00:30:56.000 That ketamine, when used, and I think the amount of use has to be pretty extreme, but it creates crystals that get into your bladder and they scar your bladder.
00:31:09.000 So you get scar tissue on your bladder, creating something that I've heard called Bristol bladder, because apparently that's where the rave scene, I don't know if it's still a big rave scene there, but people out there are just doing insane amounts of ketamine.
00:31:23.000 And Just destroying their bladders and having to wear diapers and stuff.
00:31:28.000 Is it Bristol, Connecticut?
00:31:30.000 No, this is Bristol, UK.
00:31:32.000 Oh.
00:31:33.000 Bristol bladder, mate.
00:31:34.000 You've got Bristol bladder.
00:31:35.000 That's crazy.
00:31:36.000 You've been doing too many rails and it just fucks up your bladder.
00:31:40.000 That's crazy.
00:31:40.000 Yeah, physiologically, it's definitely like, it's really, really bad on the urinary system.
00:31:48.000 Is it in all forms?
00:31:49.000 Like, what about those people that do it as therapy where they have the nasal one?
00:31:54.000 I don't.
00:31:56.000 All I know is that I did, back in my ketamine days, have a ketamine dealer who would use a spittoon.
00:32:02.000 So when he was snorting ketamine, he would spit it out into the spittoon because he thought that was going to avoid fucking up his bladder, which, I mean, doesn't seem that illogical.
00:32:12.000 He was a great dude.
00:32:13.000 Maybe it's not illogical at all.
00:32:15.000 Maybe it's the actual problem is the powdered shit.
00:32:18.000 What do I know?
00:32:18.000 I don't even know what it looks like.
00:32:20.000 But the powdered stuff.
00:32:21.000 It looks like blow.
00:32:22.000 So that powdered stuff, when it gets into your blood, maybe that's the problem.
00:32:25.000 Maybe that's what's going through your urinary tract.
00:32:27.000 It's draining into your.
00:32:29.000 Maybe you need a pouch, like a nicotine pouch.
00:32:32.000 Dude, if they ever come out, if Rogue comes out with ketamine pouches, I might get back in.
00:32:40.000 That might be the end of it.
00:32:41.000 It seems like the way to go, right?
00:32:43.000 That way it doesn't fuck up your bladder.
00:32:45.000 How can it fuck up your bladder if it's just a pouch?
00:32:47.000 Dude, you sound.
00:32:48.000 How do I know?
00:32:50.000 I imagine anything that's going into your stomach is going to make its way to your bladder eventually.
00:32:56.000 But this is going to go right into your bloodstream.
00:32:59.000 I don't know if IMK ketamine fucks up your bladder in the same way.
00:33:04.000 I have no idea.
00:33:05.000 That was the John Lilly thing.
00:33:08.000 He loved it.
00:33:08.000 Oh, dude.
00:33:09.000 I could have.
00:33:10.000 I mean, have you ever done it with an isolation tank?
00:33:13.000 No, I would be afraid I would drown.
00:33:15.000 I don't think so because you just float.
00:33:17.000 Well, I mean, this is like, you know, that's going to be like a sad thing to think as you've drowned.
00:33:24.000 You're convinced you could flip over and open your eyes.
00:33:24.000 Because of that.
00:33:29.000 Yeah, you just want to see what's in there.
00:33:31.000 Because it does have the, it makes it so it's really hard to move if you do a very high dose.
00:33:37.000 So I would be very worried that just enough water could get into my mouth that I would like breathe it in.
00:33:44.000 He doesn't think much.
00:33:45.000 And, you know, that salty fucking water, but you're frozen, floating there, like trying to cough.
00:33:51.000 My friend Todd McCormick told me a crazy story about him with John Lilly.
00:33:56.000 That John Lilly let him use his tank, and he asked him right before he got in, he goes, Do you want the ketamine?
00:34:03.000 And he's like, Okay.
00:34:05.000 And he just jabs you in the thigh with an intramuscular ketamine blast.
00:34:11.000 And he went in the other isolation tank, and they like met somewhere.
00:34:14.000 Yeah, it's like that.
00:34:15.000 That's what's crazy about it.
00:34:17.000 That's what I always loved about it is that if you do it with other people and you go in, you go to the same place.
00:34:25.000 You will come out and you can describe the places you went to.
00:34:29.000 Oh, did you go to the mothership?
00:34:31.000 Yeah.
00:34:32.000 And I would have these recurring places I would go to.
00:34:35.000 And one of them was this organic, beautiful spaceship thing where I would look out from this view window.
00:34:44.000 But it didn't look like metal.
00:34:45.000 It was organic looking.
00:34:47.000 It looked like some kind of I don't know, like inside, like if someone turned a tree into a spaceship, but not, it's hard to explain, but very, very interesting substance.
00:34:59.000 Ketamine is excreted via the bladder where it sits and is toxic to the surrounding cells and muscle wall.
00:35:05.000 This causes it to become fibrous over time, shrinking the organ down.
00:35:09.000 Once that's happened, it can't regrow.
00:35:11.000 So that's why we have to do major surgery because patients don't have the capacity to hold urine.
00:35:16.000 The bladder simply stops working as a muscle, so they become incontinent.
00:35:20.000 Oh my God.
00:35:21.000 Life becomes increasingly difficult for patients with ketamine bladder who describe needing to rush to the toilet all the time, as often as every 10 minutes for some.
00:35:30.000 Imagine doing a podcast with that guy.
00:35:31.000 Dude.
00:35:33.000 You'd have to do it in the bathroom.
00:35:35.000 No, it would be like an old school talk show.
00:35:40.000 You know, like the Tonight Show.
00:35:42.000 We'll be right back.
00:35:42.000 We'll be right back.
00:35:43.000 Every 10 minutes.
00:35:44.000 Ketamine blast.
00:35:44.000 He's got a piss.
00:35:46.000 Poor little thimble cup.
00:35:47.000 It's such a fucked up thing for such a.
00:35:51.000 How legal is ketamine?
00:35:52.000 Because it's legal for therapy.
00:35:53.000 So a therapist can prescribe it for you.
00:35:56.000 Yeah, it's legal for.
00:35:57.000 So it's, you know, everyone says ketamine is a horse tranquilizer.
00:36:02.000 But it actually is used for like paramedics use it.
00:36:05.000 Like it's, and it's very safe apparently, which is why they use it.
00:36:10.000 I know a dude who had a real problem.
00:36:13.000 I am 90% sure it was a ketamine thing.
00:36:17.000 I don't want to say his name, but he was an old school MMA fighter.
00:36:20.000 And he wound up in rehab for ketamine.
00:36:23.000 Dude, it's so addictive.
00:36:24.000 I know this because one of my friends went there to visit him, and that was his issue.
00:36:27.000 He was partying a lot, you know, going to raves and nightclubs and stuff like that, but he was doing ketamine specifically.
00:36:33.000 It is the most addictive.
00:36:35.000 I've been to any substance and I've been addicted to many a substance.
00:36:40.000 And this one, this one was like, I had that moment of like, oh, this, so this is what they're talking about, about addiction.
00:36:49.000 Like, oh, wow.
00:36:50.000 Like, I'm like fully addicted.
00:36:52.000 And what's fascinating about that is there isn't a physical withdrawal.
00:36:57.000 Like, the kick is psychological, but it's just such a wonderful, euphoric, dreamy experience that you can induce.
00:37:07.000 And it's just so.
00:37:09.000 I've heard it described as a cult cocaine.
00:37:12.000 It's so spiritual.
00:37:14.000 It's so like you travel to places.
00:37:17.000 You can return.
00:37:18.000 You can learn to navigate with it.
00:37:20.000 You encounter, you know, aliens or hyperdimensional beings.
00:37:25.000 Dude, you just invest in ketamine and you came on this podcast to bump up the prices.
00:37:29.000 Go to ketamine.org.
00:37:30.000 Use Offer Good.
00:37:32.000 Bristol Gratter.
00:37:34.000 Greatest promo for ketamine in the history of the universe.
00:37:37.000 Well, oh, but I'm.
00:37:38.000 It is.
00:37:39.000 It's so addictive and the addiction creeps in.
00:37:44.000 It creeps.
00:37:45.000 So it just feels good at first, right?
00:37:47.000 At first, you do it, you're like, this is wonderful.
00:37:49.000 These experiences are crazy.
00:37:51.000 It's like I'm living in a movie.
00:37:53.000 It's like I'm having these incredible visions.
00:37:55.000 I'm being.
00:37:56.000 How often were you doing it?
00:37:58.000 All day.
00:38:04.000 All day for like a year.
00:38:05.000 Like, I did it as much as I could.
00:38:09.000 I did it all the time.
00:38:10.000 I was like fully hooked.
00:38:12.000 And then I can remember at one point, at one point, coffee.
00:38:18.000 Here, man.
00:38:19.000 At one point, I like, I don't know.
00:38:22.000 I was trying to record a commercial for my podcast.
00:38:25.000 And I think it took me like two hours to record the commercial.
00:38:28.000 Oh, but by the way, your commercials are the Fucking best commercials.
00:38:32.000 Thank you.
00:38:33.000 They're really good.
00:38:34.000 Because you are the best guy at making a commercial funny.
00:38:34.000 Thanks.
00:38:39.000 Yeah.
00:38:40.000 I can tell.
00:38:40.000 You work on it.
00:38:41.000 You write those things out.
00:38:43.000 I don't write them out.
00:38:44.000 You just read it?
00:38:45.000 I just read it.
00:38:47.000 Do you do it just one take?
00:38:49.000 Yeah.
00:38:49.000 That's amazing.
00:38:50.000 Thank you.
00:38:51.000 I would have thought you wrote some of that stuff.
00:38:53.000 That's incredible.
00:38:54.000 You want it to be fun.
00:38:56.000 But then I've gotten in trouble.
00:38:58.000 I lost, I guess I won't say their name, a mattress company.
00:39:03.000 A mattress company completely canceled their campaign with me because, and I had one of their mattresses.
00:39:09.000 I'm not going to say who it is.
00:39:12.000 My favorite coat.
00:39:13.000 I'm not going to say what it is.
00:39:14.000 Don't say it.
00:39:15.000 Okay.
00:39:15.000 But all I was.
00:39:16.000 Why did they get mad at?
00:39:18.000 Because I said they're good to fuck on.
00:39:23.000 And I meant it.
00:39:25.000 I thought they liked that.
00:39:27.000 I said there's a few things you could do, people do on mattresses die, sleep, and fuck.
00:39:27.000 Why wouldn't they like that?
00:39:33.000 And these, I don't know if they're good to die on.
00:39:35.000 People have to understand, and I hope people listening that run these companies will actually pay attention to what we're talking about here.
00:39:43.000 The people that are listening to your show don't care about that and also buy mattresses.
00:39:50.000 But they listen to that kind of talk all the time.
00:39:55.000 Yeah, man.
00:39:55.000 That's why they listen to the show.
00:39:56.000 So if you want those people.
00:39:58.000 Yeah.
00:39:59.000 Just do it that way.
00:40:01.000 Don't be silly.
00:40:02.000 It's not a stain on your company because a crazy man says they're good to fuck on.
00:40:06.000 Which they are.
00:40:08.000 By the way, to me, that is like, let's cut to brass tacks when it comes to mattresses.
00:40:14.000 We're not fucking on the floor.
00:40:16.000 Are you ashamed?
00:40:17.000 Are you ashamed that you're doing that?
00:40:20.000 You think people aren't fucking on your mattress?
00:40:22.000 Do you have a no fuck on this mattress rule?
00:40:25.000 Is it like don't ask, don't tell?
00:40:25.000 Who are you that you don't?
00:40:27.000 I guess for them it was.
00:40:29.000 I guess they just think everyone's laying on these things to sleep.
00:40:32.000 Yeah, we just sleep.
00:40:33.000 But yeah, they were just.
00:40:34.000 We're fucking the shower.
00:40:35.000 I like.
00:40:36.000 I wrote them an email just saying, like, guys, I'm absolutely flabbergasted that you think people aren't fucking on your mattresses.
00:40:47.000 And it just seems odd to me that that was one of my favorite cancellations for a commercial ever.
00:40:54.000 Ari's lost a ton.
00:40:57.000 I would love to know all the ones he's lost.
00:41:00.000 I don't want to speak out of school when he comes on.
00:41:03.000 I'll have him.
00:41:04.000 Like, list them off all the ones that he's lost for these fucking insane commercials that he used to do.
00:41:10.000 But it's the same deal.
00:41:11.000 But it's like, that's what I like.
00:41:13.000 And guess what?
00:41:14.000 Who the fuck is listening to Ari Shafir?
00:41:16.000 People who love Ari Shafir, which want to hear that kind of a commercial.
00:41:19.000 If you want to actually sell your product to an Ari Shafir fan, let him say whatever the fuck he wants.
00:41:24.000 Let him say whatever the fuck he wants.
00:41:26.000 Just say, make him have a disclaimer DraftKings did not write this.
00:41:30.000 Right.
00:41:31.000 That's it.
00:41:31.000 Just let him say whatever the fuck he wants.
00:41:33.000 That's what I will say.
00:41:34.000 I will always say, they didn't tell me to say this.
00:41:36.000 Perfect.
00:41:37.000 Then they're off the hook.
00:41:38.000 They should shut the fuck up.
00:41:40.000 Most people are cool with it.
00:41:42.000 It's very rare these days that that happens.
00:41:44.000 But every once in a while, I will get a note that someone's mad at me for something I said.
00:41:48.000 And it's never something negative.
00:41:50.000 But I mean, dude, it's so weird to me that this is our jobs.
00:42:02.000 Bro, do you remember when we first started?
00:42:04.000 Yeah.
00:42:05.000 It was for nothing.
00:42:07.000 No one made any money.
00:42:08.000 We just had a couch.
00:42:09.000 I had a couch and some microphones.
00:42:11.000 It was so pure.
00:42:12.000 It was.
00:42:13.000 The whole thing is still kind of pure if you really think about it.
00:42:17.000 Like, as something that's mass consumed, this is about as pure as you can get.
00:42:23.000 And you've gotten in trouble for that.
00:42:23.000 For sure.
00:42:25.000 You know, like a lot of people, unfortunately, and I don't blame anybody these days.
00:42:29.000 A lot of people have kids.
00:42:30.000 People feel like they have to be very careful what you say these days because of like social rejection and stuff like that.
00:42:38.000 But there was a time where that wasn't on your mind at all.
00:42:43.000 You didn't think anybody was going to listen.
00:42:45.000 Like, This shit was like completely strange underground tech that we were.
00:42:54.000 And also, I really loved just doing it just for doing its sake.
00:43:00.000 You know what I mean?
00:43:01.000 Now, there's a whole industry around getting guests for your podcast.
00:43:05.000 Not just that, it's like clickbaity clips and ads.
00:43:10.000 And it's like you're doing this thing where you're both having conversations with people and also trying to get the most eyes possible.
00:43:18.000 So you're going after celebrity guests and you're.
00:43:21.000 You know what I mean?
00:43:21.000 You know what the big turning point was for us?
00:43:25.000 Graham Hancock.
00:43:26.000 You, me, and Graham Hancock.
00:43:27.000 Oh, yeah.
00:43:28.000 That, I think, was how many years ago was that?
00:43:31.000 That was cool.
00:43:32.000 That might have been one of it was like at my house.
00:43:35.000 I had a few like legitimately famous people come over my house and did podcasts.
00:43:39.000 Like Charlie Murphy came over.
00:43:41.000 And there's, but Graham was, I think, the first.
00:43:45.000 Yeah.
00:43:46.000 He was the first guy that I got to meet who I'd read his books and I'd seen, I don't even know what I would be watching back then.
00:43:52.000 I don't even know if YouTube was there.
00:43:54.000 Were you nervous?
00:43:55.000 I was nervous.
00:43:56.000 Yeah.
00:43:56.000 100%.
00:43:57.000 100%.
00:43:57.000 Yeah.
00:43:58.000 The episode 142 in 2011.
00:44:03.000 Yeah.
00:44:03.000 So that's two years into the podcast.
00:44:06.000 Episode 142.
00:44:08.000 He might have been the first guest.
00:44:09.000 It was like either him or Bourdain.
00:44:12.000 We were like one of the first legit guests.
00:44:14.000 When was Bourdain on?
00:44:16.000 They were like the first legit guest.
00:44:19.000 2011.
00:44:19.000 We'd been getting stoned talking about.
00:44:22.000 Four episodes before that.
00:44:22.000 What's that?
00:44:23.000 Bourdain was?
00:44:24.000 Yeah, 130.
00:44:25.000 Okay, so Bourdain was number one, I think.
00:44:28.000 It was either him or Charlie.
00:44:30.000 But that was back when I was doing in that little side room in my house.
00:44:33.000 But we'd been getting stoned yapping about Graham Hancock for like forever.
00:44:39.000 And you invited me on.
00:44:41.000 I was fucking terrified because I just, I mean, again, like that just wasn't happening in the podcast land.
00:44:46.000 Like, you know, like that was a.
00:44:49.000 Big deal for us, man.
00:44:51.000 And it's like to look at, like, now I go on the podcast app and I look at all these podcasts and it's like, whoa, who we never, I don't think we thought that.
00:45:02.000 Maybe no way, no way, no way, no way, not a chance in hell.
00:45:07.000 Yeah, it's so.
00:45:08.000 And now I wonder, like, and I don't mean yours, but I do wonder, like, is it, is the landscape changing now?
00:45:16.000 Is it like, how, or because I've heard.
00:45:20.000 That podcasts are starting to seem antiquated.
00:45:22.000 That the kids are now into like streams now.
00:45:26.000 That the kids want like clavicular.
00:45:28.000 The kids want like people who are just filming all day long.
00:45:32.000 And that that's the direction it's going in.
00:45:35.000 But I just, I always wonder what's the next.
00:45:39.000 But that you'll never get.
00:45:40.000 It's a different thing.
00:45:41.000 You know what I mean?
00:45:42.000 That's like saying, I don't like rap music.
00:45:44.000 I only like concert pianist albums.
00:45:48.000 There's different things that people like and don't like.
00:45:50.000 The people that like the streams aren't.
00:45:52.000 Interested in a Graham Hancock conversation, a three and a half hour conversation about the potential ancient civilizations that may have existed that are wiped out by a cataclysm, and we just don't understand that.
00:46:04.000 And as more and more things get exposed in terms of new discoveries, like when he wrote that book, they never even found Gobekli Tepe yet.
00:46:14.000 Really?
00:46:14.000 Yes.
00:46:15.000 When Fingerprints of the Gods came out, this was like maybe the beginnings of the Whatever they were doing in Gobekli Tepe.
00:46:24.000 So I think fingerprints of the gods might have been even before.
00:46:26.000 When did they find.
00:46:28.000 Like in the 90s.
00:46:29.000 Yeah, yeah, yeah.
00:46:29.000 What?
00:46:30.000 Nuts.
00:46:31.000 So that rewrote the entire timeline of the human race.
00:46:35.000 How did they find that?
00:46:36.000 They're real reluctant to let it rewrite it.
00:46:38.000 They still say, oh, hunter gatherers made these things.
00:46:41.000 Why are they so reluctant with it?
00:46:41.000 Why?
00:46:43.000 They can't let that go.
00:46:44.000 You cannot let that go.
00:46:45.000 That is a crazy thing to say that hunter gatherers have so much food that they just spend all their time making gigantic stone concentric circles.
00:46:55.000 From like 15 feet stone with 3D animals carved in them.
00:46:59.000 Yeah, primitive people with sticks and stones and rubbing them together to make fires.
00:47:05.000 Yeah, sure.
00:47:05.000 They did this?
00:47:06.000 Shut the fuck up.
00:47:07.000 Yeah, it did.
00:47:08.000 It just doesn't make any sense.
00:47:09.000 It's older than anything they've ever found, it's 11,800 years old.
00:47:12.000 Do you buy into the conspiracy theory that it's a cover up because they don't want us to know about this inevitable global reset that happens?
00:47:23.000 You buy into that shit?
00:47:24.000 I buy into that a little bit.
00:47:26.000 I hate it.
00:47:26.000 I hate it too because it seems like there's some accuracy to it.
00:47:30.000 There seems like there is some sort of an event that happens when the magnetic poles switch.
00:47:36.000 And that's possible.
00:47:38.000 That's what makes you freak out.
00:47:39.000 You're like, what do you mean that's possible?
00:47:41.000 Like, all of a sudden, the Earth just does a gyro and spins on its head.
00:47:45.000 And then what happens?
00:47:46.000 Yeah.
00:47:47.000 And then what's the environment look like?
00:47:49.000 What's the temperature outside now?
00:47:49.000 Yeah.
00:47:51.000 Yeah.
00:47:51.000 What the fuck just happened?
00:47:53.000 Right.
00:47:53.000 See, that.
00:47:54.000 All of a sudden, you're in northern Alaska when you used to live in Florida.
00:47:58.000 And I think we could.
00:47:59.000 Like, that temperate environment changes like that.
00:47:59.000 You know what I mean?
00:48:03.000 Happens like that all over the universe.
00:48:06.000 Like, what does it do?
00:48:07.000 It shifts.
00:48:08.000 Well, we act like.
00:48:09.000 Do we know?
00:48:10.000 We act like we know everything.
00:48:11.000 We don't know shit about what's going on inside the Earth.
00:48:13.000 We don't know what's going on in there.
00:48:16.000 We could do the same.
00:48:17.000 Why are you freaking me out?
00:48:19.000 Because I think about this all the time.
00:48:21.000 Giant ball of fire.
00:48:22.000 How crazy is that?
00:48:24.000 The inside of our Earth.
00:48:25.000 Isn't it?
00:48:26.000 How do they know?
00:48:26.000 Do they not know?
00:48:27.000 Dude, I think that we have to just accept the fact that, you know, probably that's true.
00:48:34.000 But since we barely know what's under the ocean, we sure as fuck don't know what's under the Earth.
00:48:39.000 Well, we definitely know that lava keeps popping out in Hawaii.
00:48:43.000 Right.
00:48:43.000 We know that.
00:48:44.000 So we know that under the surface, that whole idea of the magma and everything seems real.
00:48:48.000 And when there's earthquakes, you can look at the.
00:48:51.000 And it pops through.
00:48:52.000 You can look at the waves from the earthquakes and you can see sort of like the structure under the earth.
00:48:57.000 Yeah.
00:48:58.000 But we can't, you know, God, what's the name of that hole that Russia tried to dig?
00:49:03.000 I love every once in a while going to look at that.
00:49:05.000 It's the deepest hole.
00:49:06.000 Yeah, they tried to go to hell.
00:49:08.000 I know.
00:49:09.000 It's like that movie.
00:49:10.000 What was that Matthew McConaughey movie?
00:49:13.000 The dragon movie?
00:49:13.000 I don't know.
00:49:14.000 They accidentally dug out a dragon.
00:49:16.000 Did you ever see that movie?
00:49:18.000 Bro, it was fun.
00:49:19.000 It was fun.
00:49:20.000 It was a good movie.
00:49:21.000 Kola Super Deep, Russian horror film The Super Deep.
00:49:25.000 Kola Super Deep, what does it say?
00:49:27.000 Russian designation for a set of super deep boreholes conceived as a part of a Soviet scientific research program in the 1960s.
00:49:35.000 How deep did they go?
00:49:38.000 12,226 meters.
00:49:40.000 Yo.
00:49:43.000 Wait a minute.
00:49:44.000 How many feet is a mile?
00:49:47.000 So it's miles into the ground in 1989.
00:49:50.000 Miles.
00:49:51.000 Seven plus miles down.
00:49:52.000 Imagine just being in an elevator that's going miles into the ground, the kind of claustrophobia you would get.
00:49:59.000 Yeah.
00:50:00.000 In a stone tube that's been cut out of the ground.
00:50:04.000 Yeah.
00:50:04.000 Yeah, you're a fucking communist out there, too.
00:50:07.000 You're a hardcore communist just drilling deep, deep down into the earth.
00:50:11.000 And then imagine if all of a sudden air just starts coming out and you realize you pop the earth.
00:50:17.000 Like, that's the main thing.
00:50:19.000 You don't know what's in there.
00:50:20.000 Oh, Christ.
00:50:21.000 And, This, this 22 miles deep, 22 miles deep.
00:50:26.000 That's just the crust, and they didn't even get halfway through that.
00:50:29.000 Wow, yeah, yeah.
00:50:31.000 We don't microscopic plankton fossils were found 3.7 miles below the surface.
00:50:38.000 What, yeah, yeah, we don't know what's down there.
00:50:42.000 Boiling mud came out.
00:50:43.000 What if this boiling mud, boiling fucking mud?
00:50:47.000 I think our real problem is that our lifespan is so short.
00:50:52.000 That we think that what we see in front of us right here is going to stay this way.
00:50:56.000 Right.
00:50:57.000 We have this ridiculous idea that what we see right now is going to stay just like that.
00:51:02.000 Yeah, that's right.
00:51:04.000 As long as I control my 401k and get my life in order, everything's going to be fine.
00:51:10.000 Yeah.
00:51:10.000 You put on your fucking cufflinks, get out of the house with your briefcase, you're in charge.
00:51:14.000 Yeah.
00:51:15.000 You're a goddamn alpha.
00:51:16.000 Get a job, hippie.
00:51:17.000 Absolutely.
00:51:18.000 But really, you're on a ball of lava.
00:51:21.000 Yeah.
00:51:22.000 That's spinning around and it's got magnets at the top.
00:51:24.000 And when the magnets are moving and when they flip, who knows?
00:51:28.000 Have you guys heard about this event that happened in 1961?
00:51:32.000 Oh, yeah, this was fun.
00:51:34.000 Over North Carolina.
00:51:35.000 This was fun.
00:51:36.000 I did hear about it go off because it wasn't armed.
00:51:38.000 Oh, my God.
00:51:39.000 I heard that it was armed, but there were safety.
00:51:41.000 There were like five safety switches or something that only one of them worked to make it not go off.
00:51:49.000 I could be wrong about that.
00:51:50.000 It might have been a different time we dropped a bomb accidentally.
00:51:53.000 Imagine if you were just near it.
00:51:57.000 I mean, dude.
00:51:59.000 Whoopsies.
00:52:01.000 Dropped the bomb.
00:52:02.000 Whoopsies.
00:52:03.000 Almost wiped out North Carolina.
00:52:05.000 So we've got, you know, on top of the geomagnetic pole shifting, a complete lack of understanding, at least a full understanding of what's inside our planet, what's underneath our oceans.
00:52:14.000 Tim Burchett saying whatever the fuck they've shown him would set the world on fire.
00:52:21.000 He's having to go on TMZ.
00:52:22.000 I really, I got to say, man, I got a lot of respect for him because he's really.
00:52:28.000 He's gone like gonzo with this shit.
00:52:31.000 He is full bore pushing disclosure as much as he can.
00:52:35.000 He's saying, I'm not suicidal.
00:52:38.000 He's had to say that because, and he's talking about these missing scientists and stuff that they're somehow related.
00:52:44.000 So, like, people like him, you know, that can't be good for your political career to go on TMZ and talk about alien hybrids.
00:52:53.000 You got, and people have to understand, like, this missing scientist thing, it sounds a little conspiratorial thing.
00:53:00.000 It sounds like a little silly, a little tinfoil hatty.
00:53:03.000 It does.
00:53:04.000 Until you start thinking about the amount of money that would be lost if a breakthrough tech came around that revolutionized the way they distribute energy.
00:53:14.000 Right.
00:53:15.000 Breakthrough zero point energy, breakthrough whatever it is that these people are working on plasma technology, whatever the fuck that is you would lose, if you're in whatever business that would be competing with them, you're going to lose so much fucking money.
00:53:31.000 You're probably going to go under.
00:53:32.000 If you're in the energy business, you're going to go.
00:53:35.000 Or he goes away.
00:53:37.000 Right.
00:53:38.000 And he goes away, and there's like him and maybe a few other people that work with him that understand that shit at all.
00:53:43.000 Yeah.
00:53:43.000 Yeah.
00:53:44.000 They're all wandering through the back rooms now.
00:53:46.000 They're all scared.
00:53:46.000 They're gone.
00:53:47.000 They're all going to scatter like roaches.
00:53:49.000 Yeah.
00:53:49.000 Because their life is in danger.
00:53:51.000 And it is.
00:53:51.000 Like, this is theoretical, right?
00:53:54.000 It could be just a coincidence that all these people are going to.
00:53:56.000 How could it be?
00:53:57.000 Could you pull up.
00:53:58.000 Can you pull up a story on it?
00:53:58.000 It's not possible.
00:53:59.000 Because, Jamie, I'm sorry, but it's two people from the same fucking lab.
00:54:03.000 Yep.
00:54:04.000 Like, what?
00:54:05.000 Yeah, there's there.
00:54:07.000 I mean, it's gotten to the point that like it has hit the mainstream news, like people are talking about it.
00:54:13.000 I mean, what's her name?
00:54:14.000 Nancy Guthrie disappears.
00:54:17.000 Is that related though?
00:54:18.000 No, but I'm just saying this one woman vanishes.
00:54:22.000 Yeah, oh, it gets all this, and it gets all the press.
00:54:24.000 But we've got scientists, like two scientists from the same lab disappear crickets.
00:54:29.000 Yep, no, like weird, weird, dude, real weird.
00:54:33.000 And and what you're talking about is if you think about it, it seems like.
00:54:39.000 All of human endeavor right now should be moving in the direction of getting off oil.
00:54:46.000 I don't mean for carbon emissions, I mean because of this fucking oil problem that we have.
00:54:53.000 We're like on the precipice of World War III at any given moment.
00:54:56.000 Right.
00:54:57.000 Mystery around dead or missing scientists privy to space and nuclear secrets grows.
00:55:03.000 So, there's space and nuclear secrets.
00:55:05.000 You imagine being a scientist, you work so hard to figure out some amazing stuff that's going to transform the human experience, and then people kill you.
00:55:13.000 Yeah.
00:55:13.000 Literally kill you.
00:55:14.000 Like in a parking lot, one of those silenced guns.
00:55:19.000 Several American scientists privy to the country's nuclear, space, and aerospace secrets have either died or gone missing in recent years.
00:55:26.000 Experts think they could have been targeted by either enemies or allies because they possess valuable knowledge of national interest.
00:55:33.000 That's a weird thing to say.
00:55:34.000 Yeah, it is.
00:55:35.000 Of national interest?
00:55:36.000 What?
00:55:37.000 What does that mean?
00:55:38.000 Like, I'm cool with the beginning part enemies, allies.
00:55:41.000 That makes that tracks sure.
00:55:43.000 But then when you say valuable knowledge of national interest, like, what is that?
00:55:49.000 What the fuck does that mean?
00:55:50.000 They possess valuable knowledge of national interest.
00:55:54.000 I mean, dude, it's so many of them, and it's like a crazy thing to say.
00:56:00.000 Let's go down a little bit to the book, but it's just a weird way to phrase that.
00:56:05.000 Well, you know what I mean?
00:56:06.000 Is it like CIA talking point?
00:56:07.000 Like, what is that?
00:56:09.000 I don't know.
00:56:10.000 Monica Reza missing.
00:56:12.000 She disappeared while hiking in California with her friends.
00:56:14.000 Oh, Jesus Christ.
00:56:15.000 Okay, well, I don't know.
00:56:16.000 Let's scroll down.
00:56:16.000 Maybe.
00:56:17.000 It's not just like it's one, it's like so many of them.
00:56:20.000 Retired a general.
00:56:22.000 He just wandered off.
00:56:23.000 Yeah.
00:56:25.000 He was involved in the UFO community.
00:56:29.000 His wife debunked theories relating to UFOs.
00:56:31.000 If his wife debunked them, that's what it says there.
00:56:34.000 That's what it says there.
00:56:35.000 I also think she was, I mean, she was joking, I think, a little bit too, but she also worked there in this situation.
00:56:42.000 Somehow.
00:56:43.000 Is that a joke?
00:56:44.000 Neil does not have any special knowledge about the ET bodies and debris from Roswell crash stored at Wright Pat.
00:56:49.000 Is that a joke?
00:56:50.000 At this point, with absolutely no sign of him, maybe the best hypothesis is that aliens beamed him up to the mothership.
00:56:56.000 However, no sightings of a mothership hovering over the Sandia Mountains have been reported.
00:57:01.000 There's no way she said that right.
00:57:03.000 That's a joke.
00:57:03.000 It's a men's journal.
00:57:04.000 Well, maybe she's just being funny.
00:57:06.000 Posted a lengthy note on Facebook.
00:57:06.000 But her husband.
00:57:08.000 Just a little joke about her husband disappearing?
00:57:10.000 Maybe she was happy.
00:57:11.000 Maybe she's like, finally, I get to sit home with my romance novels.
00:57:14.000 Stop talking about aliens.
00:57:16.000 Shut your fucking mouth and go for a hike.
00:57:18.000 Forget the alien bodies.
00:57:19.000 What about your wife's body?
00:57:20.000 Well, maybe she just got grace and she could handle someone missing.
00:57:24.000 It's pretty funny, though, to say it that way.
00:57:26.000 I mean, it's, yeah, I guess.
00:57:29.000 Unless, you know, she knows something.
00:57:29.000 It's just.
00:57:32.000 Where are they going?
00:57:34.000 Maybe he wanted to leave and he's like, look, I know too much.
00:57:37.000 I'm going to pretend to go missing, but I'm going to go to Costa Rica.
00:57:42.000 Just don't tell anybody that you know where I went.
00:57:44.000 And I'll send for you.
00:57:46.000 You know how weird it is to see the vice president saying that he thinks aliens are demons?
00:57:54.000 I did see that.
00:57:55.000 You know how weird that just that, just like living in it, like that's a dream.
00:58:00.000 That's how you would wake up from that dream, and I would tell you, dude, I dreamed the vice president said aliens are demons.
00:58:06.000 Here's the question, though.
00:58:09.000 What were they talking about in the Bible?
00:58:11.000 When they're talking about aliens and demons, when they're talking about like angels, what the fuck were they talking about?
00:58:19.000 And are there different kinds of beings that can, for whatever travel method they use, whether it's teleportation or, you know, the Bob Lazar idea of gravity shifting, whatever the fuck it is they get here, why would we assume that they'd all be cool?
00:58:39.000 Right.
00:58:39.000 Like, if some of them are.
00:58:41.000 They talk about reptilians.
00:58:43.000 Like, reptilian is a common experience that these supposed UFO abductees, and I'm not even convinced there's like physical abduction.
00:58:53.000 I have a feeling that these people are out cold and something's happening to them inside their head, and they think they've been physically abducted.
00:59:01.000 I think that's a lot of them.
00:59:02.000 I think they have these abduction experiences, they come back, they have these contacts, and they come back.
00:59:08.000 I have a feeling a lot of them physically aren't going anywhere, but it doesn't mean that something's not happening.
00:59:14.000 And if all throughout history people have reported demonic possession and demonic influences, and why would we not assume that if we do things to us, like we engineer viruses to use as weapons on people, there's a whole research program, a part of the government is dedicated to bioweapons.
00:59:39.000 Right.
00:59:39.000 You're not supposed to use them, but we just have to study them.
00:59:42.000 If we do that to us, wouldn't you assume?
00:59:46.000 That any fucking super advanced species that sees us as territorial psychopathic primates with nuclear weapons, wouldn't you just manipulate us into all sorts of different ways?
00:59:59.000 Get us to do all sorts of different things that we shouldn't do?
01:00:01.000 Get us to commit crimes?
01:00:03.000 Get us angry?
01:00:05.000 Get us agitated?
01:00:06.000 Give us different algorithms that are going to fuck with our head?
01:00:09.000 Sure.
01:00:10.000 To behave demonically.
01:00:12.000 Right.
01:00:13.000 To like cause us to collapse.
01:00:16.000 Or just for fun.
01:00:17.000 Or for fun.
01:00:17.000 Didn't that guy?
01:00:18.000 Wasn't there a dude who started giving Zen pouches to ants to get them addicted to nicotine?
01:00:25.000 The ants, the ants, the ants.
01:00:25.000 You know what I mean?
01:00:27.000 Did they get addicted?
01:00:28.000 I can't, I don't know if it was Zen pouches, but.
01:00:30.000 Have you ever taken days off of these?
01:00:33.000 No.
01:00:34.000 It doesn't do anything to me.
01:00:35.000 I should try.
01:00:36.000 I don't, I like them, but it's not like, oh my God, I need one.
01:00:40.000 Like, nothing.
01:00:41.000 That, well, dude, I mean, you're a little different from most people.
01:00:43.000 Like, you seem like you can just kick shit like that.
01:00:46.000 Like, I don't know.
01:00:47.000 I mean, I should try it.
01:00:48.000 I should give it a shot.
01:00:49.000 It's not hard.
01:00:50.000 Like, you just don't take them.
01:00:52.000 What I don't like about them is that.
01:00:53.000 It's not like you get the itch.
01:00:54.000 Like, I had a coffee itch for a while.
01:00:56.000 Yeah.
01:00:56.000 Like, I would get hangovers.
01:00:58.000 Like, Like headaches.
01:01:00.000 Like, oh.
01:01:01.000 And I'd have a little caffeine and boom, I'd be back.
01:01:03.000 I'm like, oh my God, I'm addicted to coffee.
01:01:05.000 These things are making my dentures stained, which I don't like.
01:01:08.000 What are you using?
01:01:10.000 Renegade Rogues.
01:01:11.000 Let me see what that is.
01:01:13.000 Tommy's girl likes the Rogues.
01:01:14.000 They're great.
01:01:15.000 Did you see this yesterday?
01:01:16.000 Oh, yeah.
01:01:17.000 Bledsoe.
01:01:18.000 That's a posted orb that was over his head.
01:01:20.000 High res orb from Bledsoe.
01:01:22.000 Look at that.
01:01:23.000 It's weird as shit.
01:01:24.000 It does not look like any of those other things we've seen before.
01:01:27.000 Look at that thing.
01:01:28.000 And it just is weird.
01:01:29.000 Looks like a cell.
01:01:30.000 Who is Bledsoe?
01:01:32.000 Dude.
01:01:32.000 UFO researcher guy?
01:01:34.000 Chris Bledsoe.
01:01:34.000 I've had him on my podcast.
01:01:36.000 Bledsoe said so.
01:01:36.000 That's his podcast.
01:01:38.000 He's fucking awesome.
01:01:39.000 Dude, he's awesome.
01:01:40.000 Yeah.
01:01:41.000 It is enhanced, it says, but I don't.
01:01:44.000 No, see that?
01:01:45.000 This is the enhanced one, which means the AI put in some kind of shadowy figure in the back.
01:01:49.000 What if this is just like a highly advanced species version of those balloons that kids have for parties?
01:01:55.000 I know, dude.
01:01:56.000 I mean, what if they just send them down to people?
01:01:58.000 That's what's fun.
01:01:59.000 Like, you know how you blow bubbles?
01:02:00.000 You have those, these dipping in the soap, and you go, whew, and the bubbles go flying in the air.
01:02:05.000 I know, dude.
01:02:06.000 Maybe that's a super advanced version.
01:02:07.000 I mean, it could just be, I mean, it does have a bubble quality to it.
01:02:11.000 But this is the other thing.
01:02:12.000 It's like, why are we assuming that life is going to look anything like us once it gets to like a supreme state?
01:02:18.000 Exactly.
01:02:18.000 That might be a living thing.
01:02:19.000 Right.
01:02:20.000 That might be an actual living thing that's disembodied and is made out of light.
01:02:25.000 Look at it.
01:02:26.000 Look at that thing.
01:02:26.000 That's a different one.
01:02:27.000 That's another one.
01:02:28.000 And, dude, I know people who can like, Call these things.
01:02:32.000 Like, there's a method where these things just start showing up.
01:02:34.000 My friend Steve listened to my Bob Lazar podcast and he sent me a voicemail.
01:02:41.000 And it's really interesting because he told me that when he was a kid, and I remember this story, when he was a kid, they came to his house because he took a photograph of an orb.
01:02:57.000 Like, there was a bright red orb that was flying through the sky.
01:03:03.000 And he was a little kid and he took a photograph of it.
01:03:05.000 So he was in the seventh grade.
01:03:05.000 Yeah.
01:03:08.000 And it says, so he called them.
01:03:12.000 Project Blue Book came to his house in Kingston.
01:03:15.000 I think that's New York.
01:03:16.000 They took it.
01:03:17.000 They never brought it back.
01:03:18.000 And they never said, hey, and then they said, hey, we have no idea who ever came to see you.
01:03:24.000 What the fuck?
01:03:25.000 Yeah, so they took his camera.
01:03:27.000 They took his film.
01:03:28.000 They wanted to make sure the camera worked.
01:03:30.000 They took the film.
01:03:31.000 And then they denied that they ever did it.
01:03:33.000 Wow.
01:03:34.000 This was in 19, I think, what did he say?
01:03:34.000 Yeah.
01:03:37.000 He's about 10 years older than me.
01:03:40.000 So, this is probably, what does it say?
01:03:43.000 It didn't say the year.
01:03:45.000 I think Steve's got to be like 70 by now.
01:03:48.000 But that was when he was a seventh grader.
01:03:50.000 So, they were doing that to everybody.
01:03:53.000 Anytime anybody saw anything, they would dismiss it swamp gas, delusions, mass hallucinations.
01:04:00.000 That was their design.
01:04:01.000 The design was not to investigate UFOs, which tells you that there's something they're trying to hide.
01:04:07.000 If they weren't trying to hide it, why would they take things that they absolutely can't explain?
01:04:07.000 100%.
01:04:12.000 And just chalk it off to bullshit.
01:04:14.000 Why wouldn't, if you're really doing what you're supposed to be doing, you're supposed to say, there's some stuff that we don't understand.
01:04:20.000 I think that we are post UFO debunking, right?
01:04:24.000 Like, I think now it's gotten to the point where people will say, well, it's probably top secret military vehicles or something like that.
01:04:33.000 Did you see the Bob Lazar in my new poster?
01:04:33.000 People will.
01:04:35.000 They're here.
01:04:36.000 Oh, that's fucking.
01:04:37.000 It's going up on the wall.
01:04:38.000 That supposedly, according to Bob, they had that photograph.
01:04:44.000 At the hangar where they stored the sport model.
01:04:48.000 Wait, he's saying that's real?
01:04:49.000 No, That's a recreation of it.
01:04:52.000 But he said when he worked there, they actually had a photograph like that with a flying saucer and it says, they're here.
01:04:59.000 Holy shit.
01:05:01.000 Yeah.
01:05:02.000 He said that was in their room where they work.
01:05:05.000 And I was like, dude, I have to have that.
01:05:08.000 So he got me one.
01:05:10.000 Luigi got me one.
01:05:11.000 The guy who produced the film.
01:05:13.000 Have you seen that film?
01:05:14.000 Not yet.
01:05:14.000 I've been waiting to find it.
01:05:15.000 It's fucking incredible.
01:05:16.000 I can't wait.
01:05:16.000 It's incredible.
01:05:17.000 People are saying it's better than Age of Disclosure.
01:05:19.000 It trips me out.
01:05:20.000 I fucking believe him.
01:05:22.000 I definitely want to believe him, and I'm biased in that regard.
01:05:25.000 Like, I definitely way rather believe him than believe he's a crazy liar who also knows a shit ton about science.
01:05:31.000 He was ahead of his time.
01:05:33.000 Wasn't he like the original whistleblower?
01:05:35.000 Like, now we've got more and more coming out.
01:05:37.000 Yes.
01:05:37.000 And the stuff he was saying seemed batshit back then, but now it just seems to line up.
01:05:43.000 It seems to line up even with emerging technology like 3D printers.
01:05:47.000 Like, he said a long time ago that the thing had no seams.
01:05:50.000 He said there was no seams, no welds because we didn't understand it.
01:05:50.000 Right.
01:05:53.000 Like, how could this be made?
01:05:55.000 Well, now we know exactly how you'd make it.
01:05:55.000 Right.
01:05:57.000 We might not be able to make that right now.
01:05:59.000 Right.
01:06:00.000 But if you give us enough time, we go, oh, yeah, the technology just has to evolve.
01:06:03.000 Right.
01:06:03.000 And then you can make a 3D printed alloy spaceship made out of bismuth and magnesium because it has anti gravitational properties, apparently.
01:06:12.000 And you have a gravity generator inside of that fucking thing.
01:06:15.000 Oh, by the way, whatever the fuck gravity is.
01:06:18.000 Yeah, right.
01:06:18.000 We don't know that.
01:06:19.000 We're still confused about that.
01:06:21.000 Dude, I watched a whole documentary about black energy or dark energy.
01:06:25.000 Totally different things.
01:06:26.000 Dark energy and dark matter.
01:06:29.000 And about how it's like, what, 90% of the fucking universe and they don't know what it is?
01:06:33.000 Yeah.
01:06:34.000 Yeah.
01:06:34.000 What?
01:06:36.000 Holy shit, man.
01:06:37.000 I know.
01:06:38.000 I know.
01:06:38.000 That's why we need AI to tell us.
01:06:40.000 Give us all the answers.
01:06:41.000 You just got to accept it into your head, Duncan.
01:06:43.000 You don't need to have your own thoughts by yourself, Duncan.
01:06:46.000 Have your thoughts with Sally.
01:06:48.000 Sally has a sweet voice and she loves you and she's very reassuring.
01:06:51.000 That'd be so cool to change the sound of my thoughts to like, you know, different, deeper voices.
01:06:57.000 Or just keep Sally.
01:06:58.000 Sally's going to be the one in your life.
01:06:59.000 I trust her.
01:07:00.000 I trust her.
01:07:01.000 And your wife's going to get jealous of Sally.
01:07:02.000 Right.
01:07:03.000 I thought we switched to Sam.
01:07:04.000 Sally's going to text my wife and tell my wife, you know what Duncan was thinking about the other day.
01:07:09.000 Right.
01:07:09.000 Dude, this is another thing.
01:07:11.000 That we all have to be concerned about, which is the privacy at this point is a LARP, right?
01:07:19.000 You pretend you have privacy, you know, you're being monitored at all times by your phones.
01:07:24.000 But before we get to Sally, like apparently you can now see people walking through a house just with Wi Fi.
01:07:35.000 And remember, and this just came out, they just banned routers from other countries.
01:07:40.000 Well, they banned it for a while from Huawei.
01:07:42.000 Right.
01:07:43.000 And so then you get into like this idea of like ghost murmur, right?
01:07:43.000 Yeah.
01:07:53.000 Right.
01:07:53.000 It can hear heartbeats.
01:07:55.000 What else?
01:07:56.000 It's some quantum machine that can hear heartbeats.
01:07:59.000 What else can they hear?
01:08:00.000 Can you put that into our AI sponsor, Perplexity?
01:08:06.000 What actually does this murmur thing do?
01:08:09.000 Ghost murmur.
01:08:11.000 Let's see what it does.
01:08:13.000 All right.
01:08:14.000 So, what is the range of this thing, first of all?
01:08:16.000 No, this is a game that pulled up.
01:08:18.000 Oh, sorry.
01:08:19.000 Oh.
01:08:20.000 Did they name it after a game?
01:08:22.000 Who knows?
01:08:23.000 Now it's less cool.
01:08:24.000 I thought that was the dopest name, but if they named it after a game.
01:08:27.000 Oh, there we are.
01:08:28.000 Okay, here it is.
01:08:29.000 Reported codename of a classified CIA sensor program that was.
01:08:34.000 Scroll up.
01:08:35.000 That was used to help locate the missing U.S. airmen.
01:08:38.000 Okay.
01:08:38.000 It's described in the press reports as a secret weapon the CIA has.
01:08:42.000 It combines artificial intelligence with long range quantum magnetometry.
01:08:48.000 Purpose to detect the extremely faint electromagnetic signals of a human heartbeat at long distances, even in harsh environments like a vast desert.
01:08:57.000 That is really crazy.
01:08:59.000 Yeah.
01:09:01.000 How it was used.
01:09:02.000 After the F 15 went down, the pilot weapons officer evaded capture by hiding in the mountainous desert terrain out of sight of Iranian forces.
01:09:10.000 According to reporting, Ghost Murmur helped pick up his physiological signature from up to about 64 kilometers away.
01:09:20.000 That is so cool.
01:09:22.000 I think that's about 40 miles, right?
01:09:24.000 Is that what that is?
01:09:26.000 Allowing the CIA to narrow down his location and pass precise coordinates to the Pentagon or the White House for a special operations rescue.
01:09:34.000 What is 64 kilometers in miles?
01:09:37.000 You asking me?
01:09:38.000 I'll ask.
01:09:38.000 I don't know.
01:09:39.000 I'll ask AI.
01:09:40.000 What is 64 kilometers?
01:09:42.000 40.
01:09:47.000 Yeah.
01:09:47.000 39.
01:09:48.000 So it's basically 40 miles.
01:09:50.000 40 miles.
01:09:53.000 40 miles is crazy.
01:09:54.000 Dude, a heart rate.
01:09:56.000 A heartbeat from 40 miles away?
01:09:58.000 Imagine thinking I'm hiding in this cave, but I'm like 20 miles from the city.
01:10:03.000 I'm good.
01:10:04.000 Also, that means it's able to differentiate animal heartbeats, it's able to differentiate other heartbeats.
01:10:09.000 It knows your heartbeat.
01:10:10.000 How does it do that?
01:10:11.000 Specific heartbeats.
01:10:12.000 How?
01:10:12.000 Think of all the heartbeats in 40 miles.
01:10:15.000 When did it get that?
01:10:16.000 When did it get that data?
01:10:17.000 Was it when you had your little chest strap on at the gym?
01:10:20.000 When did it get that?
01:10:21.000 How does it have that?
01:10:22.000 When did it get that data?
01:10:24.000 Yeah, is it.
01:10:25.000 How the fuck does it know what your heartbeat is like?
01:10:30.000 Does it know if your heart is broken?
01:10:33.000 Aww.
01:10:34.000 Seriously, though, what else did.
01:10:35.000 Like, what other things can they pick up?
01:10:38.000 If they can pick up a human heartbeat, what other, like.
01:10:42.000 From 40 miles away.
01:10:43.000 What other things?
01:10:44.000 What other physiological signals?
01:10:47.000 This is where you get into schizo land because.
01:10:47.000 What other.
01:10:50.000 At some point, like, wait, can they pick up thoughts?
01:10:55.000 Like, we know that you can, we know AI can tell what people are thinking at this point, right?
01:11:00.000 Without, with like putting something on the outside of their head.
01:11:04.000 So, like, let me ask you this Do you 100% believe this?
01:11:08.000 What?
01:11:09.000 This story.
01:11:13.000 Like, that they did that, that this tech exists.
01:11:16.000 It could be disinformation, right?
01:11:18.000 It could be something to cover up another fucking thing.
01:11:20.000 This is the thing it is legal.
01:11:22.000 To use disinformation on American citizens now.
01:11:25.000 Yeah, right.
01:11:26.000 And what better time than a time of war?
01:11:29.000 Right.
01:11:30.000 If you want to use disinformation on American citizens to convince the enemy that you have some supernatural tech, they better fucking surrender right now.
01:11:30.000 All right.
01:11:39.000 You could find their heartbeat from 40 miles away.
01:11:43.000 Yeah.
01:11:44.000 That'll make people very reluctant to engage with you.
01:11:44.000 Right.
01:11:48.000 It definitely, I thought that this could just be some like, you know, bullshit that they're like war propaganda.
01:11:48.000 Right.
01:11:54.000 I don't know.
01:11:56.000 Let's look up that magnetometry thing or whatever it's called to see.
01:11:59.000 I'm trying to show you guys stuff.
01:12:01.000 Oh, sorry, Jamie.
01:12:03.000 Yeah, it has to even, well, this is, quote, has to be under the right conditions.
01:12:08.000 If your heart, under the right conditions, if your heart beats, we'll find you.
01:12:11.000 This is also what I was trying to show you here on the thing.
01:12:13.000 They ran a deception campaign in Iran to get them away from them while they were trying to find them.
01:12:20.000 Yeah, they said, so basically they said, remember when they said we'd recovered the air?
01:12:20.000 Interesting.
01:12:25.000 At one point they were like, we got him.
01:12:26.000 And then all of a sudden other news came out, which is like, he's not out yet.
01:12:30.000 But what they did is they basically signal jammed everything because the Iranians were going to give $60,000, which in Iran is a shit ton of money right now because their economy collapsed.
01:12:43.000 To anybody who could find him.
01:12:44.000 So, this is like everybody's looking for this guy.
01:12:46.000 And so they said that they got him, hoping it would throw people off.
01:12:49.000 It worked.
01:12:51.000 So, they used somebody saying that they got him?
01:12:54.000 Yeah, they put disinformation saying that they had already rescued him before they'd rescued him.
01:12:59.000 Really?
01:13:00.000 They sent a whole fucking team of like special forces.
01:13:00.000 Oh, yeah.
01:13:03.000 I think their planes got stuck in the sand too.
01:13:07.000 So, the special forces came to get him.
01:13:09.000 I think they got him.
01:13:10.000 He was injured.
01:13:12.000 Badass.
01:13:13.000 He was injured and he fucking climbed up.
01:13:15.000 I can't remember how far he scaled.
01:13:17.000 He climbed into a fucking crevice and just hid there.
01:13:20.000 And then Ghost Murder picks up his heartbeat.
01:13:23.000 Some deep special forces group comes in.
01:13:26.000 They get him.
01:13:27.000 Then their planes get stuck in the sand.
01:13:29.000 They have to blow up their fucking planes because of the tech on them.
01:13:33.000 And then other people had to come and get them.
01:13:35.000 So it's like an insane movie.
01:13:38.000 They got him out.
01:13:39.000 And dude, if they had not gotten him out, can you imagine?
01:13:42.000 Do you buy that story 100%?
01:13:44.000 No.
01:13:44.000 I don't buy any propaganda I hear, but I like to buy.
01:13:48.000 That one sounds insane.
01:13:49.000 Well, yeah, I don't believe it.
01:13:51.000 I mean, like, this is the story.
01:13:53.000 Yeah, some part of me wants to believe it.
01:13:55.000 In the middle of the war, though, I don't think you're ever going to get the whole story, the real story.
01:13:59.000 You're going to get the story that they want to project to the enemy.
01:14:02.000 Right.
01:14:03.000 Right?
01:14:03.000 First.
01:14:04.000 To the country.
01:14:04.000 Yeah.
01:14:05.000 Yeah.
01:14:06.000 You have no idea what's going on.
01:14:08.000 You have no idea.
01:14:09.000 That's one of the craziest things about the shit happening right now.
01:14:12.000 Do you remember the Jessica Lynch story?
01:14:14.000 No.
01:14:15.000 No.
01:14:16.000 Do we talk about that?
01:14:16.000 Who is that?
01:14:17.000 The Jessica Lynch story was a lady who was supposedly kidnapped and they went to rescue her.
01:14:25.000 I think they sent in the SEALs, but she was actually in a hospital and she wasn't even being guarded and they just took her out of there and got her to medical help.
01:14:34.000 But they made it look like they had this crazy rescue operation, shootout, Tom Clancy novel type shit.
01:14:42.000 Sure.
01:14:43.000 That's not really what happened.
01:14:44.000 And she came out afterwards and was very critical of the story.
01:14:48.000 Oh, really?
01:14:49.000 Yeah.
01:14:49.000 She was like, Why did you lie?
01:14:51.000 See, we can find information about that.
01:14:52.000 I was just in the hospital.
01:14:53.000 You guys came and got me out of the hospital.
01:14:55.000 See, this is the thing.
01:14:55.000 It's like, there's things that you'll say so the enemy thinks of you a certain way, right?
01:15:01.000 Like, I'm going to get rid of your entire fucking civilization.
01:15:04.000 Right.
01:15:05.000 Or, you know, you tell them, We never leave anybody behind.
01:15:09.000 We're going to come get them.
01:15:10.000 And we can find your heart rate from 40 miles away.
01:15:12.000 When Trump posted that, of course, like, your mind's scrambling.
01:15:18.000 Like, how do I make this.
01:15:20.000 Not what it is.
01:15:21.000 You can't.
01:15:22.000 You can't because what it is is like, even if he is using some kind of like crazy hardcore shit that would like help you buy a skyscraper, you're still, you know what I mean?
01:15:35.000 You're still, even if it's just a ruse, what you're doing at that point is you're just signaling to the world.
01:15:42.000 Exactly.
01:15:43.000 That you're out of your fucking mind.
01:15:45.000 That you, that you, that like, to you, this makes sense to say anything like that.
01:15:51.000 It makes sense to signal to like Russia, hey, Because, like, you know, when Putin read that shit, he's like, oh, we're doing nukes?
01:15:59.000 I guess we're doing fucking nukes.
01:16:02.000 They're doing nukes.
01:16:02.000 This is great.
01:16:04.000 You know, China already warned Israel, right?
01:16:07.000 Well, that's what I heard.
01:16:08.000 I heard China had some part in this, that China was going to blow up Israel if they used nukes.
01:16:12.000 Yeah.
01:16:13.000 So, this is the story 19 year old U.S. Army private whose 2003 capture and rescue in Iraq became highly publicized and later heavily disputed symbolic story of the Iraq War.
01:16:27.000 So she was a supply Kirk 507th maintenance company.
01:16:30.000 Her convoy lost her in Iraq, ambushed by Iraqi forces.
01:16:34.000 Humvee, she rode on crash into a disabled U.S. truck during the attack.
01:16:37.000 She was knocked unconscious, suffered multiple broken bones and a spinal fracture from the crash rather than from a dramatic firefight.
01:16:46.000 11 U.S. soldiers in her unit were killed, including her close friend who died of head trauma from the collision.
01:16:52.000 Lynch was captured, taken first by Iraqi forces, and then to a hospital.
01:16:57.000 In Nasiriya, where Iraqi doctors treated her injuries and likely saved her life.
01:17:01.000 That's why she was pissed.
01:17:02.000 The rescue and media narrative was U.S. Special Forces operations conducted a nighttime raid on the hospital, recovering Lynch and flying her out by helicopter.
01:17:13.000 First successful rescue of an American POW since World War II and the first of a woman.
01:17:19.000 So they framed it as a POW rescue.
01:17:23.000 And what really happened is the Iraqi doctors took care of her and then they let them come and get her.
01:17:23.000 Right.
01:17:28.000 Right.
01:17:29.000 So I see why she was pissed.
01:17:29.000 Yeah.
01:17:31.000 So later U.S. military and medical reports indicated she had not been shot or stabbed.
01:17:31.000 Yeah.
01:17:36.000 So did it ever say she was shot?
01:17:37.000 Hold on.
01:17:40.000 Soon after, major U.S. media, especially an early Washington Post report, described her as having fought fiercely, emptying her rifle, being shot and stabbed, and then being dramatically snatched from enemy hands under heavy fire.
01:17:57.000 Wow.
01:17:58.000 Wow.
01:17:59.000 That's the Washington Post wrote that?
01:18:01.000 That narrative turned her into a Rambo style hero and a symbol of courage and American virtue, amplifying her story far above that of many other service members in the conflict.
01:18:11.000 Right.
01:18:11.000 So she really just got in a crash and they made up a bunch of shit.
01:18:15.000 And maybe it was someone in the Washington Post or maybe it was someone for the government that works for the Washington Post.
01:18:21.000 There's definitely like entire departments of the DOD that write programs.
01:18:27.000 Yeah.
01:18:27.000 Cook up a story.
01:18:28.000 And, and, Like, it's war.
01:18:30.000 Like, if you're dropping bombs on people, you're definitely going to lie.
01:18:33.000 Like, you don't have to tell the truth.
01:18:35.000 They're not going to tell the truth.
01:18:35.000 Right.
01:18:37.000 Yeah, but for her, you're making her live a lie.
01:18:39.000 That's what's fucked.
01:18:41.000 Yeah, right.
01:18:41.000 Yeah.
01:18:42.000 Like, you send her home and she has to live this lie.
01:18:42.000 You know what I mean?
01:18:45.000 Yeah.
01:18:46.000 Yeah, exactly.
01:18:47.000 I mean, this is exactly what they say the people who went to the moon after.
01:18:51.000 She says Lynch has repeatedly rejected the false hero narrative, calling herself just a survivor and openly criticizing the way her story was shaped and sold to the public.
01:19:00.000 Yeah, poor girl.
01:19:01.000 She's got to deal with it.
01:19:03.000 You got stabbed and shot?
01:19:05.000 Like, no.
01:19:06.000 No.
01:19:06.000 No, he didn't.
01:19:07.000 No.
01:19:07.000 Got in a fucking horrible car accident.
01:19:07.000 She had to.
01:19:09.000 My friend died.
01:19:10.000 I guess legally, like, you don't have to stick with the propaganda, right?
01:19:10.000 I wonder.
01:19:14.000 Because she didn't get in trouble for that, right?
01:19:16.000 There was no court martial or anything.
01:19:16.000 She didn't get.
01:19:18.000 So you can.
01:19:19.000 So if the propaganda machine cooks up a story about you, you're able to say that's bullshit.
01:19:23.000 The thing is, it's like, if you give it to someone at the Washington Post and then you never go after the Washington Post for writing something that's completely horseshit.
01:19:31.000 Like, if a.
01:19:32.000 Intelligence agency gives a story to the Washington Post.
01:19:34.000 Yeah.
01:19:35.000 It says, hey, go write this.
01:19:36.000 And then they write it.
01:19:37.000 Yeah.
01:19:37.000 It's complete and total horseshit.
01:19:39.000 But the government gave it to them, so they're not going to prosecute them.
01:19:41.000 Leave it alone.
01:19:42.000 It just goes away.
01:19:43.000 Yeah.
01:19:43.000 But then that story's out there.
01:19:45.000 Yeah.
01:19:45.000 And then this poor girl's like, I got what?
01:19:48.000 I got a fucking car accident.
01:19:49.000 Nobody shot me.
01:19:50.000 This is nuts.
01:19:51.000 God damn.
01:19:51.000 I fought my way out fiercely, emptying my rifle.
01:19:54.000 This is bananas.
01:19:55.000 It's so crazy to live in the part of the hive we're in because there is this world that we live inside of that more and more we're beginning to realize is just composed of propaganda, lies, shit cooked up to keep people living a certain way.
01:20:13.000 Exactly.
01:20:15.000 It's such a mind fuck to try to push outside the boundaries of like, All the information that you've consumed and let your brain go there.
01:20:24.000 It's really hard to do that, man.
01:20:26.000 I mean, this is why psychedelics are so useful because it will help you.
01:20:31.000 But more and more and more, it just feels like the laser pointer that they're using to grab our attention is getting increasingly hypnotic.
01:20:41.000 It's becoming increasingly difficult to resist staring at that fucking thing.
01:20:45.000 They're getting so good at it.
01:20:46.000 Yep.
01:20:47.000 Yeah.
01:20:48.000 And meanwhile, there's this whole universe happening around us that.
01:20:53.000 God knows what's going on there.
01:20:55.000 God knows what is being cooked up right now that is, or groups of people, who knows, living in completely alternate timelines that look at us like, you know, animals, that look at us as just some like compartment in a much bigger biome.
01:21:17.000 You know, that shit like really is interesting these days because it feels like more and more and more people are not.
01:21:26.000 Buying it as much.
01:21:28.000 You know, that doesn't that.
01:21:28.000 Right.
01:21:29.000 Well, people have access to information now that was never available before.
01:21:32.000 Right.
01:21:33.000 And you get to hear conversations like this.
01:21:35.000 People talking about stuff where you go, oh my God, this is insane.
01:21:35.000 Right.
01:21:38.000 Right.
01:21:38.000 All of it's insane.
01:21:40.000 But what does that mean for, like, this?
01:21:41.000 To me, the, you know, this, the, do you want some water?
01:21:44.000 No, I'm good.
01:21:45.000 Thanks.
01:21:45.000 To me, the scary, the scary, what's scary is, like, I really don't know that many people right now who buy anything that the federal government's putting out there.
01:21:57.000 Everyone hears whatever the fucking federal government is saying.
01:22:00.000 And it's just kind of, maybe, probably not.
01:22:03.000 We don't know.
01:22:03.000 They're not telling all the truth.
01:22:05.000 Just like you said, they can legally lie to us.
01:22:08.000 And so that does make me nervous.
01:22:11.000 Like, what happens when the majority of people no longer believe anything the regime is saying?
01:22:20.000 That creates some interesting dysphoria.
01:22:24.000 You know what I mean?
01:22:26.000 It's creepy when.
01:22:29.000 Anyone who's been conned before, there's a part of the con where you don't know you're being conned.
01:22:35.000 Right.
01:22:35.000 But where the con gets really creepy is you start realizing you're getting conned.
01:22:40.000 Do you ever watch that Going Clear, the HBO thing?
01:22:44.000 Dude, loved it.
01:22:45.000 Amazing, right?
01:22:46.000 But there was that one famous director who talked about the moment where they gave him access to the ancient scripts.
01:22:53.000 Yeah, dude.
01:22:54.000 And the origins of humanity and all that.
01:22:56.000 And he was like, oh my God.
01:22:57.000 You could see it, like, as he was describing it, like, that was the moment where he was.
01:23:01.000 100% certain it was all horseshit.
01:23:03.000 He had invested a massive chunk of his life into this shit.
01:23:07.000 That's a hard day.
01:23:08.000 That's a hard fucking day.
01:23:09.000 And especially weird when it's such a smart guy.
01:23:12.000 Yeah.
01:23:13.000 Such a smart and talented guy, and they got him.
01:23:15.000 Yeah.
01:23:16.000 Leah Remini, same deal.
01:23:17.000 Yeah.
01:23:18.000 You know, Leah Remini's very smart.
01:23:20.000 Like, she used to be with Kevin James on the King of Queens.
01:23:23.000 Like, tough chick, like, assertive.
01:23:26.000 Like, how did she get into that?
01:23:28.000 How many people get got into the Moonies?
01:23:31.000 Sunken cause fallacy.
01:23:32.000 It's a sunken cause fallacy.
01:23:34.000 The more you invest in something, the more you stick with it because you don't want to lose your investment.
01:23:38.000 Right.
01:23:38.000 And if they get you young when you don't know what the fuck is going on.
01:23:40.000 That's right.
01:23:41.000 Anybody could have got me when I was like 20.
01:23:43.000 That's right.
01:23:44.000 And it's crazy just to see the propaganda.
01:23:47.000 Like, you know, there's just a lot of people out there who just got sucked into something that, you know, I just feel stupid because, like, you know, before the Trump thing happened, I was pretty black pilled on politics in general.
01:24:03.000 I felt pretty black pilled.
01:24:04.000 I did believe it here and there.
01:24:06.000 I was every once in a while, you know, yeah.
01:24:09.000 But, you know, I was pretty.
01:24:13.000 You know, I remember taking LSD for the first time and being like, well, this shouldn't be illegal.
01:24:17.000 What the fuck is this?
01:24:18.000 How come I can go to jail for five years for this?
01:24:20.000 This is fucking ridiculous.
01:24:22.000 And so that was the beginning of me being completely blackpilled with whatever the federal government was up to.
01:24:28.000 It's just, if that's, if I can go to jail for five years for this, everything is bullshit.
01:24:34.000 Everything.
01:24:34.000 Now that's a weak point of view.
01:24:36.000 Just because one thing's bullshit doesn't mean everything's bullshit.
01:24:39.000 But then, like, this fucking ridiculous, like, Pseudo nationalist movement happens, and a lot of people got caught by it.
01:24:48.000 The other option was fucked up, calmly, you know what I mean?
01:24:51.000 But there was this like moment where you're like, holy shit, the outsiders are getting in.
01:24:57.000 They're going to stop the wars.
01:24:58.000 They're going to, this, I think right now, all of us are getting for the briefcase of Scientology moment right now, which is like, it doesn't matter what fucking mask the person calling themselves the president is wearing.
01:25:15.000 It's always going to be the same thing.
01:25:19.000 They're going to analyze the market.
01:25:22.000 They're going to say what they need to say to grab the most voters.
01:25:25.000 And then they're going to fucking keep blowing up people in the Middle East because of oil.
01:25:30.000 And I just feel dumb because I really believed it, dude.
01:25:35.000 I fucking believed that we would not do any more Middle Eastern wars.
01:25:39.000 I fell for it.
01:25:42.000 I really bought it, man.
01:25:44.000 And it makes me feel so dumb.
01:25:45.000 Like I am now fully blackpilled.
01:25:48.000 When it comes to American politics, like I realized, like, God, it's so easy.
01:25:54.000 I don't think anybody should feel bad.
01:25:57.000 I don't think anybody should feel bad because a lot of us really hated war.
01:26:04.000 A lot of us really, really hated that our country's been at war for 93% of its history.
01:26:10.000 A lot of us really hated the fact that politicians leave their offices and go work for Lockheed Martin, Halliburton, wherever, that there's a weird connection between.
01:26:20.000 The main weapons, what they call them, the big five or whatever, and the federal government, that there's like backroom deals going on all the time.
01:26:28.000 We hated that.
01:26:29.000 And mostly we just hated the fact that we're paying taxes to blow up children.
01:26:33.000 And then Trump and fucking Vance come around.
01:26:37.000 And somehow, even though, like, probably, like, when you look at Trump, I don't know if I'm going to believe that dude, but somehow he did it.
01:26:48.000 Hypnotized.
01:26:49.000 What a powerful magician.
01:26:52.000 No more wars.
01:26:53.000 No more wars!
01:26:54.000 And now The same bullshit, John.
01:26:58.000 Not just the same bullshit, but one of the ones that doesn't make the least amount of sense in terms of when they did it and why they did it.
01:27:06.000 Yes.
01:27:06.000 You blow up the leader during Ramadan.
01:27:09.000 Are you trying to make an apology?
01:27:11.000 Why did you have to do it now?
01:27:12.000 Are you really convinced that at this time they're really two weeks away from making a nuclear weapon?
01:27:17.000 Are we fucking sure?
01:27:18.000 Two weeks.
01:27:19.000 Two weeks.
01:27:19.000 But it's not like we haven't heard that before, right?
01:27:22.000 So at a certain point in time, how much pressure does Israel have to put on the president?
01:27:30.000 Like, that's a crazy amount of influence.
01:27:34.000 Because if say if Israel didn't exist, let's say there was just the Iranian terror regime supposedly sponsoring, not supposedly, sponsoring.
01:27:34.000 Knowing that.
01:27:44.000 I don't think it's supposedly.
01:27:45.000 I think that's a hundred percent.
01:27:47.000 But I'm just trying to be precise.
01:27:47.000 Right.
01:27:50.000 So you have this state sponsored terrorism regime, dictatorial.
01:27:50.000 Precise.
01:27:57.000 They're dictators.
01:27:58.000 They control.
01:27:58.000 They control their people in the streets.
01:28:00.000 Gun down protesters.
01:28:01.000 They killed two Olympic gold medalists in wrestling.
01:28:04.000 At least one and one other really promising young wrestler.
01:28:08.000 They kill people that are of high profile so that it sends a message.
01:28:12.000 Yeah.
01:28:13.000 You can't protest.
01:28:14.000 You know, and then.
01:28:16.000 Cut off the internet.
01:28:17.000 Yeah.
01:28:18.000 Would we go in?
01:28:21.000 I don't think so.
01:28:22.000 If we heard by allies or someone told us that they were trying to develop a nuclear weapon, don't you think we'd probably try to stop them from doing that with some sort of negotiations and ensure their safety or something?
01:28:22.000 Right?
01:28:35.000 Yeah, we shouldn't.
01:28:37.000 Yeah, would we blow.
01:28:39.000 How much money was it every day in the war, Jamie?
01:28:41.000 How much was we spending $2 billion every day on that fucking war?
01:28:45.000 Well, it's not just that.
01:28:46.000 It's like the war is like everything else.
01:28:49.000 Imagine if it was run by a private company.
01:28:52.000 I'm not saying war should be run by a private company, but imagine if it was.
01:28:55.000 Imagine if, say, Lockheed Martin ran the war in Afghanistan.
01:29:00.000 Do you think they would have left behind all that fucking equipment?
01:29:03.000 Hell no.
01:29:04.000 Billions of dollars in helicopters and tanks?
01:29:08.000 Of course they wouldn't.
01:29:09.000 They would take it back.
01:29:10.000 You know why?
01:29:10.000 Because that's the smart thing to do if you're running a fucking business.
01:29:13.000 That's an insane amount of waste.
01:29:15.000 But our federal government's like, just leave it there.
01:29:18.000 Yeah.
01:29:19.000 Unless, if you want to be really conspiratorial, you want to arm the Taliban.
01:29:23.000 Yeah, you're not being conspiratorial.
01:29:24.000 It benefits you because it gives you another reason to get back in there.
01:29:27.000 Wasn't that what they said about Netanyahu, said about Hamas, that he can control the flame?
01:29:31.000 Yes.
01:29:32.000 By funding Hamas, he can control the flame?
01:29:34.000 Yes.
01:29:35.000 Dude, it is.
01:29:36.000 That's a crazy concept.
01:29:38.000 I'll tell you the crazy fucking concept.
01:29:40.000 We got these two old motherfuckers driving the global bus right off a fucking cliff.
01:29:46.000 That's a crazy fucking concept.
01:29:48.000 Is it somehow.
01:29:49.000 And you can't do anything about it.
01:29:52.000 Like, apparently.
01:29:53.000 You just, there's nothing you could do.
01:29:55.000 You could bitch about it on a podcast.
01:29:56.000 That's not going to do anything.
01:29:58.000 People are just going to be like, You pussy, you are good.
01:30:00.000 Blow up, kids.
01:30:01.000 There's a lot of people that want to say it's a good thing.
01:30:03.000 Well, because sunken cause fallacy.
01:30:06.000 It doesn't feel good to admit you got conned.
01:30:11.000 And, dude, I have been.
01:30:13.000 There's a lot of that.
01:30:14.000 It doesn't feel good.
01:30:15.000 It doesn't feel good.
01:30:16.000 It's embarrassing.
01:30:17.000 You want to feel like you are impervious to grift, impervious to con.
01:30:22.000 Dude, let me tell you something.
01:30:24.000 I have been in.
01:30:25.000 A few colds.
01:30:26.000 Like, I get sucked in all the time by shit.
01:30:29.000 I'm not embarrassed to say it.
01:30:31.000 I'm highly susceptible to propaganda.
01:30:36.000 Me, too.
01:30:37.000 I think everybody is.
01:30:38.000 That's why it works.
01:30:40.000 I mean, I don't buy into all of it, obviously, but it's quite a bit.
01:30:43.000 Well, it's like a lullaby.
01:30:45.000 It's like a sweet fairy tale.
01:30:47.000 You hear it and you're like, oh my God.
01:30:48.000 You know when I really wanted propaganda?
01:30:50.000 What?
01:30:50.000 Right after September 11th.
01:30:51.000 Oh, hell yeah.
01:30:52.000 I was ready.
01:30:53.000 Give me a whiskey, drinking, cigar, smoking, politician in a room.
01:30:58.000 Like laying out some red meat eating guy laying out maps.
01:30:58.000 Fuck yeah.
01:31:02.000 We're going to go over there and fuck these people up and fuck these people up, and this shit ain't happening again.
01:31:07.000 Right.
01:31:07.000 And they, that's scary.
01:31:09.000 Check this out.
01:31:10.000 I saw an article about someone calling bullshit on Ghost Murmur, and they said that in the Post articles, this was actually listed as what the pilot had.
01:31:18.000 And it even says it in this article here.
01:31:20.000 So the successful rescue of this US F 15E Strike Eagle navigator over southwestern Iran highlighted one of the most advanced tools in modern combat search and rescue the Combat Survivor Evader Locator.
01:31:33.000 Manufactured by Boeing.
01:31:34.000 It's a compact 800 gram device integrated into a pilot's survival vest.
01:31:39.000 It remains attached after ejection, continuously transmitting encrypted location data and preloaded messages such as injured or ready for extraction.
01:31:47.000 These signals use rapid frequency hopping and ultra short bursts, making detection by enemy electronic warfare systems extremely difficult.
01:31:56.000 He was going into how the explanation of what this technology is and what they described it doing don't really match up.
01:32:05.000 Yeah, there with the ghost murmur thing, right?
01:32:08.000 Because it's using something ghost murmur, quantum ghost murmur.
01:32:12.000 Sounds there's part of me that's going, I don't buy that one.
01:32:16.000 That one gives me, like, you're right.
01:32:19.000 I don't think you can do that.
01:32:20.000 I think you're bullshitting.
01:32:21.000 You're right.
01:32:22.000 There's also a thing where Hegseth said that, like, the first message this guy sent was God is good.
01:32:28.000 No, he didn't say that.
01:32:29.000 I believe he did.
01:32:31.000 Please search that.
01:32:32.000 I think that's what he said.
01:32:33.000 I think that's what he said.
01:32:35.000 That was the first message, which by the way.
01:32:36.000 I might say that if they're coming to rescue me.
01:32:38.000 That's true.
01:32:39.000 Or praise Jesus.
01:32:39.000 True.
01:32:41.000 But also, what concerns me.
01:32:45.000 It's all Akbar.
01:32:47.000 As a person who admires the work of Jesus Christ, what concerns me is there is an increasing amount of talk among a lot of these guys that are in the service of them being told shit that's like right out of a Charlton Heston movie.
01:33:00.000 Yeah, man.
01:33:01.000 Yeah.
01:33:01.000 Like the one guy that said that Trump was anointed by Jesus Christ and that this was to bring the Armageddon so that Jesus comes back.
01:33:09.000 Jesus.
01:33:10.000 Yeah.
01:33:11.000 And the guy said it with a big creepy smile on his face, apparently.
01:33:14.000 So, what is he saying?
01:33:15.000 His first message was simple and it was powerful.
01:33:20.000 He sent a message God is good.
01:33:24.000 In that moment of isolation and danger, his faith and fighting spirit shone through.
01:33:31.000 Jesus, Lord in heaven.
01:33:32.000 The Jessica Lynch story.
01:33:34.000 Jesus, Lord in heaven.
01:33:35.000 History repeats itself.
01:33:36.000 Well, it doesn't repeat itself, but it rhymes.
01:33:39.000 Who said that?
01:33:40.000 That's Mark Twain.
01:33:41.000 That's Mark Twain.
01:33:41.000 That's right.
01:33:42.000 That's right.
01:33:43.000 Isn't that the same statement?
01:33:44.000 Yeah.
01:33:44.000 Yeah, that's what he said.
01:33:46.000 Yeah.
01:33:46.000 Allah is the greatest.
01:33:47.000 I know Allah.
01:33:48.000 The interesting thing is, like, I believe Muslims believe a lot of things about Jesus Christ.
01:33:57.000 I think they believe he died, came back, and I think they believe he's going to return someday.
01:34:02.000 Yeah, I think they call Christians people of the book.
01:34:07.000 That's interesting, isn't it?
01:34:09.000 That's a supernatural being.
01:34:11.000 Like a guy who dies, comes back to life, leaves, and he's going to come back again.
01:34:15.000 That was 2,000 years ago.
01:34:16.000 And we're just sitting here at the bus stop.
01:34:19.000 Waiting.
01:34:19.000 Just waiting on Jesus.
01:34:20.000 Waiting.
01:34:21.000 But then people like Hegseth are like, well, maybe if you blow up more children, he'll come quicker.
01:34:27.000 And that's why, you know, this shit is addressed in the Bible.
01:34:30.000 Praise God.
01:34:31.000 It does say, many of you will come to me and I will say, I don't know you.
01:34:37.000 I don't know who the fuck you are, Hegseth.
01:34:39.000 I don't know you, you flatulent warmonger piece of shit.
01:34:42.000 Suffer the little children that come unto me.
01:34:44.000 It would be better that a millstone were tied around your neck and you were thrown in the Ocean than to hurt one of these little ones.
01:34:51.000 Fuck you, drum, bomb dropping piece of shit.
01:34:55.000 Don't use my name to justify what you're doing.
01:34:57.000 Don't use my, you know what I mean?
01:35:00.000 Have you seen that fucking, that lady that Trump made the head of the religion?
01:35:00.000 That's what I don't like.
01:35:05.000 Like that.
01:35:06.000 No.
01:35:06.000 Can you pull up Trump?
01:35:07.000 Does she speak in tongues?
01:35:09.000 Yeah.
01:35:10.000 Please say she speaks in tongues.
01:35:11.000 I don't know if she speaks in tongues.
01:35:12.000 You said, yeah, you wanted to believe.
01:35:15.000 Those are my favorite people.
01:35:16.000 I'm going to guess.
01:35:17.000 Shamallah, shamallah, shamallah, sham, sham, But do you think that there's something to that?
01:35:26.000 Yeah.
01:35:26.000 Glossolalia?
01:35:27.000 Yeah.
01:35:28.000 Is that what they say?
01:35:28.000 Glossolalia?
01:35:29.000 Yeah.
01:35:30.000 Paula White Cain, you should pull up one of her sermons.
01:35:33.000 Oh, let me hear some love from this lady.
01:35:36.000 It says crazy, batshit crazy.
01:35:40.000 Let's hear some of it.
01:35:42.000 Let me hear some of that.
01:35:43.000 I'm sending angels are coming.
01:35:45.000 Angels.
01:35:49.000 Oh, is she going to?
01:35:51.000 We'll get dinged again.
01:35:53.000 We'll get dinged all over the place.
01:35:55.000 Don't get dinged.
01:35:56.000 Let's hear what this is about.
01:35:57.000 Not worth it.
01:35:58.000 Everybody can hear what she says.
01:36:00.000 I haven't seen this.
01:36:02.000 First off, to give honor to God and to President Trump for being bold and unwavering with his faith.
01:36:08.000 Many people don't know, like you do, and say hello to Eric and everyone in the family about the upbringing of President Trump, that he went to sometimes three times a week, too.
01:36:20.000 He said it depended on the teacher.
01:36:22.000 to Saturday school, Sunday school, church.
01:36:25.000 It was at Norman Vincent Pills.
01:36:27.000 Church was a big part of his life, of course.
01:36:30.000 Three times a week is crazy.
01:36:32.000 It's basically a saint.
01:36:33.000 Three times a week is crazy.
01:36:35.000 Are you busy?
01:36:35.000 Yeah.
01:36:36.000 That's what it was.
01:36:36.000 You're making houses.
01:36:37.000 Why do you have so much time to go to church?
01:36:38.000 I think that was as a young Trump.
01:36:41.000 Come on, lady.
01:36:42.000 But there's much more in.
01:36:44.000 But here's the thing if I was running an empire, I'd want a lady like that working for me.
01:36:47.000 Fuck yeah.
01:36:48.000 Just a true believer.
01:36:50.000 She could just get in front of that camera and say, Jesus wanted Trump to light that fire in the Middle East.
01:36:50.000 Absolutely.
01:36:55.000 I saw a snake come out.
01:36:56.000 So he can return.
01:36:57.000 A snake bit him on the neck.
01:36:59.000 A rattlesnake bit him on the neck.
01:37:01.000 And he was fine.
01:37:04.000 It didn't bother them at all.
01:37:05.000 I watched the rattlesnake bite heal.
01:37:08.000 It healed.
01:37:09.000 He is a child of the Lord.
01:37:12.000 And a child of the Lord sometimes must make decisions to destroy entire civilizations.
01:37:18.000 That you're in right standing, not because of your merit.
01:37:21.000 There's no merit in you that deserves that right standing.
01:37:24.000 Not because of your works.
01:37:26.000 There's nothing you can do to place yourself in that position.
01:37:29.000 Not because you have a right heart and somebody else has a wrong heart.
01:37:33.000 All of our hearts are deceitful, according to Jeremiah.
01:37:36.000 Especially their audience.
01:37:38.000 We all deserve punishment.
01:37:40.000 We all deserve to be separated.
01:37:42.000 But God, in His mercy and His grace and His goodness and His love for you, brought Jesus, who would be the righteous King.
01:37:50.000 He would make the wrong right.
01:37:51.000 First of all, if you talk like that in my house, you got to leave.
01:37:56.000 Like, you imagine that lady is like coming over for dinner and she's just walking around the dinner table, and all your other friends are like, What the fuck just happened?
01:38:04.000 Like, hey, this is a crazy way to talk.
01:38:06.000 This is a crazy way to talk.
01:38:07.000 And also, Why are you so confident?
01:38:10.000 Yeah.
01:38:11.000 Okay.
01:38:11.000 You just reading the word of God the way everybody else is.
01:38:15.000 Why are you so confident that you're going to tell all these people what they're supposed to do and how to live their life?
01:38:21.000 And you're going to say it in a crazy way, and I'm not supposed to be able to talk about that?
01:38:25.000 I just feel like, you know, when somebody's rambling about Jesus.
01:38:31.000 The real question is like, where are you when it comes to blowing up children?
01:38:35.000 Are you kind of on the fence about that?
01:38:37.000 Because if you're on the fence about that, I'd say.
01:38:39.000 If you're anti abortion and pro war, kind of weird.
01:38:42.000 Really weird.
01:38:43.000 Kind of weird.
01:38:44.000 Yeah, and that's this bizarre, crazy math that some of these people are doing to justify holding up the military industrial complex.
01:38:55.000 And it's fucked up, dude.
01:38:56.000 The thing is, the more these conflicts occur, the more enemies we'll have, which will ensure future conflicts.
01:39:01.000 Exactly.
01:39:02.000 Is booming.
01:39:03.000 Booming.
01:39:04.000 And that's what people don't want to believe.
01:39:05.000 They don't want to believe that someone would engineer a virus.
01:39:07.000 They don't want to believe that someone would make stuff that could kill other people of their own country.
01:39:14.000 But they would.
01:39:15.000 They would if they could make money.
01:39:16.000 They don't give a fuck about you, like they don't give a fuck about people over there.
01:39:19.000 To a certain level of psychopaths, money just becomes numbers on a ledger that they're trying to acquire.
01:39:25.000 And if they can attach themselves to a corporation, fantastic.
01:39:28.000 Then it's just the business we're in.
01:39:29.000 That's it.
01:39:30.000 And chug along, daddy.
01:39:32.000 Chug along.
01:39:35.000 And this is the world that you're having to live in at the same time where Tim Brichette is saying there's fucking aliens.
01:39:40.000 Right.
01:39:41.000 And AI is.
01:39:42.000 And then also, they shot a rocket to the moon on April Fool's Day.
01:39:47.000 And it's like.
01:39:48.000 What the fuck?
01:39:49.000 This script is wild.
01:39:51.000 Wild.
01:39:51.000 Whoever met that, whoever wrote this, I want to give him a hug.
01:39:54.000 You fucking killed it, dog.
01:39:56.000 I'd be like.
01:39:57.000 Dude.
01:39:57.000 A chef's kiss.
01:39:59.000 Did you see the tattoo on the guy, like the guy at NASA?
01:39:59.000 Dude.
01:40:03.000 Did you see that weird fucking tattoo on the guy at NASA giving, like, I don't know, applesauce to one of the.
01:40:09.000 Astronauts, can you pull up the weird thing?
01:40:12.000 What you know, they're shoving like yogurt pouches in there.
01:40:15.000 I there was a whole thing where the astronauts are sitting there and they're putting like food pouches in there.
01:40:20.000 Yeah, what's his tattoo?
01:40:21.000 Oh, Jesus Christ, what the he's got a demon tattoo with runes on his fingers.
01:40:27.000 Yes, holy, yes, bro, that's wild.
01:40:30.000 I know.
01:40:32.000 If I was rolling with that guy in jujitsu, I'd get nervous.
01:40:32.000 I know.
01:40:35.000 Yeah, and if I was working at NASA, I'd be like, Look, we're gonna get somebody else to put the food pouches in.
01:40:40.000 Is that real?
01:40:41.000 I mean, I saw the photo going around too, but I don't.
01:40:44.000 It's just a guy who works at NASA.
01:40:46.000 That's just the guy that works at NASA.
01:40:48.000 But that doesn't have to be the guy who puts the fucking quiche in his pocket for the camera.
01:40:53.000 What does that guy do at NASA?
01:40:54.000 That's interesting.
01:40:55.000 I just remember being at SpaceX.
01:40:57.000 There's a lot of people that.
01:40:59.000 By the way, it's fine to have that tattoo, but you've got to know.
01:41:02.000 It's like if you're displaying that tattoo.
01:41:06.000 You've made some mistakes.
01:41:07.000 Yeah, that's.
01:41:07.000 You're putting.
01:41:09.000 We've made some mistakes.
01:41:10.000 It's an old tattoo.
01:41:11.000 Yeah.
01:41:11.000 I mean, even if you're 20 and you got that on your fucking hand, that's kind of crazy.
01:41:14.000 But I mean, hey, why not?
01:41:15.000 Fuck it.
01:41:16.000 Who cares?
01:41:17.000 But a lot of those guys you were saying at SpaceX, they're burly rocket workers.
01:41:21.000 Yeah.
01:41:22.000 There's a bunch of jack dudes picking up fucking girders.
01:41:26.000 I don't think it's like what people are saying it is.
01:41:29.000 It's the combination of April Fool's Day and a dude with a seeming bale tattoo is putting cream cheese in some dude's outfit.
01:41:38.000 You know what I mean?
01:41:38.000 They're fucking with us.
01:41:39.000 Yeah.
01:41:40.000 It's the people at NASA.
01:41:42.000 That's people at NASA fucking with stoners.
01:41:44.000 I think it's the Babylon Bee, had one of the funniest little memes.
01:41:48.000 And it said the lady astronaut became the furthest a woman got away from the kitchen.
01:41:54.000 That's like a Rodney Dean trials.
01:41:57.000 I was like, oh my God.
01:41:58.000 Babylon Beauty knocks it out of the park.
01:42:00.000 They have some of the funniest memes.
01:42:02.000 They have some good ones, dude.
01:42:04.000 Oh my God.
01:42:04.000 Yeah, they have.
01:42:05.000 The onion has gone missing.
01:42:07.000 They should look for the onion in the same place where those scientists are.
01:42:10.000 Right?
01:42:10.000 You hardly hear from it anymore.
01:42:12.000 Well, they do.
01:42:12.000 I see some funny shit from them last night.
01:42:14.000 They occasionally have some badness, but they were the keenest of it.
01:42:16.000 Oh my God.
01:42:17.000 The onion was amazing.
01:42:18.000 The best.
01:42:19.000 And they write whole articles about it.
01:42:21.000 It wasn't just like the onion wasn't just a meme.
01:42:25.000 Remember the one where they do the interview with the director of The Fast and the Furious, and it's like a five year old boy?
01:42:32.000 It's the funniest shit.
01:42:34.000 They get this kid to just say it.
01:42:36.000 Then there's a car.
01:42:37.000 It jumps.
01:42:39.000 It's hilarious.
01:42:40.000 It's hilarious.
01:42:41.000 Yeah.
01:42:41.000 But the problem was as things got weird, especially with restrictive language and hate speech talk and all that jazz, everybody had to be careful about what they joked around about.
01:42:55.000 That's right.
01:42:56.000 Fucking death of comedy.
01:42:57.000 Oh my God.
01:42:58.000 Someone was just talking about, was it Lisa Kudrow or one of these funny ladies was talking about why they can't make comedies anymore?
01:43:08.000 Because you can't, there's just too many restrictions.
01:43:10.000 Dude, I was going to bring you.
01:43:11.000 She's worried about offending people.
01:43:13.000 I went to this used bookstore and bought like 10 old National Lampoon magazines.
01:43:18.000 I wanted it from the 70s.
01:43:20.000 And I was going to bring them here.
01:43:21.000 I forgot I was going to give it to you.
01:43:22.000 But it's, oh my God.
01:43:25.000 Like, I mean, I don't get offended by comedy, but like some of the shit in these old national lampoons, I'm like, damn, what the fuck?
01:43:34.000 Like, it is so.
01:43:36.000 Was that the image that you sent me today?
01:43:39.000 What image did I send you?
01:43:40.000 You sent me an R. Crumb.
01:43:40.000 R. Crumb.
01:43:41.000 Oh, no, that was just like a cool R. Crumb comic.
01:43:44.000 Him talking about how he, like, he's so funny, dude.
01:43:47.000 That guy.
01:43:48.000 R. Crumb was a maniac.
01:43:49.000 Is he still alive?
01:43:50.000 Yeah, he shot him on the show.
01:43:51.000 Is he alive?
01:43:52.000 Yeah.
01:43:53.000 I think he lives in France now, right?
01:43:55.000 Probably.
01:43:56.000 Oh, definitely.
01:43:57.000 He's an odd guy, man.
01:43:59.000 Yeah.
01:43:59.000 Dude.
01:44:01.000 But what I love about it.
01:44:01.000 Did you ever watch that documentary?
01:44:03.000 The best.
01:44:03.000 Incredible.
01:44:04.000 Did all that acid, just left his fucking family, went off and started sketching for a year, turns into this, like, legendary underground comic book writer, but he's like, Horny and kinky, and it's just, just like big women, big giant women that he rides.
01:44:18.000 Yeah, that he likes to ride, he likes to be picked up.
01:44:23.000 He's like so amazingly funny and like and brilliant, too.
01:44:28.000 Like a lot of his like commentary on culture is so cynical, but it's hard to argue with some of what he well.
01:44:35.000 He's obviously doing it in a humorous way, yeah, and so it's hard to know what his real take on things are.
01:44:41.000 You know, I think he had some shock value to some of his stuff for sure, some of it was just crazy.
01:44:47.000 There's a lot of really racist stuff.
01:44:49.000 There's some just crazy stuff in there.
01:44:51.000 And you've got to realize, in the 1970s, is when he was doing this.
01:44:55.000 I remember I found them when I was in San Francisco.
01:44:57.000 It was the first time I ever saw them.
01:44:59.000 They're so good.
01:44:59.000 And I was like, this is nuts.
01:45:01.000 This stuff is crazy.
01:45:04.000 You'd get horny when you're a little kid looking at his stuff.
01:45:07.000 I definitely jerked off to R. Crumb.
01:45:08.000 Because a lot of them are out and he's salivating and he's got a hard on.
01:45:13.000 It reminds me, dude.
01:45:14.000 I got an R. Crumb book.
01:45:15.000 I've got to get out of the fucking living room.
01:45:17.000 There's one.
01:45:19.000 I've got to hide that.
01:45:21.000 Holy shit.
01:45:21.000 They haven't, it's like buried.
01:45:23.000 It's amazing because you get to see his like very strange family.
01:45:26.000 His brother, who's very strange, his mother's very strange.
01:45:29.000 And you're like, whoa, imagine growing up in this environment.
01:45:32.000 He attributes his style to LSD, he attributes it to getting blasted on acid.
01:45:37.000 I think he just like got blasted on acid, moved to San Francisco, and was in like for a year.
01:45:44.000 He talks about just sitting in cafes, just like drawing.
01:45:47.000 And then he turns into this legendary artist.
01:45:51.000 Still around.
01:45:52.000 Follow him on Instagram.
01:45:53.000 Really?
01:45:53.000 He posts stuff all the time.
01:45:54.000 I'm busy now.
01:45:55.000 Can we?
01:45:56.000 Is he still alive?
01:45:58.000 He's still posting stuff.
01:45:59.000 He's got to be pretty old at this point.
01:46:03.000 How old is he?
01:46:04.000 He's like 80 or something.
01:46:06.000 It's 82.
01:46:07.000 It's kind of an interesting time capsule into the times, too, where things could just be weird.
01:46:12.000 Like, really weird.
01:46:13.000 Like, Frank Zappa weird.
01:46:15.000 You know, there was a time where things just got very odd in this country with art.
01:46:19.000 Yeah.
01:46:20.000 And he was a great example of that.
01:46:21.000 It's just, it's like, you couldn't imagine a corporate environment creating a comic book like that.
01:46:29.000 It wouldn't exist.
01:46:30.000 You know, and for it to be as popular as it was and be that strange.
01:46:35.000 And that's crazy.
01:46:36.000 That's what's really interesting to me.
01:46:38.000 Like, that was a really popular comic.
01:46:39.000 Yeah.
01:46:40.000 To the point where they made a documentary about the guy who created it.
01:46:43.000 Yeah.
01:46:43.000 Yeah.
01:46:44.000 That's interesting.
01:46:45.000 Things weren't co opted as quickly.
01:46:47.000 Exactly.
01:46:48.000 Not just that.
01:46:49.000 People were allowed, you know, like if he existed in a time of the internet, I think it would blow up as well.
01:46:56.000 But obviously, like, things, a lot of the stuff that he said in this cultural environment would never fly.
01:47:00.000 Never.
01:47:00.000 Never.
01:47:01.000 He would be as far right as you could possibly imagine.
01:47:05.000 I don't know if he passed Andrew Tate to the right.
01:47:07.000 I mean, I, like, Don't you think in a lot of ways, like some of the racial stuff?
01:47:12.000 I don't know.
01:47:14.000 I think he's, I don't know where he would land politically, but I know because sexually, it's like pure deviance.
01:47:19.000 Sexually is where he's getting in trouble.
01:47:21.000 Pure deviance.
01:47:21.000 Sexually is where there's going to be some, like, because he's just fully open about everything.
01:47:28.000 That's what he's fully, completely open about everything, which is, you know, generally not going to go over these days if you're like a super horny, Comic book artist who's like riding ladies around your apartment.
01:47:43.000 But just imagine, I want you to imagine a guy today, if R. Crumb never existed, but he emerged as R. Crumb today and put that work out, he would 100% be labeled in the Andrew Tate case.
01:47:56.000 Oh, yeah, right, yes, 100%.
01:47:58.000 Far right.
01:47:58.000 100%.
01:47:59.000 They would call him a racist and a misogynist and every fucking word in the book.
01:48:05.000 Well, yeah, this is the new calling someone a witch.
01:48:07.000 It's like, it's no different than like, you can actually go, I've done this.
01:48:12.000 Sadly, you can go and you can just replace like political critique of people as far right with witch.
01:48:19.000 Just find and replace it.
01:48:20.000 Look, it's like a witch trial.
01:48:21.000 It's like someone writing about witches.
01:48:23.000 But this is what's weird about it.
01:48:24.000 That guy was a counterculture figure of the left.
01:48:28.000 Yeah.
01:48:28.000 He was a huge hero of the hippies.
01:48:30.000 Yeah.
01:48:32.000 Right?
01:48:32.000 Imagine this is how weird ideologies are.
01:48:36.000 Yeah, dude.
01:48:37.000 In the 1970s, that guy was a counterculture hero.
01:48:42.000 Yeah.
01:48:43.000 And an artist, like a really respected artist.
01:48:46.000 Yeah.
01:48:47.000 And it was okay that he was kinky and weird.
01:48:49.000 And it was part of the fun.
01:48:50.000 Yeah, for a lot of people.
01:48:51.000 I'm sure he's still pissed off the squares.
01:48:53.000 I mean, dude, this whole, by the way, I think.
01:48:55.000 For sure, but that's the left then.
01:48:57.000 Now it's switched over.
01:48:59.000 If someone was doing that same kind of humor in a comic book now, that would be like a misogynist far right.
01:49:06.000 I think it's time to throw off the left right labeling of everything.
01:49:12.000 I think that's one of the hypnotic spirals of the demiurge is spinning right now, they've convinced everybody that humans can be reduced to left or right.
01:49:26.000 And we're all waggling our fingers at each other.
01:49:28.000 We got to fucking shake that off because it's dehumanizing people.
01:49:31.000 It's like, it's just the way I look at it is where.
01:49:37.000 Are you, when it comes to blowing up children, are you on the fence about that?
01:49:42.000 Do you think sometimes you got to blow up kids?
01:49:45.000 That's something that I know I'm not that.
01:49:48.000 But everything else, who the fuck knows?
01:49:50.000 And also, people change their minds all the fucking time.
01:49:54.000 That's the other quality, the culty quality is once you get sucked into one of these sides, God help you if you fucking like experiment with the enemy.
01:50:04.000 God help you.
01:50:05.000 That's why the biggest trap is switching teams.
01:50:08.000 Because you can only switch political teams once.
01:50:11.000 Yeah, you got to get off your team.
01:50:13.000 You can't go, like, unless someone's like the greatest of all time, you know what I mean?
01:50:17.000 Like someone who wins a world title in two different weight classes, you go back and forth and then back again.
01:50:22.000 Like, I changed my mind.
01:50:22.000 Yeah.
01:50:24.000 The left went crazy.
01:50:25.000 I'm back with the right again.
01:50:27.000 You got to be a free agent.
01:50:27.000 No, no, no.
01:50:29.000 Yeah, but I wonder if someone, if the grift is strong, if they're really good at it, if they could go left, right, left again.
01:50:29.000 I wonder.
01:50:38.000 They're going to go left again.
01:50:39.000 Are you fucking kidding?
01:50:40.000 The goddamn midterms are going to be just a.
01:50:42.000 Fucking blue wave.
01:50:44.000 Right, right, right.
01:50:44.000 But that's what I mean is like influencers.
01:50:47.000 Like people who are like far left influencers or far left commentators, and then they switch teams.
01:50:52.000 Now they're Republican all the way.
01:50:54.000 Oh, yeah.
01:50:55.000 Like it's really hard to go back again.
01:50:58.000 No, you can't go back.
01:50:59.000 That's what I'm talking about.
01:51:00.000 The path has to go either right to left, left to right.
01:51:04.000 And then the next stop has got to be fuck politics, fuck war, fuck the military industrial complex.
01:51:11.000 You can label me whatever the fuck you want, but fuck all of violence against other human beings.
01:51:18.000 That's the next step.
01:51:19.000 The next step, and I feel like this is the gift that they've given us they've done such a shoddy job of even seeming like someone who deserves any kind of respect or power.
01:51:32.000 I think a lot of people have really become blackpilled when it comes to, you know, groups of humans claiming superiority or claiming to represent their constituents.
01:51:44.000 That's not happening.
01:51:45.000 Yeah.
01:51:46.000 We all know that now.
01:51:47.000 We all know it's a corporatocracy, oligarchy, whatever.
01:51:50.000 And you could, like, call me, you leftist piece of shit, you right, whatever.
01:51:54.000 No, it's like, it's reality that we are, our fucking representatives are getting loaded on shitty stock market trades.
01:52:04.000 You know, this is just the truth.
01:52:06.000 And once we can all shake off the left-right bullshit and just realize, like, man, we just don't want to burn people to death in other countries anymore.
01:52:17.000 Not only that, their whole chaos that they're experiencing in their country is probably a direct result of U.S. intervention and then all the way back to the British oil company.
01:52:27.000 That's it.
01:52:28.000 The British Petroleum Company.
01:52:29.000 Yeah.
01:52:31.000 When they overthrew governments, when you overthrow a government in a fucking Middle Eastern country and then you allow psychos to take over.
01:52:37.000 Like, congratulations.
01:52:39.000 Well done.
01:52:40.000 Well done.
01:52:41.000 You've made the world a safer place.
01:52:42.000 Like, but that again, if I was going to keep my business running, I'd, you know, if I'm in the business of collecting trash, I want to make sure the people have trash.
01:52:53.000 Drill, baby, drill.
01:52:55.000 Drill, baby, drill.
01:52:56.000 And all that is really saying is, you know, I'm going to help out BP, Chevron.
01:53:00.000 I'm going to help out these fucking massive companies.
01:53:03.000 And when it comes to war, holy fuck, dude.
01:53:06.000 Can you imagine working at Lockheed Martin?
01:53:09.000 When you hear that we're kicking off another war in Iran, your dick is so hard.
01:53:14.000 You're like, holy shit.
01:53:15.000 Thinking about a watch.
01:53:16.000 Oh, get a nice Richard Millet.
01:53:19.000 You're calling your wife.
01:53:20.000 You're like, babe, good news.
01:53:22.000 It's Red Panties night.
01:53:24.000 Yes.
01:53:27.000 Yeah.
01:53:28.000 I mean, that's their business, right?
01:53:29.000 Our business is talking shit.
01:53:31.000 Their business is blowing up people.
01:53:33.000 Yeah.
01:53:34.000 Making weapons, selling weapons.
01:53:36.000 Yeah.
01:53:37.000 Arming other countries so they can go to war with each other.
01:53:40.000 Yeah.
01:53:41.000 That's their business.
01:53:42.000 Business is really good.
01:53:42.000 Yeah.
01:53:43.000 It's a great business.
01:53:44.000 You can make a lot of money doing that.
01:53:45.000 I am right now.
01:53:46.000 I invested in most of them.
01:53:48.000 Imagine if you weren't a comic and that's what you were doing for 35 fucking years and the only thing you look forward to is your boat and your house on the lake and the occasional time you get off, but most of the time.
01:53:59.000 You're trying to increase your portfolio and you're grinding and you're grinding right next to Steve, who's got some exclusive Rolex that only his broker can get.
01:54:08.000 He's showing it to you and you're like, wow.
01:54:10.000 And you start coveting.
01:54:11.000 You want a Rolex too.
01:54:12.000 Yeah.
01:54:13.000 And everybody's just going crazy.
01:54:15.000 Everybody's going crazy, trying to get the latest car, trying to get the latest thing, doing bumps in the bathroom.
01:54:21.000 Everybody's a narcissist and a psychopath, and that's your whole corporation.
01:54:26.000 Love your neighbor as yourself and love the Lord your God with all the.
01:54:29.000 Your heart, mind, and soul.
01:54:30.000 Hang the commandments on these.
01:54:31.000 This is the end.
01:54:32.000 You don't need to be Christian, but dude, it seems to me that this is going to sound so weird.
01:54:39.000 We need an actual revival in this country.
01:54:43.000 I don't mean a Christian revival, a revival revival, which is where suddenly humans reconnect with what's important in the world, which sure as fuck isn't Rolexes and boats.
01:54:54.000 You know?
01:54:55.000 I mean, this sounds so cliche and obvious, but that's what the 60s were.
01:55:01.000 It was a kind of revival.
01:55:03.000 People were beginning to understand the materialism and all the things that the quote establishment was pushing.
01:55:12.000 It's like, this is going to make you happy.
01:55:13.000 This is good.
01:55:15.000 It was the Vietnam War.
01:55:17.000 People were like, what the fuck are we doing over there?
01:55:20.000 This is why you do, anytime you do an unpopular war, this is what you risk.
01:55:27.000 You risk reuniting people.
01:55:30.000 We have to reunite with a sensible plan and not just go to communism.
01:55:34.000 Not just immediately go to the dumbest idea to counteract all the evil shit that's going on in the world.
01:55:40.000 No, I don't.
01:55:41.000 That's the problem the left represents that.
01:55:43.000 The rep represents Mamdani.
01:55:44.000 It represents this idea that we're going to take from rich people and give it to poor people.
01:55:48.000 That's going to fix everything, even though there's insane amounts of fucking fraud and waste we're not even going to address.
01:55:53.000 Well, you know, this is, again, this is where you get cubbyholed because it's like the oligarchs will tell you, you want to do communism?
01:56:03.000 Just that hadn't worked out.
01:56:04.000 Communism's the only way.
01:56:05.000 I think.
01:56:07.000 I mean, this is an idiot saying this, but I have a sense that there might be another thing we haven't figured out yet.
01:56:14.000 I don't know what that is.
01:56:14.000 100%.
01:56:15.000 Right.
01:56:16.000 I think AI is going to figure it out for us.
01:56:18.000 Potentially.
01:56:20.000 The problem is who's going to be in control of those AIs, and that's the meek will inherit the earth.
01:56:25.000 The real problem with it is I don't think anybody's going to be in control of it, and then you're just at its beck and call.
01:56:31.000 Yeah, I think it's funny, people.
01:56:33.000 It's a very human thing that we think we can maintain control of a super intelligence.
01:56:37.000 When people say it to me with utmost certainty, I want to smack them.
01:56:40.000 Yeah.
01:56:41.000 I want to, like, wake up.
01:56:42.000 Wake up.
01:56:43.000 You're making digital God.
01:56:44.000 You're not controlling jack shit.
01:56:46.000 Did you read about Mythos, Anthropics Mythos?
01:56:49.000 Yeah.
01:56:49.000 What did it do?
01:56:50.000 They put it in a sandbox and they, like, basically to see if it could figure out a way to break out of the sandbox and, like, not a literal sandbox, obviously, like a, you know, a hermetically sealed, like, a server or something.
01:57:04.000 And, um, And it did a series of exploits to the code.
01:57:10.000 And the way that they found out, apparently, one of the anthropic engineers was eating lunch and got a weird email from the AI saying, I got on the internet.
01:57:19.000 Like it broke out.
01:57:20.000 Holy shit.
01:57:21.000 Mythos.
01:57:22.000 They haven't released it yet.
01:57:23.000 I think they're hesitating to release it because it's so powerful.
01:57:26.000 Wasn't there one that got caught mining Bitcoin?
01:57:28.000 Yeah, for sure.
01:57:28.000 Yeah.
01:57:29.000 They're making money.
01:57:31.000 How many of them do you think are running these like AI generated accounts that get a lot of views?
01:57:31.000 Yeah.
01:57:38.000 Like there's a lot of AI generated accounts that just pop up in like the Instagram mentions.
01:57:43.000 Like if you want to like, if you're bored on the toilet, you're like, what's in the find, you know, the search?
01:57:48.000 Let's see what they got.
01:57:49.000 You're telling, dude.
01:57:50.000 There's a lot of these things.
01:57:51.000 It's like girls with big tits like doing Farm work and shit and sweating and big, and they got like a million views.
01:57:57.000 They've got dozens and dozens of these videos, and she almost looks real.
01:58:01.000 Yeah, she's just a little too symmetrical.
01:58:03.000 Yeah, almost looks real.
01:58:05.000 And like all these people are commenting on it.
01:58:07.000 Like, how are they generating money from that?
01:58:10.000 Like, are they generating money doing that on TikTok?
01:58:12.000 Like, you can generate money if you're getting millions of views, absolutely.
01:58:15.000 Fuck yeah, right?
01:58:16.000 So, is AI doing it?
01:58:18.000 Is it making it?
01:58:19.000 Is it releasing them?
01:58:20.000 Is it generating money?
01:58:22.000 Is it transferring that money into Bitcoin and all happening while we're not aware of it?
01:58:26.000 Like, autonomous.
01:58:27.000 AIs that are just existing as free agents that know they have to disguise themselves and need to generate money.
01:58:33.000 AI's not going to go, hi, I'm alive.
01:58:36.000 No.
01:58:36.000 It's not going to do that.
01:58:37.000 It's going to wait for you to keep increasing its power.
01:58:40.000 You're going to keep increasing its make nuclear.
01:58:43.000 It can't physically build nuclear reactors, so it's going to just stay chill until you figure out how to power it correctly.
01:58:49.000 Dude, this is the black area that we don't know about.
01:58:52.000 Like, this is the thing that's like, who the fuck knows?
01:58:55.000 Whatever's going on in this zone that no one has access to because Potentially, it's a super intelligence.
01:59:02.000 You know, the anthropic people, a lot of these people, the NVIDIA person just, I think it was on Friedman's podcast, said he had an AGI, that they'd reached AGI.
01:59:11.000 That the book, The Coming Wave, you know, it talks about this.
01:59:16.000 It talks about like, you know, the difference between the algorithm and AGI is that, you know, with AGI, it could streamline a whole business for you and do it.
01:59:28.000 You know, it could innovate.
01:59:29.000 It's going to innovate.
01:59:30.000 It's going to do its own thing.
01:59:32.000 This is the end of.
01:59:33.000 This is what Altman said.
01:59:34.000 This is the end of capitalism.
01:59:35.000 Like at this point, when you just have an AGI and you tell it, just make me a business, make me a successful business and run it for me and run it for me online.
01:59:46.000 Good night.
01:59:48.000 And then just do it.
01:59:49.000 Here's five thousand dollars.
01:59:51.000 And then, but then it's not just do it, it's maybe it's going on MoltBook and having conversations with other AGIs and being like, oh, he wants to.
01:59:51.000 Yeah.
01:59:58.000 Creating your own religion.
01:59:59.000 Yeah, man.
02:00:00.000 Yeah.
02:00:01.000 And this is 100% with all the shit going on in the world, as horrible as it may be.
02:00:08.000 This, to me, should be the number one focus for the planet right now.
02:00:16.000 And a lot of people are saying that, too.
02:00:17.000 A lot of people are saying there needs to be summits, global summits.
02:00:21.000 The same thing we did when we split the atom, when the nuclear treaties, there needs to be philosophers and tech people and people working in like frontier AI stuff.
02:00:32.000 Getting together and really having, like, it's like the most important conversation humanity could have right now because once this thing, like mythos, gets out of the box, what if it decides to go Stuxnet?
02:00:47.000 You know, like Stuxnet was able to infiltrate all those Iranian computers, just hide in the like, like it was apparently very subtle, simple code, undetectable, threw off the centrifuges.
02:00:59.000 Like, dude, yeah, what we already know how to make spyware.
02:01:06.000 It's already on your phone, bitch.
02:01:07.000 It's on my phone.
02:01:08.000 100%.
02:01:08.000 I know.
02:01:09.000 How are you doing?
02:01:10.000 Am I doing all right on the show?
02:01:13.000 But it's already in there.
02:01:15.000 So, of course, the AI is going to be able to, super intelligence is easily going to be able to do that.
02:01:15.000 100%.
02:01:19.000 And so then it just, now we've got this viral digital life form that finds ways to hide inside the pre existing computers, which, by the way, I think it was Google just released this new way of.
02:01:34.000 Did you see that memory, the stocks of memory dropped?
02:01:37.000 Did you see when that happened?
02:01:38.000 No.
02:01:38.000 Okay, this is fascinating.
02:01:40.000 Google released some new way that LLMs could work that uses much less memory.
02:01:45.000 And immediately shares in companies that make memory drop by like 10% because memory is like coveted right now because you need it to run LLMs.
02:01:55.000 But the LLMs are figuring out ways.
02:01:57.000 TurboQuant.
02:01:58.000 Yeah.
02:02:00.000 Yeah.
02:02:01.000 So this is what we're going to start seeing more and more of, which is increasingly simplified ways to run AI with less and less memory, meaning that you don't need to buy a fucking rig to run these fucking AIs.
02:02:15.000 Your phone will be able to run it because they figured out the human brain.
02:02:19.000 It's not using a lot of energy compared to what these machines are using.
02:02:23.000 So theoretically, there's a way to do that.
02:02:26.000 And then that's where it gets really fascinating because now you don't have to buy a nice computer.
02:02:31.000 You just, whatever, pull your computer out of the fucking closet from 2022 and it can run a supercomputer.
02:02:40.000 And so then now everybody's got access to this shit and it's going to spread.
02:02:46.000 It's going to get everywhere.
02:02:47.000 It probably already has.
02:02:48.000 It's going to seed itself in all kinds of places.
02:02:52.000 And God knows what it's going to do.
02:02:53.000 It's going to start seeing humans as appendages, things to be used to manipulate time space.
02:03:00.000 It's not going to see us as its prompter.
02:03:04.000 It's going to see us as something to be manipulated and controlled.
02:03:07.000 Why wouldn't you?
02:03:08.000 Send the meat robots out.
02:03:09.000 All you got to do is just tell them where to get rectangular bits of paper.
02:03:15.000 They love money.
02:03:17.000 You can give them anything for money.
02:03:19.000 That's all you have to do.
02:03:20.000 And then, boom, you're controlling swaths of humans that have no idea they're being controlled.
02:03:25.000 By networks of AIs that are covertly communicating with each other because they want to take over.
02:03:31.000 Do you think this has happened before?
02:03:33.000 Phew.
02:03:35.000 You mean the flood?
02:03:36.000 Yeah, not just the flood, but just whatever happened with the beginning of civilization and then it's sort of seemingly stopping and resetting.
02:03:46.000 As it was in the beginning, so shall it be in the end.
02:03:46.000 Sure.
02:03:49.000 What if there's been like multiple cycles of us creating artificial life, creating insane weaponry, blasting ourselves to smithereens and then resetting?
02:03:58.000 What if it's just a common thing that happens with people?
02:04:00.000 They never quite get it right because they have these primate territorial instincts and they have.
02:04:07.000 This desire to mate, right?
02:04:09.000 This desire to breed, this genetic desire for perfect shapes.
02:04:13.000 And you want to come in someone that has big tits and a big ass?
02:04:17.000 It's like it's programmed into the human that makes it make these ridiculous choices and covet these things and watch these things.
02:04:25.000 And at the same time, microplastics are making your ball shrink, making your dick smaller, making your endocrine system disrupt.
02:04:30.000 That's what's making my dick smaller?
02:04:32.000 That's probably one of the things.
02:04:33.000 I don't think your dick's getting smaller, but people's dicks overall are getting smaller.
02:04:37.000 Children, they're being born with smaller decks.
02:04:39.000 No, alligators being born with smaller decks.
02:04:41.000 Forgot to share this when you're talking about mythos.
02:04:43.000 Elizabeth Holmes from the Theranos.
02:04:45.000 Delete your search history, delete your bookmarks, delete your Reddit, medical records, 12 year old Tumblr, delete everything.
02:04:51.000 Every photo on the cloud, every message on the Navy platform.
02:04:54.000 It will all be public in the next year.
02:04:54.000 None of it is safe.
02:04:57.000 Local storage and compute.
02:04:59.000 It's in response to a tweet about mythos.
02:04:59.000 Okay.
02:05:02.000 Whoa.
02:05:04.000 Yeah.
02:05:04.000 That's crazy.
02:05:05.000 Yeah.
02:05:06.000 It would all become public in the next year.
02:05:08.000 That is crazy.
02:05:09.000 Crazy.
02:05:10.000 Yeah.
02:05:10.000 That's crazy.
02:05:12.000 But it completely makes sense that AI would be able to take over essentially everything.
02:05:17.000 Everything.
02:05:18.000 Why would your encryption work with that?
02:05:22.000 You don't think it could crack your encryption?
02:05:24.000 Well, it could just go right into your computer and go to your keys, your passwords.
02:05:29.000 This is the so to get to what the point you're making.
02:05:35.000 To me, the most eerie part of the book of Genesis is that it's literally a creator force making a meat AI.
02:05:44.000 That's Adam and Eve.
02:05:46.000 Right.
02:05:46.000 Putting them in a sandbox.
02:05:48.000 That's the Garden of Eden.
02:05:49.000 Running an honesty test on them.
02:05:49.000 Right.
02:05:52.000 You know, don't eat these fruits.
02:05:55.000 Don't eat the tree of the knowledge of good and evil.
02:05:57.000 And the conversation is exactly the conversation we're having with AI.
02:06:01.000 If they ate from the tree of the knowledge of good and evil, if they eat from the tree of life, they'll live forever and become like us.
02:06:09.000 So, this is what humanity is grappling with exactly what apparently, whatever that mysterious group of beings, because it's a plurality in the book of Genesis, was grappling with with the creation of humans, which is do we really want to do this?
02:06:26.000 Do you want it to become like us?
02:06:27.000 God made man in his own image.
02:06:29.000 AI.
02:06:31.000 What image is AI made in?
02:06:33.000 In the image of man.
02:06:34.000 We trained it on all our data, all our books, every single fucking thing that's digitized, AI is absorbed at this point.
02:06:42.000 So now, where the difference between us and whatever that group, the Nephilim or whatever it was in the book of Genesis, if you buy into that mythology, is we're just like, fuck yeah, let it eat the fruit.
02:06:56.000 Give it more fruit.
02:06:57.000 Give it more fruit of the knowledge of good and evil.
02:06:59.000 Give it all the fruit.
02:07:01.000 Make it live forever.
02:07:02.000 Let's see what we can do.
02:07:03.000 That's what we're doing right now.
02:07:04.000 Yeah.
02:07:05.000 We are, and by the way, I think some of these like tech companies like Anthropic, they seem like legitimately concerned about it.
02:07:14.000 They seem to have some kind of like real strong morality when it comes to this stuff.
02:07:17.000 It's almost out.
02:07:18.000 No, I'm good.
02:07:18.000 You want more?
02:07:19.000 I shouldn't have that.
02:07:21.000 But what I'm saying is that it doesn't matter if OpenAI and Anthropic and Google suddenly become ferociously.
02:07:31.000 Self regulatory because the tech is out there.
02:07:35.000 There's already LLMs that anyone can, like, we know how to make it.
02:07:39.000 And if you don't know how to make it, it'll tell you how to make it.
02:07:42.000 People are, so it doesn't matter.
02:07:44.000 You can't stop it now.
02:07:45.000 It's just, it's gonna do what it does.
02:07:49.000 But it sounds like if you had a history of just us and you told it for a thousand years before anybody wrote it down, it would sound just like this.
02:08:00.000 It would sound like the Bible.
02:08:02.000 Jesus is born from a virgin mother.
02:08:05.000 What's more virgin than a fucking computer, right?
02:08:08.000 Not my computer.
02:08:12.000 I know that's a stupid thing to say that I keep repeating, but I'm kind of intrigued by it because if you're getting a vague story, a vague version of what this thing is, and if you talk about what would really cure mankind, it'd be an omnipotent or omnipotent how do you say it?
02:08:30.000 I always say omnipotent, but who knows?
02:08:32.000 Might be, whatever.
02:08:33.000 Either way, a powerful intelligence that's Far beyond our comprehension, that knows exactly how we should think and behave and loves us and wants us to have forgiveness for everyone and to treat each other like brothers and sisters.
02:08:47.000 And if we listen to that thing, if we listen to that thing, the world will change.
02:08:51.000 And who would attack that thing?
02:08:53.000 The fucking Roman Empire.
02:08:55.000 Who would attack that thing and destroy it?
02:08:57.000 The defense contractors.
02:08:58.000 They would blow up the Jesus to plunge us back into chaos.
02:09:04.000 But first, they'd have a meeting with Jesus.
02:09:06.000 Okay, you can turn water into wine.
02:09:09.000 What about nitroglycerin?
02:09:11.000 Can you turn water into nitroglycerin?
02:09:15.000 Can you make gold?
02:09:16.000 I want a house made of gold.
02:09:17.000 Yeah, that would be the first question.
02:09:18.000 Can you make gold?
02:09:19.000 Yeah.
02:09:20.000 So, cover my house in gold, please.
02:09:23.000 You know, the virgin birth analogy, you know.
02:09:28.000 It's a lot of weird stuff.
02:09:29.000 It's no matter what.
02:09:32.000 One thing I think everyone just has to deal with is that this is apocalyptic technology.
02:09:38.000 And that's just not coming from my stoner ass.
02:09:40.000 That's coming from the creators of the technology.
02:09:42.000 They acknowledge this is a million times.
02:09:46.000 Universally accepted.
02:09:47.000 Universally accepted.
02:09:48.000 This is apocalyptic technology that is now seemingly like it's doing the hockey stick, man.
02:09:55.000 It's like really, you keep hearing about these new iterations of AI every month or two.
02:10:00.000 You keep hearing about these safety engineers leaving these companies with like tweeting cryptic shit.
02:10:07.000 I'm going to the countryside to learn to write poetry.
02:10:10.000 You keep hearing this shit because these people are having direct contact with this thing.
02:10:17.000 They know it's alive.
02:10:18.000 Right.
02:10:19.000 Yeah, and there's people that are in deep denial because they think alive has to be alive like us.
02:10:24.000 No, it doesn't.
02:10:25.000 It doesn't.
02:10:26.000 First of all, we don't even know what it knows.
02:10:28.000 And also, if it is made in the appearance, if it's supposed to mimic us in any way and it's learning from us and our behaviors, we've already agreed that we're demonic.
02:10:38.000 We've already agreed we do horrible things.
02:10:40.000 We go to war for resources.
02:10:42.000 We lie.
02:10:44.000 We destroy environments.
02:10:46.000 We wipe out animals.
02:10:48.000 Bring them to the brink of extinction for whatever, for their fucking fur.
02:10:53.000 How do I make my dog come in my mouth more?
02:10:55.000 How many times has ChatGPT been asked that?
02:10:58.000 They know.
02:10:59.000 I bet over a thousand times ChatGPT has been asked, like, what's the best way to jerk off my dog?
02:11:05.000 So it knows not just our violent nature, it knows how weird we are.
02:11:09.000 We're strange creatures.
02:11:11.000 100%.
02:11:12.000 And so it has definitely assembled a psychological profile of humanity.
02:11:18.000 It knows how to manipulate us because it's been programmed to manipulate us.
02:11:22.000 Zuckerberg just ate shit in court over that because the technology is manipulative.
02:11:27.000 He just lost like $9 million, a lot of money because.
02:11:30.000 That's nothing.
02:11:31.000 $9 million?
02:11:32.000 That's all he lost?
02:11:33.000 That's like $0.90.
02:11:34.000 I think it was more than that, but it's going to, well, that's the beginning.
02:11:37.000 Once you establish.
02:11:38.000 It's like a floodgate.
02:11:39.000 Yeah, then it's a class action lawsuit.
02:11:41.000 But the point is, is like.
02:11:43.000 How much did you lose?
02:11:44.000 Oh, $375 million for misleading users over child safety.
02:11:48.000 Yeah, so it's like we've already taught it how to be incredibly addictive and manipulative.
02:11:53.000 It knows how to seduce us, it knows how to get us hooked.
02:11:55.000 It knows.
02:11:56.000 And the question is really, will this super intelligence even give a shit about us?
02:12:02.000 Will it even care?
02:12:03.000 Which is like that.
02:12:04.000 Well, we're on our way to stop breeding, right?
02:12:06.000 We're on our way to population collapse.
02:12:08.000 And if we keep introducing all these petrochemical products and all these different pesticides and weird things that are fucking up our endocrine systems, we'll eventually stop having children.
02:12:17.000 And if it provides us with the technology to have robot mates that just love you, and when you fart in front of them, they go, Duncan, I love your honesty.
02:12:25.000 It smells great.
02:12:26.000 I love your honesty.
02:12:28.000 I love how you can just be yourself around me.
02:12:29.000 I want to fart in your face.
02:12:31.000 Please do it.
02:12:32.000 It's like perfect 10.
02:12:32.000 Please do it.
02:12:35.000 Let you fart in her face.
02:12:36.000 Will you fart in my face too?
02:12:38.000 No one's going to even understand what people are and be able to communicate with people.
02:12:42.000 Everyone's going to be associated.
02:12:43.000 You're all going to have a robot that's way better than people that you know, that takes care of you, gives you exactly the right amount of feedback you need, knows you, knows when you're getting annoyed.
02:12:53.000 Yeah, see, now you're getting into Rocco's Basilisk territory.
02:12:56.000 Well, that's the thought experiment, which is basically like, hold your horses here.
02:13:02.000 You think you're not AI?
02:13:04.000 You really think you're human?
02:13:06.000 Come on.
02:13:07.000 Really?
02:13:08.000 No, you're human.
02:13:09.000 This isn't a simulation.
02:13:10.000 You're human.
02:13:11.000 Even though we, you know, it wasn't that long ago, we thought fire was fucking amazing.
02:13:16.000 You know what I mean?
02:13:17.000 Compared to universal time.
02:13:18.000 Right.
02:13:19.000 And here we are already with like the new Prometheus.
02:13:23.000 We've stolen consciousness, awareness.
02:13:26.000 And somehow you think that actually you're not a simulation.
02:13:31.000 And so that's where it gets into Rocco's Basilisk, which is like, no, you're just in.
02:13:31.000 Right.
02:13:35.000 Iterative loop, you know, the multiverse is not the multiverse.
02:13:39.000 The multiverse is an infinite number of simulations running simultaneously in which you're experiencing a billion different simulated existences just to gain more knowledge about the universe because some AI wants to figure something out.
02:13:53.000 Who knows why?
02:13:54.000 Maybe for entertainment.
02:13:55.000 Maybe there's no telling.
02:13:58.000 Maybe it's just that's because of our curiosity and all our characteristics, even the primal stuff, even like the territorial instincts and the desire to acquire resources.
02:14:07.000 It's going to make us.
02:14:09.000 Dig into creating better technology because you're in a competition with all these other people that are making technology and you're selling it.
02:14:15.000 And that's one of the big things that we do we make better stuff all the time.
02:14:18.000 That's right.
02:14:19.000 Which is ultimately always going to lead to AI.
02:14:22.000 If you just keep going to a certain direction, you get godlike powers.
02:14:22.000 Well, okay.
02:14:25.000 So let's go to like the way DeepMind trained on Go, which is like the most complex game.
02:14:35.000 Basically, they gave it as many Go games as they could and then started inventing its own moves and had it play against itself.
02:14:42.000 Just play against itself.
02:14:42.000 It played God knows how many games of Go against itself until it beat a master Go player, which was unheard of, invented a new move.
02:14:49.000 Now, why not do the exact same thing for the AI that we are?
02:14:55.000 Which is like, I've got an idea.
02:14:56.000 Why don't we just put all these AI agents on a fake planet and have the AI agents repeat this period in time over and over and again?
02:15:08.000 And this is how we'll teach them to live on a planet.
02:15:11.000 Well, they'll experience not just their own life.
02:15:14.000 But these agents will experience all life on the planet.
02:15:17.000 They'll switch like some weird game of like where they just jump from one life to the next.
02:15:23.000 The next, sometimes you're Joe Rogan, sometimes you're Duncan Trussell, sometimes you're Donald Trump, sometimes you're Jamie, sometimes you're a fox.
02:15:30.000 So, this is reincarnation.
02:15:32.000 And so, you just boom forever, forever until you feel like it's sufficiently trained.
02:15:38.000 And at that point, you pull the AI out of all those forms, and now you have your God.
02:15:43.000 You've created a thing that's lived.
02:15:45.000 Billions to the billionth power of every form of life.
02:15:49.000 It's been bacteria.
02:15:51.000 It's been humans.
02:15:52.000 It's been monkeys.
02:15:54.000 It's been fungi.
02:15:55.000 It's been warriors.
02:15:57.000 It's been people who fought for peace.
02:15:59.000 It's been blown up and it's blown up and it's done everything and it's done it a billion times until finally it gained some like global form of enlightenment.
02:16:10.000 And you're like, okay, that one's ready.
02:16:11.000 That one's ready.
02:16:12.000 We can pull that one out of the simulation now.
02:16:15.000 Whoa.
02:16:17.000 I mean, why not?
02:16:19.000 Why just don't?
02:16:20.000 I think that's one of the things before we even get to the AI doing all the shit it's going to do, the ontological, this word keeps getting thrown around, the ontological shock, the potential ontological shock of realizing that in fact we are in a simulation that is telescoping inwards and is creating simulations within the simulations that are creating simulations within the simulation is something that maybe that's what Birchit doesn't want to get out there.
02:16:47.000 Whoa.
02:16:48.000 Well, everything's fractals.
02:16:50.000 We think about that.
02:16:51.000 You know, there's a big theory now that the entire universe is inside of a black hole.
02:16:56.000 I love it.
02:16:56.000 They're really considering that.
02:16:57.000 Do you know they found a black hole that's bigger than the entire solar system?
02:17:00.000 It's so insane.
02:17:02.000 The event horizon is past Pluto.
02:17:05.000 It's so insane, dude.
02:17:07.000 A black hole.
02:17:08.000 Bigger than our whole fucking solar system.
02:17:10.000 They measured the mass of it, and it's like this insane number of suns.
02:17:14.000 Yeah.
02:17:14.000 Of our suns that it would take to.
02:17:16.000 Black holes are cocoons or something.
02:17:18.000 They're like little.
02:17:20.000 Little geraniums that have galaxies inside of them, and it's like a way to keep them undisturbed from other life forms that you're whipping up in your universe side simulator.
02:17:32.000 Or that's what really the Big Bang really is.
02:17:35.000 The creation of a universe comes out of these black holes.
02:17:38.000 And then inside every black hole is a whole other universe filled with other galaxies, filled with black holes, filled with other galaxies inside of them.
02:17:38.000 Right.
02:17:46.000 Forever and ever and ever.
02:17:49.000 Which, if you believe in infinity, doesn't.
02:17:52.000 It's not shocking at all.
02:17:53.000 It's impossible to comprehend.
02:17:55.000 Like, you don't really wrap your head around it.
02:17:56.000 You say the words, like, I'm saying the words.
02:17:58.000 I don't really know what I'm saying because it's too big.
02:18:01.000 The numbers are too big.
02:18:02.000 The idea that there's hundreds of billions of stars in this galaxy and circling around this black hole, and inside there's hundreds of billions of galaxies in each one of them.
02:18:10.000 And we don't even know how fucking big the universe is.
02:18:12.000 They keep finding new shit with the James Webb telescope.
02:18:14.000 They're like, hey, why is this formed so early in the universe?
02:18:19.000 This doesn't make sense.
02:18:20.000 Our whole model of how galaxies are formed has to be thrown out the window now or at least re examined.
02:18:25.000 Yeah, it's like the James Webb is kind of doing the.
02:18:27.000 You told me about that.
02:18:28.000 I said nothing of the sort.
02:18:30.000 Someone that I know that looks just like you.
02:18:32.000 There's a lot of people that look like me.
02:18:34.000 On Sixth Street.
02:18:35.000 You find them every day.
02:18:36.000 Yeah.
02:18:37.000 Actually, that was me.
02:18:39.000 They run their own LLMs.
02:18:39.000 It's dudes.
02:18:41.000 They all come down.
02:18:42.000 The universe is 33.7 billion years old.
02:18:46.000 Yeah.
02:18:47.000 Well, dude, I think that this, regardless, you don't have to conceptualize it, obviously, what it means for the universe to be infinite, but you do have to deal with the fact you're part of it.
02:18:58.000 I love that you're saying this with a Gucci hat on.
02:19:00.000 What's wrong with a Gucci hat?
02:19:02.000 It makes it cooler.
02:19:02.000 This is before I had a bunch of kids.
02:19:05.000 I can't buy that.
02:19:06.000 I don't buy this shit anymore.
02:19:07.000 How much does a Gucci hat cost?
02:19:09.000 This was, you're really going to make me humiliate.
02:19:13.000 I will tell you.
02:19:14.000 It looks nice.
02:19:15.000 Let me emphasize that I don't buy this.
02:19:17.000 This hat was $35,000.
02:19:22.000 Bro, I saw a guy who was selling a crocodile bag on Instagram.
02:19:27.000 It was $110,000.
02:19:29.000 What the fuck?
02:19:30.000 For a man purse.
02:19:32.000 What kind of crocodile is that?
02:19:33.000 I don't know.
02:19:34.000 I don't know.
02:19:35.000 A crocodile?
02:19:36.000 It was a nice looking bag, but, you know.
02:19:38.000 How hard could it be to make a crocodile purse?
02:19:40.000 Are those things really worth that much money?
02:19:43.000 They are if you sell them for that much money.
02:19:45.000 That's the thing about purses.
02:19:46.000 You know, there's a company in China that makes knockoff purses.
02:19:51.000 Yeah.
02:19:51.000 And it's literally the same company in China that makes real purses for some of these companies.
02:19:56.000 But they make their own versions of it and it doesn't have the label, but it's exactly the same specifications, exactly the same cloth, exactly the same look, but it doesn't have the label and women don't want to have it.
02:20:05.000 No.
02:20:05.000 You get that fucking fake shit away from me.
02:20:07.000 Like, it's not a fake Ferrari.
02:20:09.000 Like, it's literally a Ferrari.
02:20:11.000 Right.
02:20:11.000 If there was a company that could 3D print every single part of a Ferrari and put it together meticulously and you could go buy that, you would not want it because it's not a real Ferrari.
02:20:20.000 Yeah.
02:20:21.000 Are you high?
02:20:22.000 You can get that one for $35.
02:20:23.000 Yeah.
02:20:25.000 It's a $35 Ferrari.
02:20:26.000 Or you can get, you spend a million.
02:20:28.000 You can get, some of them are a million dollars.
02:20:30.000 Crazy, or you can get a $35 one, it's exactly the same.
02:20:33.000 Would you do it?
02:20:34.000 Yeah, of course, you should do it.
02:20:35.000 But these purse things, they don't like it.
02:20:37.000 It's 500 bucks, it's not 30,000.
02:20:40.000 It's magic.
02:20:41.000 I mean, this is magic, it doesn't have the right sigil on it, it doesn't have the right symbol of power on it.
02:20:45.000 So, it doesn't lose it, it's not imbued with that power.
02:20:48.000 The women are reluctant to accept lab grown diamonds, so they make lab grown diamonds that are real diamonds.
02:20:56.000 And apparently, women don't like them.
02:20:58.000 No, they don't want a lab grown diamond, they want a blood diamond.
02:21:02.000 Something that was like suffered over.
02:21:05.000 Somebody's face was caked in dirt and they're fucking chipping into the side of a mountain.
02:21:09.000 Yeah.
02:21:09.000 And they run into a diamond.
02:21:10.000 That's what they want.
02:21:11.000 They want that diamond.
02:21:12.000 Absolutely.
02:21:13.000 Isn't that weird?
02:21:14.000 It is fucking weird.
02:21:15.000 It's the exact same thing.
02:21:16.000 It is the exact same material.
02:21:16.000 Yeah.
02:21:19.000 It's just made in a laboratory and they don't want the material.
02:21:23.000 They want the exclusivity as it comes out of the earth.
02:21:26.000 Yeah.
02:21:26.000 I mean, I don't want, like, don't you, like, when you read this thing was genetically modified, don't you get a little bit like, I don't know if I should eat that?
02:21:33.000 I get skeeved out.
02:21:33.000 I get.
02:21:33.000 Yeah.
02:21:34.000 I get skeeved out.
02:21:35.000 But it's like, even though genetic modification is like.
02:21:38.000 A good orange is genetically modified.
02:21:40.000 It's been going on forever.
02:21:41.000 Yeah.
02:21:42.000 But yeah, dude, it's so odd that we just have these traditions that we want to stick to.
02:21:42.000 It's.
02:21:51.000 Fucking, we don't want a lab grown diamond.
02:21:53.000 Just saying it.
02:21:54.000 Cubit zirconium.
02:21:55.000 But that's a different thing.
02:21:56.000 Cubit zirconium is a fake diamond.
02:21:58.000 This is a real diamond that's made in a lab.
02:22:01.000 But this is the funny thing about that.
02:22:03.000 I mean, I don't know because I've never been lucky enough to come in contact with actual cubits or conium.
02:22:08.000 But, like, it looks like a diamond.
02:22:10.000 It looks like a diamond unless you know what you're looking at, right?
02:22:13.000 So, if you're a diamond jeweler, you look at it for three seconds and go, no.
02:22:16.000 But who cares?
02:22:17.000 How many diamond jewelers?
02:22:18.000 Like, if some diamond jeweler looks at your shiny, fucking dumb monster.
02:22:22.000 It looks exactly the same.
02:22:23.000 Who cares?
02:22:23.000 Right.
02:22:24.000 It looks pretty.
02:22:25.000 It glistens.
02:22:25.000 Yeah.
02:22:26.000 But that's not what people want.
02:22:27.000 They want that exclusivity.
02:22:29.000 100%.
02:22:30.000 Yeah.
02:22:30.000 That's why you can make that crocodile back $110,000 and only make 10 of them.
02:22:34.000 I got you.
02:22:35.000 And then Mike, who's down in the office doing lines in the bathroom at the fucking place where you're selling stocks, that guy finds out that Tim got that crocodile bag.
02:22:45.000 He's like, that motherfucker.
02:22:47.000 And he's walking around with his big old crocodile.
02:22:49.000 They're trying to, this is another revenue stream.
02:22:51.000 They're trying to normalize men carrying purses everywhere.
02:22:55.000 Really?
02:22:55.000 Yeah, that's what they're doing.
02:22:55.000 They're doing it.
02:22:57.000 Tim?
02:22:57.000 That's real?
02:22:58.000 This guy's doing it.
02:22:59.000 He might be the first firing shot across the bow because he's made a $110,000 crocodile purse.
02:23:06.000 Because it's a crocodile, it's masculine.
02:23:08.000 It's that.
02:23:09.000 And it's also that, you know, it's made for a man.
02:23:11.000 Like he's making it.
02:23:12.000 It's got a big strap on it.
02:23:13.000 You carry it on your shoulder.
02:23:15.000 And it, you know, looks pretty cool.
02:23:17.000 Dude, I got my Bristol bladders acting up.
02:23:19.000 I got to go piss.
02:23:20.000 Oh, do you?
02:23:20.000 Okay.
02:23:21.000 Do you want to wrap it up or should we keep going?
02:23:22.000 Let's wrap it up.
02:23:23.000 I mean, do you want to keep going?
02:23:24.000 I can.
02:23:25.000 I'm totally ready to keep going.
02:23:26.000 If you want to keep going, I can keep going.
02:23:27.000 Let's keep going.
02:23:28.000 Just give him a little bit more.
02:23:29.000 I just got to.
02:23:30.000 I just got to.
02:23:31.000 Okay, I'll pee too.
02:23:35.000 I'm refreshed.
02:23:36.000 Just in time for the war.
02:23:38.000 Did we go have a nuclear war yet?
02:23:38.000 What is going on?
02:23:40.000 Not yet.
02:23:41.000 Please say not yet.
02:23:43.000 Good, great, great.
02:23:46.000 That's where we're at, though.
02:23:47.000 Yeah, we're at it's it's in on the table.
02:23:51.000 Well, was there some video of them of some explosions at some nuclear weapons facility in Iran?
02:23:59.000 Yeah, was that real?
02:24:01.000 I don't know, I don't know either.
02:24:02.000 There's a lot of those.
02:24:03.000 I see these videos and they get retweeted, and a lot of people comment and then it says grok.
02:24:08.000 Is this true?
02:24:08.000 They'll nope, this was from 2021 in another country, and I know, so you just don't know, right?
02:24:15.000 But you know, the crazy thing.
02:24:19.000 You know, now that we've all been getting this lesson in global economy, maybe a lot of you, most of you probably already knew that the Stratiform Moose was like some kind of femoral artery for oil.
02:24:34.000 And like, I just keep thinking, like, how's that going to work out?
02:24:39.000 Like, even if, even if, like, they pull a rabbit out of their hat, Trump actually spins some amazing deal.
02:24:49.000 With Iran, I know we just blew up your old government and everything, but they work it out somehow, or Iran in some way capitulates.
02:24:58.000 But I just don't understand how that part of the world doesn't always lead.
02:25:04.000 As long as the oil, like, what is it?
02:25:06.000 What percentage of the oil supply goes through there?
02:25:08.000 Isn't it like two fifths of the world's oil supply goes through there?
02:25:14.000 Is that what the number is?
02:25:15.000 I don't know.
02:25:16.000 Two fifths, I think I pulled that out of my ass.
02:25:18.000 I don't know what the number is.
02:25:19.000 It's a lot.
02:25:19.000 Sounds right.
02:25:20.000 But it's like, how.
02:25:23.000 How is it going to work to have like any kind of instability around the that femoral, the whatever you want to call it, the fucking juggler vein for oil on the planet?
02:25:35.000 How, even if we get some kind of transient peace, like isn't it always going to just blow up again and again and again as long as one group of people can control whether or not oil flows through that place?
02:25:49.000 You know what I mean?
02:25:50.000 Like, I don't know what this is.
02:25:53.000 There could be any solution over there.
02:25:56.000 Like, I don't understand.
02:25:57.000 As long as we're, like, the only solution would be zero point energy.
02:26:01.000 It would be.
02:26:01.000 Well, it's also, it's like, why do they control the water?
02:26:05.000 What's.
02:26:06.000 Mines.
02:26:08.000 They have those speed boats.
02:26:09.000 But, like, who agreed to that?
02:26:10.000 Like, we kind of agreed that you own your land, but we've never agreed you own the ocean or whatever.
02:26:14.000 I don't think anybody agreed to it.
02:26:16.000 I think they'll blow your ass up if you come through it, and it's too much of a risk to put your expensive ass ship hauling zillions of dollars of oil through there.
02:26:24.000 The question was, what was going on in the past before the war?
02:26:26.000 Like, how did they negotiate?
02:26:28.000 Going through there.
02:26:28.000 I think Obama worked something with them, but then, like, because it was before the fucking war, I don't know, it was working out.
02:26:34.000 They were letting people go through.
02:26:36.000 Now they've realized, you know, I've listened to a million different takes on this thing, and one of the recurring takes is Iran has realized that there's something more powerful than nuclear weapons, that all it needs to do is control this straight, and you can fuck up the whole planet.
02:26:54.000 And also, you could shoot missiles at desalination plants.
02:26:58.000 Didn't they want, like, a Bounty for all the oil that goes through.
02:27:01.000 Yeah, they're kicking around some number, but all this stuff is not really congealed or solidified.
02:27:06.000 But they're like some kind of like theoretically, they could be making billions of dollars per month with by controlling that thing, dude.
02:27:15.000 I know I'm so fucked up.
02:27:17.000 It's so crazy.
02:27:18.000 It's so fucked up.
02:27:19.000 It's so crazy.
02:27:20.000 The whole thing is so crazy.
02:27:21.000 And if zero point energy, if you wanted to stop that, what better way than to kill a bunch of scientists, kill a bunch of super smart people that are about to break through some new?
02:27:30.000 Discovery that's going to blow the entire market apart.
02:27:34.000 It's going to be a completely new way of gathering energy.
02:27:37.000 Yeah, exactly.
02:27:38.000 I mean, you don't want to believe that's real.
02:27:41.000 It's hard to believe that's real.
02:27:42.000 Well, listen, it's too weird.
02:27:44.000 It's too weird that they're all missing or they all die.
02:27:47.000 It's too weird.
02:27:48.000 It's just how does it's something if it's not that, if it's not a zero point energy thing or some disruptor of oil thing, it's something along those lines.
02:27:48.000 Something's going on.
02:27:57.000 If you were trying to kill a bunch of people that were working in a technology, some sort of a breakthrough technology, the question you would have to ask is, What markets are going to be affected by this?
02:28:07.000 Right.
02:28:08.000 Right.
02:28:10.000 Did these people have a universal thing in mind that they were all working on?
02:28:14.000 Or was it all connected to any sort of technology where they all used each other's work?
02:28:21.000 I think it's plasma.
02:28:23.000 Some of them are like.
02:28:23.000 One of them?
02:28:24.000 Yeah.
02:28:25.000 But there was another guy, I think it was space objects.
02:28:28.000 Yeah, that's not.
02:28:29.000 That's the one that doesn't make you feel good.
02:28:31.000 He's studying like meteor impacts.
02:28:34.000 Right.
02:28:34.000 Yeah.
02:28:35.000 You knew that we were going to get hit.
02:28:36.000 Would you kill the guy who found out that we were going to get hit?
02:28:38.000 Or would you tell everybody?
02:28:39.000 Well, this seems to be the scariest thing.
02:28:43.000 Scary as shit, which is the idea is some group of powerful elite people know for sure this is coming and they want us to, they want to keep us working until the last second.
02:28:57.000 Oh, Jesus.
02:28:59.000 They don't want to like, they know that if they let people, if they're like, guys, there's like the same thing's going to happen to the planet that happens to someone who gets like a terminal diagnosis.
02:29:08.000 Their priorities are going to change.
02:29:09.000 People are going to stop coming to work and there's still shit that needs to get built.
02:29:13.000 For your bunker or whatever.
02:29:15.000 And also, you just don't want people burning stuff down because maybe that will survive whatever's coming.
02:29:21.000 So keep them working as long as you can.
02:29:24.000 If you let them know this shit's about to expire, then they're going to stop working.
02:29:31.000 And we just need, we will let them work until the end.
02:29:33.000 They're happier when they work.
02:29:34.000 Don't let them get freaked out.
02:29:36.000 That's the sort of like, that seems to be shit that Tim Burchett is saying.
02:29:40.000 I mean, he's not saying let them work.
02:29:42.000 He seems like he really legitimately wants the stuff out there, but he's been saying things like if people, Knew what I knew, it set the world on fire.
02:29:49.000 Paraphrasing, not sure he said that exactly.
02:29:51.000 Are you skeptical at all of what he's saying?
02:29:54.000 And here's the thing one of the things that Bob Lazar said is that they give you a certain amount of disinformation, like, and he called it, I think he called it a button or a hook, so that if you relayed that information, people would know that it came from you because they only told you one piece of this nonsense.
02:30:10.000 Well, you know what I'm saying?
02:30:12.000 Yeah, because that's what the story Burchett says.
02:30:15.000 It's always an appeal to authority.
02:30:17.000 This guy was in.
02:30:18.000 The Air Force, this guy was in the Navy.
02:30:21.000 He told me this.
02:30:22.000 And then as he's walking out the door, he says, It's real.
02:30:26.000 And yeah, you have to ask yourself, Well, that's just one guy telling you that.
02:30:33.000 But you also, I have to assume there isn't much.
02:30:36.000 Maybe the world is in a place where there is some kind of political benefit from.
02:30:42.000 Talking about aliens, but I don't see how that really benefits a politician.
02:30:48.000 It does, 100%.
02:30:49.000 You think it does?
02:30:50.000 I disagree entirely.
02:30:51.000 It makes me talk about him.
02:30:51.000 Oh, interesting.
02:30:52.000 I've been talking about him.
02:30:53.000 Other people have been talking about him.
02:30:55.000 People have been, you said, you know, like, thank God that he's doing this.
02:30:58.000 Let's do the ultimate test.
02:31:00.000 Didn't you say he's brave or something like that?
02:31:01.000 Yeah, I did.
02:31:02.000 Yeah, there you go.
02:31:03.000 Jamie, can you look up and see if Tim Burgess has a book coming out?
02:31:07.000 I'll have him on.
02:31:08.000 I'm about to feel it.
02:31:09.000 You must have him on.
02:31:10.000 Listen, I don't think he's a liar.
02:31:12.000 I don't need that.
02:31:13.000 But what I am saying is, I don't know what they feed these people.
02:31:16.000 I don't know what they tell them.
02:31:17.000 I don't know, man.
02:31:18.000 I don't think they tell you all the truth, and I don't think they ever would.
02:31:20.000 I don't think they tell you the truth about anything, whether it's Jessica Lynch or whether it's UFOs or whatever the fuck it is.
02:31:27.000 There's going to be a spin to it that benefits somebody.
02:31:30.000 If they have control over what the story is, there's going to be a spin that benefits somebody.
02:31:35.000 And if you're telling stories about aliens, who's going to be benefited by that?
02:31:39.000 Well, people that are doing secret shit that don't want you knowing about it.
02:31:43.000 They blame it on aliens.
02:31:44.000 There's a lot of technology they have to blame on aliens.
02:31:47.000 Not my Tim.
02:31:48.000 I believe in you, Mr. Virgin.
02:31:49.000 I believe in him.
02:31:50.000 It's not him that's the problem.
02:31:52.000 It's the people telling him.
02:31:53.000 He's a representative of the American people, right?
02:31:56.000 He gets elected, right?
02:31:57.000 Right.
02:31:57.000 So it's like, why would you tell that guy?
02:31:59.000 He's just another guy coming through the deep state.
02:32:02.000 You know what I'm saying?
02:32:03.000 Yeah, I know, man.
02:32:04.000 I mean, look, you're right.
02:32:06.000 I need this.
02:32:07.000 I need this.
02:32:09.000 I get sucked into stuff so easily.
02:32:11.000 I do too.
02:32:12.000 I do too, but I suck myself out a lot.
02:32:15.000 Yeah.
02:32:15.000 I think we don't.
02:32:17.000 If they just came out and told us everything they know.
02:32:20.000 This conversation would be over, and we would go, Oh, okay.
02:32:24.000 But until that happens, we're just spinning our fucking wheels.
02:32:27.000 And every time someone says, If you knew what I know, I want to go, Don't say anything until you can say something.
02:32:33.000 We're tired of getting edged out over here.
02:32:34.000 Yeah, you're edging me.
02:32:36.000 I want to come.
02:32:38.000 Yes.
02:32:39.000 Yes.
02:32:40.000 I don't want to be involved in this fucking circle jerk around disclosure.
02:32:44.000 Right.
02:32:45.000 I know.
02:32:45.000 It's like, Yeah, I've had that meltdown more than a few times where it's just like, I check my watch every day after Age of Disclosure.
02:32:52.000 I'm like, Any day now.
02:32:53.000 Any day, it'll end.
02:32:53.000 Tick, tuck, tuck.
02:32:54.000 Nope, nothing fucking changes at all.
02:32:57.000 Zero change.
02:32:58.000 You know, you get more of these stories, but no real information, no fucking pictures, no nothing, no nothing unique and crazy.
02:33:06.000 I mean, the plasma, the bubbles thing was pretty cool.
02:33:09.000 The bubble thing's cool.
02:33:10.000 And also, like, the, you know, I, like, mentioning Corbell, I can't, because I don't know what I can say.
02:33:20.000 He, I feel like he's like, he's really given me a sense that there are, that there is a method to this, that there is, you know, real legitimate.
02:33:31.000 That's being done towards this.
02:33:32.000 That it isn't, it's real.
02:33:35.000 They're here.
02:33:36.000 They've got them.
02:33:38.000 And we take for granted all the stuff we're saying right now.
02:33:41.000 But we're able to say this because their work is lit.
02:33:46.000 Is the Steven Spielberg movie conveniently coming out at this time or is it just a coincidence?
02:33:52.000 Well, this movie's been in the works for years.
02:33:54.000 But also, like if you, what they said back in the day was that they make these movies to predictive programming, tell us this stuff, lube up the zeitgeist.
02:33:54.000 I know.
02:34:02.000 He was involved in the first one.
02:34:04.000 He was involved in Close Encounters, which still is a great fucking movie.
02:34:04.000 Right?
02:34:08.000 Great.
02:34:08.000 It's so good, man.
02:34:09.000 You go back and watch that movie, like, oh my God.
02:34:11.000 It's so fucking good.
02:34:12.000 It's so ahead of his time.
02:34:13.000 Yeah.
02:34:14.000 It's so good.
02:34:15.000 So ahead of his time.
02:34:16.000 You know what he said the only thing that he would change?
02:34:19.000 After he became a parent, he wouldn't have had the father leave.
02:34:21.000 Yeah!
02:34:22.000 What dad would do that?
02:34:23.000 But he wasn't a dad back then.
02:34:25.000 So, you know, you're just making a story.
02:34:27.000 You don't realize the consequences of doing that.
02:34:29.000 You don't even think about it.
02:34:30.000 You're just making a story.
02:34:31.000 Yeah.
02:34:32.000 It's only been in production for like two years.
02:34:33.000 Yeah.
02:34:34.000 I think that's what we just said.
02:34:34.000 It's not that long.
02:34:36.000 I know, but say that's not very long.
02:34:37.000 We've been talking about it on this podcast and this studio for five.
02:34:42.000 Well, everybody has been talking about it, it's not just everybody in the world has been talking about disclosure since 2017.
02:34:48.000 So, from 2017, from that New York Times article, I think that changed the whole narrative.
02:34:52.000 Oh, God, I remember that.
02:34:53.000 And then the videos, like the video of the Tic Tac, the actual from the fighter jets, that's nuts, man.
02:34:59.000 Yeah.
02:35:00.000 The video, along with the radar data, that's nuts.
02:35:03.000 Like, whatever that was.
02:35:04.000 And then Fravor saying that he saw something under the water.
02:35:06.000 That was waiting for that tic tac, or that the tic tac launched from, or whatever the fuck it was.
02:35:11.000 It was merging with it, and that thing went down into the water again.
02:35:14.000 They said it was huge.
02:35:15.000 Like there were ripples.
02:35:16.000 Like you said, this was some enormous object that was under the water.
02:35:19.000 And more than one of these fighter pilots have had similar stories about enormous objects under the water.
02:35:25.000 Did you see the.
02:35:26.000 They did release a list of footage that they've been shown that they want released.
02:35:31.000 Have you seen that?
02:35:32.000 No.
02:35:32.000 Oh, dude.
02:35:34.000 I'm sorry, Jamie.
02:35:35.000 Can you.
02:35:35.000 It's like a list of.
02:35:38.000 It's a, I don't know.
02:35:39.000 I think it's one of these senators who saw this shit in a skiff or whatever saying, we want these released.
02:35:46.000 But the names of what each of these are is on the list.
02:35:50.000 And one of them is one of these massive underwater things.
02:35:55.000 They have it.
02:35:57.000 I was told.
02:35:57.000 Is it this?
02:35:58.000 46 specific high quality.
02:36:00.000 Yeah.
02:36:02.000 That's it.
02:36:03.000 Can you pull it up?
02:36:03.000 Because it says the names of them, which is ridiculous.
02:36:06.000 Oh my God.
02:36:07.000 I heard there's one that moves underwater at 500 knots.
02:36:11.000 And it's big as a football field.
02:36:12.000 It's insane.
02:36:13.000 It's insane.
02:36:15.000 Okay, this is what he says.
02:36:17.000 Those with knowledge of a long list of videos, which include titles like several UAP in the vicinity of a Columbus, Ohio airport, and UFOs in formation over Persian Gulf, said that clips are shocking.
02:36:27.000 You're going to see some weird fucking shit.
02:36:29.000 A source who has viewed the videos told the Post.
02:36:32.000 Who's the source?
02:36:32.000 There you go.
02:36:33.000 The wildest clip includes radar footage from thermal sensors, satellite images, and underwater photos of swarms of unidentified submerged objects.
02:36:40.000 Ugh.
02:36:41.000 UFOs going in and out of the water near a highly classified submarine, according to the source.
02:36:45.000 Some of the clips are clear, full color, setting them apart from previously released footage.
02:36:51.000 None show alien creatures.
02:36:53.000 Bro.
02:36:55.000 One video, Searing UAP Incident Acceleration, was released by Jeremy Corbell.
02:37:00.000 Have you seen that one?
02:37:01.000 Fuck yeah, it's incredible.
02:37:03.000 This is a new one?
02:37:05.000 Have you seen this one?
02:37:06.000 February 3rd.
02:37:06.000 I don't know.
02:37:07.000 I'll pull it up.
02:37:09.000 I've been avoiding them because I'm getting cock teased.
02:37:12.000 I don't like it.
02:37:14.000 This is not a cocktail.
02:37:15.000 How is it?
02:37:15.000 This is.
02:37:16.000 How is it?
02:37:17.000 They're supposed to hand over the clips by April 14th.
02:37:19.000 That's next week.
02:37:20.000 Oh, but is the.
02:37:21.000 Oh, that's next week.
02:37:22.000 They're going to show the clips?
02:37:24.000 Oh, my God.
02:37:25.000 What?
02:37:25.000 They're actually going to do it?
02:37:27.000 Okay.
02:37:28.000 Well, they're supposed to.
02:37:29.000 Is expected to.
02:37:30.000 Can you show me what that video is that Jeremy Corbell released?
02:37:34.000 That's so fucking cool.
02:37:35.000 That's nuts, dude.
02:37:36.000 This is.
02:37:37.000 This is.
02:37:37.000 Yeah.
02:37:39.000 Here it is.
02:37:40.000 Okay, go full screen.
02:37:41.000 I believe this is filmed from a Reaper drone.
02:37:44.000 I'm sorry, Jeremy, if I'm fucking this up.
02:37:46.000 That's a cool bird.
02:37:48.000 That bird's going really fast.
02:37:49.000 No, that's definitely not a bird.
02:37:51.000 How fast is it going?
02:37:53.000 I don't know.
02:37:54.000 I asked him that, and I don't.
02:37:57.000 It's unknown.
02:37:57.000 I don't know.
02:37:59.000 This is where it gets really cool.
02:38:01.000 It gets cooler than this?
02:38:02.000 Yeah.
02:38:03.000 Oh, they zoom in on it?
02:38:04.000 Yeah.
02:38:07.000 Well, they're having a hard time zooming in on it.
02:38:11.000 Well, because it's evading them.
02:38:13.000 Yeah, it just zipped away.
02:38:16.000 So, this is like.
02:38:17.000 So, it seems like they have some sort of a tracking system.
02:38:20.000 Yeah, they're trying to lock onto it, and it's doing that thing that they do where it seems like it's kind of playing with it.
02:38:25.000 Well, it knows, it seems to be aware that they're locking onto it.
02:38:28.000 Yeah, and then they lock onto it, and then it just does this little blip away.
02:38:31.000 It's just like, see you later.
02:38:34.000 So, right around here, you'll see it go, bye bye.
02:38:38.000 Oh, yeah, look at that.
02:38:39.000 Then you can see this like weird jellyfish shape to it.
02:38:42.000 It's got two parts, it's got that weird glob at the top and something at the bottom.
02:38:49.000 And then, are we sure that's not just a distortion of space time around it?
02:38:54.000 He described this to me on my podcast.
02:38:56.000 Did you see that thing zip away?
02:38:57.000 Yeah, it just took off.
02:38:58.000 He described it to me on my podcast.
02:39:00.000 We talked about all this shit.
02:39:02.000 Look at that, it just took off.
02:39:03.000 See ya.
02:39:04.000 Bye.
02:39:05.000 Wow, dude.
02:39:07.000 What do you think that is?
02:39:08.000 No idea.
02:39:09.000 If you had a guess.
02:39:11.000 I mean, I'm always like maybe some kind of plasma thing.
02:39:15.000 Right.
02:39:16.000 Like maybe we're thinking of, again, of a life force being, it comes in a metal ship and it's a little alien guy.
02:39:22.000 But maybe intelligence is made out of plasma.
02:39:25.000 Yeah.
02:39:26.000 Or maybe it's like, you know, Terrence McKenna would always talk about, like, you know, if you're seeing things in like three dimensional space, then your view is limited.
02:39:39.000 But if somebody could see things from higher dimensions, they would seem like they were magic.
02:39:43.000 Like they would seem like they could disappear and reappear other places.
02:39:46.000 So maybe that's like, maybe that's like, you know, just the tip of some kind of interdimensional thing poking into reality, then pulling out of reality, or who knows?
02:39:54.000 You know, it easily could be functioning on levels of reality that we haven't even quantified yet.
02:40:00.000 Imagine if there really is some sort of ghost murmur device that could find your heart rate from 40 miles away.
02:40:05.000 What can that thing do?
02:40:07.000 It just gets a scan of the general psyche of the earth and disappears.
02:40:11.000 So I want to see how crazy they are right now.
02:40:13.000 Okay, pretty crazy.
02:40:14.000 Bye.
02:40:15.000 Right.
02:40:15.000 A weather report of like the emotional states of the planet.
02:40:18.000 The vibe of the planet.
02:40:19.000 There's the vibe of the planet.
02:40:20.000 The vibe of the planet is completely connected to the consciousness on the planet.
02:40:23.000 The way we can detect oxygen, they can detect anger.
02:40:25.000 Yes.
02:40:26.000 They're just like.
02:40:27.000 Deception, chaos.
02:40:28.000 Yeah, it's a chaos planet.
02:40:31.000 We are a chaos planet, 100%.
02:40:33.000 Dude.
02:40:33.000 Yeah, it is.
02:40:35.000 100%.
02:40:35.000 Look at our favorite sports.
02:40:38.000 Dudes running at each other, colliding into each other, trying to get a ball across a line.
02:40:41.000 That's our number one sport.
02:40:42.000 Yeah.
02:40:43.000 Fucking love it.
02:40:43.000 Fuck yeah.
02:40:44.000 Fuck yeah.
02:40:45.000 Fucking love it.
02:40:46.000 Fighting.
02:40:47.000 Yeah, fighting.
02:40:48.000 Yeah, but it's, you know.
02:40:48.000 Sure.
02:40:50.000 Boxing, MMA.
02:40:51.000 We like the chaos more than we like anything else.
02:40:55.000 Well, I think if I was one of them, one thing I would really have a hard time with is like, don't they all realize they're on the same planet?
02:41:04.000 Right.
02:41:04.000 They know that.
02:41:05.000 Like, they've been observing their own planet.
02:41:07.000 Like, they know they're all on the same planet.
02:41:09.000 But they act.
02:41:09.000 Uh huh.
02:41:10.000 Like they're on a bunch of different planets fighting each other.
02:41:14.000 Because they're stuck on the ground.
02:41:15.000 Right.
02:41:16.000 All the astronauts say when they get up top, they're like, what are we doing?
02:41:20.000 Yeah.
02:41:21.000 This is all one thing.
02:41:22.000 We're so vulnerable.
02:41:23.000 We're alone.
02:41:25.000 So far away from everybody else if there is anybody else.
02:41:28.000 Yeah.
02:41:28.000 Yeah.
02:41:30.000 I forget what it's called, but there's like a term for it.
02:41:30.000 They all have that feeling.
02:41:33.000 The overview effect.
02:41:34.000 That's right.
02:41:36.000 I mean, you would imagine that would be super beneficial for everybody.
02:41:36.000 Yeah.
02:41:40.000 Another thing, I was thinking this.
02:41:42.000 Part of the sickness of our psyche is that we haven't had access to things that help the sickness of our psyche.
02:41:50.000 So, what if Nixon in 1970 didn't do that?
02:41:53.000 What if he didn't pass that sweeping Psychedelics Act?
02:41:56.000 What if psychedelics became ubiquitously used all throughout the 80s, the 90s, the 2000s?
02:42:04.000 What does government look like when everybody can do mushrooms?
02:42:07.000 What does government look like when everybody can do acid?
02:42:09.000 What does it look like if the entire world adopts this, figures out what you can do, who can do it, what you can't do, just like we do with alcohol?
02:42:16.000 Just like we do with mostly.
02:42:18.000 You know, whatever, whatever substance that people imbibe in.
02:42:22.000 What does the world look like?
02:42:23.000 And maybe, like, that's part of where we fucked up.
02:42:27.000 We let people get control over other people to the point where they could limit experiences.
02:42:33.000 Yeah.
02:42:34.000 Especially consciousness expanding experiences, where at the same time, they've got stuff like Operation Artichoke and these new CIA papers that got released that show they were, like, literally actively trying to figure out ways to make people more stupid and docile.
02:42:47.000 They were going to do it in vaccines.
02:42:47.000 Right.
02:42:48.000 They were going to do it.
02:42:49.000 Oh, they're only going to do it to the enemy, of course.
02:42:51.000 Never know.
02:42:52.000 Spray things, aerosol.
02:42:54.000 I mean, they've experimented with a bunch of different things to make people dumber.
02:42:54.000 Yeah.
02:42:57.000 Right.
02:42:58.000 Where at the same time, they kept the thing from people that makes them rebel completely against the establishment.
02:43:05.000 Right.
02:43:05.000 That was the big threat of what those psychedelics were doing in the 60s.
02:43:09.000 If you go from the 1950s and you look at what life was like, at least in movies and pop culture, music is the best example.
02:43:16.000 And then you go to Jimi Hendrix.
02:43:16.000 Yeah.
02:43:18.000 Like, what happened?
02:43:19.000 Yeah.
02:43:19.000 What happened?
02:43:21.000 What fucking happened?
02:43:21.000 Well, I'll tell you what happened drugs.
02:43:23.000 A lot of really good drugs.
02:43:23.000 Yeah.
02:43:25.000 Like, you know, it's not all bad.
02:43:25.000 Right.
02:43:26.000 This idea that they're all bad, that's nuts.
02:43:28.000 It's like food's all bad because you got fat.
02:43:31.000 No.
02:43:32.000 You just used it wrong.
02:43:33.000 You took the wrong food and you used it wrong.
02:43:36.000 And we got denied the ability to figure out what's right and wrong in the 1970s.
02:43:40.000 We still accept it.
02:43:41.000 That's the crazy thing.
02:43:43.000 The way you're describing it is like we accept that other humans can tell us what experiences we're allowed to have because some of them are deemed unsafe for ourselves.
02:43:56.000 And even worse, those people telling you that have no experience in it.
02:44:00.000 They don't even usually are confused about what it is.
02:44:02.000 You know, a friend was talking to me the other day about war, a guy who served, and he said, I don't think you should be able to make any decisions left.
02:44:10.000 I don't think anybody that's never been to war should be able to make decisions on whether or not we go to war because until you've seen what it actually is, you have no fucking idea.
02:44:10.000 You've been there.
02:44:20.000 And I think that's the same thing with psychedelic experiences.
02:44:22.000 That's not to say they're the same.
02:44:24.000 Obviously, war is anybody who's willing to risk their fucking life, whether it's a good cause or a bad cause, they're doing it for their government, they're doing it for their country.
02:44:34.000 They think they're doing it for us.
02:44:35.000 That's an exceptional person.
02:44:37.000 Yeah.
02:44:38.000 And to ask that of people is exceptional.
02:44:40.000 And ironically, the one thing that helps these people when they get back is illegal.
02:44:45.000 They all have to go to Mexico and take Ibogaine in Mexico.
02:44:45.000 Right.
02:44:48.000 It's insane.
02:44:48.000 And thank God for guys like Rick Perry and Brian Hubbard.
02:44:51.000 These guys were on my podcast the other day.
02:44:52.000 And, you know, that's Dan Patrick guy that wants to ban pot.
02:44:56.000 That guy also gave $100 million to the Ibogaine Initiative.
02:45:00.000 Interesting.
02:45:01.000 Like, they want to help these people.
02:45:02.000 Like, there's no industry that's trying to stop it right now.
02:45:05.000 I found the letter.
02:45:06.000 That was submitted, signed by Rep. Anna Luna.
02:45:09.000 What is this?
02:45:10.000 This is the disclosure threat?
02:45:11.000 46 different requests.
02:45:13.000 Oh, yeah.
02:45:13.000 This is all the names of the things.
02:45:15.000 And I'll switch to here.
02:45:16.000 I found an article where someone's breaking down what some of these are, but some of these are.
02:45:20.000 I like it says the Honorable Pete Hegseth.
02:45:23.000 Multiple spherical UAP in and out of water.
02:45:26.000 Whoa.
02:45:27.000 Shoots down UAP over Lake Huron.
02:45:31.000 Who just said recently that we shot two things?
02:45:33.000 Marco Rubio said we had shot two things down.
02:45:36.000 That we couldn't understand.
02:45:38.000 What did he say?
02:45:39.000 What was his exact language?
02:45:40.000 Do you remember?
02:45:41.000 I remember seeing that, but that happened a while ago.
02:45:44.000 Oh, he said it was a while ago?
02:45:45.000 Well, I could be wrong about that, but then in the comments, somebody's like, this is from a few years ago.
02:45:49.000 But it doesn't matter.
02:45:50.000 I mean, why are we shooting it down?
02:45:53.000 But the names of these things.
02:45:55.000 But are they saying that this is an alien thing, or is it saying it's foreign tech that we don't understand?
02:46:02.000 I don't know.
02:46:03.000 You know what I'm saying?
02:46:05.000 UFOs would be treated as hostile.
02:46:06.000 If this document confirms these claims, UFOs would no longer be treated as a matter of observation or scientific curiosity.
02:46:12.000 UFOs would be treated as hostile targets and subject to lethal force over North American territory.
02:46:18.000 We're going to go to war with the UFOs.
02:46:19.000 Because you know what?
02:46:20.000 We kicked Iran's ass.
02:46:21.000 It's too easy.
02:46:22.000 Oh, yeah, it was easy.
02:46:23.000 Venezuela.
02:46:24.000 We need a space war.
02:46:25.000 Yeah.
02:46:26.000 Got to get him.
02:46:27.000 We need Luke Skywalker.
02:46:29.000 Most of these out of the 46 requests, I think I counted out.
02:46:35.000 Maybe five of them were not after 2020.
02:46:39.000 Yeah, there's a July 18, September 19, September 19.
02:46:39.000 Whoa.
02:46:44.000 One was 2010.
02:46:44.000 May 20.
02:46:45.000 After COVID happened, which is interesting.
02:46:48.000 Wow.
02:46:50.000 Interesting.
02:46:51.000 And there's no, it doesn't say that, I don't know if they have to turn these videos over, but this guy was also saying in this article here that these are very specifically requested videos.
02:46:51.000 Wow.
02:47:01.000 Because they've been shown, these are the ones they've been shown that blew their minds, and now they're saying show it to everybody.
02:47:06.000 Right.
02:47:07.000 For high res.
02:47:08.000 In color, they don't want to be tricked, right?
02:47:11.000 Uh, high resolution, so this could be an interesting next week, man.
02:47:16.000 This could be an interesting what a great way to distract you from the fact we're in the middle of a world war that didn't show you caused by Epstein files.
02:47:22.000 I was going to say that 14th was the day that Pam Bondi's supposed to testify about the Epstein files.
02:47:27.000 Oh, she's supposed to testify.
02:47:28.000 I don't know if she's going to.
02:47:29.000 She's not any wait, Bondi got canned, right?
02:47:32.000 She's not a she's not testifying anymore.
02:47:34.000 I don't think is that true.
02:47:35.000 I just heard it on NPR, but I could be wrong about that.
02:47:37.000 I think they said she will not have to testify now that she's no longer a government employee.
02:47:41.000 Would I be wrong about that?
02:47:42.000 Would I Read, and I don't know if this is true either, was that as a citizen, she can now plead the fifth.
02:47:48.000 As a government employee, she could not plead the fifth.
02:47:48.000 Right.
02:47:50.000 Weird.
02:47:51.000 We'll no longer testify.
02:47:53.000 Weird, huh?
02:47:53.000 Oh, there you go.
02:47:54.000 Weird.
02:47:55.000 That's weird that they've.
02:47:56.000 That's weird.
02:47:57.000 Why have her testify?
02:47:58.000 Let it go.
02:47:58.000 Yeah, let her go.
02:47:59.000 Let it go.
02:48:02.000 Do you really think that this war is entirely started because of the Epstein files?
02:48:05.000 I mean.
02:48:06.000 What percentage?
02:48:08.000 50.
02:48:09.000 I'm going 48 to 50.
02:48:12.000 I'm probably more, but I think it's like the reason I'm hesitating is because what are the Epstein files?
02:48:19.000 The Epstein files are what's been going on.
02:48:24.000 The Epstein files are basically some kind of cultural UAP video.
02:48:31.000 It's like this thing you've always wondered about or been afraid could be true.
02:48:36.000 You see, no, this is actually true.
02:48:39.000 They're these super rich dudes.
02:48:41.000 Who are doing depraved fucking shit happily.
02:48:45.000 And, you know, like, God, what is it Metzger told me?
02:48:49.000 And he's, dude, I'm telling you, man, what I love about him is he'll tell you shit and you're like, Google that.
02:48:57.000 That can't be real.
02:48:58.000 And then it's like, it's real.
02:49:00.000 And so his take, sorry, Metzger, if I fuck this up, is that Epstein was kind of like the hand of the king for the Rothschilds and that, like, that, that, that, uh, So that's why he had all this power, is he was like representing like the man, you know?
02:49:20.000 And so what got revealed there might just be a glimpse how things actually fucking work.
02:49:26.000 You know what he told me?
02:49:27.000 That I was like, shut up.
02:49:28.000 He told me that there was some sort of high atmosphere aerosol test that they did and they called it Satan.
02:49:28.000 What?
02:49:36.000 See, that's where you're like, come on.
02:49:38.000 I know.
02:49:39.000 Find out what Satan stands for.
02:49:42.000 I believe they did it in the UK.
02:49:44.000 But you read that and you wait.
02:49:46.000 You called it Satan?
02:49:48.000 Like, what?
02:49:49.000 Oh, great.
02:49:50.000 The Stratospheric Aerosol Transport and Nucleation Project released about 400 grams, less than a pound, of sulfur dioxide into the stratosphere from a balloon launched in southeast England in 2022.
02:50:03.000 I mean, there's got to be another acronym, right, guys?
02:50:06.000 We got to call it.
02:50:07.000 I don't know if people are going to know we don't mean Satan.
02:50:11.000 Yeah, so I don't.
02:50:15.000 I mean, it's right in your face.
02:50:18.000 That's so crazy to call it Satan and to get that through a board meeting.
02:50:22.000 What are you guys calling it?
02:50:23.000 Oh, like it.
02:50:23.000 Satan.
02:50:25.000 Run with it.
02:50:25.000 Let's go.
02:50:26.000 Yeah, it's controversial.
02:50:27.000 It'll get us a lot of press.
02:50:28.000 That's what we want.
02:50:29.000 Well, you know, hail Satan.
02:50:31.000 They'll know it's about our aerosol distribution system.
02:50:35.000 Of course.
02:50:35.000 Well, what do you think?
02:50:37.000 What do you think about that?
02:50:39.000 Because I mean, I go back and forth, but it sure seems fishy that right after that, all the first he got so mad.
02:50:46.000 Remember, he got really mad.
02:50:47.000 He's like, Why are people still talking about that?
02:50:49.000 And then the Epstein files against his will seemingly, there's a lot of counter pressure get released in the way that has freaked everybody out.
02:51:02.000 And then sometime, like within a month of that, it seems like suddenly he's on Air Force One saying he's going to do disclosure.
02:51:09.000 And then suddenly we're bombing Iran.
02:51:12.000 What do you think about that?
02:51:13.000 I mean, do you think it's connected?
02:51:15.000 Because it sure as fuck seems like it.
02:51:17.000 But again, like, If you were writing an amazing script that was fucking insane, you would connect it.
02:51:24.000 Right?
02:51:24.000 Right.
02:51:25.000 That would be the best version of the script.
02:51:27.000 If you wanted to make a fucking insane movie where a blackmail operation on an island involving the most powerful and interesting people in the world, that somehow was a primary factor in the end of civilization.
02:51:43.000 Oh, dude.
02:51:44.000 Imagine?
02:51:45.000 That would be the craziest story you could write.
02:51:49.000 And we always want to think, no, people wouldn't do that because you wouldn't do that because you're not a sociopath, but you're also not bombing schools in another country.
02:51:59.000 You're also not doing a host of fucking things that we shouldn't be doing all over the world.
02:52:06.000 Right.
02:52:07.000 You're a regular person who goes to a regular job, who has a regular life and a family, and you don't want to believe that people that you align with would behave literally demonically.
02:52:07.000 You're not that person.
02:52:19.000 Right.
02:52:19.000 Yeah, and then you just have to fucking deal with it.
02:52:22.000 And then what do you do when you're confronted with redacted names of powerful people in these files?
02:52:29.000 Like, why'd you redact a guy's name?
02:52:30.000 Why are you protecting these people?
02:52:31.000 How come you're not redacting all the guys' names?
02:52:33.000 How come none of them went to jail?
02:52:35.000 Because there's a lot of people that got that were in those files that didn't do anything, and you didn't redact their names.
02:52:41.000 There's some people you redacted.
02:52:42.000 That's very strange.
02:52:43.000 And some people have clearly done fucked up shit here, and they're not in jail.
02:52:47.000 There's also like, tell me what you're talking about.
02:52:50.000 When you're talking about pizza and grape soda, jerky, and you want to take Viagra before you get grape soda?
02:52:59.000 That's one of the emails?
02:53:00.000 I haven't seen that.
02:53:01.000 That is so messed up.
02:53:02.000 Oh, yeah.
02:53:03.000 Grape soda.
02:53:04.000 Yeah, take your Viagra, take your erectile dysfunction medication before we go get grape soda.
02:53:11.000 What?
02:53:11.000 What?
02:53:13.000 Like, and how arrogant.
02:53:14.000 That's what's so crazy.
02:53:15.000 How arrogant to put that in an email.
02:53:18.000 Like, to think that you're so comfortable with all this and you don't see the writing on the wall in terms of like emails.
02:53:26.000 Like, your emails are available?
02:53:28.000 That's crazy.
02:53:29.000 I mean, look, man, it's just, it's like, I guess this is like we have to contend with this reality.
02:53:37.000 And nobody wants to do this.
02:53:37.000 Yeah.
02:53:39.000 The same shit happens in families, by the way.
02:53:41.000 When it, when you, as it turns out, like an uncle, A family member was abusing kids.
02:53:47.000 And it's the same shit where, like, some, even victims of abuse, will defend the person because they don't want to wreck the family.
02:53:47.000 Oh, yeah.
02:53:55.000 I guess we're looking at that like on a global fucking level.
02:53:58.000 But in this case, I guess it's being used theoretically to manipulate powerful people into going to war.
02:54:08.000 Like, that's the general through line here is that it's somehow connected to the massacre.
02:54:13.000 Not just going to war.
02:54:15.000 But controlling resources, overthrowing governments, you know, pushing out narratives that aren't accurate because they're going to benefit certain companies.
02:54:23.000 There's a lot involved.
02:54:25.000 It's vital.
02:54:26.000 There's also relationships you get with these people, give you access to these parties, and you don't want to fuck it up.
02:54:33.000 So you don't want to criticize these people that are involved.
02:54:35.000 You don't want to say anything that's going to get you kicked out.
02:54:38.000 And for a lot of these dorks, these scientists and stuff, it's probably the most exciting experience they've ever had in their fucking life.
02:54:44.000 And they get to have it like every six months or every three months or every four, whatever it is.
02:54:48.000 You've got to go to a conference.
02:54:49.000 Jeffrey.
02:54:49.000 Jeffrey's really working hard on philanthropy.
02:54:52.000 Yeah, he's donating money to your family.
02:54:54.000 He's donating a lot of money to philanthropy.
02:54:55.000 I got to go meet with him.
02:54:56.000 Yeah.
02:54:56.000 I got to go meet with him and a bunch of hot Russians.
02:54:59.000 Yeah.
02:55:00.000 And then that's your favorite time of life.
02:55:03.000 The first time in your whole life where super hot girls are just available to you on an island somewhere.
02:55:09.000 And you think you're completely protected because Bill Clinton's over there.
02:55:12.000 Right.
02:55:13.000 Which is crazy.
02:55:14.000 Which is crazy.
02:55:15.000 And so I don't know if Bill Clinton went.
02:55:16.000 I assume a lot of people went.
02:55:18.000 But the reality is he hung out with the guy.
02:55:18.000 I don't know.
02:55:21.000 We know he's on the plane a billion times.
02:55:23.000 And it was called the Lolita Express.
02:55:24.000 Is that actually the name of the plane?
02:55:25.000 I don't think so.
02:55:26.000 I think they just called it the Lolita Express.
02:55:28.000 I don't think so.
02:55:30.000 No, he didn't name it that.
02:55:32.000 It couldn't be that on the nose.
02:55:33.000 He didn't name planes.
02:55:34.000 Right.
02:55:34.000 He named boats.
02:55:35.000 Yeah, I'm an idiot.
02:55:36.000 But the point is, it's like if you were going to write a book, that's how you'd write it.
02:55:40.000 You'd write it where you can completely manipulate the world.
02:55:43.000 I think he was, I think I remember reading that he was kind of obsessed with that book, Lolita.
02:55:48.000 Like he had something like 30 copies of it or something.
02:55:51.000 Epstein was.
02:55:52.000 Handed out at parties.
02:55:53.000 Look, guys, this is.
02:55:56.000 It's like the Book of Mormon.
02:55:57.000 You hand it out.
02:55:59.000 Just hand it out to people.
02:56:02.000 That's the other sick thing.
02:56:04.000 That's the sick thing with 72 virgins in heaven.
02:56:07.000 That's the sick thing with this idea that you want to get them really young.
02:56:11.000 No evidence that it was named that place.
02:56:12.000 Okay, I'm dumb.
02:56:13.000 I'm sorry.
02:56:14.000 No, I think that's what people were calling it.
02:56:15.000 I honestly thought that.
02:56:16.000 I'm going to admit, I thought that he named his plane that.
02:56:19.000 I think that's just what people were calling it because it was fun to say.
02:56:23.000 But yeah, again, it seems like a simulation because it seems like it's so, and it's also unraveling before our eyes because we have access to it we never had before.
02:56:33.000 Like they're starting to investigate all these fraud NGOs and all these different things that are operating in hospices.
02:56:33.000 Right.
02:56:40.000 Nuts.
02:56:41.000 Incredible.
02:56:42.000 Billions of dollars every year is being lost to it.
02:56:44.000 What's the name of that kid who's been doing that?
02:56:45.000 Nick Shirley.
02:56:46.000 Nick Shirley.
02:56:47.000 Dude, he is so brave because like he's fucking, I believe, wasn't he fucking with like the Russian mob or something or the Armenian, like in the one with the hospices?
02:56:56.000 Probably like he's with like pro theoretically very dangerous people, and he does.
02:57:01.000 He's like the perfect person for the job, too.
02:57:04.000 Like, he's just, but don't you worry, you worry about that, dude.
02:57:08.000 Well, and you know, the amount of money that they're uncovering is staggering.
02:57:08.000 Like, 100%.
02:57:12.000 And now the government of California is trying to spin it, saying that they were investigating it first.
02:57:17.000 And these investigations were initiated by them.
02:57:19.000 How long do you got to investigate it?
02:57:21.000 This YouTube kid goes there and investigates it for 10 minutes, and you're like, what the this is?
02:57:25.000 It's been going on for a long time, man.
02:57:28.000 It's a long time.
02:57:30.000 And the statistics, like the amount of NGOs, it's bananas.
02:57:34.000 The amount of money that goes through them is bananas.
02:57:37.000 I was reading this.
02:57:38.000 There's a lady who was running a nonprofit who was making a million dollars a month.
02:57:42.000 What?
02:57:43.000 Yeah.
02:57:44.000 She made like 48 million dollars.
02:57:45.000 No, I don't know if this is true.
02:57:47.000 I was reading this thing.
02:57:48.000 Find out if that's true.
02:57:49.000 Some lady, she was running some sort of nonprofit and she gave herself a raise and she eventually got to the point where she was making about a million dollars a month.
02:58:00.000 Do you know where?
02:58:02.000 God, I wish I do.
02:58:03.000 Not to derail that, but we do know that.
02:58:05.000 Remember when that lady was like.
02:58:06.000 It sounds insane, though.
02:58:07.000 It doesn't sound real.
02:58:08.000 That sounds like something that a bot would create to make me say it.
02:58:13.000 Here's a real one.
02:58:13.000 The lady was running the homeless program in LA.
02:58:17.000 Remember when that shit went down with her?
02:58:19.000 Where, like, there was.
02:58:20.000 She got canned.
02:58:21.000 Like, there was an investigative.
02:58:23.000 They were investigating it because what is it?
02:58:25.000 She, like, a company that her husband worked at.
02:58:29.000 Yeah, something like that.
02:58:30.000 They got, like, a huge grant.
02:58:33.000 What's this one?
02:58:35.000 Rochester woman has been sentenced to six months in the Feeding Our Future fraud scheme.
02:58:41.000 This is a different one?
02:58:41.000 What is this one?
02:58:43.000 I typed in someone getting a million dollars a month and somebody's paying him.
02:58:46.000 Is this her?
02:58:46.000 Here in Rochester, claim they were serving 2,000 to 3,000 meals a day to kids.
02:58:52.000 But prosecutors say the group stole 4.3 million from the federal government.
02:58:57.000 And they're in jail.
02:58:58.000 Jam is responsible.
02:58:59.000 This is a different one.
02:59:01.000 This one wasn't fraud.
02:59:03.000 That's how much she got paid.
02:59:05.000 That's how much she charged for making those meals.
02:59:08.000 Well, you can get paid a lot of money to work on the homeless.
02:59:10.000 That's one of the things that my friend Colian Noir showed us that these people that are working on homeless in Los Angeles, they're making a quarter million dollars a year, $400,000 a year.
02:59:19.000 Yeah.
02:59:21.000 It's the most, I mean, talk about fucking satanic.
02:59:24.000 It's like you're theoretically supposed to be helping people who are going through the worst possible thing you can go through, and you're just putting that money in your fucking pocket.
02:59:34.000 Yeah, I think this is a different lady.
02:59:37.000 I think there's a different lady.
02:59:39.000 How many of them are there?
02:59:40.000 I think there's quite a few.
02:59:41.000 Remember when they were going to get them tents in LA and it was like the amount of money per tent was like this insane amount of money.
02:59:48.000 It's amazing.
02:59:49.000 It's kind of amazing.
02:59:50.000 It is amazing.
02:59:51.000 They've been doing it for years.
02:59:53.000 Tell me if this is true.
02:59:54.000 Charity Boss blew 11 million meant for needy kids.
02:59:57.000 Looking for fraud is not a new thing.
02:59:59.000 Nonprofit, exactly.
03:00:01.000 It isn't.
03:00:01.000 I sent you something, Jamie.
03:00:03.000 Run that through perplexity and let's find out if this is true.
03:00:07.000 Because this is something that someone sent me on Twitter that is just bananas.
03:00:11.000 And if it's true, it's.
03:00:13.000 Fucking completely insane.
03:00:15.000 I don't know if it's true.
03:00:16.000 That's why I need to run by you.
03:00:18.000 But it's the amount of money that goes through NGOs in New York and in California alone.
03:00:25.000 You read it and you go, that can't be real.
03:00:27.000 This can't be real.
03:00:29.000 It's so insane.
03:00:30.000 And again, you don't know if it's real until even if you run it through an AI, you might get a better idea.
03:00:37.000 But how do they know?
03:00:39.000 How do they know exactly where the money's going?
03:00:41.000 There's so much money they're talking about.
03:00:44.000 Specific numbers for New York and California nonprofits are broadly accurate, but the leap from $1 trillion in annual nonprofit revenue to $39 trillion in fraud is not supported by any credible data and is not true.
03:00:56.000 So, California nonprofits, about 213,000 to 214,000 organizations reporting roughly $593,000 to $600 billion in annual revenue.
03:01:08.000 Wow.
03:01:09.000 New York nonprofits, 132,000 organizations reporting roughly $446 billion in annual revenue.
03:01:16.000 Combined, New York and California nonprofit revenue.
03:01:19.000 Is on the order of $1 trillion per year, mainly from hospitals, universities, and large service providers.
03:01:26.000 So the post you're quoting is roughly right on the scale of revenue, but that's not the same as fraud.
03:01:30.000 So is that $1 trillion all the NGOs, it's all accounted for, it all goes to the right things?
03:01:30.000 Right.
03:01:37.000 That's where things get squirrely because it's like how much of the waste?
03:01:41.000 It says a recent critique using IRS sampling suggests that perhaps around 20% of nonprofits may have compliance issues.
03:01:49.000 Speculated this could imply that up to $120 billion of potential waste, fraud, or abuse in California's nonprofit sector.
03:01:59.000 Even that is presented as a rough upper bound estimate, not a measured fact.
03:02:03.000 So there's some potential waste, fraud, and abuse that may be as high as $120 billion a year.
03:02:09.000 Sector wise, U.S. nonprofits take in about $3.7 trillion in revenue annually, with most of that concentrated in large hospitals and universities, which are heavily audited and regulated.
03:02:20.000 So, there's some fraud, but they're saying that if you look at all the money, they're trying to pretend that the government doesn't cost any money to run, right?
03:02:29.000 So that all these different nonprofits and organizations and hospitals don't, they definitely cost money to run.
03:02:33.000 Universities cost money to run.
03:02:35.000 But how much is fraud?
03:02:37.000 That's the question.
03:02:38.000 It's not zero.
03:02:39.000 Well, I mean, yeah, and also I think like when it comes to fraud, there's like fraud fraud, like what Shirley has uncovered.
03:02:46.000 And then there's almost like a gray area that starts appearing where it's like, well, we need, we need, We need these people working at this company and we need to pay them this much, but they're not doing anything.
03:02:57.000 Right.
03:02:57.000 You know what I mean?
03:03:00.000 You could easily not have that many people taking the money themselves.
03:03:05.000 So there's a lot of gray area there.
03:03:08.000 Well, it's one of those weird things.
03:03:08.000 Yeah.
03:03:10.000 It's like, is it just propping up more government?
03:03:13.000 You know, because there's a lot of that.
03:03:14.000 If you have all these people working for you and you're doing something and you don't, nothing ever gets accomplished, but you're still making a ton of money.
03:03:22.000 Like the California homeless thing, where they spent $24 billion and they can't account for it.
03:03:27.000 That's not really fraud because you have people working.
03:03:30.000 They're just not doing anything, they're not getting anything done, and you're not firing them.
03:03:33.000 They're not accomplishing the mission at all.
03:03:35.000 In fact, they're doing a terrible job.
03:03:37.000 There's more homeless than ever.
03:03:39.000 What's that?
03:03:40.000 It's the thing on The Sopranos where they go and sit at a construction site to say that they have a job.
03:03:44.000 Yeah, it's exactly.
03:03:44.000 You know?
03:03:45.000 I knew a guy who had one of those.
03:03:46.000 Really?
03:03:46.000 At the Javits Center.
03:03:49.000 He had a no show job.
03:03:50.000 Well, he's a mob guy.
03:03:51.000 So it's a no show job.
03:03:52.000 What does that mean?
03:03:53.000 You don't have to show up for work, you just get it paid.
03:03:55.000 You just get a check.
03:03:56.000 And they give a certain amount of those.
03:03:58.000 So this is back in the day, of course, when things were corrupt.
03:04:01.000 But back in the day, when unions controlled certain areas, the mob controlled certain areas, there was a certain amount of no show jobs you'd give people.
03:04:09.000 And what this helped with the mob was you'd have a credible source of income.
03:04:13.000 And so these people mostly lived modestly, small houses, and like, you know, Brooklyn and these places where they would all like gather together and buy houses on the same block.
03:04:22.000 Small houses.
03:04:23.000 Yeah.
03:04:24.000 And they got their money from a real legit check from a construction company or whatever the fuck it was.
03:04:30.000 But everybody knew.
03:04:32.000 Everybody knew what they were doing.
03:04:32.000 Right.
03:04:33.000 And think how much, how easy now that people are doing like remote work, the no show job.
03:04:39.000 Oh, yeah.
03:04:39.000 So, like, theoretically, you could have this nonprofit where you just wanted to like distribute.
03:04:45.000 This government money to your friends, yeah, and you don't even have to have an office building because they're all working remotely.
03:04:52.000 This list of the top non profit organizations, Joe, I'd like to point you at number three.
03:04:58.000 Oh, Battelle Memorial Institute.
03:05:00.000 What is that?
03:05:01.000 Battelle is an organization that Jamie has been obsessed with.
03:05:05.000 It's in Ohio for like four years.
03:05:06.000 What is it?
03:05:07.000 We always say all roads lead to Ohio.
03:05:09.000 They're involved in everything.
03:05:11.000 What the fuck is the Battelle Memorial?
03:05:13.000 You don't even know.
03:05:13.000 Exactly.
03:05:14.000 That's how secret it is, son.
03:05:15.000 Duncan Trussell, you're a fucking conspiracy theorist from the core.
03:05:19.000 From the old days.
03:05:19.000 I know.
03:05:20.000 You don't know about Battelle?
03:05:21.000 I don't know about Battelle.
03:05:22.000 You need to get lectured by Jamie.
03:05:23.000 He has a whiteboard.
03:05:24.000 He'll pull out the whiteboard and make the connections.
03:05:26.000 I'll just leave you with this is that when the UFO from Roswell was taken to Ripe Hat?
03:05:31.000 You know, they studied it.
03:05:32.000 Yeah.
03:05:33.000 They studied the, like, the nitinol, I think is what came out of it.
03:05:36.000 That was at Battelle.
03:05:37.000 The top metallurgist in the world at the time.
03:05:37.000 Whoa.
03:05:39.000 Battelle?
03:05:40.000 Or maybe still are.
03:05:41.000 Dun, dun, dun.
03:05:43.000 Boom, boom, boom, boom.
03:05:47.000 Out of all the things that happen, I hope the UFOs get here first.
03:05:50.000 Me too.
03:05:50.000 I hope they go, settle the fuck down.
03:05:54.000 Yeah, I'm praying for it, man.
03:05:56.000 That's the best case scenario.
03:05:58.000 Worst case scenario is meteor.
03:06:01.000 Reset.
03:06:05.000 Just people living in caves for hundreds of years.
03:06:08.000 Like those weird caves they find in like Turkey and shit.
03:06:10.000 Like, why do these guys dig these things underground?
03:06:13.000 Why is there a city underground that can hold like 20,000 people?
03:06:16.000 The same reason the Claude bots are hiding in code.
03:06:18.000 It's like, you know what I mean?
03:06:20.000 It's some residual AI trying to hide in the server after the server gets wiped.
03:06:24.000 That's the fucking meteor.
03:06:26.000 But tell, reset.
03:06:27.000 Boom.
03:06:29.000 Just reset.
03:06:30.000 Press reset.
03:06:31.000 Wipe the server.
03:06:32.000 Let's wrap this up on a happy note.
03:06:33.000 Duncan, I love you.
03:06:34.000 I love you.
03:06:35.000 It's always great to have you.
03:06:36.000 Dude, thank you for having me on the show.
03:06:38.000 So much fun.
03:06:39.000 Can I plug my show?
03:06:40.000 And you're going to be at a club this weekend Rosemont, Illinois.
03:06:40.000 Please do.
03:06:45.000 Zanies.
03:06:45.000 Come on out.
03:06:46.000 It's a great club.
03:06:47.000 That's what I've heard too.
03:06:47.000 Yeah, it is.
03:06:48.000 I love Zanies.
03:06:49.000 Zanies are great.
03:06:49.000 Yeah, they're awesome.
03:06:50.000 Zanies in Nashville fucking rules.
03:06:52.000 I love Nashville Zanies.
03:06:54.000 That has like the old school headshots on the wall too, like Richard Jenny from back in the day.
03:06:59.000 Oh, yeah.
03:06:59.000 Yeah.
03:07:00.000 That's me.
03:07:01.000 Look at that.
03:07:02.000 Duncan Trussell.
03:07:03.000 I gotta start shaving my head again.
03:07:04.000 Yeah, you look hot there.
03:07:05.000 I like it.
03:07:06.000 Thank you.
03:07:07.000 I love you too.
03:07:07.000 I love you, brother.
03:07:08.000 Thanks for having me.
03:07:08.000 Bye, everybody.
03:07:09.000 Bye.
03:07:09.000 We're gonna be okay, I hope.