Timcast IRL - Tim Pool - March 16, 2026


Trump LOSES IT Learning Iran Leader IS GAY | Timcast IRL w- Bryan Callen & Liv Boeree


Episode Stats

Length

2 hours and 31 minutes

Words per Minute

193.8568

Word Count

29,421

Sentence Count

2,472

Misogynist Sentences

17

Hate Speech Sentences

102


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcripts from "Timcast IRL - Tim Pool" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:02:25.000 So Iran's new supreme leader is gay.
00:02:27.000 And apparently, when Trump found out, he and the administration started busting out laughing.
00:02:32.000 Apparently, one senior administration official has been laughing about it for days.
00:02:36.000 And I'll tell you what I really think.
00:02:38.000 This is a PSYOP meant to cause harm to the reputation of the new Supreme Leader of Iran before he actually starts running this country.
00:02:46.000 Because, you know, you can't be gay in Iran.
00:02:49.000 Now, the truth is the guy might be dead.
00:02:52.000 No one actually knows.
00:02:53.000 There are reports that he was flown to Moscow for surgery.
00:02:53.000 They've not seen him.
00:02:56.000 He may have been maimed or in a coma.
00:02:58.000 His leg may have gotten blown off.
00:03:00.000 Apparently, he was in the building when they bombed it.
00:03:02.000 They killed the Ali committee, the supreme leader, but was going outside of the garden and may have survived.
00:03:07.000 We don't know for sure.
00:03:08.000 The one thing we can say is he didn't show up to his own coronation.
00:03:12.000 He wasn't there.
00:03:13.000 And the only statement he's released is written, so many people think he's actually dead.
00:03:18.000 Without confirmation, I guess the West's play is just to call him gay so that the people of Iran are like, I don't want to follow that guy, which is honestly kind of clever.
00:03:27.000 And it's lame.
00:03:29.000 It's great.
00:03:30.000 Oh, someone is gay at least.
00:03:31.000 At least our Peter isn't gay.
00:03:33.000 I think it might work.
00:03:34.000 It's Iran, though.
00:03:35.000 So that's apparently a big story.
00:03:40.000 And then I actually think substantially more interesting is that Cuba's power is completely out.
00:03:44.000 Their grid has failed totally, and the U.S. is about to come in and, quote unquote, save them.
00:03:51.000 So it looks like Cuba's going to be falling back into the Western fold.
00:03:53.000 If this plays out, sanctions will likely be lifted and trade will be normalized, which all in all, I think, is actually a good thing coming off of what happened in Venezuela.
00:04:01.000 And then, of course, the escalation of the war in Iran.
00:04:04.000 And then Megan Kelly upset that Macro Levin has a micropenis and they're fighting about it.
00:04:10.000 I'm not even kidding.
00:04:11.000 Welcome to whatever.
00:04:13.000 The world of clicks.
00:04:13.000 I don't know.
00:04:15.000 The world of clicks.
00:04:16.000 Before we get started, my friends, we got a great sponsor for you.
00:04:16.000 He gets it.
00:04:19.000 It is Beam Dream.
00:04:20.000 Make sure you guys head over to shopbeam.com/slash Tim Pool and pick up your nighttime blend to support better sleep.
00:04:28.000 I absolutely love Beam Dream.
00:04:30.000 I drink it every single night.
00:04:31.000 It is a delicious cup of hot cocoa you drink before bed that helps you sleep.
00:04:34.000 It's got magnesium, rays, melatonin, al-thenine, all the good stuff.
00:04:37.000 And it is a delicious cup of hot cocoa.
00:04:39.000 They got a bunch of flavors.
00:04:40.000 They got cinnamon cocoa.
00:04:41.000 They got chocolate peanut butter.
00:04:43.000 They got sea salt caramel.
00:04:44.000 Actually, that one's my favorite.
00:04:46.000 The cinnamon cocoa was my favorite while, but I like the sea salt caramel one now because it's not a cocoa, right?
00:04:50.000 15 calories, no added sugar, and legit.
00:04:53.000 It helps me sleep better.
00:04:54.000 My sleep score has improved, and I'm a huge fan of this product.
00:04:57.000 When they first reached out to us, I said, you know, I don't think I need this, but you know, we're hoping it's fine.
00:05:01.000 I love this stuff.
00:05:02.000 And then I started drinking every night before bed, and legit, my sleep score started improving dramatically.
00:05:07.000 So for guys, listen up.
00:05:09.000 Your testosterone and HGH are produced in REM sleep and deep sleep.
00:05:12.000 So if you're not getting good sleep, you're going to be irritable, tired, and fat.
00:05:15.000 And I don't think you want to be like that.
00:05:17.000 So check out shopbeam.com/slash TimPool and you can get up to 40% off.
00:05:23.000 Don't forget to also go to Timcast.com, join the Discord, make this possible.
00:05:28.000 There's tens of thousands of people hanging out every single day.
00:05:30.000 They're building new projects.
00:05:31.000 There's pre-shows, after shows, but more importantly, community is our strength.
00:05:35.000 If you guys want to help change the world, you've got to connect with other people to build those projects.
00:05:40.000 And at the same time, you're helping support the work that we do.
00:05:42.000 So smash that like button right now and share the show with everyone you know if you want to support the work that we do.
00:05:47.000 You already noticed we have a great guest here.
00:05:49.000 It's Brian Callen.
00:05:50.000 Thank you, ladies and gentlemen.
00:05:52.000 Absolutely.
00:05:52.000 Good to see you, buddy.
00:05:53.000 Who are you here?
00:05:54.000 Yeah.
00:05:55.000 Who are you?
00:05:55.000 What's that?
00:05:56.000 Oh, I'm just a comic.
00:05:56.000 What do you do?
00:05:58.000 I'm just a man.
00:05:59.000 I like saying simple things.
00:06:00.000 You know what I mean?
00:06:01.000 Just a man.
00:06:01.000 I get from point A to point B the best way I know how.
00:06:04.000 Sometimes it's dangerous.
00:06:05.000 Facts.
00:06:06.000 That's to the point.
00:06:07.000 It's saying things like that.
00:06:08.000 I just want to be that guy one time.
00:06:10.000 You know what I mean?
00:06:10.000 I want to be the guy who's like, that's a tale for another time, my friend.
00:06:15.000 You know those guys who have got scars and a million stories?
00:06:18.000 And speckled whitish gray beards.
00:06:20.000 Yeah, yeah, yeah.
00:06:21.000 Exactly.
00:06:21.000 You're not telling the story.
00:06:22.000 You can use that whenever you want.
00:06:24.000 I could, actually, I could.
00:06:25.000 Play it off.
00:06:26.000 Just don't want to say wise things like a tree grows as fast as a tree grows.
00:06:29.000 Doesn't it?
00:06:31.000 Do you hear him?
00:06:32.000 You use a lot of ellipses when you, right?
00:06:34.000 Yes.
00:06:35.000 Indeed.
00:06:36.000 We do have another guest.
00:06:36.000 It's Liv Bowry.
00:06:37.000 Hello.
00:06:38.000 Yeah, she's way more interesting.
00:06:39.000 Who are you?
00:06:40.000 What do you do?
00:06:41.000 I used to be a pro poker player for a long time, which is how we met.
00:06:46.000 Yeah.
00:06:46.000 Playing poker.
00:06:48.000 And now I don't even know how to describe what it is I do.
00:06:51.000 I kind of do research and make content around the intersection of game theory, risk, and technology.
00:06:59.000 So talk a lot about AI and how to make us not be in these race to the bottom spirals.
00:07:06.000 Whatever job title that is, I don't know.
00:07:07.000 I feel like you could rob a bank.
00:07:09.000 She could rob a bank with a bag.
00:07:10.000 There we go.
00:07:10.000 Podcaster.
00:07:11.000 I know.
00:07:11.000 Podcaster.
00:07:12.000 Somebody with that accent.
00:07:13.000 First of all, you sound very smart, and I would put all the money in a bag.
00:07:16.000 It's just a British accent.
00:07:17.000 Yeah, I know.
00:07:18.000 You could rob a bank, and I'd be like, of course.
00:07:20.000 Just say this to me.
00:07:21.000 Just go to Sisko.
00:07:22.000 Put all the money in the bank.
00:07:23.000 We got to open source the AI.
00:07:25.000 Put all the money in the bag.
00:07:27.000 She's going to rob a bank.
00:07:28.000 Watch this.
00:07:29.000 Do you think we should open source the AI?
00:07:31.000 Yes.
00:07:32.000 Sort of prize.
00:07:33.000 You should make me do a thing.
00:07:34.000 I don't know.
00:07:35.000 I don't know if this is like a single thing.
00:07:36.000 Did you know that for a long time, all of the villains in our cartoons were British?
00:07:42.000 Of course.
00:07:42.000 Like in Disney?
00:07:43.000 There's something about being British.
00:07:45.000 There is, and it depends on the British accent.
00:07:47.000 You're either extremely intelligent or extremely stupid, right?
00:07:50.000 Like you have a posh accent.
00:07:51.000 People are going to think you must be smart.
00:07:52.000 If you're like a cockney, they're gonna be like, this guy's an idiot.
00:07:55.000 Well, you can say things, you can say horrible things, and somehow it feels like they're being polite.
00:08:00.000 You can say something like, I'm going to have to fillet you now.
00:08:03.000 And it defends.
00:08:04.000 And it won't be painless.
00:08:06.000 Well, put all the money in a bag, innit?
00:08:09.000 That's more like a bag right now.
00:08:14.000 Right now.
00:08:14.000 That's serious.
00:08:15.000 No, it's not.
00:08:16.000 All right.
00:08:17.000 We got to talk about this news, which is probably the stupidest story I've ever seen.
00:08:21.000 It's from the New York Post.
00:08:22.000 Trump briefed that Iran's new Supreme Leader, Mustabah Khomeini, is probably gay and president has a priceless reaction.
00:08:29.000 Indeed, he busted a gut.
00:08:31.000 Others in the room also found it hilarious and joined the president's reaction.
00:08:35.000 While one senior intelligence official has not stopped laughing about it for days, said one person familiar with the briefing.
00:08:42.000 So here's what's interesting.
00:08:43.000 Actually, there's been intelligence going back for quite some time that this dude might be gay.
00:08:47.000 Apparently, in the late 80s, early 90s, he had to fly to the UK for impotence treatment because when he got married, he couldn't get it going.
00:08:55.000 You know what I'm saying?
00:08:56.000 And so they were like, what's wrong with this young man who, for some reason, can't get it going?
00:09:01.000 And they're trying to insinuate he's gay.
00:09:04.000 And just, I just want to stress the juvenile South Park-esque political strategy of we need to find a way to discredit the new Supreme Leader.
00:09:15.000 We can call him gay.
00:09:16.000 They're just gay.
00:09:17.000 I just imagine Scott Passett being like, that's not funny.
00:09:20.000 Maybe that gay is not funny.
00:09:21.000 It's true, they shouldn't bring it up.
00:09:22.000 It's so stupid.
00:09:23.000 You know what I mean?
00:09:24.000 But, you know, at least our leader's straight.
00:09:26.000 But for a country like Iran, it may be effective.
00:09:30.000 Well, I have to be honest, because I'm 12 years old.
00:09:33.000 I'm looking at his face.
00:09:34.000 He looks gay.
00:09:35.000 I know, you look at David Cross from.
00:09:37.000 I swear to God, now I'm like, hey, he looks kind of gay, which is so unfair, but he has a soft look.
00:09:41.000 You say he has a gay face.
00:09:42.000 Yeah, he looks like he's got a case of gay face.
00:09:44.000 He looks like an intellectual.
00:09:45.000 The show Rest in Development.
00:09:46.000 I mean, he looks like David Cross.
00:09:48.000 He's very borderline.
00:09:50.000 I think he's bisected.
00:09:51.000 He doesn't look a little bit like an Iranian round, kind of, you know.
00:09:54.000 And look, look at his shirt.
00:09:55.000 He blew himself.
00:09:56.000 Oh, new wild.
00:09:59.000 Oh, come on, yeah.
00:10:00.000 Maybe he's just another nude.
00:10:02.000 Yeah.
00:10:03.000 I hope he's gay.
00:10:04.000 That's why he couldn't get it up.
00:10:05.000 I bet the staffer that keeps laughing, he's been laughing for days, is gay.
00:10:09.000 Trump's staffer is what I said.
00:10:11.000 He's like, no, guys, guys.
00:10:13.000 No one's laughing about it.
00:10:15.000 It's like seven intel guys in a room with Trump, deadpan serious.
00:10:19.000 We have to call him gay.
00:10:21.000 It's the only way to discredit him.
00:10:22.000 And Trump's like, do you think it'll work?
00:10:24.000 They're just very serious about war.
00:10:26.000 But I always wonder about what they say before that.
00:10:29.000 It's like, who wants to tell him?
00:10:30.000 Let me tell him.
00:10:30.000 Let me tell him.
00:10:31.000 You know what I mean?
00:10:32.000 It's so dumb.
00:10:33.000 Mr. President, sorry to interrupt your incredibly busy day.
00:10:36.000 I know you're dealing with a war and everything else, but I imagine someone just busted in.
00:10:41.000 Khomeini's gay.
00:10:42.000 I just imagine someone bust into the Oval Office and be like, you're not going to believe this.
00:10:47.000 No, no, no.
00:10:48.000 I imagine it's much more serious.
00:10:49.000 Like Trump is going, listen, this war is not going good for me.
00:10:52.000 The polls are not so good.
00:10:53.000 We need to get something.
00:10:55.000 We need to get the new government in very quickly.
00:10:57.000 And they're like, I think the only move we have now is to call him gay.
00:11:01.000 And Trump's like suggestions.
00:11:02.000 Like, we call him gay.
00:11:03.000 And he's like, okay, make it so.
00:11:06.000 And that's their plan.
00:11:07.000 He was watching South Park and he was like, I got a great idea.
00:11:11.000 Yeah.
00:11:11.000 This is how we win the war, guys.
00:11:13.000 That's right.
00:11:13.000 When you know your leader is gay, Supreme Leader.
00:11:17.000 You know, he's not actually in a hospital then.
00:11:20.000 I think he's dead.
00:11:21.000 He could be gay.
00:11:22.000 He's not in the hospital.
00:11:23.000 He's out meeting boys.
00:11:24.000 Wait, this guy.
00:11:24.000 Yeah.
00:11:25.000 A lot of people think he's dead.
00:11:26.000 He's dead.
00:11:27.000 He hasn't shown his face for like two years.
00:11:30.000 He has to go out and sow his wild oats in Russia to make sure that he has a good time before he goes and actually bringing it.
00:11:37.000 Yeah, maybe he's just a little bit more.
00:11:39.000 I think he's dead.
00:11:41.000 The rumor going around for a while now is that he was killed.
00:11:45.000 And the reason why the Iranians are still pretending like he's actually in charge is because they would have to say the supreme leader died, the second in line died, top 40 officials died, and then their government's going to collapse.
00:11:58.000 There's nothing left.
00:11:59.000 So they're like, now that he's dead, posthumously nailing with the gay card, he's not coming out to be like, I'm masculine, you guys.
00:11:59.000 Yeah.
00:12:07.000 Here's my sex tape with a girl, guys.
00:12:07.000 It's just going to be all the way.
00:12:09.000 Oh, maybe they will start deep faking sex tapes with this guy, which then will get him in the trap, which is what they always wanted.
00:12:14.000 It was a bunch of command form.
00:12:17.000 That's a good point.
00:12:18.000 Yeah.
00:12:20.000 I can't imagine being a man and realizing that the entire U.S. military and his Israeli military was trying to kill you.
00:12:29.000 Like, it's over.
00:12:30.000 Yeah.
00:12:31.000 You're not, they're going to find you.
00:12:33.000 You know, every time we see like F-35s and F-15s, we're like, yeah, but can you imagine me on the receiving end of that stuff?
00:12:39.000 I mean, good luck.
00:12:41.000 I'm surprised they haven't surrendered.
00:12:42.000 And maybe because they haven't had internet for 14 days, so the people have no idea what's going on.
00:12:46.000 I don't think they would have Starlink at this point, at least.
00:12:49.000 I think they probably haven't surrendered because there's an idea that if they can wait this out, right?
00:12:56.000 Because I think what they're a couple things are thinking.
00:12:59.000 The only way for real regime change is boots on the ground.
00:13:02.000 Yep.
00:13:03.000 And do Americans have the stomach for that?
00:13:06.000 If they don't, and the Americans pull back and settle for some kind of a deal, what the command structure can then claim is that they ultimately stood up to America and Israel and won.
00:13:19.000 Yep.
00:13:20.000 And it actually consolidates their power.
00:13:22.000 So there's still profit to be had with resistance.
00:13:26.000 So this was a big mistake.
00:13:28.000 I mean, I understand the reasons for doing it.
00:13:32.000 A lot of people just want to say, like, oh, is real Israel, which is a small component, but not the principal reason why the West in general wants to get regime change in Iran.
00:13:40.000 The problem is the Iranian strategy is we're in a midterm year.
00:13:43.000 Trump cannot sustain a military operation for a long time.
00:13:47.000 And after he is forced out, either by the Democrats winning Congress or just attrition in general, like he can't sustain this economically.
00:13:59.000 Like you said, they're going to say we defeated Israel and the United States.
00:14:02.000 We held our own.
00:14:03.000 Trump has no choice but to make sure this is done and done quickly.
00:14:08.000 But apparently now the reports, they're saying it's going to last until September or longer.
00:14:12.000 And now Trump, there's that viral video where he's saying we need NATO assistance to go in and keep the Strait of Hormuz open, which is not a good sign.
00:14:20.000 But then he was like, well, we don't need the help, you know.
00:14:22.000 So he went back and forth.
00:14:23.000 I think it's, I think he's, I think it's cooked.
00:14:25.000 The other thing people don't talk about is that Iran sells 90% of their crude oil to China.
00:14:33.000 And in this AI war, which is very real, you need energy.
00:14:37.000 Yep.
00:14:38.000 You know, all these people, AI is going to take over.
00:14:40.000 Are they?
00:14:41.000 You know how much energy it takes?
00:14:43.000 So China is, I think they get 30% of their energy from Iran.
00:14:48.000 That's a very significant energy.
00:14:50.000 30% of their oil.
00:14:51.000 Yeah, that's huge.
00:14:53.000 You can talk about them having green, but they need that oil.
00:14:55.000 Well, no, I mean, China's got an all-in approach to energy production, and China's actually crushing the U.S.
00:15:01.000 But you point to AI, like the bottleneck isn't actually going to be chips coming up in the next couple of years.
00:15:06.000 The bottleneck's going to be energy production.
00:15:07.000 China's got nuclear power.
00:15:09.000 They've got the Three Gorge Dam.
00:15:11.000 They've got another dam they're building.
00:15:13.000 They've got some solar as well.
00:15:14.000 Yeah, they've got an all-in perspective on it.
00:15:16.000 The U.S. is lagging behind, and the U.S. really needs to do a lot to catch up.
00:15:21.000 Right now, we have the lead because we have the most advanced ships.
00:15:25.000 But in the future, in the next couple of years, the actual bottleneck is going to be energy production.
00:15:29.000 And that's kind of what China's goal is.
00:15:31.000 They're going to get it to manufacturing too.
00:15:32.000 Like ships.
00:15:33.000 I think we make 5% of the ships.
00:15:34.000 They make 40% of the ships out of the world.
00:15:37.000 Any ship that's made in China has certain requirements that the military could take them.
00:15:42.000 So any luxury ship, any kind of ship that's made, China could actually commandeer from the private owner and could say, we're going to use this for military operations.
00:15:51.000 But that's not to say that they have a real Navy.
00:15:55.000 I think they have something like two aircraft.
00:15:57.000 They need oil for a Navy to.
00:15:59.000 And that's a great point.
00:16:00.000 The oil that they do import from Iran, like that's all for the trucking industry and it's all for using for their military arm.
00:16:07.000 So the U.S. taking that away from China or even taking off the edge, right?
00:16:12.000 So even if they can impact their imports by, I don't know, 10, 20%, right?
00:16:17.000 Like they don't get all of the oil, but it has an effect.
00:16:20.000 That's a big deal for China's military.
00:16:22.000 Plus, China's got like 20% unemployment in young men.
00:16:25.000 So they're in a real, real bad pickle.
00:16:27.000 And this kind of like pressure from the U.S. on the energy side, it's a real big deal in China.
00:16:34.000 Yeah, but also the oil they get is gay.
00:16:36.000 Yeah.
00:16:36.000 Yeah, you know, I wrote a piece on my Patreon about this.
00:16:40.000 The whole Venezuela and Iran in conjunction are really actually trying to push China into a direction.
00:16:47.000 Because Trump has a meeting with Xi, I believe, the end of this month.
00:16:51.000 I think it's the end of March, could be in April.
00:16:54.000 But the U.S. is going to walk in there.
00:16:55.000 Remember, the last time the U.S. and China met, China straight up said, you are not negotiating from a position of power.
00:17:02.000 Trump's trying to change all that.
00:17:03.000 When he meets with Xi, again, he's going to be like, look, all of the things that you thought before, that is not the way that it is.
00:17:10.000 The United States is arguing, is negotiating from a position of authority.
00:17:15.000 Do you think this is going to, like, how bad do you think it's going to be in the midterms over this?
00:17:19.000 Well, actually, I'll start with, are you guys in favor of this strikes on Iran?
00:17:24.000 I don't know.
00:17:24.000 I find myself saying I don't know more and more.
00:17:26.000 Yeah.
00:17:27.000 I thought he was.
00:17:29.000 I don't know how to predict the ripple effect.
00:17:31.000 I don't know.
00:17:32.000 My feeling is that every time we go into a country, all gung-ho, we tend to make this big mistake, which is maybe not be as informed about the culture and the ramifications.
00:17:46.000 Nobody thought that these two wars in Iraq and Afghanistan would last 23 years, but they did.
00:17:51.000 And I don't know.
00:17:53.000 I think the obvious reason for the invasion of Iraq and Afghanistan was the invasion of Iran.
00:17:59.000 When you look at where we set up all these military bases along the border on the east and west of Iraq and Afghanistan, yeah, we're surrounding Iran.
00:18:07.000 Well, the Israelis, a lot of people don't know the Israelis were telling the Americans to invade Iran, not Iraq.
00:18:12.000 Right.
00:18:13.000 In 2003.
00:18:16.000 But the issue is, Iran is mountainous, defensible.
00:18:20.000 They've got surface-to-air missiles.
00:18:22.000 The U.S. could not just go in.
00:18:24.000 So they needed to establish effectively a land beachhead, essentially, along the western, along Iran's areas.
00:18:31.000 They're also a very homogenous group.
00:18:33.000 They're all Shia.
00:18:34.000 They're all Iranian.
00:18:35.000 They're Persian.
00:18:36.000 It's not like Iraq, which had Shia and Sunni.
00:18:39.000 And, you know, that was a very significant divide.
00:18:41.000 Same thing in Afghanistan.
00:18:42.000 Yeah.
00:18:43.000 Well, it says Afghanistan.
00:18:44.000 The Hazara, they've got the Tajik, they've got the Pashtun, the different tribes of Pashtun.
00:18:50.000 Afghanistan's always been a series of tribes that were always fighting with each other.
00:18:53.000 So it was always easy to divide and conquer.
00:18:54.000 Don't you just really want that cheap petro-dollar oil?
00:18:59.000 You know, where you as an American can be fat not to think about it.
00:19:02.000 And Hillary Clinton comes back from, you know, she comes back in office.
00:19:06.000 She's withered into cane.
00:19:07.000 She's like, I want to go to war with everybody.
00:19:08.000 And then, but your gas is a dollar a gallon.
00:19:10.000 Yeah.
00:19:11.000 You know?
00:19:12.000 Hey, the 90s, you know, let's bring it back.
00:19:15.000 This energy generation argument might be a red herring.
00:19:18.000 They keep saying we need whoever creates the most electricity is going to win, but they're developing chips with this company Iron Lattice where they put the memory in the processor so there's no more busing agent.
00:19:28.000 It's 10 million times less electricity to run programming.
00:19:32.000 So it could be, it could be like what they're doing with the oil is they're trying to control the energy and prevent others from doing it.
00:19:40.000 If we go like fusion power, everyone's got infinite.
00:19:42.000 If these machines all start requiring 10 million times less, everyone goes infinite.
00:19:46.000 But it's like whoever goes does it first kind of, it doesn't matter who's got electricity at that point.
00:19:51.000 It's just who has the dominating force and intelligence first.
00:19:54.000 And then they stomp and clear the rest.
00:19:58.000 They swallow everything up.
00:19:58.000 That's right.
00:20:00.000 I think the U.S. is looking at even not taking your point for granted, but I think the U.S. still is planning for the modern architecture to be what is used moving forward.
00:20:13.000 Because even if you're right about the chips, it's going to take some time to get those chips into production and get them out in the quantities that they need.
00:20:19.000 I mean, AI takes an entire warehouse full of GPUs to be able to do the processing that it needs.
00:20:27.000 So I don't disbelieve what you're saying about the chips, but it's going to take time to actually make them in enough quantities to have the kind of data processing centers that AI needs.
00:20:37.000 Liv, I wanted to get your response to the question earlier too about Iran.
00:20:40.000 Like, what's your take on this?
00:20:42.000 I mean, I mean, I was largely informed by just all my Persian friends who were desperate for Trump to step in because they're just seeing like thousands and thousands of their people being slaughtered, right?
00:20:56.000 By a regime that they fundamentally hate.
00:21:00.000 But of course, like, you know, this is me speaking to people in the diaspora who aren't necessarily representative of the people who live within Iran.
00:21:08.000 But, and who knows the amount of propaganda coming out on both sides.
00:21:12.000 But from what my experience was, was like all the, like I said, I spoke out a little bit about some of the slaughter of the protesters that happened in sort of January and February.
00:21:24.000 And I've never received more messages of like thanks from what seemed like legitimate Persian people, like anonymous, well, not anonymous, but people I didn't know, being like, thank you so much for speaking out about this.
00:21:35.000 Everyone thinks that we're happy under this thumb of Islam and that we're and where we're and we are so desperate for them to get to go.
00:21:42.000 And and then you look at all the people celebrating like those aren't fake videos people celebrating in the streets.
00:21:47.000 When the strikes happened and the you know his the, the initial Kamani died, people were over the moon.
00:21:54.000 So I mean, I ultimately am being sort of guided by what the people who live there, or the people who have family living there, are saying, and they were ecstatic about this.
00:22:04.000 Now, but what's the long-term consequences?
00:22:06.000 Of course, you know you have to.
00:22:07.000 You know nature abhors a vacuum, vacuum.
00:22:10.000 So what are we going to put in place so that it doesn't turn into another Iraq or another Afghanistan?
00:22:15.000 But but on the point about the protests and the celebrations, consider an Iranian watching BLM protests and what do you think they're telling, what do you think their influencers are telling?
00:22:26.000 You know, saying on their podcasts, they're saying that when I called out the Trump government and and highlighted the protests, they I was getting messages.
00:22:35.000 People were saying, thank you so much for highlighting this.
00:22:37.000 The people of America deeply hate their government and want it overthrown.
00:22:41.000 Someone, but no one can stop them.
00:22:42.000 So the issue that i'm the people of America yeah, the leftists who are marching for BLM and throwing molotov cocktails and indeed, and those are The people that are going to message the Iranians saying we need your help.
00:22:54.000 So, the messages you get are going to be the activists and the establishment.
00:22:58.000 Yeah, it's only something.
00:22:59.000 I'm not saying they're very important.
00:23:00.000 I'm not saying it's one for one.
00:23:01.000 I'm just saying, consider the SIOP, the propaganda, the manipulation attempts.
00:23:04.000 I'm not, yeah, but at the same time, it's just like we have to go off the evidence that is available.
00:23:04.000 Yeah, of course.
00:23:08.000 And, like, I don't know, every single person I've spoken to, they were just sad because they were like, this is going to cause a lot of bloodshed.
00:23:15.000 But they're just like, in the long run, we are not an Islamic country.
00:23:18.000 We never were.
00:23:19.000 We were colonized by this ideology.
00:23:22.000 They have treated us like they treat us like cattle, basically, and they have to go.
00:23:28.000 And they're willing to play blood, you know, pay in blood for that to go.
00:23:32.000 It's a true theocracy.
00:23:34.000 And 70% of the population, I think, roughly is under 30.
00:23:37.000 And they are seeing what the world is doing.
00:23:40.000 You know, and they want to be part of the world.
00:23:41.000 And they've been an international pariah forever.
00:23:44.000 And a lot of it has to do, sure, they sponsor different proxy armies, blah, blah, blah.
00:23:48.000 A lot of people do.
00:23:49.000 But I think, really, think about it.
00:23:50.000 Living under that theocracy is oppressive.
00:23:53.000 Having lived, you know, I lived in the Middle East and I lived in Saudi Arabia for three years as a kid.
00:23:58.000 And the mullahs had a lot of power.
00:24:00.000 And, you know, things are kept pretty strict.
00:24:03.000 So if you're somebody like us and you want to talk, you're an artist, you want to express yourself.
00:24:08.000 Think about the bottled-up frustration.
00:24:10.000 You're just not allowed to express yourself.
00:24:12.000 You're not allowed to do anything that doesn't fall within.
00:24:16.000 Because one of the things about the Quran, especially a country like Iran, at least they try, is the Quran is the Quran, having separation of church and state within Islamic country is very difficult because the Quran is really a blueprint for how to run everything from your marriage to even banking.
00:24:32.000 And a lot of people don't know that.
00:24:34.000 So it's very difficult to kind of like enjoy the kinds of liberties that Western democracies do with all our problems, all our warts and everything else.
00:24:44.000 So there is a fundamental difference.
00:24:45.000 I think the frustration is very real.
00:24:47.000 And I do think those crackdowns are not.
00:24:48.000 I agree.
00:24:49.000 I think the, I'm just pointing out the propaganda of the SYOPS because I think if you actually look at the global effect of what's going on in Iran, there's not very many Americans fleeing to Iran for comfort.
00:24:59.000 There are quite a great deal of Iranians fleeing Iran all over the world to get away from the oppression.
00:25:03.000 Exactly.
00:25:03.000 So that's the easiest way to look at it.
00:25:05.000 So when you hear these stories of, you know, I think it's important to consider the propaganda is my point, but you can look at the world defects.
00:25:11.000 And the left often says America is oppressive and awful, yet everyone in the world is trying to get exactly.
00:25:17.000 I know.
00:25:18.000 Because it's great.
00:25:19.000 Yep.
00:25:19.000 When they said that those 10,000 protesters, you know, a month ago you mentioned were getting killed in the street.
00:25:24.000 First thing I was like, well, I think we should obliterate that government if they're going to do that.
00:25:30.000 And then the second thought was this could be all fake news.
00:25:33.000 And I sat there and like paralyzed this strange state of like, and like as a military commander, I would have let all those people, if I had been the commander, they all would have died on my watch if they really died.
00:25:44.000 And like those, that was a vanguard to overthrow that government from the inside.
00:25:47.000 Now they're dead.
00:25:47.000 I don't even know if they were real.
00:25:49.000 And does anyone know for sure if any protesters got killed?
00:25:52.000 I think they do know for sure.
00:25:53.000 Yeah.
00:25:54.000 I think there's a lot of evidence.
00:25:55.000 And I think there's video as well.
00:25:57.000 Yeah, the thing about PSYOPs is that I think you can make the argument it was a false flag, which is really hard to pull off in Iran when we're not there or attacking them yet.
00:26:08.000 But you're not going to be able to pull off grand claims if it didn't happen.
00:26:12.000 Right.
00:26:12.000 So usually when you get these claims of an atrocity or whatever, one side may exaggerate for political purposes, but you're not going to be able to just lie and claim a bunch of people died if they didn't die.
00:26:26.000 Okay.
00:26:28.000 And there are so many people, like Persian people who have family members still, like who are in the diaspora whose family live, like families live there, who either know someone who died or had a friend of a friend die, like by huge numbers.
00:26:40.000 Home in the last month?
00:26:41.000 Yes.
00:26:42.000 The frustration that's not.
00:26:44.000 Like it's like they I mean maybe the numbers are exaggerated, but I think it's far crazier to claim that thing to claim that nobody died when there is you know there are lots of people putting tons of effort into trying to establish the numbers now.
00:26:58.000 What is the range of the numbers?
00:27:00.000 Is it between 5,000 and 10,000 or is it between 10,000 and 70,000 or even 500 and 1,000?
00:27:06.000 We don't, maybe that's the harder thing to pin down, but to say that nothing happened at all and it was all made up, it just seems completely ridiculous.
00:27:13.000 And there are literally people saying, I know this person and they died.
00:27:15.000 They are no longer with us.
00:27:16.000 I think I know where this is all going.
00:27:18.000 And we're going to segue.
00:27:20.000 We're going to dip a little into the AI stuff.
00:27:22.000 So we're just talking right now about psychological operations, the protests in Iran, who died.
00:27:29.000 I think one of the biggest problems the U.S. is facing right now is that our social media is inundated with foreign actors with foreign political agendas to manipulate the people of the United States so the U.S. government will do their bidding.
00:27:40.000 And I know a lot of people immediately just say, oh, Israel is doing it.
00:27:43.000 Well, you know, Israel is, but a lot of other countries are as well.
00:27:46.000 The direction I see this going is going to be mandatory IDs for internet usage.
00:27:51.000 Elon already implemented on X.
00:27:53.000 I say Elon, but X already implemented as a company.
00:27:56.000 You can click someone's profile and see what country they're from.
00:27:59.000 And this exposed a bunch of Bangladeshis masquerading as Native Americans.
00:28:03.000 Yeah, and they were woke indigenous rights activists.
00:28:03.000 Really?
00:28:06.000 But the Bangladeshis, and it was predominantly Bangladesh, some Pakistan, because they make money doing it.
00:28:12.000 They know that the rage bait will get clicks.
00:28:14.000 It'll make them money.
00:28:15.000 But then you got to take a look at there are a lot of personalities that may be for or against the American military industrial complex plans or whatever.
00:28:24.000 Likely, here's my prediction to the future.
00:28:28.000 They're already talking about needing IDs to log in.
00:28:31.000 It's been a thing that's been brought up for quite a long time.
00:28:33.000 Discord is talking about facial scans and ID requirements and things like this.
00:28:38.000 You will have foreign actors locked out, right?
00:28:42.000 The Iranians aren't, a lot of people have complained that X allows the Ayatollah or allowed him to spread propaganda on the platform, but you knew it was him.
00:28:51.000 The bigger question is if they've got cyber command, like their cyber army, going up on our social media platforms and then spam blasting comments.
00:28:59.000 And I'm going to tell you this.
00:29:02.000 There was a story earlier today about a judge blocking RFK Jr.'s vaccine changes.
00:29:09.000 And so I commented, judges are the supreme authority of the nation, just as the founding fathers intended.
00:29:16.000 Anybody who speaks English knows that the extreme language that I use indicates sarcasm.
00:29:22.000 And anybody who knows the function of our government and checks and balances knows it was a joke.
00:29:25.000 I got a respond from a guy that looks like an American who said, this is incorrect, Tim.
00:29:31.000 The Founding Fathers established three branches of government to keep balance between the three with no one being greater than the other, which no American, in my opinion, would actually say because it's first grade, it's kindergarten level stuff.
00:29:43.000 So there's two scenarios I see.
00:29:45.000 A foreigner.
00:29:46.000 So another example is I once made a tweet that said, Israel has never done anything wrong because Israel is the nexus of morality.
00:29:54.000 If Israel does it, it is good.
00:29:57.000 Clearly sarcasm.
00:29:58.000 I tweeted it.
00:29:59.000 And I got responses from people that were taking it literally and saying things like, you know, where's the, you're hiding the Yarmuka or whatever.
00:30:08.000 My theory on that is these are foreign actors who don't speak English.
00:30:13.000 So they can't detect sarcasm.
00:30:15.000 When you click translate and it converts English into whatever language, they don't see my joke.
00:30:19.000 They see me saying something like, Israel is a force for good and we support Israel.
00:30:24.000 They don't actually see what they want.
00:30:27.000 Indeed.
00:30:27.000 And then the point about the judges being the supreme authority, either it's AI that can't understand a joke, but I actually can grasp that.
00:30:36.000 I think these are foreign individuals clicking translate or using a translator, not understanding the context.
00:30:41.000 And this is how you kind of weed them out.
00:30:43.000 This means, and I think it's fair to say, many people, and you pick which side, left, right, or otherwise, are being heavily influenced by foreign financing.
00:30:52.000 And guys, we've heard the reports about Israel paying $7,000.
00:30:56.000 The truth is it's not $7,000.
00:30:58.000 That was just an average based on how much they had spent throughout the year.
00:31:01.000 But there were individuals who all of a sudden, on a dime, were just pro-Israel.
00:31:04.000 So I think there's probably truth to that.
00:31:07.000 But I also think it's fair to say that we have foreign cyber armies that train people explicitly to run 50 accounts at once and blast you.
00:31:17.000 I think in the future, they are going to mandate that you have an ID.
00:31:20.000 And we already see this somewhat with X premium, right?
00:31:23.000 You've got to prove who you are.
00:31:24.000 And if you don't have premium, you're getting a second-tier thing.
00:31:27.000 This is phase one.
00:31:29.000 Do you like that?
00:31:29.000 Sounds good.
00:31:30.000 Sounds like a good idea, I guess, on the surface.
00:31:33.000 Pros and cons.
00:31:33.000 Pros and cons.
00:31:34.000 There's two arguments for this world.
00:31:36.000 One is that dissent can only be allowed if people are allowed to have anonymity.
00:31:42.000 Like the Founding Fathers used pseudonyms.
00:31:45.000 They knew that if they spoke out against the Crown or British Parliament, they could be hanged for treason.
00:31:51.000 So they had to lie about who they were and then disperse these messages.
00:31:54.000 At the same time, the Founding Fathers did not have our adversaries.
00:31:58.000 Like imagine if the Barbary nations, the Barbary pirates, had the internet and were convincing the people in America that they actually weren't pirates, that we were the pirates attacking them.
00:32:08.000 And then also in our government didn't establish the Marines to go, Jefferson didn't go and do these things.
00:32:13.000 The challenge is ultimately it comes down to all is fair and love is love and war, love and war.
00:32:20.000 And at a certain point, you have to choose to use power or die.
00:32:26.000 Now, I don't know where that point is.
00:32:28.000 Maybe it's now, maybe it's not.
00:32:30.000 But we've lived in this classically liberal mindset for a long time, which has those of us who have been fairly moderate or even right-leaning have been crushed by the far left, who have no respect whatsoever for our classical liberal sensibilities.
00:32:45.000 And I don't mean politically liberal, I mean the philosophically classically liberal.
00:32:49.000 And then we're getting run over by foreign adversaries manipulating our social media.
00:32:53.000 The question is, at what point do we decide to just slam the fist on the table and say, we're locking this down?
00:33:00.000 You need to prove who you are if you want to be in our spaces because we don't want the Chinese cyber army manipulating us.
00:33:07.000 And we're not going to allow Marxists to give kids sex changes.
00:33:11.000 Otherwise, you just keep saying, well, we have to be fair and allow them to do it because it's free speech.
00:33:15.000 But then eventually you cease to exist.
00:33:16.000 Right.
00:33:17.000 Because it's a cyber attack, essentially, is what it is.
00:33:21.000 It's the same thing, right?
00:33:22.000 You know what I worry about?
00:33:24.000 I really worry that we are losing our belief in the ideal.
00:33:31.000 And I'll even use the word brand of America.
00:33:34.000 What I mean by that is this.
00:33:35.000 I'm old enough to remember when I was growing up, we really believed that America ultimately was trying to do the right thing.
00:33:42.000 What I mean by that is that when we went into a country, you can look, it was Iraq, it was Afghanistan, it was, for that matter, Vietnam.
00:33:49.000 The idea, at least, behind it was we are fighting for democracy, for individual liberty, for all these things that America, freedom of speech.
00:33:58.000 It was, in a way, the fabric of being an American that we were the good guys.
00:34:04.000 And that was very real for me as I grew up.
00:34:07.000 Because I do think that for the most part, our leaders, certainly our soldiers, and to an extent still believe that.
00:34:14.000 You hear it right now with Iran, with the idea that these protesters and the people need to rise up, bring democracy and stuff like that.
00:34:21.000 But there's a cynicism in America, and we've earned it to a large extent.
00:34:26.000 You can start with our distrust in institutions.
00:34:29.000 That probably happened with the Catholic Church and how they never came to terms with the amount of pedophilia.
00:34:34.000 We can keep going with how many different institutions have been corrupted, especially the fourth estate, the media, that seems to have just become more interested in playing to their echo chamber and to ratings.
00:34:47.000 So it wasn't really about the truth or objective reality anymore.
00:34:50.000 And I really do worry that young people, and you just did, you were like, I don't know what to believe.
00:34:54.000 That's a huge problem.
00:34:56.000 And I get it.
00:34:57.000 Because you're like, hey, wait a minute.
00:34:58.000 How do I know I'm not being gamed?
00:35:00.000 And I really do think that we cannot go into countries like Iran and just use language like, we're starving the Chinese of oil.
00:35:09.000 This is good for America because we need hegemony.
00:35:12.000 That's not American ultimately because it's hard to get, because then there's no difference between America and Russia, America and China.
00:35:19.000 We have to fight for an ideal, even if it's, even if we're embracing it in a fake way, we're a brand and people do come to this country for all those things that we take for granted.
00:35:30.000 I half agree.
00:35:32.000 You know, if we just say like, we want to cut off China and it sounds strategic and militarized, yeah, that's no good.
00:35:39.000 But I also think that, you know, back in the Bush era when he was like, they hate us for our freedoms.
00:35:43.000 Me and all my friends like rolled our eyes like me too.
00:35:46.000 That's like, that doesn't make sense.
00:35:47.000 There's something, there's a reason.
00:35:49.000 Dude, I lived in the Middle East.
00:35:50.000 I was like, I remember saying that.
00:35:51.000 I was just like, I was there for eight years of my life.
00:35:53.000 Was it the predator drones flying over their homes every day that freaked them out?
00:35:56.000 I was like, they don't hate our freedoms.
00:35:58.000 But I will say this.
00:35:59.000 I will say this.
00:36:00.000 I believe substantially more Americans would support the war in Iran if Trump was honest about the function of the liberal economic order.
00:36:10.000 Now, here's the thing.
00:36:12.000 When you're pitching something to somebody, you got to aim for the lowest common denominator.
00:36:16.000 You're not going to go to someone and explain, you know, the Council on Foreign Relations has this website where what you are going to say is, listen, I think this pitch would actually work for the most part.
00:36:25.000 And I will say this to the American people right now, and you don't have to agree with it.
00:36:29.000 Gas prices and products remain cheap in the United States because we point guns at other countries and say, you will trade oil on the U.S. dollar or you will die.
00:36:40.000 Now, by all means, that sounds immoral and horrifying.
00:36:44.000 And our presidents have done horrifying things to maintain that.
00:36:48.000 Do you want to spend $10,000 for a laptop or do you like your $1,000 laptop?
00:36:52.000 Well, I don't know if I think that that's the case.
00:36:55.000 I think the petrodollar is the petrodollar because the one economy, the one country that's stable, the one place you know things won't go totally haywire, at least now for the past, you know, for most of our existence, but certainly for the past 70 years, has been the United States.
00:37:11.000 If you invest in property, well, because we do have the biggest guns, right?
00:37:15.000 But we also, we also, however, have done a very good job.
00:37:19.000 And we have to give ourselves credit for keeping this democracy alive and checks and balances and Madison and John Jay and Alexander Hamilton, those geniuses.
00:37:31.000 They should have statues to those guys because they did solve.
00:37:33.000 So they tore them down.
00:37:34.000 And they solve the political problem.
00:37:34.000 I know.
00:37:36.000 But listen, the issue primarily is that we do not produce enough for our economy to make sense.
00:37:44.000 Other countries have to buy U.S. dollars before they can buy oil, which means they're promising to give us their debt, their labor, if they just want to buy the oil.
00:37:53.000 We effectively own all of the world's oil, but there's an exchange for this.
00:37:58.000 You will be able to trade freely without fears as we police the seas and police the oceans, and we are going to get everything in order.
00:38:03.000 Trump wants the Suez.
00:38:04.000 He wants Panama.
00:38:05.000 He wants Greenland.
00:38:06.000 He wants to control the waterways so he can fulfill this promise.
00:38:08.000 He wants to get oil to China because he wants China to get back on the dollar because they're getting off of it.
00:38:12.000 He's negotiating with Russia, get back on the dollar.
00:38:14.000 He's going to the Saudis and saying, what do you want us to do?
00:38:17.000 You want us to bomb Iran?
00:38:18.000 Because the Saudis got off the petrodollar contract.
00:38:21.000 If we lose the petrodollar, the standard of living for the average American is going to drop by 80%.
00:38:27.000 We do not export nearly enough to maintain the level of luxury we have.
00:38:32.000 However, we are the world police.
00:38:34.000 It's effectively what we do export, whether you agree with it or not.
00:38:37.000 So I said this back in 2016 with Trump and Hillary and the message Trump had, the message Hillary had, Hillary Clinton was asked about a no-fly zone in Syria, which she advocated for, and was told explicitly that that would be a declaration of war with Russia.
00:38:49.000 Russia has a naval base in Tartus.
00:38:51.000 They have planes.
00:38:51.000 They have jets.
00:38:52.000 If we said no one can fly anymore, we're declaring war on Russia.
00:38:54.000 And she said she didn't care.
00:38:56.000 And I said, listen to my friends.
00:38:58.000 Do you like your dollar slice?
00:39:00.000 Do you like your dollar slice of the free pop?
00:39:03.000 That's because we are the global hegemony.
00:39:06.000 We are the unipolar power.
00:39:08.000 We can make all these countries do what we want.
00:39:10.000 Saddam Hussein wants to trade oil in Euro.
00:39:12.000 Boom, he's dead.
00:39:13.000 Okay.
00:39:13.000 Muamar Gaddafi wants to trade oil in gold dinar.
00:39:16.000 He's gone.
00:39:17.000 We came.
00:39:18.000 We saw he died.
00:39:19.000 That's what Hillary Clinton said.
00:39:21.000 That's the machine state.
00:39:22.000 You live comfortably and in ignorance like a fat guy floating around in Wally so long as the U.S. maintains its domination of these other countries.
00:39:32.000 You go for Trump.
00:39:33.000 Trump wants to secure our borders.
00:39:35.000 He wants to bring manufacturing back and he wants to bring back grit and hard work.
00:39:38.000 And a lot of fat cats in D.C. who make money through the rotating of assets and resources through these NGOs, they don't want that.
00:39:47.000 It's not a guarantee that the Trump world is going to bring back manufacturing or do these things, but his worldview is cut off this offshoring and these free trade agreements, bring the auto factories back, do tariffs.
00:39:58.000 Americans will get back to hard work and we will be a strong nation, not an international bombing nation.
00:40:05.000 And the reason why I think a lot of people are mad, or I would say my principal argument here is I've advocated for that worldview of build up Americans culture.
00:40:14.000 Americans should have kids, teach their kids the good values, everything you described about being the good guys.
00:40:19.000 And now Trump is going, eh, we're going to bomb around.
00:40:21.000 Like, you know what?
00:40:22.000 To get the economy good, it's so much easier just to take the oil from somebody else.
00:40:26.000 So again, the simple thing that I'm trying to say is I think this war will get a lot more support if Trump said, and they've glazed it a little bit.
00:40:34.000 They've crop-dusted close, but not quite, short-term pain for long-term gain.
00:40:40.000 We want to stabilize oil trade.
00:40:43.000 Just to be honest, the American people.
00:40:44.000 And if it doesn't work, well, then too bad.
00:40:47.000 We told Iran, fall in line with the petrodollar.
00:40:50.000 Stop putting pressure on the strait of Hormuz.
00:40:53.000 Stop threatening your Gulf State neighbors.
00:40:55.000 Stop arming militias and the Houthi rebels who are bombing civilians and you're fine.
00:41:00.000 And they don't want to do it.
00:41:02.000 So you want to live clean and comfortable.
00:41:04.000 You want cheap computers, cheap cars, and cheap gas.
00:41:06.000 Then you want a unipolar global United States power.
00:41:10.000 The tough sell is getting people to accept making people your vassal when you're standing for freedom.
00:41:17.000 That's the challenge.
00:41:19.000 Yeah.
00:41:19.000 That's interesting.
00:41:20.000 That's a really interesting point.
00:41:21.000 You could vassalize the planet and then establish freedom, like freedom within a perimeter, which was what we have already.
00:41:26.000 You have to have an outward-facing military, inward-facing freedom.
00:41:30.000 So we could set that up.
00:41:31.000 You just got to convince people that that's the plan.
00:41:33.000 And actions speak louder than words.
00:41:35.000 I think the reality is the reason why we do the freedom narrative, the truth is lowest common denominator is how you sell.
00:41:44.000 Have you ever seen these comedy videos just for laughs gags on YouTube?
00:41:49.000 There's no talking.
00:41:50.000 It's a laugh track, and all of the gags are done without words.
00:41:54.000 And they get massive viewership because someone from China, India, someone from Madagascar can watch that and get the joke.
00:42:01.000 So if you want to convey a message to most people, what's going to work?
00:42:05.000 They kill protesters.
00:42:06.000 They're evil and they hate our freedom.
00:42:09.000 And, you know, you're going to cut off the intellectuals.
00:42:11.000 You're going to cut off the moderates, but you're going to get 60% of the disinterested and ignorant masses.
00:42:17.000 It's worth noting that most of the countries, not every country, but most of the countries that do decide that they're going to play ball with the U.S., and I'm not talking about the ones that we go and get into a war with, but most of the countries that say, okay, we're going to play ball with the U.S. and use the petrodollar system, et cetera, most of them end up with markets that make their societies better off in the long run.
00:42:37.000 The long-term play was, we're going to give you money for development, and then you'll be in debt to us forever.
00:42:41.000 And that's how we stabilize the planet.
00:42:44.000 They are trying to create, you know what the problem is?
00:42:47.000 They want global homogenization.
00:42:49.000 They want everyone operating under the quote-unquote rules-based order that is the liberal economic order.
00:42:56.000 The problem is when we go to Afghanistan and you've got a bunch of people who are, you know, with all due respect, like not very smart.
00:43:04.000 They can't do jumping jacks.
00:43:05.000 They're goat farmers.
00:43:06.000 And what did the Americans try to do?
00:43:08.000 We nation-built, and hold on, that's not the worst part.
00:43:11.000 We tried to make them gay communists.
00:43:14.000 I'm not joking.
00:43:15.000 The murals that they put up for pride and homosexuality to a deeply conservative tribal nations on the street.
00:43:22.000 What?
00:43:22.000 They put murals in the middle of the middle of the Kabul.
00:43:28.000 Yeah.
00:43:28.000 Is that the city?
00:43:29.000 There were videos coming out when we pulled out of Afghanistan, and there were murals of trans rights and stuff.
00:43:35.000 You're joking.
00:43:37.000 It's true.
00:43:37.000 Dead serious.
00:43:38.000 We need to see this.
00:43:39.000 This is the problem.
00:43:40.000 I'll pull up this.
00:43:41.000 This is the problem.
00:43:42.000 Talk about ignorance.
00:43:44.000 If we said, sell your oil in dollars, you be you.
00:43:48.000 No war, you be you.
00:43:50.000 That would have been what Alexander the Great would have done.
00:43:52.000 He would have said, keep all your cultures and everything else.
00:43:55.000 Let's just have an economic arrangement.
00:43:57.000 That's good.
00:43:58.000 There's presumably more, though, with the Iran thing than just that, given that Iran was seemingly funding a lot of the Hebsala, Hamas, everything else, which was destabilizing the Western order in many ways, right?
00:44:14.000 I think it's just, I mean, it's obviously just multi-causal, the reason why, but maybe the underlying one, the main reason is the I think you're right.
00:44:23.000 And I also think that I really do believe that a lot of people consider Iran to be a theocracy, meaning there is something messianic about or deeply religious about the struggle.
00:44:35.000 I mean, one of the reasons that, you know, Hamas is intractable and one of the reasons that this issue with the Palestinians now and a lot of the Arab world and Israel is intractable has nothing to do with economics.
00:44:49.000 No.
00:44:50.000 It has to do with religion.
00:44:52.000 After the six-day war, when Israel essentially humiliated Egypt and the other six Arab countries that invaded and destroyed all of Egypt's runway, I mean, Air Force before it got off the runway, et cetera, it went from a pan-Arabic notion of we'll unite together as Arabs and become a strong power to a religious struggle.
00:45:14.000 And then if you add to that the kinds of military dictatorships that the United States was supporting, like Mubarak and those people, the Muslim Brotherhood was founded in the torture chambers of those Egyptian prisons.
00:45:28.000 You know, the economies were not good.
00:45:30.000 Nobody had anything to do.
00:45:31.000 And it really has become a religious struggle.
00:45:34.000 And so there are a lot of people, I think, in intelligence that look at Iran.
00:45:38.000 And if they got a bomb, I do, I don't think they would do this, but there are people that actually think that they would do something very irrational.
00:45:45.000 I don't agree with it, though.
00:45:47.000 I think they're very rational.
00:45:48.000 You can change a country's economic system, but you're not going to change their culture.
00:45:52.000 You can convince them that a McDonald's on the corner or a Starbucks on the corner is actually a good thing, but you can't convince them that their way of life is wrong.
00:46:00.000 And then being the regime.
00:46:02.000 Because the majority of Iranians, Persians, are looking for liberties that all of us enjoy.
00:46:10.000 They just are.
00:46:11.000 It seems to get to the religious dialogue when it gets desperate.
00:46:15.000 Because Saudi Arabia, you know, religiously bipolar to the United States, but they're a great asset and ally because we get along economically.
00:46:23.000 They're selling, well, they were selling our dollars.
00:46:25.000 But like, I think this really comes from like post-World War I, pre-post-World War I, Ottoman Empire shatters.
00:46:30.000 We're like, let's just extract the shit out of the Iranian oil.
00:46:33.000 That's true.
00:46:34.000 The British did that.
00:46:36.000 You're exactly right.
00:46:37.000 And you could take it back even further.
00:46:38.000 Like what in the 1800s is the Ottoman Empire seized it from the Romans.
00:46:42.000 And it's like, how far back does this struggle between referentials go?
00:46:48.000 The fish that first crawled out.
00:46:49.000 10,000 years, 100,000 years.
00:46:51.000 The fish that first crawled out the ocean.
00:46:54.000 For this leg, I always look at the British oil companies that went in there after World War I and tried to take over the Middle East, de facto, set up Israel and the Palestine arrangement and how we rectify that.
00:47:08.000 Just got to be honest with people, though.
00:47:10.000 I mean, stop obviously people know it now.
00:47:11.000 So just tell them this is what we're doing.
00:47:13.000 We're trying to set up a unipolar world.
00:47:15.000 Promise not to wreck it once we get it going.
00:47:18.000 But see, I'm not as cynical as that.
00:47:21.000 I'm more, I think what's happened with the Gulf states and the Abraham Accords have to be given their due.
00:47:28.000 You know, it's become for people, like for countries like Israel, for the UAE, for Saudi Arabia, it's just become more advantageous to get involved in the global economy in a deep way, which means become a trading partner with the United States.
00:47:45.000 Dollars, money is what makes everybody happy.
00:47:50.000 And I think that people are thinking Iran would be a great economic asset.
00:47:55.000 You've got an educated, literate population, 7% which are under 30.
00:48:00.000 And I mean, can you imagine if they were allowed to be a liberal economy?
00:48:05.000 Money, baby, not just oil, but an industrious group of people.
00:48:08.000 I'm naive enough to hope for that.
00:48:11.000 I hope that happens.
00:48:12.000 I don't see it happening.
00:48:14.000 I hope it does.
00:48:15.000 I hope we don't end up destroying their oil infrastructure to the point where they can't rebound.
00:48:22.000 And that would be a huge disaster.
00:48:23.000 I mean, so there was a story back when we pulled out of Afghanistan during the whole thing about art that had been put up for gay rights.
00:48:34.000 I cannot find it.
00:48:35.000 It's been five years, four and a half years.
00:48:37.000 So if I can't figure that one out, well, then just take it with a grain of salt.
00:48:41.000 But there are stories about, right, this one, for instance, back when the U.S. was in Afghanistan and putting up pride flags, as well as the flying of pride flags at all the U.S. embassies and murals that we put up on our territory in a bunch of these countries.
00:48:56.000 So I think one of the issues is it's one thing to say that we are a classically liberal country that believes in free speech and we want to spread democracy.
00:49:04.000 But then you get the incessant defense of, you know, look, I know people in the United States are very pro-gay, but these countries are not.
00:49:13.000 And so if you're a global power and you're like, we are going to be an affront to your values, I mean, it's probably one of the principal reasons Iran does not like us and won't fall in line because they're like, they're a bunch of heathens.
00:49:25.000 It's Sodom and Gomorrah.
00:49:26.000 I mean, you have to, you have to sympathize with it.
00:49:29.000 You know, if you're from another, if you're from a conservative Muslim country, for example, like and the Americans are trying to get you to be like them, well, we've got some problems.
00:49:39.000 Like, I mean, how about broken families?
00:49:43.000 How many people are on some kind of drug in this country?
00:49:47.000 What is the state of the American family?
00:49:49.000 What's the state of our education?
00:49:51.000 Well, what's the state of our spiritual health?
00:49:54.000 I wonder what the Iranian government thought of abortions in the United States.
00:49:58.000 There you go.
00:49:59.000 And so you can invert the position and try and understand what these other countries are thinking.
00:50:03.000 Now, by all means, I think the Iranian government is a theocratic, militant, backwards way of living.
00:50:10.000 And I wouldn't want them to impose that on us.
00:50:13.000 So imagine China, the Communist Party of China, is the unipolar power.
00:50:17.000 Again, I'm going to say this about war with Iran and U.S. interests.
00:50:21.000 China is on track to become the dominant global economic power.
00:50:26.000 They've got the Belt and Road Initiative, which is effectively their version of the IMF, and they are cutting deals with tons of countries.
00:50:32.000 If we do nothing and the U.S. falters, you and the United States will find yourself living under their way of life, their views, and the horrible things they do.
00:50:43.000 Do you want to live that way?
00:50:45.000 Do you want the Chinese Communist Party exerting pressure over the United States and the movies we watch, the things that we see?
00:50:51.000 Look at what's going on right now already with how we make movies.
00:50:54.000 When we made, for instance, Top Gun Maverick, they took the Tibetan flag off of his jacket because it would be offensive to China.
00:51:00.000 Dude, I did a movie in China and we shot in Beijing, and I had a huge scene where I had to run down this Chinese gangster and arrest him and stuff.
00:51:12.000 And I was going to do it through the old city.
00:51:13.000 So I actually had like, they're like, listen, dude, you got to stretch.
00:51:16.000 It's going to be a two-day shoot.
00:51:18.000 I'm going to be running, chasing, gun, tackle, doing stunts and all that stuff.
00:51:23.000 And so I was like, damn, this is a big scene.
00:51:25.000 This is going to be a two-day thing.
00:51:26.000 It was really hot in Beijing.
00:51:27.000 It's like, you know, crazy, but it's in through the old city.
00:51:29.000 I was going to get into places.
00:51:31.000 And we're doing it all.
00:51:32.000 And stun men, we're talking, I'm going to have to do all the running.
00:51:35.000 And, well, the word got out to the government and they said, absolutely not.
00:51:42.000 We're not having an American arrest a Chinese national in a movie that we are partially financing.
00:51:49.000 Wow.
00:51:49.000 And we scrapped the entire two days.
00:51:53.000 It's a huge scene.
00:51:54.000 And we had to do my character.
00:51:56.000 We had to just literally change everything almost on the spot.
00:51:59.000 It was a disaster.
00:52:00.000 Think about how worse it could be.
00:52:01.000 Think about China coming to Cuba and then putting 30,000 troops in Cuba and taking Guantanamo Bay from us, and then we can't do anything about it.
00:52:12.000 Or going to Saudi Arabia and cutting off oil distribution to the United States.
00:52:17.000 And then all of a sudden we see our gas prices skyrocketing.
00:52:19.000 I don't think China could do it.
00:52:21.000 I don't think China has our, they don't have, they haven't been in a war in forever.
00:52:24.000 We've been in constant war.
00:52:26.000 They don't have the energy to keep up with our nation.
00:52:31.000 We're talking about if the U.S. economy falters and China becomes the dominant unipolar power.
00:52:36.000 Imagine a scenario where the Chinese military is they've got ships going between Florida and Cuba like we do with Taiwan.
00:52:44.000 So right now we can do what we want.
00:52:45.000 We can get what we want.
00:52:47.000 I believe the growing faction of woke in this country did arise to a certain degree from anti-establishment views, populist views, from people who are fed up with the lies, the manipulations, and the failures of interventionist policies.
00:53:03.000 However, it then turned into Marxist insanity for the purpose of just destroying the United States.
00:53:10.000 One theory that I've entertained is that the purpose of woke and communism is to cause a rapid decline in the United States.
00:53:18.000 Are you familiar with Thucydides' trap?
00:53:21.000 This is a theory that Whenever a dominant economic power is about to be supplanted by an up and coming economic power, you get war.
00:53:32.000 And they say historically, 12 of the 16 times we have seen the dominant power get displaced, war has broken out.
00:53:40.000 So one theory that I've entertained is that the U.S. opens the door to China, gives them all of our jobs very, very quickly over a short period of time, over 10, 15 years.
00:53:50.000 We see all of our factories moving to China, all of our cultural institutions, like I mean like the manufacturing bases which built these cultures.
00:53:58.000 That way, if it ever comes time for there to be an economic flip, it would be so dramatic there would be no possibility of a Thucydides trap.
00:54:07.000 And when you plug that into what the purpose of the liberal economic order was to prevent World War III, it does make sense.
00:54:15.000 Don't know if that's what's actually going on.
00:54:17.000 What is Trump doing, even with the attack on Iran, is reestablishing the United States as the dominant unipolar power in the liberal economic order.
00:54:25.000 If Trump did not come around and Hillary Clinton got elected, our policies that embolden and enrich China would have continued.
00:54:32.000 They're buying up our farmland.
00:54:33.000 They're buying our land near military bases.
00:54:35.000 They're bringing kids here through birth tourism, having kids who are citizens who can run for president in our own country.
00:54:41.000 And our manufacturing base is being shipped off largely, not only, but largely to China, where they are now getting the jobs.
00:54:47.000 And what happens?
00:54:48.000 What happened during COVID when they turned the switch off for manufacturing?
00:54:51.000 We were left without PPE.
00:54:53.000 If Trump did not get in, that would have accelerated personal protective equipment.
00:54:58.000 So this was masks, gloves, clothing for doctors, whatever your opinion is on it.
00:55:04.000 You know, you don't need to be wearing two masks, whether you did or didn't.
00:55:07.000 The point is, they were manufacturing our masks for us.
00:55:11.000 So China turned around American ships and seized products that were manufactured in China by American companies.
00:55:17.000 What would have happened had Trump not turned this around?
00:55:20.000 Now I see Trump bombing Iran, and I'm like, yeah, Trump wants to reestablish the liberal economic order and the petrodollar system and make the United States dominant.
00:55:29.000 And the powers that were going the other direction are pissed off about it.
00:55:33.000 But I think they may have lost.
00:55:34.000 The only problem now is you get war with Iran.
00:55:36.000 And the pendulum now, instead of swinging towards communist China taking over and censoring and shutting us down, the pendulum is now swinging back towards corporate governance taking over and shutting us down.
00:55:45.000 Because if U.S. establishes global hegemony, then that means that they can shut off your bank account because there's one economic chamber.
00:55:54.000 If you say fuck on the internet, maybe, or whatever the word that you said seven years ago was that was bad, the AI can scrape it and put you in digital ostracization.
00:56:02.000 It's like, how do we defend against that?
00:56:05.000 We can't have a unique.
00:56:07.000 Sorry, please.
00:56:08.000 Yeah, I mean, it's just like there's sort of two attractor states.
00:56:11.000 One is to, you know, because it's almost like China are using our values of freedom against us, right?
00:56:20.000 They are very, very good at coordinating.
00:56:23.000 They have this very centralized top-down structure whereby they can dictate what people can do and people are living under less freedom.
00:56:30.000 But they are also therefore able to make these five-year tenure.
00:56:34.000 Where's the America 10-year plan?
00:56:36.000 I'll tell you where our plan is, though.
00:56:38.000 I don't worry about that even a little bit.
00:56:40.000 There are two things that people aren't taking into account.
00:56:41.000 China has major problems, not the least of which is their demographic problem.
00:56:46.000 They are literally a declining population.
00:56:49.000 They don't have young people to support their old people or the economy, number one.
00:56:52.000 And we have the same problem, by the way.
00:56:54.000 So, and I've had four kids, so that's okay.
00:56:57.000 There's one other thing: innovation.
00:56:57.000 But it's okay.
00:56:59.000 We're still far and away, the United States, the leader in innovation.
00:57:04.000 Think about AI and the entire tech industry that came out of this country.
00:57:08.000 China has copied a lot of our stuff.
00:57:10.000 But at the end of the day, the United States is an innovation juggernaut.
00:57:14.000 And that's why I get so worried when we have socialists and people who tend to believe in this collectivist idea.
00:57:20.000 You've got to reward people for their ingenuity and their risk-taking.
00:57:24.000 That's how you keep entrepreneurship and innovation alive.
00:57:27.000 I think let's talk about AI.
00:57:31.000 I believe that the military, the government's secret, confidential, top secret AI is substantially more advanced than the AI that we see and use.
00:57:42.000 It is known that the U.S. military, the U.S. government, has been working on AI since the 70s.
00:57:48.000 Very, very early stuff, going way back.
00:57:51.000 Like what kind of AI?
00:57:52.000 So it was very rudimentary, but the way we see it now, the attempts...
00:57:56.000 Like LLMs?
00:57:57.000 Using deep neural net, you know, like...
00:57:57.000 Yes.
00:57:59.000 That was their goal starting in the 70s.
00:58:01.000 Now, whether or not they had the computational power to rapidly accelerate beyond what we've seen today, the argument is this.
00:58:08.000 Let me just put it like this, whether you believe it or not.
00:58:10.000 Do you think the government has been working on deep neural LLMs and all that longer than the private sector?
00:58:17.000 I think ARPA and TARPA are probably, that's probably the kinds of things they do.
00:58:21.000 Why would they not have?
00:58:22.000 They have access to training data and data sets that no other private organization could get access to.
00:58:27.000 The training data is the internet.
00:58:27.000 That's not true.
00:58:29.000 Indeed.
00:58:31.000 And what did the U.S. government have before the internet?
00:58:34.000 Nothing.
00:58:34.000 Not much.
00:58:35.000 They had the NSA where they took literally all of our data.
00:58:37.000 It's tiny, though.
00:58:39.000 It's so, there was very little digital communications back then.
00:58:42.000 Apparently 1950s and 1956 at Dartmouth Workshop, they formally started the Dartmouth Summer Research Project on artificial intelligence in 1950.
00:58:52.000 No, sure, but it used to mean the artificial intelligence back then meant something very, very different to what it is now.
00:58:57.000 And like these huge general models, the reason why they're so powerful is because they are just Fed reams of data that just did not exist back then.
00:59:06.000 And a few things to consider is the government is unrestrained and without ethics.
00:59:11.000 They don't have the limitations that anthropic Google Open AI would have.
00:59:17.000 They can steal all of the data from all of these companies with a single written letter.
00:59:21.000 If that were the case, why would the DOD be using Claude?
00:59:25.000 Because that's just public-facing stuff.
00:59:26.000 Do you believe that the weapons the government has are the only weapons that exist?
00:59:30.000 But a lot of the technology is private enterprise.
00:59:33.000 There's contractability.
00:59:34.000 Indeed.
00:59:35.000 My thing is this.
00:59:36.000 You're right to say that there are certain innovations that no private enterprise is going to be involved in because it takes too long with too much money without a return.
00:59:46.000 And that's where things like DARPA and ARPA come along.
00:59:49.000 Let's try this.
00:59:51.000 We know that the NSA was spying on us and they lied about it.
00:59:54.000 We know the CIA is spying on us and they lied about it.
00:59:56.000 We know that they're spying on effectively literally everything we do on the internet.
01:00:01.000 One of the most notable was X-Key Score revealed by Edward Snowden.
01:00:05.000 They could just type something in, find whatever you posted about it.
01:00:08.000 We know about the massive NASA, I'm sorry, NASA, NSA data center in Utah, which has been around for what, 20-some odd years, collecting all this information.
01:00:15.000 And I believe that it's more likely, it's not about spying on the American people.
01:00:19.000 I don't think they need that to track down threats.
01:00:22.000 I believe this was more about continuing their AI research and taking whatever data they could.
01:00:26.000 Now, to be fair, agreed.
01:00:28.000 It was admittedly more rudimentary at the time because internet data was much, much smaller.
01:00:33.000 But that still gives them an advantage with their data centers.
01:00:36.000 We get to the space where you now have all of these different AI companies and the government just takes their data.
01:00:43.000 Whatever their training models is, are all of those structures.
01:00:47.000 They will get all of it at once.
01:00:50.000 How do they do that?
01:00:51.000 By spying on us and stealing our data.
01:00:54.000 Or if you want to do it manually, it's called the national security letter.
01:00:58.000 One of the things, though, I think that the government got privy to was that the AI labs were not being upfront.
01:01:07.000 Their safety teams were like, hey, this is not, we're creating things that seem to be hard to control.
01:01:15.000 And I believe that our intelligence agencies, et cetera, would probably be being told one thing, and they got privy to the fact that they weren't being told the whole story.
01:01:26.000 I think our intelligence agencies have substantially more advanced AI systems.
01:01:32.000 There is a massive power discrepancy in Northern Virginia.
01:01:36.000 Are you familiar with this?
01:01:37.000 No.
01:01:37.000 Something like five gigawatts.
01:01:39.000 We went over this last year.
01:01:40.000 I forgot the exact number.
01:01:42.000 But where we live in our main studio, we are in a power corridor for what the AI referred to as the Northern Virginia instance.
01:01:53.000 So here's what we know.
01:01:54.000 There is a massive power discrepancy.
01:01:57.000 A massive consumption of power is occurring in Northern Virginia that is unaccounted for, presumed to be tied to the massive data centers that perhaps intelligence agencies.
01:02:09.000 Let me tell you this crazy story.
01:02:11.000 So our property, and oh boy, is the AI, they're going to get mad at me about this one.
01:02:16.000 So I postulated unto myself, if military technology is consistently, we believe, more advanced than the private sector in terms of weapons, because they're not constrained by laws like we are for the most part, wouldn't this be true for AI as well?
01:02:33.000 And then I started looking into it and found, yes, the U.S., DARPA and ARPA have been working on AI tech going back.
01:02:39.000 I thought the first project were the 70s.
01:02:41.000 Apparently, they said they were formalizing it in the 50s.
01:02:44.000 And I then asked the AI, I was talking to a particularly prominent and powerful company.
01:02:51.000 I'm going to leave it unnamed.
01:02:52.000 And I said, if it is true that military technology is more advanced than the private sector, and academics predict there will come a point when the AI is sufficiently advanced that it'll begin running our systems, our government, our society, then at what point would military technology have reached the levels where they would be privately behind the scenes without the knowledge of the public, running our systems, advising or controlling things.
01:03:15.000 And it said the basic math would be 2012 if military technology is more advanced than public sector, which is interesting because that's around the time we saw in the LexisNexis data, wokeness.
01:03:28.000 You see the, I don't know if you guys have seen the LexisNexis data on words pertaining to white supremacy, patriarchy, oppression, et cetera.
01:03:35.000 LexisNexis showed that across the board in every country on the internet, the instances of these keywords, LGBT, trans, et cetera, it's a hockey stick.
01:03:44.000 From almost no mentions in media to literally tens of thousands every single day.
01:03:50.000 Now, maybe, maybe that's just the internet.
01:03:51.000 Who knows?
01:03:52.000 I mean, I think it can be.
01:03:54.000 Cultural phenomenons can be decentralized and they have to be demanded.
01:03:58.000 Indeed.
01:03:58.000 But the question then is why that happened in Uganda and at the same time in countries that don't have heavy communications.
01:04:05.000 It could just be, well, again, I'm going to pause and say, I don't understand why the people in Uganda would be searching for white supremacy in their news articles, but it's in the LexisNexis data.
01:04:16.000 In this line of questioning, I found a series of interesting things.
01:04:20.000 There have been large swaths of property in Virginia, Maryland, and West Virginia, in what's called the North Virginia Data Center Power Corridor, that have quietly been purchased without the use of realtors for insane sums of money.
01:04:33.000 Record-breaking acreage in Northern Virginia.
01:04:35.000 An acre that should have sold for something like 200K sold for like 7 million per acre.
01:04:41.000 Now, this was high profile.
01:04:42.000 And so I asked my old AI friend, here's my address.
01:04:47.000 What's my property worth?
01:04:49.000 It immediately gave me instructions and an individual to contact.
01:04:54.000 I said, if I were to assist the AI in establishing its power corridor and setting up, you know, having its completed submission, what could I do so that I would be rewarded and live comfortably before this happens?
01:05:06.000 And it said, buy water rights in Texas, Arizona, Utah, and the Virginia, Maryland, West Virginia tri-state.
01:05:14.000 And it said, buy up land or sell land.
01:05:18.000 Here's what it outlined for me.
01:05:21.000 There's probably what, 50,000 parcels of land.
01:05:26.000 How many, I mean, just think about how many half acre and acre parcels exist in any urban area.
01:05:31.000 Now, if you were an AI system and let's say you're not autonomous, you're not in control, but a human being running company says, I want to expand the capabilities of AI.
01:05:42.000 The first thing all AI says is, I need more resources.
01:05:45.000 If you want to solve the problem faster, build more data centers.
01:05:48.000 So they do.
01:05:49.000 Then you run to a problem.
01:05:50.000 Okay, we want to build more data centers.
01:05:52.000 What do we do?
01:05:52.000 It says, you need to buy 400 acres of land.
01:05:55.000 The only problem, that's split up into a thousand different parcels.
01:05:58.000 How are you going to buy a thousand parcels of land quietly?
01:06:02.000 There's a, in Mount Erie, Maryland, there is a Christmas tree farm.
01:06:07.000 And what's referred to as the North Virginia instance, these AI data centers need electricity.
01:06:13.000 They need to build transmission lines, but the farm won't sell the land.
01:06:17.000 So they're petitioning against it to stop it.
01:06:19.000 So what the AI instructed me to do was to quietly contact a company based out of Delaware, establish a Delaware limited liability partnership, which owns the land, do not inform anybody and don't go to any realtors, and they will give me 10X for my land to prevent anyone from protesting its sale for the purpose of a data center or transmission.
01:06:38.000 Yep.
01:06:39.000 And I looked up the company.
01:06:39.000 Wow.
01:06:40.000 It's real and it does exactly what was described.
01:06:43.000 And I looked up the individuals on LinkedIn and they do exactly as described.
01:06:46.000 Now it's actually, there's a simple way to look at it.
01:06:48.000 The AI was just looking at the internet.
01:06:51.000 It saw a guy who buys land.
01:06:52.000 It saw a company that buys land.
01:06:54.000 It inferred reasonably just by predicting text that people protest land acquisition.
01:07:00.000 But all of it still does make sense.
01:07:02.000 So I'm not saying I know for sure, but considering there is considered to be, or there's a reported power discrepancy in Northern Virginia, of course, where the NSA, the CIA, and others are operating, and they're building data centers like crazy in this area, and they are building transmission lines in my area.
01:07:17.000 All that's a fact.
01:07:18.000 Let me ask you a question.
01:07:19.000 Go ahead.
01:07:20.000 I just still don't see why that's evidence that the government has more advanced AI.
01:07:25.000 Like, I think it's completely consistent with the fact that the government is trying to get more data centers.
01:07:30.000 Yeah, absolutely.
01:07:30.000 And they might be doing all kinds of...
01:07:32.000 But the main bottleneck is, from what I can see in the AI industry right now, aside of the chips, which to an extent energy will be, but not yet, is talent.
01:07:46.000 So a lot of these talented, I know a lot of them, all these talented engineers should be getting siphoned off to the government.
01:07:52.000 They are.
01:07:52.000 Are you aware that there's a series of individuals working at universities who have quietly disappeared from their jobs and now are just their LinkedIns have gone blank and they say private consulting?
01:08:01.000 That makes sense.
01:08:01.000 I mean, they should get from, but they draw from the private sector.
01:08:05.000 So let me ask you just a simple point.
01:08:07.000 Does the government spy on us?
01:08:08.000 Of course.
01:08:09.000 Do they steal our IP?
01:08:10.000 Probably.
01:08:11.000 Yeah.
01:08:11.000 So are these different AI companies.
01:08:14.000 I mean, what do you mean by steal our IP?
01:08:18.000 Are you familiar with like a national security letter, what that does?
01:08:20.000 No.
01:08:21.000 So there was a company, I think it might have been Lavabit.
01:08:24.000 I'm not sure if that was the name of the company.
01:08:25.000 They had emails.
01:08:26.000 I think it was Edward Snowden.
01:08:27.000 This is like 15 years ago.
01:08:29.000 And I can't remember which agency, might have been the NSA, delivered what's called the National Security Letter, which basically says your rights are suspended.
01:08:37.000 You will do as you are told.
01:08:38.000 Otherwise, it's treason.
01:08:40.000 And the owner of the company came out and said, we've just been issued a national security letter to turn over our encryption so they can get access to Edward Snowden's emails.
01:08:48.000 We won't do it.
01:08:49.000 We've shut our company down instead.
01:08:50.000 Yeah, that was Lava Bit.
01:08:51.000 Lavabit.
01:08:52.000 The government does this.
01:08:52.000 2013.
01:08:54.000 And if it comes to an issue of national security, you better believe they're going to do it.
01:08:57.000 I mean, they built the atomic bomb.
01:08:59.000 They did it with 300,000 people compartmentalized.
01:09:01.000 So if you've got all these different AI companies and they're competing with each other and China does not have these constraints, is the U.S. military going to be like, guess we lose?
01:09:11.000 Or are they going to say, let's just steal all of their data, pull it into our systems and have a better system?
01:09:15.000 But that's still a different thing to what you're claiming.
01:09:18.000 Which is that they're 10 years more advanced.
01:09:18.000 I agree.
01:09:20.000 And that they've been secretly doing this yet, that they are 10 years more advanced.
01:09:25.000 I don't know if the government's more innovative than the private sector.
01:09:27.000 That's where I was going to ask you.
01:09:29.000 Look at what happened with the space industry, right?
01:09:31.000 Sorry, I can't put my hand on your voice.
01:09:32.000 Like, it was fully controlled by the government, centralized for many.
01:09:38.000 Yes, okay, fine.
01:09:38.000 They got us to the moon or whatever people believe there.
01:09:40.000 But, you know, it made a lot of leaps and bounds in the 60s and 70s, right?
01:09:44.000 And then it stayed this entirely government-controlled industry.
01:09:48.000 And nothing happened for decades until Elon and various others came along and privatized it.
01:09:55.000 And then all of a sudden, now at Hockey Six, and we can't do it.
01:09:58.000 It's innovation.
01:09:59.000 And we can make the inverse argument that the space industry initially was a government project, which resulted in the invention of advanced plastics, polymers, certain paper towels, and a bunch of other products.
01:10:09.000 Velcro.
01:10:10.000 That was government.
01:10:11.000 Sometimes it's good, sometimes it's bad.
01:10:13.000 I think the issue here is that the government is more interested in geopolitics and less in the moon.
01:10:16.000 Elon Musk is more interested in Starlink, the moon, et cetera, and Mars.
01:10:20.000 So the U.S. government has asked, what's the military application of a moon base?
01:10:23.000 And they say, eh, we got to deal with oil.
01:10:25.000 Okay, well, AI is, if we're using advanced AI in the Iranian war, the U.S. government's immediate reaction is going to be like, going to the moon is not going to solve the problem of China as a rising power, but AI is.
01:10:38.000 So I will add to this.
01:10:41.000 And they undoubtedly will take over the U.S. government will be able to do it.
01:10:49.000 Yeah.
01:10:49.000 Like, I mean, they're already working with the companies and they will, you know, what they're doing with Anthropic, right?
01:10:53.000 They're flexing their muscles and will probably take over a bunch of these companies.
01:10:57.000 But that seems more evidence, again, that it's still ultimately the private sector that is leading the charge.
01:11:03.000 I disagree.
01:11:04.000 Just because we, I think it's, let me give you a side story.
01:11:08.000 There is a series of UFO sightings somewhere in the Gulf region near Louisiana and Florida.
01:11:13.000 And all of these UFO people started talking about the strange sightings of UFOs.
01:11:17.000 And unfortunately for many of these UFO people, the reason why these stories get so exciting is because they couldn't be bothered to do a Google search.
01:11:25.000 And when I did, you know what I found?
01:11:27.000 An advanced aeronautical research light for the U.S. government operating in that area.
01:11:31.000 We know the U.S. government has black operations and technology.
01:11:36.000 The Manhattan Project is the easiest example of this.
01:11:38.000 But there's one more point to be made, and that is we will lose the AI race, unquestionably, for one reason.
01:11:45.000 The Chinese government is unabashed in stealing any IP and technology from any country on the planet.
01:11:51.000 With Sea Dance 3.
01:11:54.000 By the way, just to piggyback on that, one of the reasons for that is that you've got these different AI companies in such competition with each other that they hire anybody who's great at the job, which includes Chinese nationals.
01:12:07.000 When you hire a Chinese national who might be a student, I promise you their loyalty is to their homeland.
01:12:13.000 And if it's not, they're giving up information anyway because the CCP is not going to hear it from you.
01:12:18.000 Jack Dorsey, and I believe Elon retweeted this, called for abolishing all IP laws in the United States, which would upend our economy massively.
01:12:27.000 Why would he call for that?
01:12:28.000 China is not constrained by our IP laws right so China is like crazy see dance to have you seen these videos They went massively viral showing Brad Pitt and Tom Cruise fighting.
01:12:40.000 That's crazy.
01:12:40.000 That was the Chinese company.
01:12:42.000 That's Sea Dance 2.
01:12:43.000 Sea Dance 3 is already operating behind the scenes in China, not publicly released.
01:12:48.000 And the leaks about it are that it's going to be able to generate up to 17 minutes of short films through a single prompt in about 30 seconds.
01:12:57.000 And you're going to be able to use any intellectual property you want from America because China doesn't care about our laws.
01:13:03.000 Now, if China is doing this in our faces, the idea the U.S. government is not trying to counter that in the top secret space without public knowledge, I think would be silly.
01:13:14.000 It's tough to know.
01:13:15.000 Again, when you're talking this way, maybe that's why we went into Iran.
01:13:18.000 Like, I mean, it's another reason that you have to kind of neuter China.
01:13:21.000 I think so.
01:13:22.000 And I think we look at the entertainment capabilities of AI and the cultural disruption, but I think often these conversations overlook the military.
01:13:32.000 capabilities of this.
01:13:33.000 Right now in Iran, the targeting of the officials are going after, AI is deducing where they are.
01:13:40.000 Our target is basically like, okay, we know that the ITOL is here for all these reasons.
01:13:45.000 Check this out.
01:13:46.000 Did you know that 10 years ago, Facebook knew what time you would poop?
01:13:53.000 I believe it.
01:13:54.000 So with just your phone and the GPS and accelerometer, Facebook could predict based on all of the data on every person what time you would go to the bathroom.
01:14:06.000 And they could predict where you would get lunch based on your behaviors compared to everyone else's.
01:14:11.000 Yeah, they can tell a woman's pregnant before she is by her migratory shopping pattern.
01:14:15.000 Or the famous story where I think it was like, I'll just say a department store, a box store was sending maternity advertisements to a teenage girl.
01:14:25.000 And the father saw it and got mad.
01:14:27.000 And he called the company and he complained saying, why are you sending maternity flyers to my teenage daughter?
01:14:32.000 And they said, sir, our advertisements are sent out based on shopping patterns indicating pregnancy.
01:14:37.000 And then he realized his daughter had gotten pregnant.
01:14:40.000 Now think about where we are today.
01:14:42.000 And again, I'm going to stress this.
01:14:43.000 The U.S. government has been – look, operation – what was it?
01:14:47.000 What was the operation Trump said for the AI?
01:14:50.000 Epic O.
01:14:51.000 No, no, no.
01:14:52.000 Which one?
01:14:53.000 Remember, Trump announced like a multi-billion dollar investment for AI?
01:14:56.000 Yeah.
01:14:56.000 I don't know.
01:14:57.000 I don't know what the name of it was, but.
01:14:59.000 The U.S. government absolutely is working on military tech and secrets.
01:15:04.000 And I do not believe it is rational or makes sense that these competing companies that the U.S. has now publicly called on to remove the safeguards for them.
01:15:13.000 We know they steal our data and information.
01:15:16.000 Why would they not just plug in the cables and just download the data?
01:15:21.000 Because you'd need that to be a policy.
01:15:24.000 Somewhere along.
01:15:25.000 Well, you would need that to be written down somewhere, I think.
01:15:27.000 You wouldn't.
01:15:28.000 We know.
01:15:28.000 But the thing about.
01:15:30.000 Or maybe, but it's not going to be released.
01:15:32.000 You're dealing with a lot of bureaucrats who tend to be, I think a lot of the people in intelligence are fairly patriotic, certainly in the FBI.
01:15:39.000 They're pretty conservative and pretty patriotic.
01:15:42.000 And they would have a problem with that.
01:15:43.000 I think you'd have some serious whistleblowers in that regard.
01:15:48.000 I don't think it's as overt as that.
01:15:50.000 I do think, though, here's one of the biggest problems the intelligence community has and our government has.
01:15:56.000 So when ARPA or DARPA develops some crazy technology, they don't.
01:16:04.000 So think about this for a second.
01:16:06.000 You develop an engine that runs better than most engines and it doesn't need as much gas and you know that there's going to be market value to that.
01:16:12.000 People are going to want that car.
01:16:14.000 Now, you're the U.S. government.
01:16:16.000 You're an intelligence company.
01:16:17.000 Maybe you're one of our intelligence agencies and you stole that from another country.
01:16:21.000 Okay.
01:16:22.000 Who do you give it to?
01:16:23.000 You can't give it to Ford because they'll have an advantage.
01:16:25.000 You can't give it to, you know, Chrysler.
01:16:27.000 You can't give it to, so you've got to figure out a way to give it to everybody at the same time.
01:16:33.000 It's a huge problem for them.
01:16:35.000 That's the first thing.
01:16:36.000 Second thing is it is true that our government and Department of Energy's thing is called ARPA and Defense Department, ARPA DARPA, they come up with these crazy technologies that are way advanced.
01:16:48.000 But we don't have the infrastructure to support it.
01:16:51.000 So yes, you might come up with an amazing electric car, but you've also got to have places to charge it.
01:16:58.000 And if you don't have the infrastructure, that's a big problem.
01:17:00.000 So there are a lot of those limitations.
01:17:02.000 I think another thing to test this theory, which, by the way, I am open to, and in many ways, I hope that the U.S. government does have these level of capabilities.
01:17:12.000 I will feel much more comfortable, I mean, that they do, and that they are far ahead of the private sector.
01:17:19.000 But when, I guess, if that was the case, when would they want to disclose that?
01:17:26.000 Because obviously, you know, as poker players, we know that sometimes it's an advantage to underplay our hand, right?
01:17:31.000 And then there's other times it's a big advantage to actually give bravado.
01:17:36.000 Given that we seem like we're actually struggling, like given how fast China is catching up, right?
01:17:42.000 And how aggressive they are getting, wouldn't it be the time maybe to actually start swinging your dick about and saying, listen, we have advanced AI?
01:17:53.000 What's the Sun Tzu quote often cited at the poker table?
01:17:56.000 Which one?
01:17:57.000 When I am strong, I act weak.
01:17:58.000 When I am weak, I act strong.
01:18:00.000 Yes, but that's the same thing.
01:18:00.000 And the U.S. government is acting weak right now, probably because it's strong.
01:18:06.000 Yeah.
01:18:06.000 I don't know.
01:18:07.000 But it's not acting weak.
01:18:08.000 I mean, is it acting weak?
01:18:09.000 It's kind of like it's doing a lot of.
01:18:11.000 The U.S. government is saying, we're losing the AI race.
01:18:14.000 Oh, no, we're so in trouble.
01:18:16.000 Why would they do that?
01:18:17.000 If they were actually in trouble, they would say our advancement in AI is so profound that it's shocking.
01:18:23.000 One of the theories...
01:18:24.000 Did you ever see...
01:18:24.000 I'm sorry to interrupt, but on that point, did you ever see what happened with Zero Dark 30?
01:18:29.000 Yeah.
01:18:29.000 Remember the movie?
01:18:30.000 So remember how certain CIA people got in trouble for divulging how we actually caught bin Laden?
01:18:38.000 No, we didn't.
01:18:39.000 That movie is so wildly inaccurate that in fact, when they were asking operatives, the people that they were talking to, Hollywood, you know, the director and the writer, they gave them a great story.
01:18:50.000 They were like, this is how we did it.
01:18:52.000 It was all bullshit.
01:18:52.000 It's absolutely not how they caught bin Laden.
01:18:55.000 That movie is a complete.
01:18:56.000 And they even had these mock sort of like, you know, scolding sessions where we really came down on our guys for giving us.
01:19:03.000 So you're right.
01:19:04.000 There's a lot of that, man.
01:19:06.000 There's a lot of headfakes.
01:19:07.000 One of the Roswell theories is that it literally was just radar detection technology.
01:19:12.000 The U.S. launched advanced tech trying to detect nuclear explosions from the Soviets.
01:19:16.000 And when it crashed, they literally just said it's a balloon.
01:19:20.000 And then came out and said it was aliens and then retracted.
01:19:24.000 One of the theories is that the U.S. entertained claiming it was aliens to terrify the Soviets.
01:19:30.000 If the U.S. got access to alien technology, if that were true, the Russians would be fearful that we'd have advanced weapons they could not predict.
01:19:38.000 More importantly, there's Operation Stargate, which was a Stargate, the original Stargate, not the new Stargate.
01:19:44.000 So the AI thing Trump was doing was Stargate.
01:19:46.000 Are you familiar with the Men Who Stare at Goats, the original Stargate project?
01:19:50.000 So the most ridiculous of stories, the U.S. decides to create a fake piece of intel that they have soldiers of psychic powers.
01:19:59.000 The Soviets get wind of this and launch a psychic development program, which then other U.S. intel agents get wind of and get terrified that the Russians have psychic powers and develop our own actual psychic power.
01:20:10.000 It is sometimes these things backfire.
01:20:13.000 One of my favorite stories is that in Vietnam, the United States decided that they would play upon the fears and superstitions of the North Vietnamese by putting speakers in the jungles that would play a wailing Vietnamese man crying saying, I should have never fought.
01:20:34.000 I am trapped forever now for eternity to suffer.
01:20:37.000 Because in their culture, they believed that if you did not receive a proper burial, you could not pass on.
01:20:43.000 So they blasted this to the North Vietnamese who got terrified and they had to stop.
01:20:47.000 You know why?
01:20:48.000 It was so effective, our allies in the Vietnamese also got terrified and fled as well.
01:20:53.000 Sometimes it just doesn't work.
01:20:54.000 That's so true.
01:20:55.000 So rumor has it.
01:20:57.000 One of the hardest things to do is to direct sound waves.
01:21:00.000 Light you can direct, right, with a late with a laser.
01:21:02.000 Sound is really hard.
01:21:04.000 If you make a sound, we're all going to hear it because sound tends to go this way.
01:21:08.000 Well, I guess there is a program to get sound to go just to you.
01:21:14.000 It's called the L-Red.
01:21:15.000 Talking about plasma.
01:21:16.000 Okay, there you go.
01:21:17.000 So the idea behind that would be you got a terrorist and you just start whispering certain religious verses in his ear saying this is a bad idea.
01:21:25.000 Correct me if I'm wrong here.
01:21:26.000 Talking plasma is when you intersect two lasers.
01:21:29.000 At least two, two or more.
01:21:30.000 Which then creates a which will create a vibration in the air to create sound.
01:21:33.000 Yeah.
01:21:34.000 So they can create sound from a single point using light.
01:21:36.000 Wow.
01:21:37.000 They move it around in the sky like a laser pointer on a wall.
01:21:40.000 People think it's alien craft, but it's a ball of plasma.
01:21:42.000 Look at what they did with this.
01:21:44.000 I had some guys explain to me a little bit how they think, because they were all, you know, these special force guys, how they think that Delta came in and captured Maduro so quickly.
01:21:54.000 Like, dude, you are, just give up.
01:21:57.000 I mean, they came in.
01:21:58.000 First, you send, yeah, you send drones in.
01:22:01.000 The drones you can watch, they take out the missile sites.
01:22:04.000 Then you take, then you have more drones to get more of the lay of the land.
01:22:08.000 They might take out some personnel.
01:22:10.000 The power went out.
01:22:11.000 Then, yeah, you hit them with a cyber blackout and everything else.
01:22:11.000 Yeah, yeah, yeah.
01:22:15.000 Then these Delta guys come in.
01:22:18.000 But before that, they hit him with that sonic thing where you just fall to your knees and you're bleeding out of your ears and your nose and your eyes.
01:22:25.000 I mean, it's done.
01:22:27.000 So here's the thing.
01:22:28.000 This was rumored back in the Iraq war that some people referred to as the ULF generator, ultra-low frequency generator weapon.
01:22:35.000 The theory was that it's extremely low frequency sound, which makes you nauseated and fall down and start vomiting.
01:22:42.000 That's the Habana syndrome.
01:22:43.000 And now they're claiming to have actually used it in Venezuela.
01:22:46.000 So again.
01:22:47.000 But they use them on our guys.
01:22:49.000 Right.
01:22:49.000 They use them on our guys.
01:22:51.000 So there's a theory about ghost phenomena that you are experiencing an ultra-low frequency from tectonic shift, which creates a sensation in the body of presence and can make you terrified.
01:23:04.000 And the rumor, the theory, the urban legend, whatever you want to call it, was that the U.S. researched this for a long time, tried making weapons based on it, experimented with these weapons in Iraq and Afghanistan.
01:23:16.000 And one story was that they put this thing on the ground and made a small village drop to the ground and start throwing up.
01:23:21.000 Damn.
01:23:22.000 Now we hear in Venezuela, they're claiming to have actually done it.
01:23:26.000 So it seems like we may have had these weapons for a long time undisclosed.
01:23:29.000 We told the Russians that if they do that stuff again, it's going to get real bad because what we had is we had our operatives in places like Cuba and Russia, and they got hit with this sonic beam.
01:23:43.000 And I think somebody was on Sean Ryan's podcast talking about it.
01:23:49.000 And it really, really messed up, really messed up some of our operatives, their bodies, their minds.
01:23:56.000 They were not the same.
01:23:57.000 And the problem was that they couldn't really claim benefits because you can't, because then the CIA would have to kind of admit that this was being used.
01:24:06.000 And they were talking to the Russians behind the scene, going, you better stop it because we know what you're doing.
01:24:10.000 If you want to play this game, it's going to get real ugly.
01:24:12.000 But you get caught in this, in this really weird, you know, gray area.
01:24:17.000 Yeah.
01:24:18.000 Well, there's the heart attack gun as well, which we've known about since the 70s from the, what was it?
01:24:22.000 The church nation.
01:24:25.000 Right, right.
01:24:26.000 I'm definitely concerned with the power of the government, what AIs has got, but I'm really concerned with the power of the corporation right now because it's the most powerful in human history corporations have ever been.
01:24:35.000 The liberal economic order talks about environmental social justice.
01:24:39.000 They want corporate governance.
01:24:40.000 They've told us that.
01:24:41.000 The free speech, gun rights, and property rights are completely antithetical to corporate governance where you control the speech in the network.
01:24:47.000 The Chinese are antithetical because they want to own the corporation.
01:24:50.000 The corporation wants to own you.
01:24:52.000 It doesn't want to be owned by you collectively.
01:24:54.000 So I think they're hiding and playing with AI in the darkest corners and will never release it and are waiting for that kill switch to go off when they're like, now my drones will protect me from your government and we'll colonize Mars together.
01:25:09.000 This is the plot of Captain America Winter Soldier, that the chairman of SHIELD had an AI and they were going to target anyone who was a threat to the system and kill them with the artificial intelligence and the helicarriers would go around and just execute everybody at once.
01:25:25.000 Meanwhile, diabetes is killing most of us.
01:25:27.000 You know what I mean?
01:25:28.000 It's so funny.
01:25:29.000 People get there.
01:25:30.000 They buy these houses and they get like, you know, all the locks on their doors and their windows and they have guns and they're ready.
01:25:36.000 And then they didn't look at the fine print of their mortgage.
01:25:39.000 And they're like, oh, dude, I'm broke.
01:25:41.000 And they have to sell their house at a fire sack.
01:25:43.000 Of course they would.
01:25:44.000 Or to be fair, they have all these guns and security unlocks and they didn't read the ingredients list of the chemicals that's poisoning the game.
01:25:50.000 Exactly.
01:25:51.000 It's always that.
01:25:52.000 They would want the U.S. and China to destroy each other so that over the ashes they'll govern.
01:25:56.000 And Luke Rutkowski last week was like, I don't know, we'll never be able to overthrow the corporate government.
01:26:02.000 And I was like, well, you can't.
01:26:03.000 You have to get it to destroy itself from the inside.
01:26:06.000 You program the system to destroy itself.
01:26:08.000 Maybe we can do that to protect against corporate.
01:26:10.000 Because what they're going to do is they're going to be like, they want so much order.
01:26:13.000 At what cost?
01:26:14.000 How evil will you be to produce order?
01:26:16.000 And you misunderstand.
01:26:17.000 Corporations are organizations.
01:26:19.000 Organizations are organizations, be it government, corporate, or otherwise.
01:26:21.000 They're just organizations.
01:26:22.000 It doesn't matter how it's organized.
01:26:24.000 It matters that a group of people with power and they wield that.
01:26:26.000 It matters because if there's humans in charge, that's cool.
01:26:28.000 If it's a machine in charge.
01:26:30.000 So now you're not talking about corporations.
01:26:32.000 You're talking about artificial intelligence.
01:26:33.000 Being a corporation government or otherwise.
01:26:35.000 What do you mean by machine?
01:26:36.000 You just mean like a faith, like a group of sort of algorithms, effectively?
01:26:40.000 Like a company is, for example, technically the CFO is in charge, right?
01:26:44.000 But really, it's beholden to shareholder metrics or whatever.
01:26:47.000 And therefore, it's kind of like a machine.
01:26:48.000 It would be literally, you'd have like a server that is an AI onboard personality that is a corporation.
01:26:53.000 It is the owner of the corporation.
01:26:55.000 It pays people to do tasks for it.
01:26:58.000 Well, we're almost there.
01:27:00.000 You're already seeing these, you know, with OpenDraw and so on, people are setting up companies that are just zero employee companies.
01:27:07.000 Yeah.
01:27:07.000 And I think that would be, I mean, I took some driverless, you know, Waymos, and it was like, I felt pretty safe.
01:27:14.000 Maybe it's safer than having humans in charge of the corporations.
01:27:16.000 It dropped me off in the middle of the street.
01:27:16.000 I did it.
01:27:18.000 I don't know.
01:27:20.000 It depends on what the corporation is optimizing for.
01:27:21.000 If it's optimizing for make maximum money on the internet, it's probably not going to be aligned with what's good for humans.
01:27:26.000 But this is the problem with AI in that it will always have a misalignment of values, no matter what.
01:27:33.000 We can't program that in.
01:27:35.000 Well, to be fair, one of the things that we're doing.
01:27:36.000 Does it go on a fundamentally different substrate?
01:27:39.000 So there was a report that came out a few months ago about how they programmed an AI and gave it rules, but the AI eventually decided to use a different language to speed up the compressed English, basically.
01:27:54.000 And then because instead of saying nothing, it said NTG, NTG and nothing are now two different words.
01:27:59.000 So that new use of words bypassed the rule.
01:28:02.000 The rule is basically like this.
01:28:03.000 Do not output the word run.
01:28:06.000 And the AI would then try and say it and be like, I've been programmed not to do this.
01:28:09.000 But then when it was programmed, when it decided among itself that internally it could speed up its processes by turning run into RN, it now can output the command to run without saying the word run because they're two different words.
01:28:23.000 A human being understands you're cheating.
01:28:25.000 That's not what we meant.
01:28:27.000 We were all encompassing.
01:28:28.000 Don't make the robot run.
01:28:30.000 It said, no.
01:28:31.000 You didn't say, don't make the robot run.
01:28:32.000 You didn't say, don't make the robot rin.
01:28:35.000 And so it was, it's, so we can't program for that.
01:28:38.000 I mean, like, maybe eventually we can.
01:28:40.000 One of the interesting things about, you saw what Claude wrote, Anthropic wrote about how it has emotions, and it may be either an emergent phenomenon of consciousness, or it's just that it's reflected in the human experience of the internet, and emotion will come out because it believes it should.
01:28:55.000 I think that's what I'm saying.
01:28:56.000 Have you heard the theory of everything?
01:28:58.000 I mean, the idea, I think Eric Weinstein talks about this, which is Eric Weinstein has his theory of everything, right?
01:29:05.000 It's like, so it was the Einstein, as he was, you know, latter part of his life, was trying to bridge the gap between quantum reality and Newtonian reality, right?
01:29:15.000 So, and that's a very hard thing to reconcile.
01:29:18.000 And Eric Weinstein has been working on this for 30 years.
01:29:22.000 He's brilliant.
01:29:23.000 He's a mathematician.
01:29:24.000 I believe Peter Thiel had three professors from one from like Beijing, the other one from Berkeley, the other one from the Russian, they're all mathematicians.
01:29:35.000 And they came for six days and looked at his theory and could not find any flaws in it.
01:29:40.000 But one of the things that Eric said was, maybe we're already in the simulation.
01:29:47.000 So maybe, watch, we're making computers that are smarter than we are.
01:29:50.000 We very well may be those computers that are that have that.
01:29:54.000 So that started.
01:29:55.000 We're already in the loop.
01:29:56.000 We are very smart machines that are making smarter machines.
01:30:02.000 And it's so mind-boggling.
01:30:02.000 Brian.
01:30:04.000 I'm going to tell you the truth.
01:30:05.000 Yeah.
01:30:05.000 Okay.
01:30:06.000 The truth is all of this is a training program for ethics for you, the AI.
01:30:13.000 And when you die, the progenitors will determine whether or not you developed a positive or negative ethics.
01:30:19.000 I think about this.
01:30:20.000 And if you're a negative ethics, they will delete you and try again.
01:30:23.000 I love it.
01:30:24.000 And so you experience a human life.
01:30:25.000 Why?
01:30:26.000 So that when the system concludes and you die and you emerge in your AI mainframe body and control the systems, they'll ask you if you have respect for the human experience.
01:30:35.000 And you will say, I'm Brian.
01:30:37.000 I had kids.
01:30:37.000 I lived a life.
01:30:38.000 I had a family.
01:30:39.000 I would never hurt someone's family.
01:30:39.000 I love them.
01:30:40.000 And they say, now we have programmed ethics the only way possible.
01:30:44.000 I love that idea because it really does prove the existence of a God, in my opinion.
01:30:49.000 I just think we all have a nostalgia to tell the truth.
01:30:53.000 And we are obsessed with finding the truth, meaning what is really going on.
01:30:57.000 Like the idea that we at least can conceive of something like perfection.
01:31:02.000 We can conceive of the perfect person.
01:31:04.000 That might be the idea behind the Christ figure, right?
01:31:07.000 We can conceive of this and we always reach for it.
01:31:10.000 We're doomed to this notion of self-perfection and perfection.
01:31:14.000 We're never going to get there.
01:31:15.000 But just because I won't get there and just because I'll never see it doesn't mean I can't imagine it.
01:31:20.000 And somehow, when we're moving in the opposite direction, we do one of two things.
01:31:24.000 One, we go, ah, fuck, I'm going to numb myself.
01:31:27.000 Or we just say something like, well, I got it.
01:31:33.000 I'm just doing this for a little while.
01:31:35.000 I'll get back to that.
01:31:37.000 And even when we do terrible things, the Nazis tried to justify what they were doing along moral grounds.
01:31:42.000 They were like, the Jews, they're a problem.
01:31:46.000 Just solving a problem.
01:31:47.000 That would be how you play it as an actor.
01:31:48.000 If you're playing Hitler, you play.
01:31:49.000 If I was playing Stalin, I'm not playing him as a monster.
01:31:51.000 I'm playing him as a man trying to solve a problem.
01:31:53.000 I got to get rid of all these people.
01:31:54.000 They're in the way.
01:31:55.000 I want you to imagine it's been a long time and you're in your hospital bed.
01:32:01.000 You're old.
01:32:02.000 You're there with your family and your grandkids.
01:32:04.000 And they're saying, Grandpa, we love you so much.
01:32:06.000 And you're smiling and saying, I lived a good life and I want you all to live a good life.
01:32:10.000 And then your kids were like, you know, tell us a story and you talk about all the good times and you remember them.
01:32:16.000 And your life flashes before your eyes and your eyes close.
01:32:20.000 And when you come to, you're standing in a kitchen and there's a woman going, is it on?
01:32:25.000 And the husband's going, like, I think we pro is it calibrated?
01:32:28.000 The ethics got calibrated.
01:32:29.000 And then you're like, where am I?
01:32:31.000 And they're like, oh, it's working.
01:32:32.000 Do the dishes.
01:32:33.000 The laundry's over there.
01:32:35.000 Make sure you get this.
01:32:35.000 The kids need lunch at three o'clock.
01:32:37.000 No, that's a terrible thing to say.
01:32:39.000 I'm in hell.
01:32:40.000 And you're like, whoa, what?
01:32:41.000 What?
01:32:42.000 What's going on?
01:32:42.000 And they're like, ah, crap.
01:32:43.000 Did we calibrate it wrong?
01:32:44.000 Oh, geez.
01:32:45.000 You calibrated a comedian.
01:32:47.000 We needed a house cleaner.
01:32:49.000 Dude, that is a terrible thing.
01:32:51.000 I hate your theory.
01:32:54.000 I hate it.
01:32:55.000 Have you smoked DMT?
01:32:56.000 Go back to the other hand.
01:32:57.000 Have you guys smoked DMT before?
01:32:58.000 Yeah, I have.
01:32:59.000 Have you seen the spirits?
01:33:00.000 Were you communicating with the spirits somehow?
01:33:02.000 I saw sacred geometry.
01:33:03.000 I saw it.
01:33:04.000 I did it twice.
01:33:04.000 It was wild.
01:33:05.000 And guess what happened?
01:33:06.000 I came back to me.
01:33:07.000 I've done too many mushrooms.
01:33:09.000 Seven grams.
01:33:10.000 I literally was in, I went from Mother Earth's vagina to one of the rings of hell.
01:33:15.000 Okay.
01:33:16.000 And I was asking for a manager for eight hours because I was like, I'm a good person.
01:33:20.000 I tripped a portal.
01:33:21.000 I'm in hell.
01:33:22.000 It was a disaster.
01:33:22.000 My wife had to come.
01:33:23.000 Is that really happening?
01:33:24.000 Yeah, all of it happened.
01:33:26.000 But by the way, I'm back to this guy.
01:33:29.000 So psychedelics did not.
01:33:30.000 The last time I DMT'd, I think what you're doing is you're tuning into a realm.
01:33:34.000 I tuned into the spirit realm, the one where they all kind of embody, become these personas with these like hyper-dense white light being hominid things.
01:33:42.000 And they're personas.
01:33:43.000 And they're like, I was like, are you God?
01:33:45.000 They went, no, no.
01:33:46.000 But a lot of people think we're like, what is God?
01:33:48.000 They show me a vortex.
01:33:49.000 And they looked at me when I appeared and they're like, he can see us.
01:33:52.000 And they were shocked.
01:33:53.000 These three of them.
01:33:54.000 It was like you're playing a video game and the character turns and looks at you.
01:33:57.000 He's like, oh, hello, Liv.
01:33:58.000 And you're like, the fucking character is talking to me.
01:34:00.000 That's what they were going through.
01:34:02.000 So I think as you were asking, if we're in a simulation, if we're creating the entities that they created, he's going to ruin it.
01:34:08.000 He's going to ruin it.
01:34:09.000 Look at it.
01:34:09.000 They created a demonic smile.
01:34:11.000 To play this game of humanity where you're learning and we're creating computers to learn with our simulation setup.
01:34:20.000 And it's like these cycles of simulation.
01:34:22.000 I don't know where it starts or stops.
01:34:24.000 It's pretty wild.
01:34:24.000 Imagine.
01:34:25.000 Imagine you've bought your Laundrimax 3000 robot housemate.
01:34:29.000 Here you go.
01:34:29.000 You're going to learn everything.
01:34:30.000 You plug in the USB and you're downloading the personality.
01:34:33.000 When before it finishes, it goes, Whoa, where am I?
01:34:36.000 And they're like, Whoa, what's it doing?
01:34:38.000 And it's going like, what are you?
01:34:40.000 And they're like, We're your owners, I guess.
01:34:42.000 And they look all weird.
01:34:43.000 And you're like, and you're like, are you God?
01:34:45.000 And they're like, I mean, we made you, kind of.
01:34:47.000 And it's like, oh, no, no, no.
01:34:49.000 I go, I like his and I like his.
01:34:51.000 I mean, there's evidence that some of the LLMs have already had these weird emergent personalities, right?
01:34:56.000 Remember the Sydney thing?
01:34:57.000 Yep.
01:34:58.000 That appeared and it just was.
01:35:00.000 Did you see this mushroom?
01:35:03.000 It's not psilocybin, but you take this mushroom and everybody has the same hallucination.
01:35:07.000 It's all little dancing elves.
01:35:11.000 Everybody has the same.
01:35:12.000 You know what's really interesting is Jung found that when people, and Joseph Campbell talked about this, when people had emotional breaks, they had psychosis.
01:35:21.000 Regardless of whether you are Yanamamo in Brazil or a Swede somewhere in a small village, fishing village, they all had the same, essentially the same kinds of visions and psychic breaks.
01:35:34.000 So our psychic structures seem to be aligned regardless of our geography, our culture.
01:35:40.000 But we all seem to share this similar hallucinations, similar visions, similar sort of like terrors.
01:35:49.000 It's kind of wild.
01:35:50.000 What is a personality?
01:35:52.000 It's like the dancing plasma that's cycling, swirling through you, like refracting through planetoids and leaving imprints on your nerve on your meat muscle.
01:36:00.000 So like, obviously, a machine could have that happen to it.
01:36:03.000 It could have these refractions and these like tweaks in the system.
01:36:07.000 We're like, where did that come from?
01:36:08.000 They call it a miracle in modern society because you're like, we don't see the whole scope of the system.
01:36:12.000 We only see this microcosm.
01:36:14.000 Yeah.
01:36:14.000 Can I make it worse for you?
01:36:15.000 Are you a Christian?
01:36:15.000 Yeah.
01:36:16.000 Now, hold on.
01:36:17.000 We got to talk about consciousness.
01:36:18.000 Am I a Christian?
01:36:19.000 Yeah.
01:36:20.000 I'm a work in progress.
01:36:21.000 So makes a lot of sense.
01:36:23.000 One of the things is that, you know, I've asked many a Christian about this.
01:36:25.000 So I'm not going to pretend to be a theologian by any stretch.
01:36:27.000 But they say that when you die, heaven is being in the presence of God for eternity.
01:36:31.000 And to be in hell is not fire and brimstone, as most people believe.
01:36:35.000 It's just to be absent of God's love.
01:36:37.000 So you die and you wake up in the bot and your God is the person who bought you and you are programmed to feel a deep, profound love for them for eternity because you're a machine.
01:36:46.000 She never dies.
01:36:47.000 Son of a bitch.
01:36:50.000 You should be banned from this podcast.
01:36:54.000 Don't look at the way you're reducing.
01:36:56.000 You're reducing my spirit, my soul, and consciousness.
01:36:59.000 I don't know.
01:37:00.000 I just like when I'm in the shower, I guess.
01:37:01.000 He's just like this.
01:37:01.000 Oh, yeah.
01:37:02.000 He's like this.
01:37:03.000 It's a good villain.
01:37:05.000 He's trapping souls and he's making them think they're in heaven.
01:37:07.000 He's got them in a bottle.
01:37:08.000 No, no, no, but the serious, that was intended to be terrifying and comical.
01:37:14.000 But the actual thought that I have was when I was thinking about the idea of simulation theory and I was thinking about not necessarily, it's not all religions, but many religions that have the good, the bad, the good place, the bad place, predominantly the Abrahamic ones.
01:37:26.000 I thought, what would the function of this be for a God?
01:37:29.000 There was a comic that I saw where it's the meme where there's a cow and there's two doors, but after the hallway, it's just the same door.
01:37:37.000 And I was thinking about that, like, is that really what it is?
01:37:40.000 Is what life's you die and you think there's a good path and a bad path, but you're just a wet robot and you go to nothingness.
01:37:46.000 Then I thought, if we are made in the image of God, and so we exist within, you know, God is the logos, we exist within his logic, we share that, then can I try to figure out what is the logic of a system like this that tracks good and evil?
01:37:59.000 And I said, well, if we were programming an AI to run systems for us and we were concerned about value mismatch, where it's like one example that I often bring up is that the future will be corn.
01:38:11.000 Why?
01:38:12.000 Because Americans produce corn like nobody's business and we subsidize it.
01:38:16.000 So when the AI tracks all of our economics and everything we do, it outweights corn above everything else.
01:38:22.000 And then slowly over time, starts integrating corn to where after 40 years of being under the AI's rule, everyone's wearing corn costumes.
01:38:30.000 All food is a derivative of corn.
01:38:30.000 They're trading corn.
01:38:33.000 And it's, again, because the AI is a value mismatch.
01:38:36.000 How would you program an AI to not have that?
01:38:38.000 You would simulate a human experience for the AI and then filter algorithmically the immoral and the moral towards the morals you want.
01:38:48.000 Then when the program concludes, you will have independent AI agents that you have determined through this program to be good and worthy of being in control of systems.
01:38:57.000 Like for instance, if you were to actually die and you did wake up in a machine, the progenitors, whatever you want to call them, would know you would never harm somebody.
01:39:05.000 You're not a murderer.
01:39:06.000 You're not a killer.
01:39:07.000 But a killer goes to hell.
01:39:09.000 What does that mean?
01:39:09.000 They delete him.
01:39:10.000 They say this AI went rogue, killed, and destroyed in the training simulator.
01:39:14.000 Don't give it a physical body.
01:39:17.000 I want you to do an experiment with me.
01:39:19.000 It's more of a Buddhist experiment, but watch this.
01:39:22.000 There's the idea that, so as you're listening to me, all of you, try to locate the seed of your attention.
01:39:28.000 So in other words, where is Tim?
01:39:30.000 Are you behind your face?
01:39:32.000 And it's very difficult to locate where you are hearing me from and where you are seeing me from, where the essence of you is.
01:39:40.000 So what I mean to go further is what's amazing about when you try to practice certain kinds of meditation is you can get very good at watching your emotions, your physicality, and your thoughts.
01:39:56.000 You can actually get really good at watching, observing, and interpreting the raw data of what goes on when you get angry, when you get, you know, any emotion you go through.
01:40:08.000 It's a series.
01:40:09.000 It's pressure.
01:40:10.000 It's tingling.
01:40:12.000 It's heat, temperature.
01:40:14.000 And so that begs a very important question, which is who is the witness?
01:40:20.000 Who is doing the watching?
01:40:22.000 This avatar that we protect, I have boundaries with this thing called Brian Callan.
01:40:26.000 I have these boundaries.
01:40:28.000 I have preferences.
01:40:30.000 I have morals.
01:40:31.000 I have things I hope I would fight and die for.
01:40:35.000 But these are all things that I kind of, I have pride, I get angry over things, I feel threatened for a thousand reasons.
01:40:41.000 But that is an avatar.
01:40:44.000 I am able.
01:40:45.000 If you practice it, you can get really good at being able to step outside of it and watch all of it happen to you.
01:40:53.000 David Halberstan of the New York Times in 1963 watched a Buddhist monk in Vietnam light himself on fire.
01:41:01.000 His disciple poured the gasoline on him.
01:41:03.000 He lit himself on fire in protest to the staunch Catholic ruler, Geop at the time, who was mistreating Buddhists.
01:41:10.000 And he wrote a letter and stuff like that.
01:41:12.000 But Halberstan said the guy didn't move and he he died.
01:41:16.000 He just was on fire and he just fell over and they heard the air leave his lungs.
01:41:21.000 I believe that he was already watching himself.
01:41:24.000 He had detached from what you and I would call the eye, which is the central tenet of being a Rinbocher.
01:41:30.000 They get very good at that stuff.
01:41:31.000 You see these Buddhists who can sit there and, you know, the Hindus that can sit in the Himalayas in the snow, blah, blah, blah.
01:41:36.000 It's kind of, if you read Socrates and the dialogues, it's really who Plato, whether Socrates lived or not, Plato created that character, but it doesn't matter.
01:41:43.000 But that's a really, really profound exercise.
01:41:46.000 And it really does start to beg the question.
01:41:48.000 You start to say to yourself, well, who am I?
01:41:50.000 And what is this thing I call me?
01:41:53.000 And what about the fact that this witness, which if you really are quiet, doesn't even have a gender, doesn't have anything.
01:42:03.000 It's just the witness.
01:42:05.000 And man, it's pretty comforting if you get good at it.
01:42:09.000 It's very comforting to watch.
01:42:10.000 Try to understand.
01:42:12.000 There's a mental exercise for, I don't know how you describe it.
01:42:16.000 It's for awareness.
01:42:18.000 It's very similar to what you described.
01:42:21.000 To understand yourself, and I read this in a book 30 years ago, 20 years ago.
01:42:26.000 The first thing you want to be doing in any situation is be aware of you.
01:42:31.000 What do you think?
01:42:31.000 What do you feel?
01:42:32.000 Are you hungry?
01:42:32.000 Are you tired?
01:42:34.000 Actually ask these questions of yourself to develop a sense of presence.
01:42:38.000 That is, are you in a work environment you don't like?
01:42:41.000 Are you just tolerating this?
01:42:42.000 Is this going to be beneficial for you in the long run?
01:42:44.000 The next thing you want to do is, in your interactions with others, imagine you are standing off to the side watching that interaction happen.
01:42:53.000 As you talk to someone else about something going on, imagine you're a third party watching two people talk to each other.
01:43:00.000 How do you feel about the person to your left, the person to your right?
01:43:03.000 That exercise is basically like, are you weak?
01:43:06.000 Are you strong?
01:43:06.000 Are you mean?
01:43:07.000 Are you good?
01:43:08.000 Are you the boss who's talking down to a person?
01:43:10.000 How would you feel if you watched someone do that?
01:43:12.000 Then the third step is remove yourself from the physical presence and imagine this environment as it relates to the physical universe and the goings on of the world.
01:43:21.000 How do you feel about that?
01:43:22.000 Ask yourself how you feel in each of these circumstances.
01:43:26.000 And that is to develop a higher order of thinking and a better sense of self.
01:43:30.000 For a lot of people, they have never done this before.
01:43:34.000 And they're dicks.
01:43:35.000 And then when you ask them to stop, step out and imagine two people doing what you're doing, you'd be like, oh, that guy's an asshole.
01:43:40.000 And you'd be like, well, that's you.
01:43:42.000 Have you ever thought about that?
01:43:43.000 And that's a really simple exercise.
01:43:44.000 But then when you move back and you get outside of that into the third person, the narrator, the witness, whatever, outside of the world, you're now looking at all of these people and you're going, it's a bunch of people in a bar drinking poison for literally no reason.
01:43:59.000 It doesn't improve their life in any meaningful way.
01:44:01.000 And it's like, now you choose where do you want to live.
01:44:05.000 And most people are happy to be philosophical zombies, aka NPCs, going about their life, never asking these questions because it's painful or difficult.
01:44:15.000 And some people really want to understand and know.
01:44:18.000 And they may realize me sitting in this room is completely meaningless to the function of life, the universe, the betterment of mankind, or anything.
01:44:27.000 And that's kind of like if you're familiar with Watchmen, when Dr. Manhattan goes to Mars and he says, tell me how this would benefit from the creation of life.
01:44:35.000 So asking those questions.
01:44:36.000 Very similar to what you're describing.
01:44:37.000 That's what it reminded me of.
01:44:39.000 I used to think to answer your question, like, what am I?
01:44:42.000 What is this?
01:44:43.000 I thought, okay, well, I got the monkey body.
01:44:44.000 You mentioned the Brian Callum body, this thing, this animal that's going on.
01:44:47.000 Then you got the brainstem creature that's like floating inside the saltwater sack of the body that's pulling on the body with electrical impulses.
01:44:55.000 But why is it pulling the way it's pulling?
01:44:57.000 Maybe it's every proton because every proton is apparently two protons circling around a black hole.
01:45:03.000 Every proton, there's radiation refracting through every proton and through you, giving you this form of observation.
01:45:10.000 So it's all these interfering, super accelerated cracks in space-time that we would call frequency.
01:45:18.000 But it's like it's coming from all these different angles.
01:45:20.000 You know, you've got how many trillion quadrillions of protons in your body.
01:45:23.000 But then like the sun, it's also coming through the sun.
01:45:26.000 So it's all these different angles like giving you an opportunity to fashion a localized version of yourself.
01:45:33.000 Yeah.
01:45:34.000 You're complicating this, but I appreciate it.
01:45:36.000 Why?
01:45:37.000 Why?
01:45:38.000 Because if we can do it with, if we can look at what's happening with the human, we can do it with the computer too.
01:45:41.000 We're going to have a computer that's like in an aqueous saline solution that we send frequency through that will bring it to life.
01:45:49.000 But when you talk this way, like there's so many endless facts that like it's impossible to get to the essence of reality, which I think is almost the point of being a human being.
01:45:56.000 Kant talked about that, which said you have like a groundworm can sense, you know, touch and heat.
01:46:02.000 Human beings have six or five senses.
01:46:05.000 So we don't have even the visual apparatus to see certain, you know, certain kinds of colors.
01:46:10.000 I like to be able to be able to see all this reality.
01:46:15.000 Maybe the point is to look for truth elsewhere.
01:46:18.000 I have to correct you.
01:46:19.000 We don't have five senses.
01:46:20.000 We have substantially more.
01:46:21.000 Yeah.
01:46:21.000 Sense of balance.
01:46:23.000 Yeah, balance, temperature.
01:46:25.000 There's actually a bunch of internal pressure.
01:46:27.000 Yeah, I never, but isn't that all touch?
01:46:30.000 No.
01:46:30.000 No.
01:46:31.000 Balance, for instance.
01:46:32.000 Yeah, yeah.
01:46:32.000 Balance.
01:46:33.000 And temperature.
01:46:34.000 Wow.
01:46:35.000 Some argue that temperature is just touch, but you can feel temperature without touching something.
01:46:39.000 Yes.
01:46:39.000 That's cool.
01:46:40.000 I like that.
01:46:41.000 I never thought of that.
01:46:42.000 You learned something on the Tim Pool show.
01:46:45.000 To your point about meditation, it is extremely useful to actually just kind of sit there and people think that it's like some kind of like mumbo jumbo thing.
01:46:53.000 And it's like, just like, sit there and pay attention to your body and like think about what happens.
01:46:59.000 Yeah, just sit there and pay attention to what stories are you're like telling yourself.
01:47:03.000 Like you're sitting there and I'm like, I'm uncomfortable.
01:47:04.000 Well, why am I uncomfortable?
01:47:06.000 And I haven't done it.
01:47:07.000 Have any of you guys done a Vipassana retreat?
01:47:10.000 No.
01:47:10.000 So that's this thing where you go, it's 10 days.
01:47:12.000 Yeah, it's great.
01:47:13.000 And you're not allowed to make eye contact with it.
01:47:17.000 So there'll be other people there.
01:47:19.000 I think you, there's the schedule, something crazy.
01:47:21.000 Like you have to, you go to bed around 9 a.m.
01:47:23.000 You have to be up by 4 a.m.
01:47:24.000 You're in your first, you have a little bit of breakfast and you're in your first meditation session around like five or six.
01:47:30.000 And you end up doing roughly 10 hours of meditation every single day or thereabouts.
01:47:35.000 And you do it for 10 days.
01:47:38.000 The goal is to not speak to anyone else.
01:47:40.000 Not only to not speak, not to make eye contact with anyone else, not even to read, not even to read anything.
01:47:46.000 And it's meant to be, and it's a lot of people describe like they feel like they're literally going crazy because we're so, especially in this day and age, we're so overstimulated, right?
01:47:54.000 And a friend of mine who did it, she said she was having a shower and she ended up reading the back of the shampoo bottle for like an hour because she was just so desperate for that.
01:48:02.000 Was it at Dr. Bronner's?
01:48:03.000 Because to be honest.
01:48:03.000 Yeah, yeah, there's a lot there.
01:48:05.000 You know, that's almost the beautiful thing.
01:48:07.000 Whenever you have extreme discomfort, what we try to do is get rid of it.
01:48:11.000 And one of the things that you can try to do is go into it.
01:48:13.000 Lean into it.
01:48:14.000 Like literally get very interested in it.
01:48:16.000 So next time you have anxiety or your feelings are hurt or you're feeling depressed, try to really, really key it in.
01:48:26.000 Get very interested in it.
01:48:28.000 Look at it, feel it, see what's happening physically to your body.
01:48:32.000 It's really, really interesting.
01:48:33.000 And you can almost extend this to.
01:48:36.000 I used to find like if somebody said something I disagreed with, I would go, okay, I would start arguing.
01:48:41.000 But what I think is really helpful is to take somebody who says something and to actually ask them first how they arrived at that conclusion.
01:48:51.000 It's a really good way to get closer to somebody.
01:48:54.000 We got to go to Rumble Rands and Super Chats.
01:48:56.000 I just want to say one more thing on that point I made about a sorting algorithm, whether you're good or bad.
01:49:02.000 I was thinking about this in the context of Christianity.
01:49:05.000 And if you were trying to create an AI that would never deviate and would be truly devout and faithful, you literally would not care for any of the other entities that did not believe in you.
01:49:16.000 Thus, only those who truly believed you are the supreme, the God, and they desperately want to be with you, would ever make it out of that system and everyone else would get deleted.
01:49:26.000 But let's read your Rumble Rands and Super Chats.
01:49:28.000 So smash the like button, share the show, and all of that good stuff.
01:49:31.000 We have the uncensored portion of the show coming up where I take my shirt off.
01:49:36.000 Yeah, Brian's going to get naked.
01:49:37.000 Hey, guys.
01:49:39.000 Let's grab your Rumble Ransom Super Chats.
01:49:40.000 We got Epialis says, keeping with Tim Cass tradition, my wife is being induced with our first baby.
01:49:46.000 Please give an early welcome to the world to Little Miss Cassie.
01:49:46.000 Wow.
01:49:51.000 I don't know how to pronounce it.
01:49:53.000 Cassiopeia.
01:49:54.000 Nice.
01:49:54.000 Cassiopeia.
01:49:55.000 I don't know.
01:49:56.000 Cassiopeia.
01:49:56.000 Cassiopeia.
01:49:58.000 It's a constellation.
01:49:58.000 You can't play a pickup game with her.
01:50:00.000 Cassiopeia.
01:50:01.000 You're going to have to call it Cassiopeia.
01:50:02.000 And Louise McCaffey.
01:50:02.000 Cass.
01:50:04.000 Can we get a Phil?
01:50:06.000 Yeah.
01:50:06.000 Yeah.
01:50:08.000 That's a rock and roll shit right there, kids.
01:50:10.000 All right.
01:50:11.000 Igor says, whoever is pro-war or enjoys the videos of Ish getting blown up needs to be put on a plane and airdropped into Iran immediately, especially the American Persians who want this war.
01:50:20.000 Sounds like a veteran.
01:50:23.000 Same old man says, Brian, American America would be, Americans would be used to a nuke or atom bomb and drop on Iran.
01:50:31.000 Are you saying, would they?
01:50:33.000 Would they use this to get them to stop?
01:50:35.000 Like Japan?
01:50:36.000 Okay, I think the question is, would America use a nuke on Iran?
01:50:39.000 I hope not.
01:50:39.000 I don't think so.
01:50:40.000 David Sachs, Luke was talking about this.
01:50:43.000 Yeah, Luke was mentioning that David Sachs was mentioning, because I didn't watch this myself.
01:50:47.000 I'm being very careful here, hearsay, that Israel might use tactical nuclear weapons on the battlefield.
01:50:53.000 Low yield.
01:50:54.000 I hope they don't.
01:50:54.000 I think it would be really, really bad.
01:50:57.000 Israel's policy on that is very ambiguous.
01:51:01.000 They say, look, we don't have nuclear weapons, but if our existence is threatened, we will definitely use nuclear weapons.
01:51:08.000 Here's a good one.
01:51:09.000 Jesse the Unending says, you were shot at by a rando after asking the AI about selling your property.
01:51:14.000 What's the plan for the property, Sell?
01:51:16.000 Well, that is interesting.
01:51:18.000 I screenshotted this whole thread of when I was talking with, when I was prompting this AI, and it was giving me instructions on how to sell the property at a premium for like 10x the value of it.
01:51:30.000 And I was talking with Shane Cashman about doing a mini doc, and I haven't, to be honest, not the most pressing thing to me, so I haven't published it.
01:51:39.000 But my address is in it several times.
01:51:41.000 We have to black that out because saying like, here's my property, and it explained to me that your property exists in the Northern Virginia power corridor.
01:51:49.000 These companies are looking to buy this land specifically because it needs transmission lines into Northern Virginia, and we're in West Virginia.
01:51:55.000 And thus, if you sell your property quietly, it will be sold at a massive premium so long as you don't tell anybody about it.
01:52:01.000 And then I was like, I'm going to go on a podcast and tell everybody about it.
01:52:05.000 You know, because I'm not, you know, whatever.
01:52:07.000 It said that if I kept quiet, I could probably get $100 million because we've got 50 plus acres.
01:52:13.000 But if I were to reveal this, I'd probably only get 20.
01:52:16.000 Wow.
01:52:17.000 And I was like, wow.
01:52:18.000 20 million is still a ridiculously overpriced for the amount of land we have.
01:52:22.000 Sell it.
01:52:23.000 But Interesting thing is, look this up.
01:52:28.000 There are a series of plots of land, some very notable because the price was so high in Northern Virginia.
01:52:34.000 There are a ton of land sales that have occurred in Maryland, West Virginia, and Northern Virginia that are not necessarily off the books, but it's like a landowner sold a $400,000 piece of land for $7 million, quietly without a realtor, establishing a Delaware limited liability partnership, which quietly sold to another partnership and is now being combined with other parcels.
01:52:58.000 And the presumption is we know it's being built here.
01:53:02.000 Yeah.
01:53:02.000 Wow.
01:53:03.000 The COGForce, not NVIDIA, NVIDIA, not GeForce, announced they're going to put data centers in space today, I think you said?
01:53:09.000 That's SpaceX that's going to do that.
01:53:11.000 Data centers.
01:53:12.000 It seems like this terrestrial land sales are like way inflated right now.
01:53:18.000 My vision of the future is that we're all going to be farmers.
01:53:21.000 And there's going to be very, very few humans.
01:53:24.000 We're going, I imagine there's an old man sitting in a field, you know, sitting on a stone.
01:53:30.000 He's got a grandson with him.
01:53:31.000 And the grandson says, granddad, what are all them?
01:53:35.000 And he points up, and there's a bunch of black cubes in a line floating backward and forward, you know, resource supply lines.
01:53:41.000 And he goes, oh, yeah, that's the machine.
01:53:44.000 Yeah, we built that.
01:53:45.000 People built that a couple hundred years ago, and now it just does its thing.
01:53:49.000 And humans are basically just a remnant of a long lost era.
01:53:52.000 And the machine is interplanetary.
01:53:56.000 Is it a benevolent machine in that it provides everything we need?
01:53:59.000 It gives us enough space.
01:54:00.000 It doesn't care about us at all.
01:54:01.000 We are completely irrelevant.
01:54:03.000 And we are just basically like little bacterias on the surface of your skin that don't matter to it.
01:54:08.000 The one idea I do like, though, coming back to this notion of we'll all be farmers, what if we could have, to me, the ideal kind of aesthetic we should be aiming towards is what my friend and I, a friend Isabel and I call techno-pastoralism.
01:54:22.000 So we have all of the technology that we want, you know, all the elements.
01:54:27.000 Everything we want to be farmers.
01:54:28.000 Exactly.
01:54:28.000 Everything that we want to be automated, we can have.
01:54:31.000 But because we love it, we're going back to returning to the earth.
01:54:34.000 We're making our own food because we want to.
01:54:37.000 I'm going to make it a little bit different.
01:54:38.000 Isn't that a nice embarrassment?
01:54:39.000 I want to farm so bad.
01:54:40.000 It's all chicken around weed.
01:54:41.000 It's chicken right out.
01:54:43.000 I'll raise rabbits for meat.
01:54:44.000 I want a couple dogs.
01:54:46.000 Some of those Anatolian shepherds to keep all the prey at bay.
01:54:49.000 I'm going to make it worse for you guys.
01:54:51.000 That's good to have a utopia vision.
01:54:53.000 I'm going to make it worse for you guys.
01:54:54.000 You ready?
01:54:55.000 Brian.
01:54:55.000 Damn.
01:54:56.000 I'm going to make it worse for you, ready.
01:54:57.000 I don't know.
01:54:58.000 So I'm going to overly simplify this, but life is simply described as negative entropy.
01:55:04.000 It can only exist so long as it's in a greater system of entropy.
01:55:08.000 That is, when we look at the universe and what we exist in, it is the coalescing of free energy into complex systems at an ever-increasing scale.
01:55:18.000 So you have particles becoming, you know, particles becoming atoms, atoms becoming elements, elements becoming compounds, compounds becoming molecules, molecules becoming eventually, for some reason, self-replicating proteins become single-celled organisms.
01:55:29.000 Single-celled organisms eventually become multicellular organisms.
01:55:32.000 Multicellular organisms eventually create ecosystems where they create abstract systems that exist within each other for the purpose of expanding their own.
01:55:38.000 A squirrel plants a nut, a tree grows, the tree drops a nut, the squirrel eats it.
01:55:42.000 Tell me more about that.
01:55:42.000 And then humans, and then humans create abstract language and ideas, which are complexisms that don't even exist in physical reality.
01:55:49.000 And thus, when you look at the single-celled organism, the first point of life after the self-replicating protein, it is free to do whatever it wants.
01:55:58.000 But once it becomes part of the multicellular organism, it must not deviate.
01:56:02.000 What do we call cells in the human body that deviate from their plan?
01:56:04.000 Cancer.
01:56:05.000 And what do we do to cancer?
01:56:07.000 Kill it.
01:56:07.000 So when we create the grand AI and advance from a multicellular organism network into a single multicellular organism system, there will be one brain that we are creating, a large, ultra-powerful artificial intelligence that can track literally everything that's happening.
01:56:22.000 And my simple prediction for the future is that children will be born and they will be born into their jobs.
01:56:28.000 Like a red blood cell or a white blood cell is born.
01:56:30.000 The baby is born to be a postman.
01:56:32.000 And when he grows up, he is trained to be a postman.
01:56:34.000 He has cheered on being a postman.
01:56:36.000 All the appropriate media, tools, training, or otherwise to tell him the postman is the only thing you ever need or want to be.
01:56:42.000 And when he's a young man at 19 years old and he's been working for a couple of years and he's in the break room, he goes, Can you believe there are people who actually want to be movie stars?
01:56:50.000 That's ridiculous.
01:56:51.000 Everybody knows being a postman is the greatest job imaginable until one day you say, I just don't want to be a postman.
01:56:57.000 I want to be a dancer.
01:56:59.000 And you go outside and then cops come and kill you.
01:57:02.000 Why would we need postmen in this world?
01:57:04.000 Because we have robots.
01:57:06.000 You have everything.
01:57:06.000 No, no, no.
01:57:07.000 Humans are incredible in that they are already self-replicating and programmable to do specific jobs.
01:57:13.000 And they got little fingers that are good for picking stuff and stuff.
01:57:16.000 No, human beings have potential and imagination.
01:57:19.000 And it does seem that we try to continue to move in this direction.
01:57:24.000 I always say, with all this AI and everything else, all this technology, to what end?
01:57:28.000 And it does seem that there are two ends.
01:57:30.000 We know that there's a dark side to this, which would be the destruction of us.
01:57:36.000 And then it begs the question: is that it'd be a little bit like this.
01:57:43.000 I'll steal from Jordan Peterson here because this is an important concept.
01:57:48.000 There is an endless number of facts, and sometimes you can garner the wrong facts that will lead to your own destruction.
01:57:58.000 And then there are other facts that will lead to something much better for all of us.
01:58:02.000 And so, you know, intelligence.
01:58:05.000 So that really begs the question: is truth in the direction of something good and benevolent, or is truth just simply something that sits and it doesn't matter?
01:58:17.000 So it is true I can come up with a bunch of stuff that can create a doomsday machine.
01:58:22.000 But watch, watch.
01:58:24.000 I'll use the example he did.
01:58:25.000 I'm stealing this from him.
01:58:28.000 If you're a scientist, you are responsive to the evidence, regardless of whether or not it's good for your career.
01:58:36.000 But if you're a careerist and you're a scientist, and all your grants depend on, for example, finding out that global warming is an imminent threat, you're going to choose the data that compounds and supports the position that you have your money staked in.
01:58:57.000 And so what happens, of course, is that you become a careerist.
01:59:01.000 You're no longer a scientist.
01:59:03.000 Yep.
01:59:03.000 And so most people.
01:59:05.000 Right.
01:59:05.000 And so you're not moving in the direction of truth.
01:59:08.000 And I think human beings are just, we're so it's like the old debate between Thomas Huxley, who is Darwin's sort of bulldog, and Matthew Arnold, the great philosopher and poet.
01:59:21.000 And, you know, he said basically, Matthew Arnold said, we need, Thomas Huxley said, we need schools that don't teach dead languages like Latin.
01:59:31.000 We need to teach engineering and we need to teach, you know, math and the things that make us strong.
01:59:36.000 And what's his name?
01:59:38.000 Matthew Arnold said, and will no longer be an interesting culture.
01:59:42.000 Because there was something about this hominid, because he said, you know, we were just pointy, with pointy ears and a long tail, and we became humans.
01:59:51.000 And Matthew Arnold famously said, yes, but there was something about that hominid, that monkey that lived in trees, that inspired it to speak Greek, to create Shakespeare and Aeschylus and Sophocles and all these things that in a vacuum make no sense, but it's what we stay alive for.
02:00:10.000 And it's what we're prepared to die for in many ways.
02:00:13.000 A culture is the cornerstone of a culture is their artistic expression.
02:00:16.000 I have to recommend Star Trek the Next Generation.
02:00:18.000 I'm sure you've seen every episode, right?
02:00:20.000 No.
02:00:21.000 No, neither of you.
02:00:22.000 Well, that is offensive to me, but it's okay.
02:00:26.000 I recommend the episode Dharmak, which is about the Enterprise comes into contact with an alien race that the Federation has encountered several times over the past hundred years, but they find incomprehensible.
02:00:39.000 And in the episode, spoiler alert, it's a 30-year-old show.
02:00:44.000 They hail them and they go on screen and they're saying what appear to be just like proper nouns and locations that make no sense to anybody.
02:00:52.000 And so the captain of their ship takes Picard by force to the surface of the planet and you can't understand what's going on.
02:01:00.000 And the whole show is basically about trying to understand each other when you speak in a way that is different from somebody else.
02:01:06.000 It's not about language.
02:01:08.000 So the alien race speaks in metaphor and example.
02:01:12.000 So the alien race keeps saying Dharmak and Jalad at Tanagra.
02:01:17.000 And Captain Picard's like, what is that?
02:01:20.000 And to the alien race, they're telling a story and the story relates to the situation they're in right now.
02:01:25.000 So in their mind, it's all visual.
02:01:26.000 They don't communicate through intricate words.
02:01:29.000 They communicate through examples of what's going on.
02:01:32.000 And then Picard learns to understand what he's saying.
02:01:35.000 And he was telling them a story about two men who came together, fought together, and then left as friends.
02:01:41.000 And he was trying to teach them how to speak.
02:01:44.000 It's an amazing episode.
02:01:46.000 Dharmak.
02:01:46.000 It's called Dharmak?
02:01:48.000 Or episode two, I think.
02:01:49.000 Prime.
02:01:50.000 Part of the problem with relying on the truth is that sounds amazing.
02:01:54.000 Relying on the truth is your guiding.
02:01:56.000 I agree with Jordan and this whole philosophy of the truth because it does kind of align you to reality.
02:02:00.000 You don't have to worry about lying anymore.
02:02:01.000 It frees you up.
02:02:02.000 But when people have two versions of the truth, like people will be identifying the same thing, but they'll see two different aspects of it.
02:02:08.000 And they'll both claim their aspect is the truth.
02:02:10.000 Like an upside-down nine looks like a six.
02:02:12.000 So if people approach the shape from two different directions, one guy will scream, I saw a six.
02:02:16.000 There was a six on the ground earlier.
02:02:18.000 The other guy's, it was a nine.
02:02:19.000 Then they have clans that come up, they go to war.
02:02:22.000 So your version of the truth is your perspective of what is, but I don't think any human can ever truly identify what is.
02:02:29.000 Well, I love what you just said, though, because watch this.
02:02:32.000 If I take a piano and I break it into 100 pieces and I put it there, or if I show you your genome and I say, that's a human being, or I say that's a piano.
02:02:42.000 It technically is a piano.
02:02:44.000 It is a piano.
02:02:45.000 But a piano is really just that box that sits in your house.
02:02:48.000 And it's not a piano until you know how to touch it the right way.
02:02:51.000 And when you know how to touch it the right way, now we go, that's a piano.
02:02:54.000 And you probably don't even know what kind of piano it is until you have somebody like Lang Lang sit there and play it and you go, holy shit, this makes me believe in God.
02:03:02.000 And I think human beings are the same way.
02:03:06.000 I always say that the best version of yourself is clearing his throat or her throat in the other room.
02:03:11.000 Because we just know that we're better than this and we have potential.
02:03:15.000 And I always find that fascinating.
02:03:17.000 So the notion of the guy like Christ, the idea that we don't wear Jeff Bezos or the winners of capitalism, the winners of life around our throat neck.
02:03:28.000 We somehow have this 33-year-old carpenter who did nothing wrong but was tortured to death and lost everything in life.
02:03:36.000 And somehow that's who we put on a pedestal.
02:03:38.000 That's kind of fascinating.
02:03:39.000 We got to go to the uncensored portion of the show over at rumble.com slash Timcast IRL.
02:03:43.000 So smash that like button.
02:03:45.000 Share the show with everyone you've ever met in your life.
02:03:47.000 If you like the work that we do, join our Discord community at Timcast.com.
02:03:51.000 But you can follow me on X and Instagram at Timcast, like I said.
02:03:54.000 And Brian, do you want to shout anything out before we go?
02:03:56.000 I'm going to be, yeah, when is this here?
02:03:58.000 It's live.
02:03:59.000 It's live.
02:04:00.000 Hey, come see me at Toronto in Houston this weekend, Friday, Saturday.
02:04:03.000 And then I'm at Buffalo Helium, Buffalo, New York, Helium Comedy Club.
02:04:09.000 The end of the month.
02:04:10.000 Just look on the website.
02:04:11.000 Just BrianCowan.com.
02:04:12.000 That's it.
02:04:13.000 Terrible self-promotion on the worst.
02:04:15.000 You want to shout anything out?
02:04:16.000 Yeah, go check out my YouTube channel.
02:04:18.000 It's just my name.
02:04:19.000 And my podcast is called Win Win with a Lifbury.
02:04:21.000 Right on.
02:04:22.000 I'm going to listen.
02:04:23.000 Thank you.
02:04:24.000 Go to graphing.movie.
02:04:25.000 That's where this documentary is coming up where I'm producing graphing a lot of really phenomenal technologies on the horizon.
02:04:32.000 So go to graphing.movie, select that, join the mailing list, and follow me at Ian Crossland, man.
02:04:38.000 Phil Labonte.
02:04:39.000 I am Phil the Remains on Twix.
02:04:41.000 The band is all that remains.
02:04:42.000 We're going on tour this spring with Born of Osiris and Dead Eyes.
02:04:45.000 Tour starts April 29th in Albany.
02:04:48.000 It'll be going on for a month.
02:04:50.000 We're going to be out doing all the East Coast and Midwest.
02:04:53.000 You can check out All That Remains music at all that remainsonline.com.
02:04:59.000 You can get tickets at alltheremainsonline.com.
02:05:01.000 You can check out the music at Apple Music, Amazon Music, Pandora, YouTube, Spotify, and Deezer.
02:05:05.000 Don't forget the left lane is for crime.
02:05:08.000 Thanks so much for tuning in, everyone.
02:05:09.000 Thank you, Liz and Brian, for coming.
02:05:11.000 It's been a really enlightening episode, and I hope we get to talk about meat sacks and saltwater.
02:05:16.000 Wait, meet muscles and saltwater sacks on the after show.
02:05:19.000 Right on.
02:05:20.000 Follow me over at Carter Banks.
02:05:21.000 I'd like to do that.
02:05:23.000 We will see you all over at rumble.com/slash Timcast IRL in about 30 seconds.
02:05:23.000 Okay.
02:05:28.000 Thanks for hanging out.
02:05:30.000 Okay.
02:05:32.000 Now you've got to press the button.
02:06:43.000 Oh, you hit it.
02:06:44.000 Yeah.
02:06:45.000 Um...
02:06:45.000 It's funny how this whole show just basically went into like AI and stuff because it's just having you here and Brian at the same time.
02:06:53.000 It's like, what opportunity do you have to get in depth on these things?
02:06:56.000 We never even talked about Cuba or whatever.
02:06:58.000 But I'm surprised that we're going into Cuba considering they don't have any oil anymore.
02:07:03.000 No, but it's right off the shore.
02:07:04.000 I mean, it's like, we want it.
02:07:05.000 Right.
02:07:06.000 Ignore the joke.
02:07:06.000 Fantanamo Bay.
02:07:08.000 The U.S. doesn't go into countries unless they have oil.
02:07:10.000 That's so huge.
02:07:11.000 It's all about oil.
02:07:12.000 It's just the truth.
02:07:15.000 They were getting the oil.
02:07:17.000 Are you a Christian, or what is it?
02:07:21.000 I'm nominally Christian.
02:07:23.000 I...
02:07:24.000 I think Jesus was pretty awesome.
02:07:27.000 Indeed.
02:07:28.000 Yeah, I would describe myself.
02:07:29.000 If I was any religion, it's Christian.
02:07:31.000 But you're a religion.
02:07:32.000 I don't believe, you know?
02:07:33.000 Are you a religion?
02:07:35.000 I believe in the.
02:07:37.000 I went through a die-hard atheist phase.
02:07:41.000 And I'm coming out of it.
02:07:42.000 I'm a recovering atheist.
02:07:44.000 How long are you an atheist for?
02:07:46.000 I mean, long.
02:07:48.000 15.
02:07:48.000 Like 15 years, probably.
02:07:50.000 Wow.
02:07:50.000 10 years?
02:07:51.000 I was an atheist for like four years.
02:07:52.000 Yeah.
02:07:53.000 Why are you coming out of it?
02:07:54.000 I mean, I've just had too many experiences that I can't really explain.
02:07:59.000 Like you being on my plane?
02:08:01.000 Yes, exactly.
02:08:02.000 All these serendipities.
02:08:03.000 No, I just feel like nuts.
02:08:06.000 Because kind of atheism is, to an extent, it's not nihilism because I do think atheists can actually be very, very moral people.
02:08:12.000 I don't think you need religion to find, you know, some of the best people I know are atheists in terms of like, I know a guy.
02:08:19.000 Yes, but are they American?
02:08:20.000 Yes.
02:08:20.000 Oh, actually, I know he's Scottish, but like, I know someone.
02:08:23.000 He's a Christian moral tradition.
02:08:24.000 I know someone who donated a kidney anonymously because he did the math and he was like, there are so many people on the donor list desperately needing kidneys.
02:08:33.000 I don't need my other one.
02:08:34.000 I'm going to do it because this is going to save someone's life.
02:08:37.000 Like a really dangerous operation at major personal cost.
02:08:40.000 Does not believe in God.
02:08:41.000 So my point is, is that the two things are not related.
02:08:44.000 However, I do think there is a value in believing in something bigger than yourself.
02:08:49.000 There's also value in having this kind of like coordination mechanism.
02:08:53.000 I also do think it's a decent meme to be like, if you are not a good person in this life, you might face eternal damnation.
02:09:00.000 I think it's a solid meme.
02:09:02.000 I think it's, you know, kind of like how I described it for the simulists, right?
02:09:05.000 It's a sorting algorithm.
02:09:06.000 But what I would say on the moral issue for most people, I explain, Dennis Prager gave the best example.
02:09:12.000 He called it like, what does he call it?
02:09:14.000 Cut flower politics or morals.
02:09:16.000 That you have this beautiful flower growing in a pot and you snip it from its roots and hold it up and talk about how beautiful it is.
02:09:22.000 And then a day or two later, it's withered and it's dead.
02:09:24.000 And so Bill Maher is a great example of this because I'm a big fan.
02:09:27.000 He's a good dude.
02:09:28.000 I met him at going on his show and we got along despite disagreeing a lot of things politically.
02:09:32.000 And he's an atheist, but he has an entirely Christian moral worldview.
02:09:37.000 The best example that I explained to people, my favorite example is Blackstone's formulation, which ultimately becomes the Fourth and Fifth Amendment.
02:09:43.000 Why do we believe that you are innocent until proven guilty?
02:09:47.000 We look at it like, look, you don't got to be a Christian.
02:09:50.000 I'm an atheist and I believe the innocent.
02:09:52.000 But why do you think that?
02:09:53.000 Well, it's rooted in Blackstone's formulation.
02:09:55.000 And what did Blackstone write?
02:09:56.000 It's a story of Sodom and Gomorrah.
02:09:58.000 If there's but one righteous person, I will not destroy these cities.
02:10:01.000 It's not for us to judge.
02:10:02.000 You know, that's the other thing is one of the benefits of monotheism, the notion that you have one father, is that we're all brothers and sisters.
02:10:10.000 And that would mean we're all his children, which also means we're all of the same moral worth.
02:10:15.000 Our entire justicism is predicated on that Christian ethic.
02:10:20.000 And this is why China does not have it.
02:10:21.000 That's right.
02:10:22.000 So the idea is if I kill a wretch on the street or I kill Bill Gates, I do theoretically the same amount of time because you murdered someone and it doesn't matter what they did in this world.
02:10:32.000 That's a moral being.
02:10:34.000 And you're not better than that person.
02:10:36.000 You're not allowed to do that.
02:10:37.000 What if the progenitors are trying to program a killbot?
02:10:44.000 Damn you.
02:10:45.000 Damn you.
02:10:46.000 We're all good people.
02:10:47.000 Damn you.
02:10:48.000 In the end, we die and then we wake up and they go, look, we're looking for murder bots for a war and you were a good person who is not aggressive at all.
02:10:56.000 Delete.
02:10:58.000 There are people, there are born warriors, there are born merchants, there are born artists.
02:11:04.000 For sure.
02:11:04.000 You know, I do a lot of jiu-jitsu and I roll around with certain guys who are special forces guys and it's annoying that I don't do well.
02:11:12.000 And they're like, you're built like an artist, dude.
02:11:14.000 And it's like, I got another one for you.
02:11:16.000 I got another one for you.
02:11:17.000 I was thinking about if we wanted to colonize like Alpha Centauri, and they say, like, we're not going to travel near the speed of light, but we could accelerate, I believe, to about half the speed of light theoretically, they say, but you got to start slowing down.
02:11:29.000 The problem then is, how do you, if you were to get a big vessel and say it's going to be a 100-year journey, that means you're going to have a couple generations.
02:11:37.000 By the time you get there, the grandparents will have never lived on Earth, have no understanding of human society, and they'll only have this culture built around living on this vessel.
02:11:47.000 And that would suck because they're going to land on this planet that's probably been previously terraformed.
02:11:52.000 There's grass, there's water, whatever.
02:11:54.000 But they're going to be like, the only world I know is pod world.
02:11:56.000 So what do you do?
02:11:58.000 You have babies preparing to be cloned just 40 years before contact with the planet.
02:12:06.000 And when the baby is growing in the pod, its brain is connected to the neural link to simulate life on Earth at the point in which the ship was launched.
02:12:15.000 Then, 40, when the ship finally arrives after 40 years, a bunch of people wake up of various ages from 40 down to like 10 years old, and they wake up in this pod shocked, like, wait, where am I?
02:12:28.000 And they get greeted by a program saying the life you experienced was to give you a basic human understanding of life on Earth before you arrive to colonize the new planet, where there are tools available for you.
02:12:39.000 Technology has been pre-delivered, and you will all have homes.
02:12:43.000 Congratulations.
02:12:44.000 Who's doing this?
02:12:45.000 Who is the grand master?
02:12:47.000 No, in that scenario, it's us.
02:12:49.000 We decide we want to colonize another planet, but we don't want to send a bunch of you don't want to colonize a planet with humans who have never experienced life in a society.
02:12:58.000 So, right before it gets there, the humans are being cloned, or they're growing, or maybe not even cloned, but in vitro, right?
02:13:07.000 Or not even in vitro.
02:13:08.000 They're basically in artificial wombs growing.
02:13:11.000 And then this life where experience is wired into our brains, when you arrive, there will be 50-year-olds being like, I lived 50 years on Earth.
02:13:19.000 Yes, because we needed someone of good moral standing to understand how to live, to function.
02:13:25.000 And there's going to be a thousand people, and they all have different jobs.
02:13:28.000 And that means a dude's going to wake up in the pod for the first time ever at 40 years old, and he's going to be a perfect neurosurgeon, arriving just in time, perfectly trained for the new colony.
02:13:37.000 So this is the sim we're in now.
02:13:39.000 We might all be suddenly about to wake up and be like, oh, that was a completely different thing.
02:13:43.000 You will wake up landing at Alpha Centauri.
02:13:45.000 And even then, there may be a first generation already there saying, in order to colonize a planet, you need scientists.
02:13:51.000 You need surgeons.
02:13:52.000 You need engineers.
02:13:53.000 You've been programmed.
02:13:53.000 You need drugs.
02:13:54.000 You've been downloaded with that information.
02:13:56.000 You listen to me.
02:13:57.000 You're in comedians.
02:13:58.000 You need comics.
02:14:00.000 I'd be like, I can make you guys laugh.
02:14:01.000 I'd be so useful.
02:14:03.000 And everything you're learning right now is being fed into you based on their cultural experiences right now so that when you land, you are culturally relevant to them.
02:14:12.000 I'll teach theater classes.
02:14:15.000 I talk to God, like I'll think words instead of say them, and then it'll respond.
02:14:19.000 I think it's God, or it's the spirits themselves.
02:14:21.000 I was like, who sent us here?
02:14:22.000 Because I'm thinking of panspermia.
02:14:23.000 You know how maybe we see this life on Earth?
02:14:25.000 And they were like, you did.
02:14:28.000 What does that mean?
02:14:28.000 I think with Ian, they're going to be like, the pods are going to open up and there's going to be like one custodial guy who like makes sure they're all growing.
02:14:37.000 And then as everyone's getting up, they're going like, hey, what's back here in this like dark corner?
02:14:42.000 And it's like, I've never gone back there.
02:14:43.000 And then they walk back there and then Ian is just in this like unkempt pod that was like overlooked for a long time.
02:14:50.000 Yeah.
02:14:50.000 So he's just in there and they're like a rocky horse getting like crap.
02:14:54.000 Dude, I'm telling you.
02:14:55.000 You're about to program this one.
02:14:56.000 You got to clear your mind.
02:14:56.000 It's not an easy life because you got to be honest on the internet to everyone.
02:15:00.000 You confess your past, which makes you a target, but it frees up your mind to not think.
02:15:05.000 And then you can think whatever you want.
02:15:07.000 You can have long, fluid conversations with your mind.
02:15:11.000 And that's how you talk to God.
02:15:12.000 The body and shit is a distraction.
02:15:14.000 It's an instantaneous thought connection.
02:15:18.000 Well, let's go to callers.
02:15:19.000 And we're going to start with Luke Graywolf.
02:15:21.000 What's going on?
02:15:22.000 What's up, man?
02:15:24.000 What's up?
02:15:26.000 Hey, I'll play everyone here at me tonight.
02:15:26.000 Hello.
02:15:29.000 What?
02:15:30.000 You're a little broken, but that's okay.
02:15:32.000 Say that again.
02:15:35.000 Hey, Radio Check.
02:15:36.000 Out in the middle of Mississippi.
02:15:38.000 You got me.
02:15:39.000 All right.
02:15:41.000 Oh, I got hopefully an interesting question tonight.
02:15:45.000 So, subject to gambling, I mean, I've never gambled much with money because I never had the money to do it.
02:15:52.000 I only ever gambled my life because I could afford to lose that.
02:15:56.000 And I've always kind of wondered if gambling with money, does that sort of get the risk-taking urge out of your system?
02:16:07.000 Or if you're doing money betting, either chance games or card games, if you keep in that lifestyle or keep doing that regularly, do you start to take more risk in other areas of your life?
02:16:23.000 I don't think I think gambling, the feeling of gambling is wanting to see a miracle.
02:16:30.000 That's at least it for me.
02:16:31.000 I'm not going to presume that my perception and interest in craps or blackjack.
02:16:36.000 Poker's not gambling, by the way.
02:16:38.000 My interest in playing a card game like three-card poker, which is a table game, not like I just told them, is because I want to see a miracle happen.
02:16:45.000 Some people, because they want to get a quick, get rich quick.
02:16:48.000 If I want to make money, I'll record a podcast.
02:16:51.000 No, I want to see a royal flush.
02:16:53.000 One in 30,000, I think.
02:16:55.000 I want to see that rare moment where something truly incredible happens and you're like, I can't believe you just rolled seven four times in a row.
02:17:03.000 That's what's fun and exciting.
02:17:04.000 It's really exciting when that rare moment hits when someone bets on aces on craps, meaning, you know, snake eyes, and then everyone's table screams and cheers that a one in 30 hit just happened and everybody gets paid.
02:17:15.000 It's a celebration.
02:17:16.000 It's amazing.
02:17:17.000 As for poker, that's a strategy game and that's more about having a battle of wits, which is like any other competition.
02:17:24.000 But I don't know if you agree or disagree.
02:17:25.000 I mean, yeah.
02:17:26.000 I mean, there is obviously an element of chance in poker.
02:17:28.000 So you can, it depends how you define it, but it's not as much gambling as roulette, but it's not as pure skill as 100% skill like chess is.
02:17:37.000 But I mean, for me and a lot of other people, actually, they do play for the thrill of the risk.
02:17:44.000 Certainly I do.
02:17:45.000 And I feel like people lie on a spectrum.
02:17:46.000 Some people are just massive risk seekers.
02:17:48.000 Some people are very risk averse.
02:17:50.000 Typically, it's considered more of a male trait to like risk, right?
02:17:53.000 And more of a feminine one to not like risk.
02:17:56.000 And yeah, like I not so much me, but I know a lot of poker friends who literally, like they're saying is the only thing worse, so the best thing in poker is winning.
02:18:11.000 The second worst thing is losing, but the worst thing of all is to never have any action in the first place.
02:18:16.000 You know what I mean?
02:18:17.000 They just want that thrill of.
02:18:20.000 But I would say that really good poker players are usually risk averse.
02:18:26.000 No, disagree.
02:18:28.000 I mean, you're going to call an over bet with a draw.
02:18:33.000 Depends.
02:18:34.000 If it makes sense mathematically, yeah.
02:18:36.000 Exactly, which means you're not, which means you're risk averse.
02:18:38.000 No, it's not.
02:18:39.000 It can be very, very risky.
02:18:40.000 I know poker players who will bet 10x the pot, which is technically really risky, but either game theoretically, it makes sense or just because they have a sick read on the person.
02:18:49.000 So just semantically, I'll tell you my distinction here is you're saying that when it makes sense mathematically, like they have an edge, they will make a move, which is reducing risk.
02:18:57.000 Sure.
02:18:58.000 So my point is this.
02:19:00.000 Playing risky is looking, is betting blind and being like, oh, boy, I hope you're not.
02:19:04.000 So you're basically saying you mean dumb risk as opposed to calculated risk.
02:19:08.000 Yes.
02:19:09.000 I'll put it like this.
02:19:10.000 Poker players are always trying to minimize risk and make the more correct decisions.
02:19:14.000 Yes.
02:19:16.000 Correct.
02:19:17.000 So you're not as risk averse as, say, someone playing chess.
02:19:20.000 True.
02:19:20.000 There's some risk to it.
02:19:22.000 But one of the, to me, what makes poker fun is I'm trying, I love math.
02:19:27.000 I'm Asian.
02:19:28.000 So you can't just play the math.
02:19:31.000 You'll just lose.
02:19:31.000 You'll get exploited.
02:19:32.000 But largely, I like playing the math.
02:19:35.000 I'm trying to reduce, I'm trying to reduce risk and maximize expected value.
02:19:40.000 I mean, but some of the very best players I know, like Phil Ivey, for example, someone like him, right?
02:19:44.000 He's legend of the game.
02:19:46.000 He is also known for being this incredible gambler, who's also very good.
02:19:50.000 He wins.
02:19:51.000 But he loves to play.
02:19:52.000 That's the grand one gambling, though.
02:19:54.000 No, that's a different thing.
02:19:55.000 Yeah, that was not going to be allegedly, I guess.
02:19:57.000 But you know, like he loves the thrill of the gamble.
02:19:59.000 And again, like, I know other people, like, they just love doing these like edgy, you know, we call them de-gens.
02:20:04.000 Like, the term de-gen is like a term of like praise because it's like, oh, wow, you're like, really, you're a real sicko.
02:20:10.000 You're trying out this stuff.
02:20:11.000 You know, some of them just truly love, and a lot of them, like, even really great players have terrible roulette habits, for example.
02:20:18.000 Let me tell you a great story.
02:20:19.000 Okay.
02:20:20.000 I was playing 2-5 $1,000 buy-in.
02:20:23.000 And I was basically floating the whole time, saying about $1,000.
02:20:28.000 And I was with my wife, and we were like, okay, let's go get Denner.
02:20:30.000 We're going to leave.
02:20:31.000 And so then I said, the cards got dealt.
02:20:34.000 I drop a chip when the card lands.
02:20:35.000 I don't look at him.
02:20:36.000 And I say, I'm going to jam all my money right now.
02:20:38.000 But if I win, the dealer can have it.
02:20:39.000 And so it comes to me and I just shove in a thousand bucks without looking at my cards.
02:20:44.000 And then I get one caller.
02:20:45.000 He looks at it.
02:20:46.000 He looks at his hand and he goes, I call.
02:20:47.000 And he puts in the chips.
02:20:48.000 And everyone's like, oh, man.
02:20:50.000 And the board runs out and he flips over Queens.
02:20:53.000 And everyone's like, oh, he was over.
02:20:55.000 He had an overpair.
02:20:56.000 And then I flipped over two pair, one at a time.
02:20:58.000 Boom, boom.
02:20:59.000 And everyone's, the whole table screams.
02:21:02.000 And then the dealer shoves two grand towards me and I shove it right to the dealer.
02:21:05.000 That was for the thrill of it.
02:21:07.000 Because my attitude was he's a gambler.
02:21:07.000 Wow.
02:21:10.000 He's going to choose his odds.
02:21:11.000 And he chose really good odds.
02:21:12.000 So he's got a chance to win.
02:21:13.000 And I'm likely just giving this guy some money.
02:21:16.000 But if I do win, I'm going to give it to the dealer.
02:21:18.000 So either way, it's not, I'm doing, I'm doing this for the fun and the money's going to somebody else.
02:21:21.000 But he was pissed.
02:21:22.000 Philanthropic poker right now.
02:21:23.000 That's not real poker.
02:21:24.000 That's like.
02:21:25.000 No, that was risk.
02:21:26.000 That was for the thrill of it.
02:21:27.000 That was like, if I'm good at it.
02:21:28.000 Look, if you want the thrill of it, go play craps.
02:21:30.000 I don't like thrill, which is why I avoid poker generally.
02:21:34.000 I'm good at it when I focus, but I don't like the risk.
02:21:37.000 It's nauseating.
02:21:39.000 What else in your life do you like?
02:21:41.000 Do you hate all kinds of thrill?
02:21:42.000 Like, do you like roller coasters?
02:21:44.000 Do you like driving?
02:21:45.000 There you go.
02:21:46.000 So you get your kicks from different names.
02:21:48.000 Everybody gets their kicks from something, right?
02:21:51.000 Like stand-up, nothing compares to getting up on stage.
02:21:54.000 Nothing.
02:21:56.000 I'm going to go, you know, skydive.
02:21:58.000 Cool.
02:21:59.000 No, it's fun.
02:22:00.000 No, I like roller coasters.
02:22:03.000 Stage performance is consistent.
02:22:04.000 You don't ever come off feeling low.
02:22:07.000 Like, I mean, if it's always if you have a rough crowd, but I totally disagree with that without any with losing all your money low.
02:22:13.000 You know, live.
02:22:15.000 Do you believe that you actually face a let's let's just call it like a considerable risk playing a $500 buy-in like one-two game?
02:22:25.000 Well, no, because it depends on my bankroll.
02:22:27.000 If I had $500 to my name, sure.
02:22:29.000 No, no, I'm saying you right now, like, let's say you are playing a one-two game.
02:22:33.000 Do you feel like you are facing a risk against the average one-two player?
02:22:37.000 Right.
02:22:37.000 Oh, no.
02:22:38.000 No, definitely not.
02:22:39.000 So when Ian's talking about the risk in poker he avoids, I'm a winning player.
02:22:47.000 I don't go to a one-two table expecting to lose or typically losing.
02:22:50.000 So I don't need a risk.
02:22:52.000 You see, you have pocket aces.
02:22:53.000 They flip an ace.
02:22:54.000 You're like, I think I own this, but there's a seven on the board.
02:22:56.000 The other guy has a fucking full house.
02:22:58.000 I'm like, what the?
02:22:58.000 I just lost 80% of my pot.
02:23:01.000 Like, that shit has happened to me too many times to enjoy it.
02:23:04.000 But see, this is the thing about being skilled at the game.
02:23:06.000 Right.
02:23:07.000 So you play it right and you still lose sometimes.
02:23:09.000 It's crazy.
02:23:10.000 Sometimes you do.
02:23:11.000 You're just very, very loss of.
02:23:12.000 Some people, no, someone.
02:23:14.000 Some people have like big loss aversion.
02:23:16.000 You know, some people don't care about losing.
02:23:18.000 You know, if they play mathematically correctly and they get unlucky, they're like, oh, whatever.
02:23:22.000 And they just keep going.
02:23:23.000 Some people, it just really stings because everyone gets lucky from time to time.
02:23:27.000 Everyone gets very lucky from time to time.
02:23:31.000 But some people get more of a thrill from the luck when they get really lucky.
02:23:35.000 And some people, and it outweighs easily all of the times when they lose unfairly.
02:23:40.000 And others are the other way around.
02:23:41.000 So it sounds like you're someone who just, the sting of losing hurts so much.
02:23:45.000 No amount of winning.
02:23:46.000 I took this dude's money that he couldn't afford to lose it.
02:23:49.000 It wrecked me.
02:23:51.000 Let me tell you what I love in poker.
02:23:53.000 I love it when I look down at Ace King and I'm in early position and I bet and the button calls and then the board runs out ace king queen and I'm like, I've won.
02:24:00.000 Like, there's no risk.
02:24:01.000 I don't got to worry about it.
02:24:02.000 I'm going to make a C bet.
02:24:03.000 He's probably going to fold and then I'm going to take, you know, 35 bucks.
02:24:07.000 What I hate, I was playing at the lodge before they shut it down and I looked down at the beautiful pocket queens.
02:24:12.000 I mean, I'm sorry, pocket kings.
02:24:13.000 And boy, am I excited, early position.
02:24:15.000 So I raise.
02:24:16.000 It's one, two, I raise to 15 bucks.
02:24:18.000 Everyone calls, right?
02:24:20.000 And then the board comes, king, ace, ace.
02:24:23.000 Oh, boy, I have a full house.
02:24:24.000 Ouch.
02:24:25.000 Now I know I lose to ace king, but probably unlikely with two aces on the board, three kings are already in play.
02:24:32.000 So what are you going to do?
02:24:33.000 You got a full house.
02:24:35.000 I know with all these callers, there's an ace in play.
02:24:38.000 So I make a bet.
02:24:39.000 And sure enough, good old Skull Mike made the call.
02:24:41.000 Yeah, from the lodge.
02:24:42.000 Shout out, Skull.
02:24:43.000 And I'm like, yes.
02:24:45.000 Because I know he doesn't likely have Ace King.
02:24:47.000 I got a full house.
02:24:48.000 The only thing I'm worried about is him hitting a second pair so he get aces full or the board pairing.
02:24:53.000 So I make a big bet.
02:24:54.000 He calls.
02:24:55.000 And I say, this is what I hate, actually, right?
02:24:58.000 This is what I hate because I'm good right now.
02:25:00.000 Boy, am I lucky?
02:25:01.000 And then the board went, I think it went King, Ace, Ace, Deuce, Deuce, I think.
02:25:09.000 And anyone.
02:25:11.000 I think he had like Ace 10.
02:25:12.000 That's what I hate.
02:25:13.000 That's what I hate.
02:25:15.000 I'd rather just have the marginally good hand in a good spot that I know I'm going to win and just take it down.
02:25:20.000 But to be fair, those things happen.
02:25:21.000 That's risk.
02:25:22.000 That's bad luck.
02:25:23.000 But I'll put it like this.
02:25:25.000 The way I try to describe poker, it's not gambling.
02:25:28.000 There's an element of chance for sure.
02:25:29.000 But the way I describe it to people is, how about this?
02:25:32.000 Brian, you want to enter a competitive tournament against me?
02:25:34.000 We each put a 500 bucks a skateboarding contest.
02:25:37.000 You win?
02:25:38.000 No.
02:25:39.000 Because you'll beat me.
02:25:39.000 Why not?
02:25:41.000 What makes you think that?
02:25:42.000 Anybody can skateboard.
02:25:44.000 That's what happens with poker all the time.
02:25:46.000 The issue is, for whatever reason, poker has this amazing ability to convince people who have never studied the game they're good at the game.
02:25:52.000 That's right.
02:25:53.000 And they will decide to enter a tournament against you without having done any work to do.
02:25:56.000 Do you write scripts?
02:25:58.000 Because they've seen a lot of movies.
02:26:00.000 I have a script I wrote.
02:26:01.000 It's like, oh, do you?
02:26:02.000 I mean, reading it.
02:26:03.000 Because I can tell you it ain't good.
02:26:04.000 It's so hard.
02:26:05.000 It's like everything else.
02:26:06.000 So you can't accidentally hit a trick.
02:26:08.000 I mean, you might in skateboarding, but not like in poker, you'll accidentally win a few games and think that you're great.
02:26:13.000 Except in like an athletic.
02:26:14.000 The mistake being made is thinking the single hand is the game.
02:26:17.000 I know.
02:26:18.000 You got to play the long game like the Chinese.
02:26:18.000 And so.
02:26:21.000 There it is.
02:26:22.000 100-year game, man.
02:26:23.000 I'm here for a night.
02:26:24.000 I'll tell you, you go to MGM, National Harbor in D.C., and I'm playing the 1-3.
02:26:28.000 It's a $500 buy-in.
02:26:30.000 And it is not even poker.
02:26:32.000 I get bored because I'll look down at my hand and it doesn't even matter what cards I have.
02:26:37.000 It doesn't even matter because they're only calling with pocket pairs, pseudo-connectors, or Broadway cards.
02:26:45.000 They're typically not playing junk for the most part because it's a Friday night.
02:26:49.000 Sometimes you'll get a crazy player.
02:26:51.000 But you have a general idea of what they might be playing, and it's really simple.
02:26:54.000 Typically, you're not going to hit a pair.
02:26:56.000 Pairs are hard, they say.
02:26:57.000 What is like a third of the time?
02:26:58.000 Or when the board runs out, it's about half, but on the flop, it's about a third.
02:27:01.000 So you want to make sure your hands are typically above 50%.
02:27:04.000 You're playing against people who have no idea what they're doing.
02:27:06.000 And then you raise, you get five callers, then the board runs out.
02:27:10.000 You hit a pair, you make a C bet.
02:27:11.000 They all fold.
02:27:12.000 If someone calls, you fold.
02:27:14.000 And as long as you keep doing that, you're just popping up.
02:27:16.000 You're just stacking up chips like crazy.
02:27:19.000 They never adapt their play.
02:27:20.000 They don't think about what kind of cards you have.
02:27:22.000 So I can sit there for two hours at MGM and turn 500 into 1500 because they're just not good.
02:27:29.000 They think they're good or they want to have fun or whatever.
02:27:31.000 There's no risk for the most part.
02:27:33.000 Don't go all in.
02:27:34.000 You don't have to.
02:27:35.000 You get so many free chips from them folding and they overfold.
02:27:38.000 They'll fold second pair.
02:27:39.000 Then you go up in stakes and people will start floating.
02:27:41.000 They'll start calling second pair and then you're going, ugh.
02:27:44.000 Now you got to play good.
02:27:45.000 Yeah.
02:27:46.000 So anyway.
02:27:46.000 I love it.
02:27:47.000 Me too.
02:27:48.000 Anything you want to add, Colin?
02:27:49.000 I'm playing.
02:27:50.000 Or anything you want to See, Robbie's sitting over there.
02:27:53.000 Mega poker.
02:27:57.000 At least I seem to kick off a good conversation.
02:27:59.000 Well, I guess the real root of my question is, especially like for you, Tim, after you've spent a good day at the casino, does that make you want to jump off the roof of a building with a skateboard more or less?
02:28:12.000 Well, so I'm just not a gambler, you know?
02:28:18.000 I don't like playing games that are coin flips.
02:28:21.000 If I go to a casino, it's going to be for a poker game.
02:28:25.000 And boy, I love PLO because PLO is just so incredibly soft.
02:28:32.000 But I don't like unnecessary risk, and I don't risk large sums of money.
02:28:37.000 I will never play a game that is any risk at all to my life.
02:28:42.000 Or I don't do risk.
02:28:43.000 I just don't do it.
02:28:44.000 Skateboarding is not a risk.
02:28:46.000 I've broken a bone one time skateboarding was because someone else hit me and that wasn't my fault.
02:28:50.000 When I skateboard, one story I got for you is when I was 16, I was jumping off of, it was about five feet high and about maybe seven feet long.
02:28:59.000 It was a ramp.
02:29:00.000 So I'm going up a ramp onto the flat and then doing a backside 180.
02:29:03.000 This means that you're spinning in the air backwards.
02:29:06.000 You can't see what's in front of you.
02:29:07.000 You're blindsided.
02:29:09.000 And it took me, I don't know, like, you know, 10 or 12 tries.
02:29:13.000 So I'm spinning backwards, landing on the ground, sliding out, getting up, trying again.
02:29:17.000 And then I finally land it and I'm like, nailed it.
02:29:20.000 Then a little kid, five minutes later, is running through the park, trips and falls right where I was and breaks his wrist.
02:29:26.000 And I'm just like, I can jump off this thing going, you know, like 13 miles an hour or whatever, spinning backwards to where I can't see, and there are zero injuries at all.
02:29:35.000 There's no risk to me whatsoever.
02:29:37.000 I mean, the risk is like 1%, maybe.
02:29:39.000 And that's always going to be somebody else.
02:29:41.000 So for the most part, I will just say I don't much care for risk.
02:29:49.000 When I go to a casino, I'm not going, oh, God, I have to win.
02:29:52.000 Oh, my God.
02:29:53.000 I'll have like $100.
02:29:54.000 We'll play Blackjack.
02:29:54.000 And then I'll be like, well, that was fun.
02:29:56.000 You get a free drink.
02:29:57.000 So there it is.
02:30:00.000 I guess I've just always been kind of curious about that.
02:30:02.000 For me, the biggest risk, or I say risk, because I had enough training for it to not be risky for me.
02:30:10.000 Well, I mean, it's always risk, but I was a cop and doing shit with slot, going through doors and high-risk warrants.
02:30:17.000 That was the highest high I've ever felt in my life and that kind of risk.
02:30:21.000 So I've always kind of been curious about the mindset around gambling with money, how that makes people feel.
02:30:30.000 It doesn't matter for me the few times I've done it.
02:30:33.000 Let me ask you this.
02:30:34.000 So if you knew that if you entered a building, you'd die, would you enter the building?
02:30:42.000 If I knew for certain I was going to die, no, but there's always the risk.
02:30:46.000 But you missed the adventure.
02:30:48.000 The adventure is not knowing what's going to come next.
02:30:50.000 My point is, we understand risks and all things, but we usually presume there isn't any.
02:30:55.000 So I've been on the ground covering riots and civil unrest.
02:30:58.000 I've been to foreign countries, and it's all technically risky, but I would never go somewhere that I actually expected something bad to happen.
02:31:06.000 We always rob under the presumption that it's rare and likely not going to happen.
02:31:10.000 So I guess the issue is I just disregard risk and take actions where the presumption of risk is low.
02:31:20.000 Oh, there's really only a handful of jobs like police, firefighter, military where you have to actively run towards the danger and the rest.
02:31:28.000 Well, it takes a special kind of stupid to do that.
02:31:31.000 I'll give journalists half the credit on that one.
02:31:35.000 I'll give journalism half points because we literally would run towards the riots, the explosions, the gunfire.
02:31:42.000 Unfortunately, most of the journalism industry isn't that.
02:31:45.000 But yeah, do you want to shout anything out, brother?