The Joe Rogan Experience - November 26, 2024


Joe Rogan Experience #2234 - Marc Andreessen


Episode Stats

Length

3 hours and 8 minutes

Words per Minute

207.57314

Word Count

39,131

Sentence Count

3,411

Misogynist Sentences

45


Summary

On this episode of The Joe Rogan Experience: The Podcast by Night, the hosts discuss the aftermath of the Kennedy Assassination, including the lack of official investigation into the events surrounding it, and whether or not there was a cover-up. They also discuss conspiracy theories about what could have been the real shooter, and if it was Lee Harvey Oswald. Also, they discuss the possibility that the assassination was an accident, not a deliberate act, and that the truth is still out there somewhere, waiting for us to figure out who killed President John F. Kennedy and why it could have happened. And, of course, there's a conspiracy theory about what happened to JFK, and why we still don't know who did it, or why it happened at all. Joe and Joe discuss this, and much more, on this week's episode of the podcast by day, and the conspiracy theories by night, and how they think it could be a conspiracy, and what we should do about it. Check it out, and don't forget to subscribe on your favorite streaming platform so you don't miss out on the next episode! It's a good one! Joe's new book is out now, and you won't want to miss it! If you're a fan of conspiracy theories and conspiracy theories, you'll love this one. Subscribe to the pod! Subscribe on Apple Podcasts! Subscribe on iTunes Learn more about your ad choices. Rate/subscribe on iTunes Review and review our podcast choices! Thanks for listening and share the pod with your fellow podcaster friends! if you're looking for a good podcastment and listening to the latest episode of What's Good to Good on Good Things? Subscribe and review the podcast with your thoughts on Good Morning America? and Good Morning, Good Day, Good Life, Goodness, Good Vibes, Good Things, Good Podcasts, Good Rest Rest Rest Day, and Good Luck, Good Dreams, Good Success, Good Night, Good Blessings, and Happy Holidays, - Joe's Thoughts and Good Day! - Cheers, Cheers! -- Your Day Off Soon, Joe's Back Soon, Bye, -- Cheers. -- Thank You, Joe Rogans -- Rory McChoy Cheers x -- - -- Mike & Rory McHenry -- Thank you, Rory McDonough -- Chacho


Transcript

00:00:01.000 Joe Rogan Podcast.
00:00:02.000 Check it out.
00:00:03.000 The Joe Rogan Experience.
00:00:06.000 Train by day.
00:00:07.000 Joe Rogan Podcast by night.
00:00:08.000 All day.
00:00:12.000 Hello.
00:00:13.000 Good to see you.
00:00:14.000 Thanks for having me back.
00:00:15.000 My pleasure.
00:00:16.000 Good to see you.
00:00:17.000 Well, the world's still functional.
00:00:19.000 It's amazing.
00:00:20.000 Yeah, amazing.
00:00:21.000 You wanted to talk about the post-election sort of a wrap-up.
00:00:25.000 Yeah.
00:00:25.000 Sort of.
00:00:27.000 Where we stand.
00:00:27.000 Are you happy?
00:00:28.000 Very happy.
00:00:29.000 That was a weird one.
00:00:30.000 Morning in America.
00:00:32.000 That was one of the first times ever I felt hopeful after an election.
00:00:35.000 Like you should have seen The Green Room at the Comedy Club.
00:00:37.000 Everybody was like, yes!
00:00:38.000 Yes.
00:00:40.000 So my theory is the timeline, like in a science fiction movie, the timeline has split twice in the last like nine months.
00:00:46.000 What was the first split?
00:00:48.000 There was when Trump got shot.
00:00:50.000 And there was that moment where the world was going to head in two totally different directions.
00:00:54.000 Right.
00:00:54.000 If he got hit...
00:00:55.000 Yeah.
00:00:56.000 Yeah.
00:00:56.000 And we saw the most conspicuous display of physical bravery I've ever seen.
00:01:00.000 Right.
00:01:01.000 Afterwards.
00:01:02.000 Exactly.
00:01:03.000 And it could have gone, you know, horrifically badly for the entire world after that.
00:01:07.000 So that was timeline split number one.
00:01:08.000 So that other timeline is out there somewhere.
00:01:10.000 Yeah.
00:01:10.000 And I don't want to visit it.
00:01:11.000 Boy, imagine being stuck there.
00:01:14.000 What kind of horrible karma.
00:01:15.000 No.
00:01:15.000 I mean, that's a totalitarian dystopian nightmare.
00:01:18.000 That's the bad place.
00:01:19.000 Yeah.
00:01:20.000 And then timeline split again on election day.
00:01:23.000 I know you fancy a good conspiracy theory.
00:01:26.000 And that gentleman being able to pull off what he did and the way it happened, the way it all went down, it's a Lee Harvey Oswald 2.0.
00:01:39.000 Oh, yeah.
00:01:39.000 Clearly.
00:01:40.000 Yeah.
00:01:40.000 The shooter.
00:01:41.000 Yeah.
00:01:41.000 That we still don't know anything.
00:01:43.000 There's no call for disclosure.
00:01:46.000 There's no call for a press conference.
00:01:49.000 There's no toxicology report.
00:01:51.000 The toxicology report had to have been done.
00:01:53.000 Yeah.
00:01:54.000 Wouldn't you want to know, like, what kind of stuff this kid is on that made him want to do that, or if anything?
00:01:59.000 Yeah.
00:02:00.000 So my theory is it's almost as if people want us to think it's a conspiracy.
00:02:05.000 Like, it's almost like the whole thing is almost orchestrated.
00:02:09.000 It's so strange.
00:02:10.000 This is like the rapid cremation.
00:02:12.000 The whole thing was just completely bizarre.
00:02:13.000 And then you're exactly right.
00:02:14.000 No hearings, no nothing.
00:02:16.000 Now, having said that, I expect that this will change.
00:02:19.000 Do you think they're going to do a dive into what happened?
00:02:21.000 I mean, I would.
00:02:22.000 I don't know if they will, but I certainly would if I was in a position to do that.
00:02:25.000 I wonder what they can actually find.
00:02:27.000 I mean, I don't know if they wanted it to be a conspiracy that people talked about or if that's simply the best way to pull it off.
00:02:35.000 Yeah.
00:02:36.000 Or it's just, you know, as we saw, I think, in the hearing afterwards, maybe just a systemic collapse of confidence.
00:02:41.000 There's also a confidence in the fact that the news timeline today is so rapid.
00:02:48.000 When things are relevant and people are paying attention to them, you have a couple of days, even with an assassination attempt on a former president, where people were murdered.
00:02:59.000 And there's It's in and out.
00:03:02.000 Yeah, that's right.
00:03:02.000 I think it's exactly.
00:03:03.000 I think the news cycle now is like a two to three day social media firestorm and we just cycle from one to the next.
00:03:08.000 Yeah.
00:03:08.000 And we have the memory of goldfish and things that would have been era-defining just come and go with astonishing speed and shock.
00:03:16.000 By the way, I should say, I doubt there was a conspiracy.
00:03:19.000 I think anything's possible.
00:03:20.000 I think we have a competence collapse.
00:03:22.000 And I think we saw that on display when the director at the time testified.
00:03:26.000 Well, there's all the elements that it could have been a conspiracy.
00:03:29.000 It could have.
00:03:29.000 But this is kind of the thing, which is it also could have been systemic competence collapse.
00:03:33.000 And then it's like, okay, would it be better off if it looks like a conspiracy?
00:03:38.000 Okay, two timelines.
00:03:40.000 Which world would you rather live in?
00:03:41.000 The one with the conspiracies or the one with just incompetence everywhere?
00:03:45.000 Well, I think you have both simultaneously.
00:03:47.000 I don't think it's binary.
00:03:48.000 I think there's incompetence everywhere and conspiracies are legitimate.
00:03:52.000 They're real.
00:03:53.000 And that one seems like a conspiracy.
00:03:55.000 The fact that his house was professionally scrubbed, there's no social media record of this kid online, there's no nothing.
00:04:03.000 He's the only kid of his generation who's that fired up about politics to have no online footprint.
00:04:07.000 It just doesn't make any sense.
00:04:09.000 And he's a registered Republican.
00:04:10.000 The whole thing is so weird.
00:04:13.000 And he was a bad shooter and then he became a great shooter.
00:04:15.000 Well, he definitely trained.
00:04:16.000 You could train someone to become a good shooter.
00:04:19.000 This is all you have to do.
00:04:20.000 Don't move and do that.
00:04:22.000 Get all your mechanics in place.
00:04:24.000 Understand technique and positioning, breathing.
00:04:26.000 It's not like the most complicated thing from a prone position.
00:04:30.000 But the fact that he chose to use iron sights I thought was weird too.
00:04:34.000 There's a lot of weirdness to it.
00:04:36.000 You know, from 140 yards with a scope, that is an easy shot.
00:04:41.000 Well, then he could just like wander up.
00:04:43.000 That's the different timeline.
00:04:44.000 The different timeline is he has a scope and that's it.
00:04:47.000 And Trump's dead.
00:04:49.000 And then, boy, boy, do we live in a crazy world then.
00:04:54.000 Yeah, completely bizarre.
00:04:55.000 I mean, what do the streets look like right now?
00:04:57.000 What kind of, like, protests and riots and...
00:05:00.000 You think January 6th was nice?
00:05:02.000 If they had killed Trump, that would be January 6th on steroids everywhere.
00:05:06.000 Yeah, that's right.
00:05:07.000 And we would experience it.
00:05:08.000 I mean, you know, I don't know.
00:05:09.000 When I was a kid, my high school history teacher got us a bootleg copy of the Zapruder film.
00:05:15.000 Really?
00:05:16.000 What a gangster high school history teacher.
00:05:18.000 He was actually pretty focused on that.
00:05:19.000 He really loved the Kennedy assassination.
00:05:20.000 So we spent a lot of time on that.
00:05:22.000 And, you know, you kind of watch it frame by frame and you can kind of see what's happening with this lots of questions.
00:05:26.000 But, like, when things like that happen, you know, today, it's going to be an...
00:05:29.000 High definition, 4K, ultra, surrounds on forever, right?
00:05:34.000 Playing out in real time forever.
00:05:35.000 And so like, yeah, I very much don't want to live in the world where those things happen.
00:05:38.000 Trevor Burrus Well, we are very fortunate.
00:05:40.000 I mean, like I said, after the election, I was like, wow, voting works.
00:05:48.000 Voting works.
00:05:49.000 That's nice.
00:05:49.000 Like they don't have the system completely rigged.
00:05:52.000 But they kind of tried to rig it at least with the media.
00:05:57.000 Where the real rigging in the 2020 elections – I mean, you can cast all your conspiracies upon it in terms of, like, mail-in ballots and all this jazz – but the real rigging was the collusion between social media companies and the government to suppress information that would have altered the effect of the election.
00:06:17.000 That's legitimate.
00:06:18.000 Oh, yeah, for sure.
00:06:19.000 Yeah, that was like direct interference.
00:06:20.000 And it was aided and abetted by a lot of former intelligence officials and by the current administration.
00:06:25.000 Tons of pressure on censorship coming from the current administration and all their kind of arms of the censorship apparatus.
00:06:32.000 You have your hands in the tech community.
00:06:34.000 You have your fingers in all that jazz.
00:06:35.000 What was the general attitude about all that stuff when it was revealed?
00:06:41.000 How did your peers respond to that?
00:06:45.000 I think anybody in social media, the internet companies knew it.
00:06:48.000 So it was pretty widely understood.
00:06:50.000 I mean, look, there's nothing that happened at Twitter in the Twitter files that wasn't happening at all the other companies, right?
00:06:54.000 So it's a consistent pattern.
00:06:56.000 If you got the YouTube files, they would look exactly the same.
00:06:58.000 And of course, we should get the YouTube files, right?
00:07:00.000 And now we probably will now with, you know, this new administration is probably going to carve all this stuff open.
00:07:05.000 Yeah, no, look, it was a pattern.
00:07:06.000 And then look, you know, the companies bear a lot of responsibility and the people in the companies, you know, made a lot of, I think, bad judgment calls.
00:07:11.000 But the government, like the Biden White House was directly exerting censorship pressure on American companies to censor American citizens, which I think, by the way, is just flatly illegal.
00:07:20.000 Like, I think it's actually subject to criminal charges.
00:07:23.000 Like, I think there are people with criminal liability who are involved in this.
00:07:25.000 So there was that.
00:07:26.000 There were also members of Congress doing the same thing, which is also illegal.
00:07:29.000 And then there was a lot of funding of outside third party groups that were bringing a lot of pressure down on censorship.
00:07:35.000 Yeah.
00:07:35.000 And just an example of that is there's a unit at Stanford right next door to us that was the internet censorship unit that was funded by the US government and exerted tremendous pressure on the companies to censor.
00:07:46.000 And it was very effective at doing so.
00:07:48.000 Does it smell like sulfur when you walk those halls?
00:07:50.000 It is very dark and grim.
00:07:53.000 This whole thing is very bad.
00:07:56.000 Stanford?
00:07:56.000 Oh, yeah.
00:07:57.000 Stanford, by the way, another unit like that at Harvard.
00:07:59.000 A bunch of universities got pulled into this.
00:08:01.000 A lot of NGOs and nonprofits got pulled into this.
00:08:04.000 And so the Twitter file showed us kind of the basic roadmap.
00:08:08.000 And then there's this thing called the Weaponization Committee that Congressman Jordan is running that has also revealed a lot of this.
00:08:13.000 But I would imagine the new Trump administration is going to come in and carve all that wide open.
00:08:17.000 And I know that there are people being appointed to senior positions who are very determined to do that.
00:08:24.000 We're good to go.
00:08:43.000 We're good to go.
00:08:50.000 We're good to go.
00:09:06.000 Then, you can either go full DIY or let Blinds.com handle everything from measuring to installation.
00:09:13.000 These guys have covered over 25 million windows, all backed with 100% satisfaction guarantee.
00:09:22.000 You can't lose.
00:09:23.000 So head to Blinds.com now and grab those Black Friday deals all month long.
00:09:29.000 Use the code ROGAN for $50 off when you spend $500 or more.
00:09:34.000 Limited time offer.
00:09:36.000 Rules and restrictions apply.
00:09:38.000 See blinds.com for details.
00:09:40.000 One of the things that I found really kind of shocking was when they revealed how much money the Democrats had spent on the election and how much money was spent on activist groups.
00:09:51.000 It's like more than $100 million, right?
00:09:54.000 Yeah.
00:09:55.000 There's extensive government funding of politically oriented NGOs.
00:09:59.000 Yeah.
00:10:00.000 NGO is one of those great terms, like non-governmental organization.
00:10:03.000 All right.
00:10:03.000 Like, what the hell is that?
00:10:05.000 What is that?
00:10:06.000 Tell me.
00:10:07.000 I don't know.
00:10:07.000 Well, it's sort of a charity.
00:10:09.000 Sort of.
00:10:10.000 Sort of.
00:10:10.000 But most of the time, it's a political entity.
00:10:13.000 It's an entity with a political agenda.
00:10:15.000 But then it's funded by the government in a very large percentage of cases, including the NGOs and the censorship complex, like the government grants, National Science Foundation grants, like the State Department grants, direct money.
00:10:28.000 And then, okay, now you've got an NGO funded by the government.
00:10:30.000 Well, that's not an NGO. That's a geo.
00:10:34.000 Right.
00:10:34.000 Right.
00:10:35.000 And then you've got a conspiracy.
00:10:36.000 You know, like censorship, then you have a conspiracy because you've got government officials using government money to fund what look like private organizations that aren't.
00:10:44.000 And then what happens is the government outsources to these NGOs the things that it's not legally allowed to do.
00:10:51.000 Like what?
00:10:52.000 Like censorship.
00:10:53.000 Oh, okay.
00:10:54.000 Like violation of First Amendment rights.
00:10:56.000 Right.
00:10:56.000 So what they always say is the First Amendment only applies to the government.
00:10:59.000 The First Amendment says the government cannot censor American citizens.
00:11:02.000 And so what they do is if you want to censor American citizens, you're in the government.
00:11:06.000 If you're smart, you don't do that.
00:11:07.000 What you do is you fund an outside organization and then you have them do it.
00:11:11.000 Boy.
00:11:11.000 Right.
00:11:12.000 And that's what's been happening.
00:11:13.000 Right.
00:11:14.000 That's like hiring a hitman.
00:11:15.000 Like it's not okay to murder someone, but you can hire someone to murder someone and then you're clean.
00:11:19.000 Yeah.
00:11:19.000 And if you want to solve a murder, it's not enough to find out who the hitman was.
00:11:22.000 You have to find out who paid the hitman, right?
00:11:24.000 You want to work your way up the chain.
00:11:25.000 And so a lot of this traces into the White House.
00:11:27.000 The best defense the companies have is that a lot of this happened under coercion, right?
00:11:31.000 Because when the government puts pressure on you, like it might be a phone call.
00:11:36.000 It might be a letter.
00:11:37.000 It might be the threat of an investigation.
00:11:38.000 It might be a subpoena.
00:11:39.000 It could take many forms.
00:11:40.000 But when the government does that, it carries, you know, that's a very powerful message.
00:11:45.000 It's like a message from a mob boss, right?
00:11:47.000 Right.
00:11:47.000 Don't you want to do me a favor?
00:11:49.000 Yes, Mr. Gambino, I do.
00:11:52.000 I like my corner store.
00:11:54.000 I'd like it to not catch on fire tonight.
00:11:56.000 And so there's this overwhelming hammer blower pressure that comes in.
00:12:00.000 And by the way, even when the government doesn't talk to you directly, if they're funding the organization that is talking to you, then it's very clear what's happening.
00:12:06.000 And so you come under incredible pressure.
00:12:08.000 And so the whole kind of chain, this whole chain of governments, activist universities and companies was corrupted.
00:12:13.000 And then on top of that, people in the companies in a lot of cases made a lot of decisions that I think they're probably increasingly starting to regret.
00:12:18.000 What was confusing to me was that the government spent so much money on these activist groups during the election and I didn't understand like what purpose that would serve.
00:12:30.000 What function would it serve to spend all this money on these activist groups that already support you, supposedly?
00:12:39.000 Are you bribing them to support you?
00:12:42.000 Are you paying them to go on talk shows and consistently repeat the government's message, the current administration's message?
00:12:51.000 What would be the function of that?
00:12:53.000 So I think in some cases, it's just pay to play, right?
00:12:56.000 So for example, we know that Kamala's campaign paid certain on-air personalities.
00:13:03.000 It's your point.
00:13:03.000 People are very supportive of Kamala, who then gave her interviews that went really well.
00:13:07.000 And so I think in some cases, you just have straight pay to play.
00:13:09.000 That's just how that system works.
00:13:11.000 It's just expected.
00:13:12.000 Then I think you have other organizations like these NGOs and other activist groups where they're actually – they actually do field activities, right?
00:13:18.000 And so there's – maybe there's a get out the vote component or there's social media influence downstream component or some other kind of field activity that's happening in support of the election.
00:13:25.000 Trevor Burrus I just didn't think that they – like when – it's still unclear.
00:13:31.000 Whether or not celebrities got paid to endorse her?
00:13:33.000 Right.
00:13:34.000 Right?
00:13:35.000 Have you...
00:13:35.000 They've mixed it up because there's...
00:13:37.000 Like Oprah says, her production company was paid to put on the production, but she was not paid for the interview.
00:13:42.000 Yeah, whatever.
00:13:43.000 But it was, you know, whatever, $2 million, $2.5 million.
00:13:46.000 It was initially listed as one, and it turned out it was 2.5.
00:13:48.000 Right.
00:13:49.000 But like, if I have a production company...
00:13:51.000 Right.
00:13:51.000 And my production company gets paid $2.5 million to endorse Trump.
00:13:55.000 And then I go, I didn't get any of that money.
00:13:57.000 People are like, shut the fuck up.
00:14:00.000 It's your company.
00:14:01.000 What are you talking about?
00:14:02.000 And also, how much does it cost to do an event?
00:14:05.000 Yes.
00:14:06.000 How does it cost $2.5 million to put on an event?
00:14:09.000 Like, are you feeding people gold sandwiches?
00:14:11.000 Like, what are you doing?
00:14:12.000 Like, how is that possible?
00:14:13.000 Yeah, exactly.
00:14:14.000 So, yeah.
00:14:14.000 And then the fact that it's deliberately obfuscated, of course, is a clue.
00:14:18.000 I just thought the really bizarre one was the allegations.
00:14:23.000 And I would say unsubstantiated allegations.
00:14:29.000 We're good to go.
00:14:47.000 Well, I wonder if Lizzo was like, I didn't get shit.
00:14:50.000 I would say it.
00:14:51.000 But why haven't they said it?
00:14:52.000 Like, Beyonce has been mum about the whole thing.
00:14:55.000 I think I would probably say.
00:14:57.000 Like, I didn't get any money to do that.
00:14:59.000 But that was a weird one, too, because a lot of people thought Beyonce was going to do a concert.
00:15:03.000 And she just went out there and talked.
00:15:04.000 And everybody was like, what the fuck?
00:15:06.000 Because they all came to see a free Beyonce concert.
00:15:08.000 And then she just said, I want to support Kamala Harris.
00:15:12.000 And everybody's like, good, good.
00:15:13.000 Now, if you like it, then you should have put a ring on it.
00:15:16.000 Right?
00:15:18.000 Come on!
00:15:19.000 We love your songs.
00:15:20.000 That's what we're here for.
00:15:22.000 I just didn't think that it was even possible that a I didn't think a candidate would ever pay for an endorsement.
00:15:31.000 Yeah.
00:15:31.000 I mean, the fact that it was even alleged.
00:15:33.000 Yeah.
00:15:34.000 Well, you know, and then there's, of course, there's the even stinkier version, arguably, which is all the social media influencer campaigns now.
00:15:39.000 There's, you know, a tremendous amount of payola.
00:15:41.000 That's for sure.
00:15:42.000 Right.
00:15:42.000 Because I know people personally who are approached multiple times and offered a substantial amount of money to post things in support of Harris.
00:15:51.000 Yeah.
00:15:52.000 And, like, I'm pro-capitalism and I'm happy for them that they get paid, but, like, maybe we should know.
00:15:55.000 Yeah.
00:15:56.000 Yeah, that seems like something you should absolutely have to disclose.
00:15:59.000 It should be like, say if I was going to do an ad for, you know, whatever, a certain coffee company, Black Rifle Coffee, and I did it on my Instagram, I'd have to say ad.
00:16:08.000 I'd have to say this is an ad.
00:16:10.000 It's a paid ad.
00:16:12.000 And that's part of the thing, you know?
00:16:15.000 Unless it's your company.
00:16:16.000 Like, you're supposed to say, they're paying me to do this.
00:16:19.000 Yeah.
00:16:20.000 Well, you know, look, the good news with these is each cycle we learn a lot about how politics works.
00:16:24.000 We learn a lot about how fake it is.
00:16:26.000 We learn a lot about the things we put up with for a very long time.
00:16:28.000 I mean, everybody's always, like, freaked out by, like, whatever the new guy does, but, like, this real scandal in most cases, I think, is just the way the system already works.
00:16:35.000 It's a sneaky system.
00:16:36.000 Well, another fascinating aspect of the system that we learned out this time around is the uncontrolled aspect of it, like what Trump called earned media was much more powerful than anything else.
00:16:51.000 The uncontrolled version of it.
00:16:53.000 Like, one of the things that, unfortunately for them, mass media or corporate media has done is they've diminished their credibility so much, so much so, that like Joy Reid was on TV today talking about It's saying that Trump is going to shoot protesters and just wild,
00:17:13.000 unsubstantiated, crazy shit.
00:17:15.000 And the more they do stuff like that, the more that they say things like that, the more it diminishes their impact and the more it drives people to independent media sources.
00:17:26.000 Trevor Burrus Yeah.
00:17:26.000 I'm sure you've seen the ratings collapse that they've been – they're down to like – MSNBC is down to like 50,000 people in the 18 to 20 – 18 to 49 demo.
00:17:35.000 That is so wild.
00:17:36.000 Which is tiny.
00:17:37.000 It's so crazy.
00:17:38.000 It's really tiny.
00:17:38.000 So I think that's happening.
00:17:40.000 The Gallup organization has done polls on trust in institutions, including, you know, media for the last 60 years.
00:17:46.000 It's been a steady slide down.
00:17:48.000 And in the last, you know, four years, it's fallen off a cliff.
00:17:51.000 I think it's real.
00:17:52.000 Oh, there's another study that came out.
00:17:53.000 The kids are now watching a lot less TV. Kids are just giving up on TV. And they're just, you know, they're on YouTube and TikTok and Instagram and other things.
00:18:01.000 And so, like, I think it's tipping.
00:18:03.000 A question I've been asking myself is when will the actual, you know, famously, 1960 was the first television election, right?
00:18:10.000 You know, sort of legend has it because it was the one where the televised debate really mattered.
00:18:13.000 And if you saw the televised debate, you saw Confident Kennedy and Nervous Nixon.
00:18:17.000 And if you heard it, you experienced something different.
00:18:19.000 And handsomeness came into effect.
00:18:21.000 And vitality, and health, and all these things.
00:18:24.000 Sort of positive spirit, positive energy.
00:18:27.000 I'm actually not...
00:18:28.000 This might have been the first internet election, or maybe we actually haven't had it yet.
00:18:32.000 I feel like we're really close to the first internet election, but maybe it's not all the way there.
00:18:36.000 I think this is it.
00:18:37.000 There's an argument that this is it, right?
00:18:40.000 All the stuff, especially in the last six months, all the podcasts, obviously, and your show played a big role, but...
00:18:46.000 Like I think there's a real if you're gonna run in 28 like I think there's like a fully internet native way to run these campaigns that might literally involve like zero television advertising and Maybe you don't even need to raise that money and maybe you get to your point if you have the right message Maybe you just go straight direct.
00:19:00.000 Yeah, I think a completely different way to do this I think that's the only way now and I think if you do pay people it's not gonna have the same impact You know, I think these call her daddy shows and all these different shows that she went on I mean, I'm sure they had an impact But I think that in the future,
00:19:17.000 I'm sure they're scrambling to try to create their own version of this show.
00:19:22.000 This is one thing that keeps coming up, like we need our own Joe Rogan.
00:19:26.000 But they had me.
00:19:28.000 Well, number one, they had you.
00:19:29.000 Number one, they had you.
00:19:30.000 They had you and they drove you away.
00:19:32.000 But they also have ABC, NBC, CBS, CNN. Right, but that doesn't work anymore.
00:19:38.000 It's like you're using smoke signals and everybody else has a cell phone.
00:19:42.000 It doesn't work.
00:19:43.000 Yeah, that's right.
00:19:44.000 That's right.
00:19:45.000 It's a bizarre time.
00:19:47.000 It's really interesting, though.
00:19:48.000 As you said, we're in a great timeline.
00:19:51.000 And I think...
00:19:53.000 It's a fascinating timeline, too, because there's so much uncertainty.
00:19:56.000 And there's so much, right, we're at the verge of AI, you know, open AI. Altman has said now that he thinks 2025 will be the year that AI becomes sentient, whatever that means.
00:20:08.000 You know, artificial general intelligence will emerge.
00:20:11.000 And who knows how that affects...
00:20:14.000 I've said publicly, and I'm kind of half-joking, that we need AI government.
00:20:19.000 Yeah.
00:20:19.000 It sounds crazy to say, but instead of having this alpha chimpanzee that runs the tribe of humans, how about we have some really logical, fact-based program that makes it really reasonable and equitable in a way that we can all agree to.
00:20:37.000 Let's govern things in that manner.
00:20:39.000 Right.
00:20:39.000 So you can actually simulate this today, because you can go on these systems, ShedGPT or Claude or these others, and you can ask, you know, how should we handle issue X? How should this be run?
00:20:48.000 Yeah, we've done that.
00:20:49.000 Right.
00:20:49.000 How should the Department of Energy do whatever, nuclear policy or whatever?
00:20:52.000 And what I find when I do that is I discover two things.
00:20:54.000 Number one, of course, these things have the same problem.
00:21:02.000 Yeah.
00:21:04.000 Yeah.
00:21:06.000 Yeah.
00:21:19.000 I mean it might be the way to go, which is so horrifying for people to think because everyone is worried about the Terminators taking over the world and like if that's the first step is we let them govern us.
00:21:30.000 Well, there's nothing stopping a politician from using this.
00:21:33.000 There's nothing stopping a policymaker from using it as a tool.
00:21:35.000 You start out – at the very least, you start out using it as a tool.
00:21:38.000 There's nothing to prevent – like for example, I think military commanders in the field are going to have basically AI battlefield assistance that are going to advise them on strategy, tactics and how to win conflicts and then it will start to work its way up and then they will be doing war planning.
00:21:49.000 And then if you're a general, if you're a sergeant or a colonel or a general, it's going to just mean you perform better.
00:21:55.000 So maybe there's like the sort of man-machine kind of symbiotic relationship.
00:21:59.000 And you could imagine that happening more in the policy process and in the political process.
00:22:03.000 And there's also AI-controlled jets, which are far superior.
00:22:07.000 Mike Baker was telling us about that.
00:22:09.000 They did these simulated dogfights and the AI-controlled jets won 100% of the time over humans.
00:22:15.000 Yeah.
00:22:16.000 And there's a bunch of reasons for that.
00:22:17.000 And part of it is just simply the speed of processing and so forth.
00:22:20.000 But another big thing is if you don't have a human in the plane.
00:22:23.000 You don't have the spam in the can.
00:22:25.000 You don't have the human body in the plane.
00:22:28.000 You don't have to keep a human being alive, which means you can be a lot faster and you can move a lot more quickly.
00:22:32.000 G-forces.
00:22:33.000 Much higher G-forces.
00:22:34.000 Yeah.
00:22:35.000 And then there's no option for someone to go crazy.
00:22:39.000 Yeah.
00:22:40.000 That's also right.
00:22:41.000 Yes.
00:22:41.000 Exactly.
00:22:42.000 There's no human element, which is a real element.
00:22:46.000 Yeah.
00:22:46.000 No, I think it's going to be common to have Mach 5 jet drones within a few years.
00:22:52.000 And there'll be a fraction of the size of the current manned planes, which means you can have a lot more of them.
00:22:57.000 And so you kind of want to imagine a thousand of these things coming over the horizon right at you.
00:23:01.000 And it really changes.
00:23:04.000 It's actually, I think, going to be very interesting.
00:23:05.000 It really changes the fundamental equation of war in the following sense.
00:23:09.000 Fundamentally, in the past, the people who won wars were the people who had the most men and the most material.
00:23:14.000 Right.
00:23:14.000 So you just needed the most soldiers and you needed the most equipment.
00:23:16.000 And in this drone world that we're talking about, it's going to be the people with the most money and the best technology.
00:23:21.000 Yeah.
00:23:22.000 Right.
00:23:22.000 And so, for example, small states, you know, small advanced states like Singapore will be able to punch way above their weight and then kind of large sort of economically or technologically backward states that normally would have won will now lose.
00:23:34.000 And so it's going to be a recalibration.
00:23:36.000 And then the good news is you're not putting soldiers at risk, right?
00:23:39.000 So you'll have a lot less death.
00:23:41.000 The bad news, arguably, is it'll be easier to get into conflicts because you're not putting soldiers at risk.
00:23:45.000 So there's going to have to be a recalibration of when you actually lean into an attack.
00:23:49.000 I'm sure you're aware of all this UAP disclosure jazz that you see on television.
00:23:56.000 The more I look into it, the more I think at least a percentage of it, a healthy percentage of it, is bullshit.
00:24:05.000 And there's probably some government projects where they've developed some very sophisticated propulsion systems that they've applied to drones and that that's what these people are seeing and this is one of the reasons why they continually have sightings over secured military spaces like out in the eastern seaboard,
00:24:24.000 like there's areas over Virginia where they continually see them, in San Diego, they see them off the coast of San Diego, where there's a place where you would test stuff like that.
00:24:34.000 Well, so, of course, we know that that was the case for a very long time, for sure, from the 50s through the 80s, because the development of stealth was highly classified, and the SR-71 was brand new at one point, and so you had these, like, you know, alien, you know.
00:24:47.000 Do you pay attention to any of that stuff at all?
00:24:48.000 Of course, of course, 100%.
00:24:49.000 Yeah.
00:24:50.000 And then, by the way, we're not the only ones.
00:24:52.000 And so, you know, my speculation would be that some of the military-based stuff is, you know, the Chinese doing something similar.
00:24:58.000 And, you know, we got a glimpse into that with the balloon.
00:25:01.000 Well, that thing was goofy, though.
00:25:02.000 We got shut down.
00:25:03.000 But still, the fact that the Chinese are flying surveillance balloons over American territory, and they were able to slip through our early warning systems and just, like, you know, loiter above military bases and, like, you know, take lots of, you know, imagery and do whatever scans they do.
00:25:15.000 Yeah.
00:25:15.000 And like literally nothing was happening and we didn't even know they were there most of the time.
00:25:18.000 And so like, you know, that's like a tip of the ice.
00:25:20.000 It feels like a tip of the iceberg kind of thing where if they were doing that, there are probably other things going on.
00:25:25.000 Well, I've read that someone had commented that similar things had happened during the Trump administration, but they didn't tell Trump because they didn't want him to shoot them down.
00:25:33.000 Interesting.
00:25:35.000 For the record, I'm pro-shitting them down.
00:25:37.000 Yeah, I think you should probably shoot them down and take pictures of shit.
00:25:40.000 There's not even people up there.
00:25:42.000 Fucking shoot them down.
00:25:43.000 What's the problem?
00:25:46.000 Do you think there are any of those that are not of this world?
00:25:50.000 I don't think there's any way to know from the outside.
00:25:53.000 Have you ever, like, pondered it late at night, sitting on your porch, staring up at the sky?
00:25:57.000 Of course, of course.
00:25:58.000 Well, you know, it raises – number one is, is there or not?
00:26:01.000 And then if it is, you know, did it recently get here?
00:26:04.000 Have they been here for a long time?
00:26:06.000 You know, did they arrive 5,000 years ago?
00:26:08.000 Tucker thinks they're demons and angels.
00:26:09.000 You know, I mean, demons and angels, are demons and angels real?
00:26:14.000 It's like, you know, literally, you know, probably not, but like, certainly they're metaphorically real.
00:26:18.000 And are there kind of shades of gray between literal and metaphorical?
00:26:20.000 Well, the actions are certainly demonic and angelic, right?
00:26:24.000 Actions of human beings, mass, things that happen in the world are uplifting or horrific.
00:26:30.000 Yeah.
00:26:30.000 Evil people doing evil things are possessed.
00:26:32.000 I mean, they're possessed by something.
00:26:34.000 Like, something is going on.
00:26:35.000 And like, you know, what's the dividing line between, you know, an actual supernatural force and some sort of psychological, sociological thing that's so overwhelming that it just takes control of people and drives them crazy?
00:26:45.000 Like, you might as well call that a demon.
00:26:47.000 Yeah.
00:26:48.000 It's fascinating because, like, when you think about from theological terms – like, when you think of it from a religious perspective, you know, people would apply what would a demon do, what would angels do, what is the will of God and what is,
00:27:04.000 like, the evils of the worst aspects of humanity.
00:27:09.000 You could apply them to so many things in the world, but we're very reluctant to say that something is demonic, even though it's clearly demonic, clearly in action.
00:27:22.000 This is what a demon would do.
00:27:24.000 A demon would possess people to gun down children and use drones to shoot down a wedding party.
00:27:32.000 A demon would do that.
00:27:33.000 Exactly.
00:27:34.000 So a friend of mine is a religious scholar.
00:27:36.000 He teaches at Catholic University and he's a religious history scholar.
00:27:39.000 And he says that medieval people were psychologically better prepared for the era ahead of us with AI and robots and drones everywhere than we are.
00:27:49.000 Because medieval people took it for granted that they lived in a world with higher powers, higher spirits, angels, demons.
00:27:55.000 All kinds of supernatural entities.
00:27:57.000 It was just assumed to be true.
00:28:00.000 In the world we're heading into, that we're arguably already in, there are going to be these new forces, these new entities running around doing things.
00:28:08.000 We're going to struggle.
00:28:10.000 We're going to catastrophize.
00:28:11.000 We're going to conclude AI is the end of the world.
00:28:14.000 The medievals would have said, oh, it's just another spirit.
00:28:16.000 It's just another kind of entity.
00:28:18.000 Yeah, it's better than humans at some things, but so are angels.
00:28:21.000 And so we're going to have to like change our mentality.
00:28:23.000 We're almost going to have to become a little bit more medieval.
00:28:25.000 We're going to have to open up our minds to the kinds of entities that we're dealing with.
00:28:28.000 Wow.
00:28:29.000 Yeah.
00:28:29.000 Which also could help us actually deal with people.
00:28:32.000 Like maybe there's an explanatory way to think about human behavior here that seems less rational but might actually be more rational.
00:28:39.000 Well, you express yourself very brilliantly in describing the current state of woke ideology as a religion.
00:28:48.000 Yeah, that's right.
00:28:49.000 And that the way you described it was brilliant because you were saying that it has all the elements, excommunication, adherence to a very strict doctrine, all these different aspects of it, saying things that everyone knows to be illogical and nonsensical, but you must repeat it.
00:29:04.000 You know, these things are indicative of people that are in cults or people that are a part of like a very – like a serious fundamental religion.
00:29:12.000 Yeah.
00:29:13.000 Well, I mean, of course, the big difference between woke and those traditional religions is woke has no concept of redemption.
00:29:17.000 Right.
00:29:17.000 No concept of forgiveness.
00:29:19.000 Right.
00:29:19.000 Which is a very evil religion.
00:29:21.000 You do not want that.
00:29:23.000 Yeah.
00:29:24.000 Well, it's ill-conceived, right?
00:29:27.000 Because it's like immature.
00:29:28.000 It's an immature religion.
00:29:30.000 Yes.
00:29:30.000 It's absolutist.
00:29:31.000 It's inherently totalitarian.
00:29:32.000 It has to be because it can permanently destroy people.
00:29:35.000 Yeah.
00:29:35.000 Woke also understands something that the Greeks understood, which is that being ostracized and being put to death are the same thing.
00:29:41.000 And so when the Greeks sentenced somebody like Socrates to death, they gave them the option of just leaving.
00:29:46.000 But the problem was- Really?
00:29:48.000 Yes.
00:29:49.000 Socrates could have just walked out and left.
00:29:50.000 No kidding.
00:29:51.000 But the reason that was considered equivalent sentences is because at that time, if you were not a citizen of a particular city, you would get killed in the next city.
00:29:58.000 You'd be identified as the enemy presumptively and killed.
00:30:00.000 And so there was no way to survive without being part of your community.
00:30:03.000 Wow.
00:30:04.000 And that's what the woke's figured out is you can do the same thing.
00:30:06.000 If you're able to like, you know, nail somebody on, you know, on charges of having done something, you know, unacceptably horrible, then you make them toxic and all of a sudden they can't, you know, they can't have, you know, sure, you know, people, you know, they lose friends, they lose family, they lose, they can't get work, you know, before you know it, like they're, you know, living, you know, severely diminished,
00:30:22.000 damaged lives.
00:30:23.000 Some people then go on to kill themselves.
00:30:25.000 I don't know if you've been paying attention at all to Blue Sky, but I have multiple friends that have accounts on Blue Sky that are very sophisticated trolls and are pushing the woke agenda to a satirical point,
00:30:43.000 like parody.
00:30:45.000 But like on the edge where you're not quite sure, they'll say enough real things that make sense and talk about their own anxieties and personal issues with stuff and then say fucking ridiculous shit.
00:30:56.000 And it's fascinating.
00:30:58.000 I bet it works.
00:30:59.000 It does work.
00:31:00.000 Yeah.
00:31:00.000 What's so terrifying is all the outcasts of Twitter, all the people that were like, I can't take this.
00:31:05.000 A few of them come back, which is wonderful.
00:31:08.000 I love when they come back.
00:31:09.000 I'm gone.
00:31:10.000 I'm going to go to Blue Sky.
00:31:11.000 Fuck you people.
00:31:11.000 A bunch of them went to Threads for a while, like Stephen King, he went to Threads.
00:31:14.000 Came right back.
00:31:15.000 They all come right back.
00:31:17.000 The marketplace of ideas, like, okay, you could go to a fruit stand in the middle of the fucking desert, and that's a marketplace, or you can go to the farmer's market where everybody's there.
00:31:28.000 Where are you going to go?
00:31:29.000 You're going to go to the farmer's market.
00:31:30.000 There's tons of people.
00:31:31.000 It's a lot of fun.
00:31:32.000 A lot of activity.
00:31:34.000 That fruit stand's fucking barren and deserted.
00:31:36.000 There's no one there.
00:31:37.000 There's very few choices.
00:31:38.000 It's not fun.
00:31:39.000 And it's win-win to have them back on Twitter because it's good for them because they want to proselytize.
00:31:44.000 And so they need an audience.
00:31:45.000 So they win.
00:31:47.000 And then we win because it's really, really fun to dunk on them.
00:31:50.000 But it's also weird for them to not want any pushback at all.
00:31:55.000 Like, don't – isn't the whole thing supposed to be about an exchange of ideas?
00:32:00.000 Like, if you have a controversial idea and someone disagrees with it, don't you want to hear that position?
00:32:04.000 I know I do.
00:32:06.000 I want to hear it.
00:32:07.000 Even if I vehemently disagree with it, I want to hear it.
00:32:10.000 I want to know where – how do you – how does your brain work?
00:32:12.000 How are you coming to these conclusions?
00:32:14.000 What makes you think this way?
00:32:15.000 Who are you?
00:32:16.000 What do you like?
00:32:16.000 I want to go on Instagram.
00:32:18.000 I want to look at your pictures.
00:32:19.000 I want to see what you're up to.
00:32:20.000 What are you doing?
00:32:21.000 What are you doing with your free time?
00:32:23.000 What are you complaining about?
00:32:25.000 It's a fascinating education on human psychology and to watch people express themselves publicly and then also be attacked publicly by strangers, which never happens in the real world, like at scale, the way it happens on social media.
00:32:42.000 And I think it's an amazing time for people to examine ideas.
00:32:47.000 If you can handle it.
00:32:49.000 Yeah.
00:32:49.000 My favorite term is marketplace of idea.
00:32:51.000 Yeah.
00:32:53.000 You could have a marketplace of ideas.
00:32:55.000 It's just going to be one idea.
00:32:56.000 So Blue Sky is a marketplace of ideas.
00:32:57.000 Sure.
00:32:58.000 Yeah.
00:32:58.000 X is the marketplace of ideas.
00:33:00.000 That final S makes a lot of difference.
00:33:02.000 Yeah.
00:33:03.000 Right.
00:33:03.000 But the thing about X is it really is diverse.
00:33:06.000 Yeah.
00:33:07.000 I follow tons of like kooky leftist progressive nutbags that like have bizarre takes on everything.
00:33:15.000 Yeah.
00:33:15.000 And they were 100% convinced that Kamala Harris is going to sweep all the swing states, including Iowa.
00:33:21.000 They were all in.
00:33:22.000 And I was like, this is wild.
00:33:24.000 Like, is that going to happen?
00:33:25.000 Are they right?
00:33:26.000 Like, this is crazy.
00:33:27.000 But they were 100% convinced.
00:33:30.000 And it's fascinating to see all these different kinds of people, to see the Charlie Kirks and the full-on left-wing kooks and see them all together.
00:33:40.000 Right.
00:33:41.000 You need that.
00:33:42.000 Yeah.
00:33:42.000 Look, so one of the ways I think to think about this is all new information is heretical by definition, right?
00:33:48.000 So anytime anybody has a new idea, it's a threat to the existing power structure.
00:33:52.000 So all new ideas start as heresies.
00:33:54.000 And so if you don't have an environment that can tolerate heresies, you're not going to have new ideas and you're going to end up with complete stagnation.
00:34:01.000 If you have stagnation, you're going to go straight into decline.
00:34:04.000 Yeah.
00:34:05.000 Right.
00:34:05.000 And I think this is the aberrant nature.
00:34:07.000 This is the timeline split.
00:34:08.000 I think the last decade has just been like a really weird aberrant time where things have not been working like they should.
00:34:13.000 And, you know, in 2015, Twitter called itself the free speech wing of the free speech party, right?
00:34:19.000 And Elon has restored it.
00:34:22.000 Right.
00:34:23.000 He brought it back to something that everybody thought was completely normal 10 years ago.
00:34:26.000 Yeah.
00:34:27.000 And I think, I hope, this last 10 years increasingly is just going to feel like a bad dream.
00:34:31.000 Like, I can't believe we tolerated the level of repression, right, and anger and, you know, emotional incontinence and, you know, cancellation campaigns.
00:34:38.000 Emotional incontinence is a great term.
00:34:40.000 Yes, there has been a lot.
00:34:43.000 That's really what it's like.
00:34:44.000 You just diarrhea in your emotions.
00:34:47.000 Just spraying rage in all directions.
00:34:49.000 And so, you know, I'm very, at the moment at least, very optimistic that there's a cultural change happening here that's even more profound than the political change.
00:34:56.000 I have a lot of respect and also sympathy for Jack Dorsey.
00:35:00.000 I like him a lot as a human being.
00:35:02.000 I think he's a brilliant guy and I think he had very good intentions.
00:35:05.000 But he was a part of a very large corporation and he had an idea for a Wild West Twitter.
00:35:11.000 He wanted to have two versions of Twitter.
00:35:13.000 He wanted to have the Twitter that was pre-Elon where there's moderation and you can't dead name someone and all that jazz.
00:35:20.000 And then he wanted to have an additional Twitter that was essentially what X is now.
00:35:25.000 And he just didn't have the ability to push that through with the board and the executives and all the people that, you know, were fully on board with woke ideology.
00:35:36.000 So the experience that people like Jack have had running these companies in the last decade has been...
00:35:41.000 And I don't mean to let them off the hook for their decisions, but just the lived experience, as they say, of what these people's lives have been like is just daily pounding.
00:35:47.000 Just every single day, it's like meteor strikes coming down from the sky, exploding around you, getting attacked from every conceivable direction, being called just incredibly horrible things, being attacked from many different directions.
00:35:57.000 Well, he's already left Blue Sky.
00:35:59.000 Well, yeah.
00:36:00.000 So the irony of Jack is that Jack then created Blue Sky, which Which is kind of exactly the opposite of any way where he thought it.
00:36:08.000 Oh, by the way, you know, the new name for it, of course, is Blue Cry.
00:36:11.000 Ah!
00:36:12.000 Yes!
00:36:12.000 I didn't know that.
00:36:13.000 Exactly.
00:36:14.000 Yeah, but he's also got, you know, look to his credit, he's still trying.
00:36:16.000 And so he's got Nostr, you know, which is another thing.
00:36:19.000 What is it?
00:36:19.000 It's called Nostr, N-O-S-T-R. Oh, okay.
00:36:21.000 It's his kind of new, it's actually his third.
00:36:22.000 He's going to keep swinging.
00:36:24.000 Look at full credit, full credit.
00:36:25.000 He's going to keep swinging.
00:36:26.000 And by the way, full credit, he supported Elon.
00:36:28.000 You know, they've mixed up a little bit, but by and large, he's been very supportive and was very supportive at a key time.
00:36:32.000 Yeah.
00:36:33.000 Well, I also found it fascinating that when there was any sort of a right-wing branch of that stuff, like Gab or any of these, they would immediately be infiltrated by bots as well, like my friends that troll on Blue Sky.
00:36:46.000 But these are Nazis.
00:36:47.000 Like, these are Nazi bots.
00:36:49.000 These are people that would just spew horrible hate.
00:36:51.000 And then Gab would be labeled, oh, this is where the Nazis go.
00:36:56.000 This is a right-wing psychopath social media app.
00:37:00.000 Yeah.
00:37:00.000 And I think, frankly, I think you get the same thing if you start out – I think if you start out overtly political on either side, I think that's what you end up with.
00:37:06.000 Yes.
00:37:06.000 And so I just – like that doesn't seem to be an effective route to market.
00:37:10.000 It seems like you have to start from the beginning as a general purpose service, but you need to have some sense of the actual guardrails you're going to have around – and by the way, every social media service, internet service that ever works, there's always some content filters and restrictions because you can't have child porn, you can't have violence,
00:37:26.000 you can't have terrorist recruitment.
00:37:27.000 And even the First Amendment, there's like a dozen carve-outs that the Supreme Court has ruled on over time that are things like that that you can't just say.
00:37:34.000 I can't say, let's go join ISIS and let's go attack Washington.
00:37:37.000 It's literally not allowed.
00:37:39.000 So there's always some controls, but you need to have a spine of steel.
00:37:42.000 If you're going to hold back the censorship pressure.
00:37:45.000 And there's basically Substack, a company I'm involved in, is doing very well.
00:37:52.000 I love Substack.
00:37:53.000 Smaller than Twitter, but doing extremely well.
00:37:54.000 Fantastic.
00:37:55.000 And they've done a great job, I think, of holding the line on this stuff.
00:37:57.000 Yes.
00:37:57.000 And then, obviously...
00:37:58.000 And it's an amazing resource.
00:38:00.000 There's so many brilliant people on Substack.
00:38:02.000 I love Substack.
00:38:03.000 I get a large percentage of my news from Substack.
00:38:07.000 It's really good and it's so valuable and it's such a great place for people who are independent journalists and physicians and scientists to publish their ideas and actually get paid for it by the people who subscribe to it.
00:38:20.000 I think it's fantastic.
00:38:22.000 And there's lots of people on the far left and the far right.
00:38:24.000 Yes.
00:38:24.000 So you actually have the full spree.
00:38:26.000 When a far left person gets upset, somebody working in the New York Times is mad because they're not far left enough, they quit and they start on Substack.
00:38:32.000 And Substack welcomes them in.
00:38:33.000 Yes.
00:38:34.000 Which is why they don't devolve into a gab or something like that.
00:38:37.000 Because it really is a platform.
00:38:39.000 It really does welcome all conversations.
00:38:41.000 Well, it's also very difficult to subvert in that same way because Substack is essays, right?
00:38:48.000 You're reading people's essays and papers on things.
00:38:51.000 And, like, these are long-form things that are very well – in a lot of cases, very well-researched.
00:38:56.000 And it's not the kind of thing you could just shitpost on.
00:38:59.000 There are comments, but it's just like they don't hold the weight that the actual article holds.
00:39:04.000 Right.
00:39:04.000 So my partners at work, they've observed that I tend to be able to inflame situations from time to time.
00:39:10.000 I contend to be provocative and get people really upset.
00:39:12.000 And so the rule they've asked me to comply with is I'm allowed to write essays, for example, and I'm allowed to go on long-form podcasts that I'm not allowed to post.
00:39:21.000 Really?
00:39:22.000 Right, exactly.
00:39:24.000 It's the rule.
00:39:25.000 It's the rule.
00:39:25.000 Now, by the way, I struggle against the rules because I can't help myself from time to time.
00:39:29.000 Why do they want you to have rules?
00:39:30.000 Because otherwise I inflame people too much.
00:39:33.000 I drive people too crazy.
00:39:34.000 Do you do it on purpose?
00:39:35.000 Sometimes.
00:39:36.000 I mean, sometimes you have to.
00:39:39.000 Sometimes it's unintentional.
00:39:41.000 Did you ever hear about when the entire country of India was mad at me?
00:39:45.000 No!
00:39:45.000 Oh, I spent one night with the entire country of India basically wanting to kill me.
00:39:49.000 It was incredible.
00:39:51.000 Oh my goodness, what happened?
00:39:52.000 I was in a Twitter debate with somebody back when I was just posting freely on Twitter and it was a debate about economics and the topic of colonialism came up.
00:39:59.000 And I made a comment in a long thread about colonialism and it turns out the Indians are still extremely sensitive.
00:40:07.000 About the topic of colonialism.
00:40:09.000 And I did not understand the mindset and the historical orientation.
00:40:14.000 And I tripped a line.
00:40:15.000 And I stayed up all night.
00:40:17.000 And I went hyperviral in every time zone in India.
00:40:20.000 Every hour there would be like an entirely new activation.
00:40:22.000 And I was like front page headline news, top of the hour TV news, like all the way across India.
00:40:29.000 Wow.
00:40:29.000 Yes, it was like a...
00:40:30.000 I do not recommend this as an experience.
00:40:32.000 By the way, I learned how many incredible Indian-American friends I have because they all rallied to my side and said, you know, Mark's not literally calling for the recolonization of India.
00:40:43.000 There's a problem with the language barrier as well, right?
00:40:46.000 Language.
00:40:46.000 And then just historical context.
00:40:49.000 Americans have a different...
00:40:51.000 Americans experience history differently than almost everybody else.
00:40:54.000 History for us is just like stuff that happened in the past that doesn't matter anymore.
00:40:57.000 But a lot of other people around the world experience history as something that really matters to their lives today.
00:41:03.000 They live in history more than we live in.
00:41:06.000 They have a deeper understanding of kind of how they got to where they were and the things that happened to their parents and grandparents and ancestors.
00:41:11.000 I don't know if it's better or worse.
00:41:14.000 It's just a different way of experiencing reality.
00:41:16.000 Anyway, I recommend learning that lesson not by enraging a billion people.
00:41:21.000 I experienced a small version of that recently because I said we shouldn't be using long-range missiles on Russia.
00:41:27.000 And the Ukrainians, like, and Ukrainian bots, a bunch of people came after me.
00:41:32.000 Because I was saying, like, the Biden administration, I was like, fuck these people.
00:41:35.000 And then I think some people misconstrued that as fuck the Ukrainian people, which I absolutely was not saying.
00:41:41.000 I was saying, fuck whoever in the last days of the presidencies decided to escalate this war, because it appears that that's what they've done.
00:41:50.000 It appears that they're leaving Trump a giant mess, at the very least.
00:41:55.000 So, the good news is I am allowed to go on podcasts.
00:41:58.000 That's good news.
00:42:25.000 Do you also think, while you're writing, how things could be misconstrued to let me do a really good job of being very clear about this?
00:42:33.000 100%.
00:42:34.000 Yeah.
00:42:34.000 You kind of have to.
00:42:35.000 Yeah, yeah, yeah.
00:42:36.000 Exactly.
00:42:36.000 I had Jimmy Corsetti on the other day, and he is an expert in ancient history and ancient civilizations, and we had these fascinating subjects.
00:42:45.000 And one of them that came up was the Nazis and their fascination with the occult.
00:42:50.000 And so you had to clearly say, listen, fuck Hitler, okay?
00:42:53.000 Can I be really clear?
00:42:54.000 Fuck Hitler and fuck the Nazis, okay?
00:42:57.000 I have not in any way...
00:42:58.000 Okay, now that we're clear, let's talk about where the swastika came from.
00:43:03.000 Fuck Hitler.
00:43:04.000 Did I say fuck Hitler?
00:43:05.000 Let me say it again.
00:43:05.000 Fuck Hitler.
00:43:07.000 But the swastika is this ancient symbol, and it's like talking about like, why did the Nazis have this fascination with the occult and with ancient civilizations?
00:43:14.000 And so we got into it, but it was like one of those things where it's like, all right, we're hitting the third rail.
00:43:18.000 Everybody get your rubber boots on.
00:43:19.000 You know, let's save everybody here.
00:43:23.000 Yeah.
00:43:23.000 I've got a friend in the entertainment business who is quite left-wing but really likes World War II documentaries.
00:43:29.000 And so he'll be like, yeah, I saw this great documentary last night about Hitler.
00:43:32.000 And I'm like, I'll bet you did.
00:43:37.000 You can't even have a copy of Mein Kampf in your house.
00:43:41.000 Oh, a student, this is actually one of the Stanford crazy stories.
00:43:43.000 A student at Stanford was reported to the disciplinary board, the civil, whatever, disciplinary board for reading a copy of Mein Kampf in the quad.
00:43:51.000 Oh my God, that's so crazy.
00:43:53.000 Which is a book that's been, you know, a sign for 80 years to college kids to, like, understand who these people were and, like, how to not do that again.
00:44:01.000 Yeah, that kid was, like, nearly brought up on charges and nearly expelled.
00:44:04.000 So, like, yeah, that's, yes, this is the world that I hope that we're leaving.
00:44:10.000 Well, it's just an awful way to look at things.
00:44:13.000 It's so awful to think that if you read about someone horrible, you support them.
00:44:18.000 Yeah, that's right.
00:44:18.000 It's just so crazy.
00:44:20.000 Like, how are we going to study history?
00:44:22.000 Yeah, right.
00:44:23.000 And how are we going to prevent that from happening again if we can't wrap our heads around why they happened the first time?
00:44:28.000 Especially something like the Nazis.
00:44:30.000 Like, how are we going to learn, like, what happened?
00:44:34.000 Clearly, 1920s Germany was very different than 1945 Germany.
00:44:40.000 What the fuck happened in 25 years?
00:44:42.000 So what we're essentially talking about is the year 1999 America versus 2024 America.
00:44:49.000 Imagine a shift of that magnitude so crazy that there's a Holocaust in 2024 and in 1999 everybody's just hanging out.
00:44:57.000 Yep, that's right.
00:44:58.000 Well, you should probably study that.
00:45:00.000 You should probably not reprimand someone for reading a book on this.
00:45:03.000 Yeah, that's right.
00:45:03.000 Yeah, exactly.
00:45:04.000 And look, the German people went along with it, right?
00:45:05.000 And so, you know, how did that happen?
00:45:07.000 How did that happen?
00:45:08.000 And how many, you know, did they, was there active agreement?
00:45:10.000 Was there passive agreement?
00:45:11.000 Was there, you know, what?
00:45:12.000 What are the steps where things go horribly wrong?
00:45:15.000 Yeah, exactly.
00:45:15.000 And how can we recognize those?
00:45:17.000 Because those steps have taken place multiple times in history, recorded history.
00:45:21.000 We know about them.
00:45:22.000 So, like, if we see them happening today, maybe we should stop it and nip it at the bud.
00:45:27.000 What better way than to read about when it already happened?
00:45:29.000 Yeah.
00:45:30.000 One of my observations about people talking about current events is we know conclusively that prior eras all had horrible moral problems, disasters, you know, catastrophes, wars, and all kinds.
00:45:40.000 They made all kinds of horrible mistakes.
00:45:41.000 But we are completely certain that in our time we figured it all out.
00:45:44.000 Right.
00:45:44.000 We're 100% convinced that we have it all dialed in.
00:45:47.000 And the one thing I know for sure is people 50 years from now are going to look back on us and they're going to say, oh my God, those people were awful.
00:45:52.000 100%.
00:45:52.000 Right.
00:45:53.000 But like, in what way?
00:45:55.000 Right.
00:45:56.000 In what way are we horrible?
00:45:57.000 I mean, certainly...
00:45:58.000 A lot of the way we treat each other is horrible, especially with the amount of information that we have available.
00:46:03.000 But it is fascinating also that I visited Athens last year, and I got to tour the ruins, and I was like, I wonder when it all went south.
00:46:13.000 When did they know this had fallen apart?
00:46:17.000 In the peak of everything, they probably thought, hey, We have the most amazing, sophisticated civilization that's available on Earth, and we will maintain this.
00:46:27.000 We will be the center of intellectual discourse and the home of democracy.
00:46:33.000 This is us.
00:46:34.000 And then, no.
00:46:35.000 Now there's like shitty apartment buildings next to the Parthenon.
00:46:38.000 Like, what happened?
00:46:40.000 Something horribly happened, and we don't want to think that could ever happen to us today.
00:46:45.000 Right.
00:46:45.000 We want to think, we're American, motherfucker.
00:46:48.000 We're going to keep this bitch rolling forever.
00:46:50.000 Leonard Skinner, Freebird, let's go!
00:46:52.000 Second Amendment, come on!
00:46:53.000 And we think that this is the future.
00:46:58.000 America is the shining star of the world, and we're going to carry this on.
00:47:02.000 But probably not, like, historically.
00:47:05.000 I mean, what is the longest-running, dominant civilization ever?
00:47:09.000 The Romans existed for, what, a couple thousand years?
00:47:14.000 Like, how long did the Greeks run—how long did the Egyptians—the Egyptians might be the longest running, especially if you, like, take into account the possibility of alternative history timelines, where, you know, like, Egyptian hieroglyphs, they have kings that go back 30,000 years.
00:47:29.000 Here it is.
00:47:31.000 Egypt and Mesopotamia.
00:47:32.000 There it is.
00:47:33.000 One estimate measuring the time of the first pharaohs, the use of hieroglyph writing to the native religion was replaced by Christianity.
00:47:39.000 Ancient Egyptian civilization endured for about 3,500 years.
00:47:43.000 I bet it was more than that.
00:47:45.000 The argument is things just didn't really change.
00:47:49.000 Like historical change of the kind that we understand where things actually change, the way people live changes, really kicked off with the Greeks.
00:47:56.000 And so that was sort of the default status of civilization for a long time.
00:48:00.000 The Greeks kicked off change, as we understand it, and then the Romans.
00:48:04.000 Do you know about the fishponds?
00:48:06.000 The fishponds.
00:48:07.000 The Cicero's fishponds.
00:48:08.000 No.
00:48:09.000 So the Roman Empire, you know, ran for, you know, in its sort of Roman Republican Empire in its sort of health, which you consider its dynamic phase, its sort of vital phase ran for a few, you know, a few hundred years, but maybe 400 years total, something like that.
00:48:20.000 And towards the end, as it was sort of falling or stagnating and increasingly starting to fall apart, Brenda Mine says, when the roads got dangerous and nobody could quite explain why.
00:48:31.000 Right?
00:48:31.000 Which sounds familiar, by the way.
00:48:35.000 Cicero was, you know, one of the great Roman statesmen.
00:48:37.000 And he wrote these letters that we have.
00:48:39.000 And in the letters, he sends these letters to all of his aristocratic friends.
00:48:41.000 And the theme in the letters is basically all of the actual competent, capable citizens of Rome are out in the countryside at their villas, perfecting their fishponds.
00:48:51.000 They're pulling into themselves.
00:48:53.000 They've built themselves their own protected environments, where they control everything.
00:48:58.000 And they're completely focused on ornamentation.
00:49:01.000 They're completely focused on their clothes and on their lifestyles.
00:49:05.000 Kardashians.
00:49:05.000 They were Kardashians.
00:49:08.000 I don't know if the Kardashians have fishponds, but if they did, they would be spectacular fishponds.
00:49:12.000 They would be amazing fishponds.
00:49:13.000 No doubt they would be the most amazing fishponds we have ever seen.
00:49:17.000 So he kept railing.
00:49:18.000 He's like, stop with the fishponds.
00:49:19.000 Stop working on the fishponds.
00:49:21.000 Get back out here.
00:49:22.000 Rejoin the Senate.
00:49:23.000 Get back involved in the system.
00:49:25.000 Let's keep this thing from caving in.
00:49:26.000 Yes.
00:49:27.000 And I think, you know, the significance I think of, you know, Trump actually talked about this in the campaign.
00:49:32.000 You know, his version of this talking on the campaign trail is he's like, look, I could be off on a resort.
00:49:35.000 I own all these golf clubs.
00:49:37.000 I have many things I could be doing in my life.
00:49:38.000 Yeah, of course.
00:49:39.000 And he's 78 years old.
00:49:40.000 He probably would like to do that.
00:49:41.000 Exactly, right?
00:49:42.000 And he's, you know, surrounded.
00:49:42.000 His family loves him and, like, you know, grandkids and, like, the whole thing.
00:49:45.000 And he's like, look, I'm not doing it because, like, I need to do this.
00:49:49.000 And it's interesting because, you know, he doesn't use, you know, he's not referencing Cicero when he says that.
00:49:53.000 But it's that spirit, right, that Cicero talked about.
00:49:55.000 Where, you know, when times get tough, do the people who are in a position to actually make positive change actually step up or not?
00:50:01.000 Right.
00:50:01.000 And I think we've had a pretty long stretch here where that hasn't been the case.
00:50:04.000 And I think maybe with Trump and then I think also with Elon.
00:50:07.000 I think...
00:50:07.000 Yes.
00:50:08.000 Because Elon's the other guy, right?
00:50:09.000 He for sure could be focused.
00:50:11.000 Well, it's a coalition, right?
00:50:12.000 It's not just him.
00:50:13.000 It's Vivek, Ramoswamy.
00:50:14.000 That's right.
00:50:14.000 Another guy, by the way, who could be kicking it on a beach somewhere.
00:50:17.000 100%.
00:50:18.000 Yeah.
00:50:18.000 Yeah, exactly.
00:50:19.000 Very successful guy.
00:50:19.000 And young and handsome.
00:50:20.000 He could do whatever he wants.
00:50:21.000 He could be doing anything.
00:50:21.000 Yeah, exactly.
00:50:22.000 Instead, he's decided to go all in.
00:50:24.000 And then, of course, you have Tulsi Gabbard, and you have J.D. Vance, who I think is brilliant.
00:50:29.000 You have all these brilliant people that are together, which is very hopeful.
00:50:34.000 This is what we didn't see out of the Biden-Harris campaign.
00:50:38.000 What we saw from Harris and Waltz, you have Waltz, this guy who it seems like he's a compulsive liar.
00:50:45.000 At the very least, he's lied multiple times about fairly insignificant things, you know, like whether or not he was a head coach or an assistant coach.
00:50:54.000 And the lies have always elevated him socially, right?
00:50:58.000 The lies about his military service or at least implying that he served in a different aspect.
00:51:05.000 And then there was Tiananmen Square.
00:51:08.000 Everything enhances his virtue.
00:51:10.000 This is not what anybody wants.
00:51:11.000 You want the opposite.
00:51:12.000 You know, you want a guy like J.D. Vance who served in the Marines and, you know, went to Yale, comes from a single mother with addiction problems, rose from hard work and dedication to become who he is now.
00:51:23.000 Like, that's the kind of guy that I like.
00:51:25.000 That's what we all would like.
00:51:26.000 Like, okay, that looks like a leader to me.
00:51:28.000 Yeah.
00:51:28.000 Well, the Romans had this concept they took very seriously.
00:51:31.000 They called virtue, right?
00:51:32.000 And like, did you, did you, there's a whole ranking, by the way, of the Roman virtues.
00:51:35.000 And if you read them today, you just like want to burst out crying because you're just like, oh my God, I can't believe what we're missing.
00:51:39.000 But like, people with virtue, people with virtue, it's not just that they think that they're good people or that they tell everybody they're good people.
00:51:45.000 They actually act on it and actually step up.
00:51:47.000 Well, this is what's missing from today's secular society, right?
00:51:51.000 We don't have a doctrine that encourages that sort of thinking and behavior and rewards it publicly, which religion does.
00:52:02.000 True Christianity, not subverted fucking giant arena Christianity where the guy's flying private jets and has Rolls Royces and shit, but actual real Christian people.
00:52:14.000 Well, the Romans had gods.
00:52:16.000 I mean, their virtues had gods.
00:52:17.000 To your point, it was like encoded into their religion.
00:52:21.000 It was wrapped up in their religion.
00:52:22.000 They knew exactly what was expected of them.
00:52:24.000 They knew exactly what their ancestors expected of them.
00:52:26.000 They knew exactly what their gods expected of them.
00:52:28.000 I recently read Meditations again a couple of months ago.
00:52:32.000 I listened to it in the sauna.
00:52:34.000 But it's brilliant.
00:52:35.000 And it's amazing that this guy, Marcus Aurelius, was thinking like this so many years ago.
00:52:41.000 And it's so valid today.
00:52:45.000 And it applies so well to modern life.
00:52:48.000 It's so strange how Brilliant this person was while he was running this incredible empire that he could write about human psychology and the value of forgiveness and, you know, being true to yourself and constantly being truthful everywhere in everything you do and all these virtues and all the stoicism that he That he espoused.
00:53:12.000 It's so valuable today.
00:53:14.000 It's really remarkable that this person, who was a leader, what was it, 2,000 years ago, that his words still ring true today.
00:53:23.000 Yeah.
00:53:23.000 You probably know he didn't write it for public consumption.
00:53:25.000 Right, yeah.
00:53:26.000 It was even more amazing.
00:53:27.000 His private notebook.
00:53:28.000 Which is why it's so good, probably.
00:53:30.000 Yeah.
00:53:30.000 Because he probably wrote it for a substack.
00:53:32.000 He's like, well, people are going to hate on this.
00:53:34.000 Yeah, yeah.
00:53:35.000 Let me preemptively attack the people in the comments or subdue them.
00:53:41.000 But he's lecturing himself.
00:53:44.000 He's telling himself how to act.
00:53:45.000 These are very deep important.
00:53:48.000 My favorite part of the meditation is there's a section where it's something like, yeah, you're going to wake up this morning and everybody's going to hate you and everybody's going to lie to you and everybody's going to make dumb decisions and you're going to be incredibly frustrated and you're not going to get any credit for anything and you have to get up anyway.
00:54:05.000 Yes, yes, yes, that's all true.
00:54:07.000 And you still have to get up and do your job.
00:54:09.000 And of course he's saying that to himself as the leader of Rome.
00:54:11.000 To himself, exactly.
00:54:13.000 And what's in there is just like, wow, his life was not, you know, he's just like, again, it's actually, you know, like the CEO, it's just like you're going to get pounded.
00:54:19.000 Like, if you're in these positions, you're going to get pounded every day.
00:54:22.000 And if you're operating out of a true sense of virtue, if you're operating out of a true sense of like exercising your responsibilities, you get up and do it anyway.
00:54:31.000 It's amazing how much it resonates.
00:54:33.000 It really is.
00:54:34.000 What's amazing how much so many ancient writings resonate?
00:54:38.000 You know, there's so much valuable information, just like in Sun Tzu's The Art of War or in The Book of Five Rings.
00:54:46.000 You know, there's so many ancient books that you read and you go, first of all, I love reading them because I try to imagine books.
00:54:53.000 What is this life like?
00:54:57.000 If you want to take Miyamoto Musashi, 1400, when did he live?
00:55:02.000 Miyamoto Musashi was like 1420s or something like that.
00:55:06.000 What's that like?
00:55:07.000 What is your life like?
00:55:09.000 What is the view of the world when you don't really have detailed maps or you don't have any photographs?
00:55:17.000 You don't have any idea what the fuck is going on in Europe unless you go there.
00:55:21.000 What is your version of the world like?
00:55:25.000 And then to see someone's words written down and you read them and try to just imagine yourself and their perspective and their mindset.
00:55:34.000 Yeah, that's right.
00:55:34.000 Yeah.
00:55:35.000 And look, I think if you're somebody like that or somebody like Marcus Aurelius, you just have this incredible sense of responsibility.
00:55:40.000 Yeah.
00:55:40.000 Like the one thing you do have is a sense of purpose.
00:55:42.000 Like you know exactly why you're here.
00:55:43.000 You know exactly what your role is.
00:55:44.000 You know exactly how you're supposed to behave.
00:55:46.000 You know exactly how you're supposed to basically gain glory, how you're supposed to honor your ancestors.
00:55:50.000 Like it's just all, you know exactly where you are in the community.
00:55:53.000 Right.
00:55:54.000 Right.
00:55:54.000 You have this like incredible sense of groundedness and rootedness.
00:55:58.000 And of course, there's huge downsides to that, which is it really cuts off your ability to, you know, run off and, you know, go on American Idol, right?
00:56:05.000 There's like a lot of things you can't do, right?
00:56:07.000 But like, you know, you know what you're supposed to do, and you either do it or you don't do it.
00:56:12.000 And these days, to have people like that, we need people who choose to be that way.
00:56:16.000 Right, which is arguably harder, right, given all the choices that they actually choose to live that way.
00:56:22.000 Well, not only that, given all the distractions that people face every day that keeps them from sitting down and writing a journal like that.
00:56:29.000 Yeah, that's right.
00:56:30.000 You know, I mean, back then there's not a lot of different things to entertain you with.
00:56:35.000 Correct.
00:56:35.000 Yes, you had to be maybe a little bit more serious because you couldn't have as much fun.
00:56:38.000 My other favorite meditations, Marcus Aurelius thing is something like be the rocks on the shore at which the waves beat.
00:56:47.000 Right?
00:56:48.000 Like, yes.
00:56:49.000 Like, yes.
00:56:49.000 Your job is to stand there like the rocks do and just the surf just keeps coming and keeps coming and keeps coming and your job is to just like stand there and take it.
00:56:57.000 Imagine what it was like addressing the people back then too.
00:57:00.000 Just yelling out to these groups or speaking in front of all the leaders.
00:57:06.000 And everyone's plotting to kill you.
00:57:08.000 There's also a lot of that going on.
00:57:10.000 Yeah.
00:57:11.000 I mean, how many times do they try to kill Hitler?
00:57:13.000 Yes.
00:57:14.000 Like, everybody's trying to kill you.
00:57:15.000 If you're running things, all your generals are probably secretly wanting to become the king.
00:57:20.000 Yep.
00:57:21.000 Yep.
00:57:21.000 Exactly.
00:57:21.000 Yeah.
00:57:22.000 All the usurpers are waiting in the wings.
00:57:23.000 Not easy lives.
00:57:24.000 You know, today most of the killings happen metaphorically.
00:57:27.000 Most.
00:57:27.000 Although, every now and then.
00:57:29.000 Yeah.
00:57:29.000 Somebody takes a shot.
00:57:30.000 In the alternative timeline.
00:57:31.000 Yes.
00:57:31.000 Yes, exactly.
00:57:32.000 That's right.
00:57:33.000 That's right.
00:57:33.000 Yeah.
00:57:34.000 How fearful were you leading up to the election that it wouldn't go into the new timeline?
00:57:40.000 It was so weird because all the experts said it was 50-50, razor sharp.
00:57:46.000 It's this tiny little thing, 80,000 votes in eight counties.
00:57:49.000 And number one, then it wasn't, which means we can take all those experts and just dismiss them forever going forward because they clearly have no clue.
00:57:59.000 So it's another set of people we don't have to listen to.
00:58:01.000 But I had this really interesting conversation that kept nagging at me with a senior Democrat who's on his way out of politics.
00:58:10.000 And he said in the summer, I said, what's your view?
00:58:14.000 And this person said, Trump's going to win with 100% certainty.
00:58:17.000 Really?
00:58:18.000 This is a Democrat from a sort of purple state.
00:58:21.000 Right.
00:58:21.000 So, you know, not New York or California, but like a state with sort of maybe a broader cross-section of people.
00:58:28.000 And this person basically said, yeah, said, look, all you have to do is fly anywhere in the country into any purple place and go into a second or third tier.
00:58:36.000 You know, Cy City and take an Uber for 30 minutes.
00:58:39.000 You know, land at the airport, take an Uber, drive around for 30 minutes, come back and just ask the driver, like, how's it going and who are they voting for?
00:58:44.000 And basically 100% of the time, the answer is going to be Trump.
00:58:47.000 Because people were just completely fed up.
00:58:50.000 They were just completely fed up.
00:58:51.000 And then there was the Kamala enthusiasm, which this person said, you know, the Kamala enthusiasm is like highly focused in New York and California, which don't matter from an electoral standpoint, right?
00:59:02.000 So they're not going to decide anything.
00:59:03.000 But that is huge when it comes to media.
00:59:07.000 Oh, sure, of course.
00:59:07.000 But that's the thing, the self-reinforcing nature of the bubble.
00:59:10.000 This is what's actually so interesting about these media bubbles is the people in these media bubbles are not breaking out.
00:59:16.000 It's like they're getting deeper into the sort of collective psychosis that they indulge in.
00:59:19.000 And part of it was getting excited about a candidate for which there was very little popular support for once you got outside of these heavily blue states.
00:59:25.000 Yeah.
00:59:26.000 And so, in a lot of ways, it's the most, you know, obvious explanation in the world, which is just people just fundamentally did not like the direction the country was going in, and they were just fed up with it.
00:59:34.000 There's also this very bizarre arrogance of people that were certain that Kalma Harris was going to win.
00:59:40.000 I'm sure you've seen the viral video of this lady who's a political analyst who talks about going to the liquor store and buying a bottle of champagne.
00:59:48.000 Oh, right.
00:59:48.000 I saw that.
00:59:49.000 Yeah, right.
00:59:49.000 I don't want to show up to the poor lady.
00:59:51.000 She's probably living in hell right now.
00:59:53.000 Yeah.
00:59:53.000 On Blue Sky.
00:59:55.000 Yeah, she's probably on Blue Sky.
00:59:57.000 She might be on X. Well, she was on X. I think she deleted her profile.
01:00:00.000 But the poor lady, I mean, but she was being very arrogant and she laughed and mocked this man and said, you do realize you wasted your vote, right?
01:00:09.000 That's right.
01:00:10.000 That's right.
01:00:10.000 That's right.
01:00:10.000 That's right.
01:00:11.000 Which makes her hard to feel sorry for.
01:00:13.000 That's right.
01:00:14.000 It's like you were ready to mock this man.
01:00:16.000 Yes.
01:00:17.000 But in her eyes, it was all about reproductive freedom.
01:00:20.000 And she thought that that was under attack under the Trump administration and that women are going to stand up and they're going to stop that because in her echo chamber, that was the case.
01:00:29.000 Everybody was universally – they all agreed.
01:00:32.000 We're universally on board with this idea that Trump is evil.
01:00:35.000 We've got to get rid of him and women are going to vote and this is going to be fun.
01:00:37.000 But who are you hanging out with, lady?
01:00:39.000 Yeah.
01:00:39.000 You could hang out with a bunch of people that think baseball is awesome and then you run into someone from another country like, what the fuck is baseball?
01:00:46.000 You've got to realize there's a lot of people out there.
01:00:48.000 Yeah.
01:00:49.000 And people really don't like being talked down to.
01:00:50.000 They really don't.
01:00:51.000 And they don't like you mocking the fact that, first of all, nobody wasted their vote.
01:00:56.000 That's not how it goes.
01:00:57.000 You don't waste your vote if you vote different and the other side wins.
01:01:01.000 The other side won.
01:01:02.000 That's just how it is.
01:01:04.000 Wasting the vote is a crazy way to look at it.
01:01:17.000 Right.
01:01:27.000 And they apply that, especially if they're not into sports, to other things.
01:01:31.000 I think it's just a war mentality.
01:01:32.000 It's a tribal war mentality that's been sort of subverted in the human mind and applied to other things.
01:01:39.000 It could be like Microsoft versus Apple.
01:01:41.000 It could be Android versus Apple iOS.
01:01:45.000 It's weird how people get so tribal and then connect their own personal identity to other people agreeing with these ideas that they believe.
01:01:56.000 Yeah.
01:01:56.000 Yeah.
01:01:57.000 I offer two thoughts.
01:01:58.000 One is the Democrats for a long time were the big tent party.
01:02:01.000 So the Democrats were the coalition of people who had very different points of view on things.
01:02:04.000 And of course, you know, famously, it's all the different identity groups and it's all the different, you know, economic and unions and all these things.
01:02:08.000 And Republicans were like the party of like rigidity, right?
01:02:12.000 And just for whatever set of – a lot of the woke stuff had a lot to do with it as it flipped to where at least today Trump's Republican Party is the big tent party.
01:02:19.000 You know, to your point on having all these new people and many of whom are former Democrats.
01:02:23.000 A lot of them.
01:02:23.000 And the Democrats have decided to try to isolate out anybody.
01:02:26.000 Right.
01:02:26.000 Who disagrees on any issue and demand lockstep conformity through the cancellation process.
01:02:31.000 And so that's a very interesting inversion that happened kind of without anybody saying anything about it.
01:02:36.000 But it did happen.
01:02:37.000 And then I think the other inversion was the economic inversion, which is – remember the criticism of the Republican Party for a long time was it was the party of trickle-down economics, where the idea was the rich people are going to get all the money because they're going to cut taxes, Reagan administration, and then basically if poor people get any money, it's going to be because the rich people like trickle some down.
01:02:53.000 I think that inverted to where the Democrats, especially in the last four years, became the trickle-down party, which was we're going to tax and we're going to collect all the money and give it to the government and then we're going to let the government hand it out.
01:03:02.000 Right.
01:03:02.000 But they did it under the guise of tax the rich.
01:03:04.000 They did it.
01:03:05.000 They did it with this Robin Hood mentality.
01:03:07.000 At least they expressed that publicly.
01:03:09.000 Of course, that's how it starts.
01:03:10.000 But then you end up with $35 trillion federal debt.
01:03:13.000 You end up with this giant annual deficit.
01:03:15.000 And then you end up with all this money being handed out, right?
01:03:18.000 Handed out in all these grants and all these things.
01:03:21.000 Like just this shower of money coming from the government.
01:03:23.000 But of course, if the government is giving you money, it also means the government can take money away, right?
01:03:27.000 Like if you're making somebody dependent on you because you're giving them money, then you're in a tremendous position of power because you can make their life horrible by pulling the money away.
01:03:34.000 Right.
01:03:35.000 You can also control their ideology that way.
01:03:37.000 100%.
01:03:37.000 Yeah, you own them.
01:03:38.000 It's actually a form, you know, it's on the spectrum to a form of like domination, you know, that should make us very uncomfortable.
01:03:45.000 And so, you know, maybe that would be fine if the deficit didn't get out of control and inflation didn't get out of control, but it did.
01:03:53.000 And then at that point, it's like, okay, like this new kind of sort of tax and spend-driven trickle-down economics is clearly not sustainable.
01:03:59.000 It's not going to work.
01:04:00.000 So the way the Trump administration is going to approach the economy, they want less regulation.
01:04:06.000 They want tariffs and less regulation.
01:04:08.000 And they want more reliance on US energy.
01:04:13.000 They want to drill more, more natural gas, more fracking, more drilling for oil.
01:04:18.000 And then allow companies to work without regulations inhibiting their performance.
01:04:26.000 This will boost the economy.
01:04:28.000 You'll have more productivity.
01:04:30.000 You have more American manufacturing.
01:04:32.000 You have more things happening.
01:04:33.000 Yeah.
01:04:33.000 So the two headline things you hear from them whenever they talk about this, the two headline things are number one, growth.
01:04:38.000 You just need faster growth.
01:04:39.000 By the way, it's the only way to resolve the long-term fiscal situation.
01:04:42.000 It's the only way to resolve the debt.
01:04:44.000 There's only two ways to do it.
01:04:45.000 You can inflate your way out of it and end up in 1930s Germany with hyperinflation.
01:04:49.000 That's one track you can get on, which is a very bad track, and you don't want to go there.
01:04:52.000 Or you can grow faster.
01:04:54.000 Because if you grow faster, then your economy can catch up to the debt, and you can pay down the debt as you grow.
01:04:59.000 And so they want to go for a higher rate of growth.
01:05:01.000 And then the other thing is they want America to win.
01:05:04.000 My partner Ben and I were able to spend time with Trump this summer, and that was like his adamant thing he kept coming back, which is like, look, America has to win.
01:05:10.000 And specifically what that means is America has to win in business and in technology and in industry generally globally.
01:05:17.000 Like, our companies should be the ones that win these, you know, broad—we should win global markets.
01:05:21.000 Like, our companies should be the global— How can anybody be against that?
01:05:24.000 I happen to think that makes a lot of sense.
01:05:28.000 Yes.
01:05:28.000 I know.
01:05:29.000 I mean, obviously, you're a wealthy man, and I am as well.
01:05:31.000 But it's like, how could you not want that?
01:05:34.000 Yes.
01:05:35.000 By the way, if you are in favor of a high level of social support, if you want there to be lots of welfare programs and food assistance programs, all these things, I would argue you also want that because it's the growth that will pay for all the social programs.
01:05:47.000 That's how you square the circle.
01:05:49.000 That's how you actually have your cake and eat it too, which is like first your economy just generates a fountain of money through growth and economic success, and then you can pay for whatever programs you want.
01:05:59.000 Personally, I'm totally fine.
01:06:01.000 Set up all the programs you want.
01:06:03.000 All the social spending you want, all the safety nets you want.
01:06:05.000 And as long as it's easy to pay for because you're growing so fast, then everybody wins.
01:06:09.000 Yeah.
01:06:10.000 I mean, I've always said if I knew that I paid more taxes, people in the world in this country would live better.
01:06:16.000 I would do it.
01:06:16.000 Right, of course.
01:06:16.000 I just don't believe that they're good at spending it.
01:06:19.000 That's the thing, right?
01:06:20.000 It's like if you're putting in this, if you've generated $35 trillion of debt and these are the results.
01:06:26.000 Yeah.
01:06:26.000 Like, this is not the deal.
01:06:27.000 And this is my friend that I talked about earlier.
01:06:29.000 That was the point he made.
01:06:30.000 It's just like, look, the deal has been broken.
01:06:32.000 Like, this is not the deal anybody signed up for.
01:06:34.000 This is not how it's supposed to work.
01:06:35.000 Everybody knows it.
01:06:36.000 And when you were talking about giving people social programs and giving them benefits and then...
01:06:44.000 You could take that away at any moment.
01:06:46.000 This was one of the big fears that people had about letting illegal immigrants into the country and moving them to swing states, which clearly happened, and also giving them a bunch of benefits, which clearly happened.
01:06:57.000 Money, food stamps, housing, all that happened.
01:07:00.000 Stuff that wasn't available to veterans, stuff that wasn't available to homeless people, wasn't available to the very poor of this country.
01:07:07.000 All of a sudden people came here illegally got those things.
01:07:10.000 That's right.
01:07:10.000 And the thought was, if you gave these people these things and you gave them a way better life.
01:07:15.000 Look, if I was living in a third world country with a family and I knew that I could come to America and I could get a job, an actual job and make money and my family is going to definitely eat.
01:07:26.000 I'll vote for whoever the fuck you want me to vote for.
01:07:30.000 I don't care.
01:07:31.000 My life is infinitely better than it was in this totalitarian shithole that I was in until I walked here.
01:07:37.000 I'll do whatever you want.
01:07:38.000 I just want my family to survive, and I think everything's going to...
01:07:42.000 It's so much better than where I was if I'm in some war-torn part of the world.
01:07:46.000 It's so much better here.
01:07:48.000 I don't care if the Democrats win or the Republicans.
01:07:51.000 I'm in America, and if the Republicans didn't give me any money and they want to get me out, they want to deport me.
01:07:57.000 But this nice lady, she gave me an EBT card.
01:08:01.000 And I'm staying at the Roosevelt Hotel in New York City, and I can get a flight somewhere else if I want to go there?
01:08:06.000 Oh, this is wonderful!
01:08:08.000 Right.
01:08:08.000 So that's how it starts, and there is a lot of that going on.
01:08:11.000 But I will say one of the things that's interesting is it doesn't necessarily stick that way.
01:08:14.000 And the sort of evidence for that is the sort of dramatic ramp-up in the Hispanic vote for Trump.
01:08:20.000 Well, Hispanic people generally are very hard workers.
01:08:24.000 So this gets to the thing.
01:08:25.000 So I'll just take a quick story on this.
01:08:27.000 So the night after the 2016 election, literally everybody I knew was just completely traumatized.
01:08:33.000 We were all just completely freaked out.
01:08:34.000 Everybody was shocked.
01:08:35.000 You were freaked out too?
01:08:35.000 Yeah, I was completely freaked out.
01:08:36.000 Everybody was freaked out.
01:08:37.000 I didn't expect him to win the nomination.
01:08:39.000 I didn't expect him to win the race.
01:08:41.000 And the media is on full historical blast, and it's the end of the world.
01:08:45.000 And he's a Russian spy, all this crazy stuff that we now know not to be true.
01:08:48.000 It's just full on.
01:08:49.000 A group of us went out to dinner at a restaurant in Palo Alto, and the atmosphere was like a funeral.
01:08:55.000 Everybody in the restaurant was just despondent and ready to slit their wrists.
01:08:58.000 So we're sitting there eating, and the food doesn't taste good.
01:09:01.000 You can't taste the food, you can't taste the drinks.
01:09:03.000 Everybody's just depressed.
01:09:04.000 And it gets this thing of like, my God, I can't believe that Trump, this, that, racist, anti-Hispanic and all this stuff.
01:09:13.000 And it was one of those moments where the young waiter, who's a Hispanic young man in his 20s, One of those rare moments where he broke into the conversation at the table.
01:09:21.000 But in context, it was like, oh, thank God, because we're just depressing ourselves to death.
01:09:26.000 So thank God he's going to say something.
01:09:28.000 And he said, you know, I think you guys are looking at it all wrong.
01:09:30.000 He's like, my father thinks Trump is fantastic.
01:09:33.000 My father came here as an immigrant, whatever, 30 years ago, built a life here, became a citizen, bought into the system, pays taxes, raised a family.
01:09:40.000 Mowing his lawn with a MAGA hat on.
01:09:41.000 He thinks this guy is great.
01:09:44.000 He thinks this guy is fantastic, and he voted for him.
01:09:47.000 And then, you know, you've heard this before, but then it's like, and the thing that this guy said, the thing my father thinks is terrible is if other people are able to come here, they're able to cut in line, you know, they didn't have to go through the process, they didn't have to prove anything, they're not bought into the system, right?
01:10:00.000 They're able to jump in, and then they, you know, they don't.
01:10:04.000 They're not buying into the system.
01:10:05.000 Part of it may be they're not being accepted, but also part of it is they're not buying in.
01:10:08.000 They're not assimilating.
01:10:10.000 They're not becoming part of what makes America, America.
01:10:15.000 By the way, in some cases, the criminals are coming across and terrorists are coming across and gangs.
01:10:19.000 It's like my father's not in favor of any of that.
01:10:22.000 My father wants to be part of a great society, of a great America, not some dysfunctional, basically just disaster zone.
01:10:29.000 And I remember the group of us, it was my first glimmer of like, okay, I need to like completely rethink my whole sense of like how the world works because- Was that one conversation?
01:10:37.000 Yeah, yeah.
01:10:37.000 Well, it was weird because it was like, so what happened to me is like, so I grew up in rural Wisconsin, which is now like completely Trump country.
01:10:43.000 And so from like zero to 18, like I completely understood the mentality and I was always like explaining to my friends of like, no, no, like this is, you know, this is like a different place and people think differently.
01:10:52.000 And then somehow between the ages of like 18 and 40 or whatever, I just like forgot.
01:10:55.000 And I became a Californian.
01:10:57.000 I became a fully assimilated Californian.
01:10:59.000 And I was just like, well, of course, the Californians are much more sophisticated in advance than people, you know, where I came from.
01:11:04.000 And so, of course, everybody in California has it figured out.
01:11:07.000 And of course, California is going to lead the country in all this thinking, right?
01:11:12.000 And for me, Trump's 2016 was the wake up call of like, no, no, no, no, no.
01:11:16.000 Like, that's just like completely, that is such an impoverished worldview of how this country works and of how people think.
01:11:21.000 But it doesn't explain what, because you have to explain what happened and then you have to like, if you have some sense of being able to predict what's next, which is what I'm supposed to be doing for a living, you know, that's what investing is supposed to be.
01:11:31.000 It's like, okay, I got to rebuild my entire model of the world for like how this all works and how this whole system and how this country works.
01:11:38.000 But it was that conversation that kick-started it for me.
01:11:40.000 So what was the process of altering your perspective or at least opening it up?
01:11:47.000 Yeah, so for me it was – primarily it was reading.
01:11:49.000 And so I started to actually read my way back in history.
01:11:51.000 And I actually went all the way back.
01:11:53.000 I tried to read of like where the origins of like left-wing thought came from and then communism and how did that evolve and, you know, liberal democracy and then also right-wing thought and like, you know, everybody's calling everybody fascist now.
01:12:03.000 So like what was fascism?
01:12:04.000 Is that what this is?
01:12:06.000 How did the Germans do with it?
01:12:08.000 So all of those questions.
01:12:09.000 And then kind of converging on in the last 80 years, like how is that either stabilized or not stabilized?
01:12:15.000 And so I did that.
01:12:16.000 But the other thing is I just started talking to a lot more people.
01:12:18.000 And I just stopped assuming that because I read it in the New York Times that it was true.
01:12:22.000 And by the way, then of course what unfolded in the years kind of sense was I followed the whole Russiagate thing like super closely.
01:12:29.000 Like I read everything and I read all the reports.
01:12:31.000 What did you think initially?
01:12:32.000 Did you think it was true?
01:12:33.000 It's like this overwhelming consensus from the entire expert class that, of course, he's a Russian spy.
01:12:37.000 I sat on stage.
01:12:38.000 I went to Hillary's first post-election loss speech, which she gave at Stanford, the very first one.
01:12:43.000 And I sat.
01:12:44.000 We know the people organizing it.
01:12:45.000 So we sat literally like 15 feet from Hillary in her first appearance.
01:12:48.000 And the whole thing is fraught with this incredible tension.
01:12:51.000 And the Russiagate stuff is in full.
01:12:53.000 Full-blown display, and I go there, and I'm like, all right, this is going to mean to me.
01:12:57.000 And the audience is a Stanford audience, and so it's all 100% Hillary Clinton supporters.
01:13:01.000 And I'm sitting there, and I'm on my best behavior because I'm with my wife, and I have to not act out.
01:13:08.000 And Hillary gets up there, and she says, Trump is only president today because Vladimir Putin hacked Facebook and made him the president.
01:13:14.000 And I'm sitting in the audience, and I'm on the Facebook board, and I'm like, that's not true.
01:13:19.000 I know for an absolute fact that that's not true.
01:13:23.000 Right?
01:13:24.000 And so that got me thinking.
01:13:25.000 And then the Russiagate stuff unspooled.
01:13:27.000 And I was like, you know, the whole...
01:13:28.000 The Steele dossier and, like, all this stuff comes out.
01:13:31.000 What was the accusations about Facebook?
01:13:33.000 How did she think that Russia hacked Facebook and made Trump the president?
01:13:38.000 Yeah.
01:13:38.000 So it's this whole thing with...
01:13:39.000 So remember, this whole thing, Cambridge Analytica.
01:13:41.000 And so it's this whole thing that there was this...
01:13:43.000 Basically, there was this data...
01:13:44.000 There was this theory, which, by the way, is, like, completely...
01:13:46.000 It is, like, a completely fake thing.
01:13:47.000 Like, this didn't...
01:13:48.000 So there was this data set on user behavior that in theory, there's a theory that you could sort of impute human behavior from this data set and then you could use it to predict what people would do and how they would react to different kinds of messages.
01:14:01.000 And it was like this magical breakthrough and basically thought control.
01:14:05.000 And then there was this company called Cambridge Analytica in the UK that figured out a way to do this.
01:14:09.000 And then it was this like new kind of literally like mind control, like, you know, by far like the most powerful meme weapon of all time for getting people to vote the way that you want.
01:14:17.000 And it was this data breach.
01:14:19.000 The whole thing was weird because Facebook had been criticized for a decade leading up to 2016 that it kept all the data closed.
01:14:25.000 Right.
01:14:26.000 So the criticism was Facebook never lets any of the data.
01:14:28.000 It doesn't share the data, right?
01:14:29.000 And the criticism for years was Facebook is the Rochematel of data and the virtuous thing for it to do is to actually free the data and let everybody else have access to the data.
01:14:36.000 And then in 2016, it flipped 180 degrees and it was Facebook is the most evil company of all time because it let Cambridge Analytica get access to this data.
01:14:44.000 And then Russia ran basically a psychological operation and the American citizens using this data.
01:14:48.000 Why didn't Facebook push back?
01:14:50.000 They did early on.
01:14:52.000 They do today in their way.
01:14:54.000 But they're trying to run a business.
01:14:56.000 They're trying to get to the next quarter.
01:14:57.000 They're trying to keep the employee base and everybody copacetic.
01:15:00.000 They're trying not to get just completely destroyed by the politicians.
01:15:03.000 They're getting slammed every single day on every conceivable issue you can imagine.
01:15:09.000 It's actually a very interesting thing.
01:15:11.000 When you're in these companies, these big issues are big issues, but you're also literally trying to make the quarter.
01:15:16.000 You're trying to ship your products.
01:15:18.000 You're trying to close your sales.
01:15:19.000 You're trying to keep your employees from quitting.
01:15:21.000 You have responsibilities.
01:15:23.000 You have practical concern responsibilities.
01:15:25.000 And so sometimes these companies get kind of wedged because they can't do the things that they would do if they were just in damage control mode.
01:15:31.000 And then maybe the message doesn't get out.
01:15:34.000 So what was the bigger shift, the waiter or the Hillary speech?
01:15:37.000 Oh, it was The Waiter.
01:15:38.000 I mean, The Waiter was the much bigger shift because it was listening to a person with their feet on the ground actually explaining the way the world worked.
01:15:45.000 Whereas with Hillary, it was cope, right?
01:15:47.000 It was delusion.
01:15:50.000 It was amazing, by the way.
01:15:51.000 She then spent the next hour and a half...
01:15:53.000 When I'm in a place where I don't know if I'm going to control myself, I bring a little notepad along because I can work out my demons.
01:15:59.000 Draw dicks.
01:16:00.000 Exactly.
01:16:01.000 So that I don't say anything.
01:16:03.000 Like super bad.
01:16:04.000 So I brought my little notepad along.
01:16:06.000 Exactly.
01:16:06.000 My little Fisher's face pen, right?
01:16:08.000 And I pull it out.
01:16:09.000 And I started making a list of all of the people and organizations that she blamed for her defeat that were not named Hillary Clinton.
01:16:15.000 And I got to 20. My favorite was Netflix, by the way.
01:16:19.000 Netflix?
01:16:20.000 She blamed Netflix.
01:16:21.000 What did Netflix do?
01:16:22.000 Netflix aired anti-Clinton documentaries.
01:16:25.000 Oh, you mean facts.
01:16:27.000 Well, this is particularly funny because the CEO of Netflix is a famous Democrat.
01:16:30.000 He's a super Democrat booster.
01:16:32.000 Well, actually, Ted, but also specifically Reed Hastings and his wife are very enthusiastic left-wingers.
01:16:40.000 But, I mean, it was just this litany of, you know, basically excuses and complaints, right?
01:16:44.000 With no sense of, like, personal responsibility at all.
01:16:47.000 You know, just, like, pure grievance.
01:16:49.000 And so it was a negative lesson of, like, okay, like, whatever that is is not the path.
01:16:53.000 Did she blame Comey?
01:16:55.000 Oh, yeah.
01:16:55.000 Oh, absolutely.
01:16:57.000 Oh, yeah.
01:16:57.000 She absolutely hated that guy.
01:16:58.000 Yeah, no question.
01:16:59.000 That was a wild one.
01:17:00.000 100%.
01:17:01.000 Yeah, exactly.
01:17:01.000 And by the way, like, that was super weird.
01:17:03.000 Yeah.
01:17:04.000 I don't think she was completely wrong on that.
01:17:05.000 I don't understand that one, honestly.
01:17:07.000 If they didn't want Trump to win, I don't get that one.
01:17:10.000 Well, we know she's guilty, but we're not going to charge her.
01:17:13.000 Right.
01:17:13.000 Is a weird message to send.
01:17:16.000 It's almost as weird as the Biden one, where we don't think he's competent to stand trial for the documents that he had that were classified.
01:17:23.000 Exactly.
01:17:24.000 But he can, what, have his finger on the button?
01:17:27.000 What the fuck are you talking about?
01:17:28.000 Exactly.
01:17:29.000 We know he's guilty, but we never convict him because the jury would say that he's a senile old man.
01:17:33.000 Which is crazy because he's still running for president at the time.
01:17:35.000 He's running for re-election.
01:17:36.000 Well, then remember, everybody at the time said, the media said, the prosecutor is lying, right?
01:17:40.000 Of course, he's sharp as a tack.
01:17:42.000 He's sharp as a tack.
01:17:44.000 Exactly.
01:17:45.000 My favorite is Joe Scarborough.
01:17:47.000 Yes.
01:17:47.000 This is the best Biden, intellectually, the best one I've ever seen.
01:17:51.000 Like, dude...
01:17:52.000 Yes.
01:17:52.000 And then meanwhile he had to go to Mar-a-Lago and kiss the ring.
01:17:55.000 Yes, exactly, exactly, exactly.
01:17:57.000 My favorite was the, remember about earlier this year was the invention of the term cheap fake?
01:18:03.000 Cheap fake, yes.
01:18:04.000 Cheap fake, because everybody's worried about the AI deep fake, which really didn't, and there was really nothing happened to that.
01:18:08.000 And so the cheap fake we learned is a video that just simply shows you something.
01:18:12.000 Right.
01:18:14.000 It's claimed to be out of context, but it actually turns out that it's actually just telling you the truth.
01:18:17.000 Didn't Nancy Pelosi start using that one?
01:18:20.000 Cheap fake?
01:18:20.000 Yeah, exactly.
01:18:21.000 Because the theory was it was going to be clips out of context.
01:18:23.000 Yeah.
01:18:23.000 But it turned out they were clips in context.
01:18:25.000 Have you seen...
01:18:26.000 There's a gentleman who made a video.
01:18:29.000 Here, I'll send it to you, Jamie, because I sent it to Duncan.
01:18:31.000 It's pretty fucking crazy of what AI is capable of now by...
01:18:38.000 Come on.
01:18:39.000 My phone updated, you son of a bitch.
01:18:41.000 Come on.
01:18:42.000 Don't make me go to fucking Android because I will.
01:18:48.000 This guy did this insane video where it's all completely AI and everything he did, including his voice.
01:18:59.000 Here, I'll send it to you, Jamie.
01:19:01.000 It's 100% AI generated.
01:19:04.000 And it's so hard to believe because it's so good.
01:19:08.000 And it really puts you in this...
01:19:11.000 When you're talking about cheap fakes...
01:19:12.000 I just sent it to you, Jeremy.
01:19:13.000 Cheap fakes and deep fakes.
01:19:15.000 Let's put the headphones on to watch this because it's so crazy.
01:19:18.000 We're at that moment where you cannot tell.
01:19:21.000 And let's look at this one because it's pretty extraordinary.
01:19:25.000 This is the best version that I've seen so far.
01:19:28.000 This is completely AI. 11 labs?
01:19:31.000 It's one of our companies.
01:19:32.000 So I can input any text and it will sound like me.
01:19:36.000 Then I trained HeyGen with a video of mine.
01:19:40.000 I input the audio file to generate a video based on my text.
01:19:44.000 The video you are watching right now is the result.
01:19:47.000 100% generated in AI. What do you think of that?
01:19:52.000 Guys, I know this might sound crazy.
01:19:54.000 How crazy is that?
01:19:56.000 Oh, that's your company?
01:19:58.000 That's him.
01:19:58.000 Oh, that's him.
01:19:59.000 Yeah, yeah.
01:19:59.000 That's the AI generator.
01:20:01.000 Yeah, yeah.
01:20:01.000 That's right.
01:20:01.000 That's right.
01:20:01.000 That's right.
01:20:02.000 So that's two companies.
01:20:03.000 One of them, the voice is ours.
01:20:04.000 And then that's another great company called HeyGen that did the visuals.
01:20:07.000 But yeah, no, that's right.
01:20:09.000 That's nuts.
01:20:10.000 Yeah, yeah, yeah.
01:20:10.000 Well, this is part of the first internet election.
01:20:13.000 Probably the first internet election will be the one that has this kind of thing actually in it where people get tricked.
01:20:17.000 Why didn't they do that with Kamala Harris?
01:20:19.000 They would have done an amazing job.
01:20:21.000 They could have really knocked it out of the park with a solid speech.
01:20:24.000 Just have her say it on the internet.
01:20:25.000 Yes.
01:20:26.000 Just have a bunch of viral videos of her speaking so eloquently and perfectly.
01:20:30.000 One would think, exactly.
01:20:31.000 That's the fear of the future, right?
01:20:32.000 Yeah, yeah.
01:20:32.000 And so, like, I think that's going to be the kind of thing that's going to happen in terms of, like, the dirty trick side.
01:20:37.000 I think that, you know, that will be a part of it, right?
01:20:39.000 There's always some way to try to game these things.
01:20:41.000 Just have the most brilliant writers formulate, you know, get AI to do it.
01:20:45.000 Like, you're saying AI has all these solutions to things that are super logical.
01:20:49.000 And, well, there's no, like, weird thinking in it.
01:20:52.000 It's like, you know, cut all the fat out.
01:20:54.000 So I think we have a theory on how to fix this.
01:20:56.000 And the theory basically is we're going to have to switch our sense of what's real from basically just trying to eyeball it and figure out whether it's real to only taking seriously the things that we know are real.
01:21:04.000 And the way that we would know things are real is we'll have them registered on a blockchain.
01:21:08.000 And so I think the way this is going to work in the future is every politician will have an account on a blockchain service, like a crypto service.
01:21:16.000 And then every politician, whenever they say anything in public, whenever they're going to have people around them with cameras all the time, Whenever they put out a statement, they're going to cryptographically sign it on the blockchain so that it can be validated that it is actually content from them.
01:21:29.000 And then I think we're just going to have to reach an understanding that we're just going to have to write off everything else that we see.
01:21:34.000 Wow.
01:21:35.000 Which frankly is a good idea anyway because there just is a lot of noise in the environment.
01:21:39.000 How would you integrate that with social media though?
01:21:42.000 Because one of the issues is these low information voters that are getting information Either from clickbait headlines on these websites where they don't even read the actual paragraph, which might be completely different than the headline itself.
01:21:57.000 The headline is just inflammatory.
01:21:58.000 And then viral videos.
01:22:01.000 How would you...
01:22:03.000 So the thing is, so that's already happening even pre-AI, right?
01:22:07.000 And so I would say that's a pre-existing problem.
01:22:09.000 And so, like, we can't, you know, we can't...
01:22:10.000 And by the way, that's been happening for a long time.
01:22:12.000 Newspapers have been scandal sheets forever.
01:22:14.000 If you go back hundreds of years to the first newspapers, they were running all kinds of scrolls.
01:22:19.000 The first newspaper was a scandal sheet of the Vatican, like, in the year 1500. It was all these, like, terrible rumors about, like, the Pope and the bishops and all these, the cardinals and all this stuff.
01:22:26.000 That was the first newspaper?
01:22:27.000 That was the very first newspaper was in the Vatican.
01:22:28.000 And then...
01:22:29.000 All the American colonial newspapers were like that in the revolutionary era.
01:22:34.000 It was all crazy rumors and innuendo and people accusing each other.
01:22:37.000 There was a famous election in 1800, which was Jefferson versus Adams, that we think of as these super upstanding, upright people.
01:22:43.000 And they're just smearing the crap out of each other in their respective newspapers.
01:22:47.000 Because they would actually own newspapers in those days.
01:22:49.000 Oh, God.
01:22:50.000 Attack each other.
01:22:51.000 The more things change.
01:22:52.000 Ben Franklin printed newspapers before he became a government, and he created 15 different sock puppets.
01:23:00.000 He created 15 different pseudonyms.
01:23:03.000 He was a sued, a non.
01:23:06.000 And then he would basically have them argue with each other in his newspaper without telling people that it was all him.
01:23:11.000 Oh, Ben.
01:23:11.000 So he had all these different personalities.
01:23:13.000 And so, like, we've been in a world of, like, information warfare for a very long time.
01:23:17.000 We've been in a world of sensationalist, you know, nightly news.
01:23:20.000 If it bleeds, it leads.
01:23:22.000 You know, sensationalist stuff for a long time.
01:23:24.000 We've been in a world of, like, yeah, propaganda for a long time.
01:23:28.000 So you're never going to make that go away.
01:23:31.000 But isn't it funny that we don't think of the past like that?
01:23:34.000 We think of them being virtuous.
01:23:36.000 We assume they had it all figured out.
01:23:37.000 That very much is not true.
01:23:39.000 There's all kinds of crazy banana stuff.
01:23:41.000 My favorite is in the Vietnam War, it was the Gulf of Ton Can that sort of kicked off the big escalation.
01:23:47.000 We now know for a fact it didn't happen.
01:23:49.000 The whole thing just didn't happen and now there's this big debate about did they know it didn't happen or did they fake it?
01:23:55.000 There's always been stuff like that in history.
01:23:57.000 So that we can't fix.
01:23:59.000 And AI will be a new way to do that kind of thing.
01:24:01.000 But what we can do is we can reorient people and say, okay, now you're going to have to take seriously.
01:24:05.000 This stuff is real.
01:24:06.000 And if you want to actually know what's happening, this stuff is real and we can prove that it's real.
01:24:09.000 And if it's not, it's entertainment and you can choose to believe it or not.
01:24:12.000 But you should not rely on it.
01:24:14.000 And look, it's not going to be perfect and it's going to take time, but there is a way to address this.
01:24:19.000 Okay, so that would be the solution to deepfakes, the blockchain.
01:24:23.000 Yeah, you flip it.
01:24:24.000 You flip it.
01:24:25.000 You focus on the real stuff.
01:24:26.000 That's logical.
01:24:27.000 That actually does make sense.
01:24:29.000 That actually kind of gives me hope.
01:24:30.000 I do generally have hope.
01:24:32.000 Even though I look at the pessimistic side of things, I'm generally optimistic.
01:24:37.000 Because my real feeling about human beings is most people are good.
01:24:41.000 I genuinely believe there's far more good people in the world than bad people.
01:24:45.000 There's far more people that just want to live a good life and have a good time and enjoy themselves than there are people who are tyrants.
01:24:52.000 Yeah.
01:24:52.000 I'm super optimistic.
01:24:53.000 I'm incredibly optimistic.
01:24:54.000 And I was optimistic already with flashes of pessimism, but I'm really optimistic, and especially now.
01:25:00.000 So I think this is going to be—we have the real potential here for Golden Age.
01:25:04.000 We really do.
01:25:05.000 We really do.
01:25:06.000 The capabilities that we have and the people that we – I mean, look, in my day job, I meet these young – I meet these 22-year-olds every day that are just like the smartest people in the world, the smartest people I've ever met.
01:25:16.000 I think they're getting better, by the way, as time passes.
01:25:18.000 By the time they're 22, they just know a lot more.
01:25:21.000 They have so much more access to information than we did.
01:25:23.000 Yeah, they're so much better trained, capable, and ready to go, fired up.
01:25:27.000 And they know each other.
01:25:28.000 They're able to connect online and they're already in communities and they know how to help each other.
01:25:31.000 And so, like, yeah, the productive and inventive and creative, you know, aspect particularly of this country is just like there's never been anything like it in the world.
01:25:41.000 I think there's also the real potential for a shift in perspective, a positive patriotic shift in perspective that can happen in this country.
01:25:51.000 And if you think about what happened with the woke ideology, how it swept so quickly over the country and changed so many aspects of the way we deal with things socially.
01:26:00.000 It happened so radically and so quickly and such a large change that people are...
01:26:06.000 It's susceptible to change.
01:26:08.000 It's possible to enact change and a positive change in a good direction where people are optimistic about the future, which you are and I am.
01:26:17.000 I mean, I think that's probably contagious.
01:26:20.000 Yeah, that's right.
01:26:21.000 I really do think that.
01:26:22.000 It's an upward spiral.
01:26:23.000 It was Evan Hafer who said that thing about psychology the other day.
01:26:26.000 It was a friend of mine who was a The former Special Forces guy said that psychology is more contagious than the flu.
01:26:36.000 Right.
01:26:36.000 Right.
01:26:36.000 Exactly.
01:26:37.000 Yes.
01:26:37.000 Yes.
01:26:38.000 Yeah.
01:26:38.000 Yeah, I think that's right.
01:26:39.000 So one of the interesting things that's going to happen right now, you know, we talked a lot about Trump's victory and Republicans, but there's now a civil war that's kicked off inside the Democratic Party, which is very interesting.
01:26:48.000 Really?
01:26:49.000 Well, because they lost so badly, right?
01:26:50.000 So the fact that they lost the White House and they lost the popular vote and they lost the Congress and they lost the Senate and they lost the Supreme Court.
01:26:56.000 Right.
01:26:57.000 This time, it's undeniable that the current path that they've been on is not working.
01:27:02.000 Being in an exclusionary party and kicking people out for a wrong thing, they're not going to win elections.
01:27:07.000 They're not just kicking people out.
01:27:08.000 They're barring people from making it to the primaries, which is very undemocratic.
01:27:12.000 That's right.
01:27:13.000 That's right.
01:27:13.000 Yeah, exactly.
01:27:14.000 Well, starting with Bernie in 2016 and then continuing.
01:27:16.000 In Donna Rice's book, she documented that.
01:27:18.000 Right.
01:27:19.000 And so I would say the smart Democrats know that this is not a viable path.
01:27:25.000 You can't have a political party that doesn't win.
01:27:27.000 It's not useful.
01:27:29.000 And so there's a civil war that's underway inside that party that's kicking off right now where they're going to have to recalibrate what they want their future to be.
01:27:37.000 And it's going to be a big decision.
01:27:38.000 And the same thing happened, by the way, when Reagan beat Carter really badly in 80 and then had a landslide in 84. It then took Democrats 12 years to get to Bill Clinton and to actually win again.
01:27:51.000 And so they have this cautionary tale of they went too far in the 60s and 70s and it took them 12 years to recover.
01:27:56.000 And so if you talk to the really smart Democrats right now, they're like, look, this can't be 12 years.
01:27:59.000 That's crazy.
01:28:00.000 We have to do this a lot faster, but we have to reorient and we have to get back to common sense.
01:28:05.000 We have to get back to normal.
01:28:06.000 We have to get back to sensible.
01:28:07.000 We have to get back to moderate.
01:28:09.000 We were actually playing Bill Clinton debating during the elections of, what year was that Jamie?
01:28:17.000 I forget which one.
01:28:18.000 It was when he first ran.
01:28:19.000 What year did he first run?
01:28:21.000 Oh, yeah.
01:28:22.000 92. So it was the 92. And I was like, I'd vote for that guy.
01:28:25.000 Yeah, exactly.
01:28:26.000 In a heartbeat.
01:28:26.000 The guy's awesome.
01:28:27.000 Also, we played a clip of Hillary Clinton where she sounded more MAGA than anybody who's MAGA today.
01:28:33.000 She was talking about the penalties that illegal immigrants should face.
01:28:36.000 They should pay a stiff fine because they came into this country illegally.
01:28:39.000 And if they're a criminal, they should be jailed or kicked out of the country without question.
01:28:44.000 Like all this was like so MAGA. I was like, this is so wild to hear from Hillary in 2008. Yep, that's right.
01:28:50.000 That's right.
01:28:51.000 And Hillary and Joe Biden and Dianne Feinstein and all these people wanted to build a wall.
01:28:55.000 Uh-huh.
01:28:56.000 Dianne Feinstein, our senator in California at the time, very left-wing, she was down on the border, like, the photo ops in front of the wall that was being built, like, trying to take credit for it.
01:29:03.000 Crazy!
01:29:04.000 Yeah, yeah.
01:29:04.000 Like, 18 years ago?
01:29:06.000 Yeah.
01:29:06.000 Yeah.
01:29:06.000 So yeah, another reason for optimism is I think that they're going to be able to pull their way back.
01:29:11.000 I think losing this bad is very motivating to be able to pull your way back and become more normal.
01:29:18.000 And I think, again, that would be like, I mean, how great would it be if you had two parties that actually had sensible, normal policies?
01:29:24.000 I mean, imagine if Clinton was running up against Trump.
01:29:27.000 Yes, exactly.
01:29:28.000 He was so good.
01:29:30.000 We played that speech that he gave after Sister Soulja and said a bunch of very anti-white things about white people, and he gave this super eloquent but yet compassionate speech about this, where he's very charitable about her position as being a young person and not having the best perspective on things.
01:29:50.000 It was fucking brilliant.
01:29:51.000 It was brilliant.
01:29:52.000 Like, that's the guy!
01:29:53.000 Like, that's the president!
01:29:54.000 Now, by modern standards, of course, he was a fascist.
01:29:57.000 Yeah.
01:29:57.000 Well, that's the weird thing about fascism, right?
01:29:59.000 Because fascism, by definition, is almost always applied to right-wing totalitarian governments.
01:30:05.000 But it's really kind of just adherence to the state and enforcing a doctrine and enforcing people to think and behave, which is what the left-wing does.
01:30:14.000 And then you talk about, like, being pro-war.
01:30:17.000 Well, who's more pro-war right now?
01:30:19.000 Trump or the Biden administration?
01:30:23.000 Clearly Trump is less pro-war.
01:30:25.000 Clearly Trump wants to end the wars.
01:30:27.000 Clearly Biden just allowed Ukraine to use long-range missiles into Russia.
01:30:33.000 I don't know what's going on in terms of negotiations.
01:30:35.000 I hear all kinds of different things.
01:30:37.000 But if you looked at one side that is pushing for these wars and seems to be all in on it and the other side that's not, like, the fucking polar shift is so dramatic.
01:30:48.000 Yeah, that's right.
01:30:49.000 It's really weird.
01:30:50.000 The free speech thing, which was always a tenant of the left-wing party, it was like, you know, I mean, it was...
01:30:57.000 Doctrine.
01:30:57.000 Free speech is necessary.
01:30:59.000 It's the foundation of our ability to discuss and find out what's right and what's wrong.
01:31:05.000 You have to be.
01:31:07.000 The ADL used to let fucking Nazis speak.
01:31:11.000 They used to let them march.
01:31:12.000 They would defend their right to do it.
01:31:15.000 Yeah, because you needed to air out the idea to be able to show why it was wrong.
01:31:17.000 Exactly, yeah.
01:31:18.000 So look, it was not that long ago when you had Democrats that were very much in favor of many of these extremely sensible positions.
01:31:24.000 Super recent.
01:31:25.000 It was pretty recent.
01:31:26.000 But again, I don't know if they're going to pull it off.
01:31:29.000 They might go crazier.
01:31:31.000 They might just go right off the cliff.
01:31:32.000 It's certainly possible.
01:31:33.000 But it is also possible that they'll drag it back and it might happen quite quickly.
01:31:37.000 And I am hopeful and optimistic.
01:31:38.000 I am as well.
01:31:38.000 I think the temperature of society, like the mindset of society is so clearly moving away from that madness that they're going to have to course correct, which is just logical.
01:31:50.000 There's just no way they're going to keep doing it the same way or double down.
01:31:53.000 It's just not going to.
01:31:54.000 It's like they're going to go the way of MSNBC. They're going to become ridiculous.
01:31:58.000 Yeah, that's right.
01:31:59.000 So they have to, which is good for everyone, for everyone.
01:32:02.000 So one of my theories is you can separate the concepts of the United States and America, and you can be very optimistic about America and have all kinds of issues with the United States, but still be positive about America.
01:32:12.000 And the difference is the United States is the formal system of the government and the politics and all the stuff we get mad about, and America is the people.
01:32:20.000 Right.
01:32:20.000 Right.
01:32:21.000 And so you can be, as I am, incredibly bullish about the people.
01:32:24.000 And then it's just a question on the America part.
01:32:26.000 And then it's just a question of whether you can get the United States part kind of lined up to at least not prevent good things from happening and ideally help good things.
01:32:32.000 Well, what are the things that you think about this administration, at least what they're proposing, that would move us in that direction as opposed to the way things were going?
01:32:41.000 There's a lot of things.
01:32:42.000 I think you've got to start with the Doge, the Department of Government Efficiency.
01:32:47.000 It's hilarious that it just winds up being Doge, D-O-G-E. He's been pushing Dogecoin forever.
01:32:53.000 The universe speaks.
01:32:54.000 Yeah.
01:32:55.000 So many things are just so on the nose that you're like, is the simulation real?
01:33:00.000 Yes.
01:33:01.000 I mean, it has to be real.
01:33:02.000 Yes, exactly.
01:33:02.000 Exactly.
01:33:03.000 And Elon is programming it in the back room late at night in between playing Diablo.
01:33:07.000 We certainly got a good position in the game.
01:33:09.000 And tweeting, exactly.
01:33:09.000 He's the number one Diablo player in the world right now, by the way.
01:33:12.000 He just got number one.
01:33:13.000 Which means...
01:33:14.000 Fucking bananas.
01:33:15.000 How does he have the time to do that?
01:33:16.000 Which means he could be the guy steering the simulation.
01:33:18.000 Yeah.
01:33:19.000 Yeah, so look, this goes back to what we were talking about before.
01:33:21.000 It is time to carve this government back in size and scope.
01:33:25.000 It's time to take the overall – you can talk about distribution of taxes, but it's time to take the overall tax load down.
01:33:30.000 It's time to take the spending down.
01:33:31.000 It's time to get the government out of the position of deciding who gets money.
01:33:33.000 It's time to unleash economic growth.
01:33:35.000 Elon explained that there's more agencies than there have been years of the United States.
01:33:39.000 Correct.
01:33:40.000 Yeah, 450 federal agencies and two new ones a year.
01:33:44.000 And then my favorite twist is we have this thing called independent federal agencies.
01:33:48.000 So, for example, we have this thing called the Consumer Finance Protection Bureau, CFPB, which is sort of Elizabeth Warren's personal agency that she gets to control.
01:33:55.000 And it's an independent agency that just gets to run and do whatever it wants, right?
01:33:59.000 And if you read the Constitution, like, there is no such thing as an independent agency.
01:34:03.000 And yet, there it is.
01:34:04.000 What does her agency do?
01:34:06.000 Whatever she wants.
01:34:07.000 What does it do, though?
01:34:08.000 Basically, terrorize financial institutions, prevent new competition, new startups that want to compete with the big banks.
01:34:16.000 Really?
01:34:16.000 Oh, yeah.
01:34:16.000 How so?
01:34:17.000 Just terrorizing anybody who tries to do anything new in financial services.
01:34:20.000 Can you give me an example?
01:34:21.000 You know, debanking.
01:34:24.000 This is where a lot of the debanking comes from, is these agencies.
01:34:27.000 So debanking is when you, as either a person or your company, are literally kicked out of the banking system.
01:34:31.000 Like they did to Kanye.
01:34:32.000 Exactly.
01:34:33.000 Like they did to Kanye, my partner Ben's father has been debanked.
01:34:37.000 Really?
01:34:37.000 We had an employee who— For what?
01:34:39.000 For having the wrong politics.
01:34:40.000 For saying unacceptable things.
01:34:42.000 Under current banking regulations—okay, here's a great thing.
01:34:45.000 Under current banking regulations, after all the reforms of the last 20 years, there's now a category called a politically exposed person.
01:34:51.000 PEP. And if you are a PEP, you are required by financial regulators to kick them out of your bank.
01:34:58.000 What?
01:34:58.000 You're not allowed to have them.
01:34:59.000 But what if you're politically on the left?
01:35:01.000 That's fine.
01:35:03.000 Because they're not politically exposed.
01:35:05.000 So no one on the left gets debanked?
01:35:06.000 I have not heard of a single instance of anyone on the left getting debanked.
01:35:09.000 Can you tell me what the person that you know did?
01:35:11.000 What they said that got them debanked?
01:35:13.000 Oh, well, David Horowitz is a right wing.
01:35:15.000 He's pro-Trump.
01:35:15.000 I mean, he's said all kinds of things.
01:35:17.000 He's been very anti-Islamic terrorism.
01:35:18.000 He's been very worried about immigration, all these things.
01:35:21.000 And they debanked him for that?
01:35:22.000 Yeah, they debanked him.
01:35:23.000 So you get kicked out of your bank account.
01:35:25.000 You can't do credit card transactions.
01:35:28.000 How is that legal?
01:35:30.000 Well, exactly.
01:35:31.000 So this is the thing.
01:35:32.000 And then you go into this thing of like, well, this is where the government and the companies get intertwined, back to your fascism point, which is there's a constitutional amendment that says the government can't restrict your speech, but there's no constitutional amendment that says the government can't debank you, right?
01:35:45.000 And so if they can't do the one thing, they do the other thing.
01:35:48.000 And then they don't have to debank you.
01:35:49.000 They just have to put pressure on the private company banks to do it.
01:35:53.000 And then the private company banks do it because they're expected to.
01:35:56.000 But the government gets to say, we didn't do it.
01:35:57.000 It was the private company that did it.
01:35:59.000 And of course, JP Morgan can decide who they want to have as customers.
01:36:01.000 Of course, right?
01:36:02.000 It's their private company.
01:36:04.000 And so it's this sleight of hand that happens.
01:36:07.000 So it's basically, it's a privatized sanctions regime that lets bureaucrats do to American citizens the same thing that we do to Iran.
01:36:14.000 Just kick you out of the financial system.
01:36:16.000 And so this has been happening to all the crypto entrepreneurs in the last four years.
01:36:20.000 This has been happening to a lot of the fintech entrepreneurs, anybody trying to start any kind of new banking service.
01:36:24.000 Because they're trying to protect the big banks.
01:36:26.000 And then this has been happening, by the way, also in legal fields of economic activity that they don't like.
01:36:31.000 And so a lot of this started about 15 years ago with this thing called Operation Truck Point, where they decided to, as marijuana started to become legal, as prostitution started to become legal, and then guns, which there's always a fight about.
01:36:43.000 Under the Obama administration, they started to debank We're good to go.
01:37:10.000 None of that stuff is available.
01:37:11.000 You've been sanctioned.
01:37:12.000 None of that stuff is available.
01:37:13.000 And then this administration extended that concept to apply it to tech founders, crypto founders, and then just generally political opponents.
01:37:21.000 So that's been super pernicious.
01:37:24.000 I wasn't aware of that.
01:37:25.000 Oh, 100%.
01:37:26.000 Operation Shortpoint 1.0 was 15 years ago against the pot and the guns.
01:37:32.000 Chokepoint 2.0 is primarily against their political enemies and then to their disfavored tech startups.
01:37:37.000 And it's hit the tech world.
01:37:38.000 Like, we've had like 30 founders debanked in the last four years.
01:37:41.000 Real?
01:37:42.000 Yeah, yeah, yeah.
01:37:42.000 It's been a big recurring pattern.
01:37:44.000 30?
01:37:45.000 This is one of the reasons why we ended up supporting Trump.
01:37:48.000 It's like we just can't.
01:37:49.000 We can't live in this world.
01:37:50.000 We can't live in a world where somebody starts a company that's a completely legal thing and then they literally get sanctioned and embargoed by the United States government through a completely unaccountable...
01:37:59.000 By the way, no due process.
01:38:01.000 None of this is written down.
01:38:03.000 There's no rules.
01:38:04.000 There's no court.
01:38:06.000 There's no decision process.
01:38:08.000 There's no appeal.
01:38:09.000 Who do you appeal to?
01:38:11.000 Who do you go to to get your bank account back?
01:38:15.000 And then there's also the civil asset forfeiture side of it which is right the other side and that doesn't happen to us but that happens to people in a lot of places now who get arrested and all of a sudden the state takes their money.
01:38:25.000 Yes.
01:38:26.000 That happens to people if they get pulled over and they have a large amount of cash in some states.
01:38:30.000 Right.
01:38:30.000 Or there will be – well-publicized examples of like – there will be some investigation into like safe deposit boxes and the next thing you know the feds have seized all the contents of the state deposit – safe deposit boxes and that stuff never gets returned.
01:38:44.000 And so it's this – and this is when Trump says the deep state – like the way we would describe it is it's administrative power.
01:38:51.000 It's political power being administered not through legislation, right?
01:38:55.000 So there's no defined law that covers this.
01:38:57.000 It's not through regulation, right?
01:38:59.000 There's nothing you can – you can't go sue a regulator to fix this.
01:39:02.000 It's not through any kind of court judgment.
01:39:04.000 It's just raw power.
01:39:06.000 It's just raw administrative power.
01:39:08.000 It's the government or politicians just deciding that things are going to be a certain way and then they just apply pressure until they get it.
01:39:13.000 So what happens to those 30 tech people that you know?
01:39:17.000 Start to go into a different field.
01:39:19.000 Like, try to do something different and try to get, you know...
01:39:22.000 Whoa!
01:39:23.000 Complete upending of your life.
01:39:24.000 Yeah, complete upending of your life.
01:39:25.000 And try to, yeah, try to change your life.
01:39:27.000 Try to get out of the...
01:39:28.000 Try to get away from the eye of Sauron.
01:39:30.000 Try to get out of whatever zone got you into this and keep applying for new bank accounts at different banks and hope that at some point a bank will say, you know, okay, you know, it's okay.
01:39:39.000 We've checked in.
01:39:39.000 It's now all right.
01:39:40.000 Whoa!
01:39:41.000 But there's no...
01:39:42.000 So what do they do with their money?
01:39:43.000 Like, what happens?
01:39:45.000 I mean, you go to cash.
01:39:46.000 I mean...
01:39:46.000 You go to cash?
01:39:47.000 You can't have a...
01:39:48.000 Yeah.
01:39:49.000 So where do you put it?
01:39:50.000 Under your mattress.
01:39:54.000 Yes, exactly.
01:39:55.000 Yeah.
01:39:55.000 That is so insane.
01:39:56.000 So if someone has $30 million in the bank and they get debanked...
01:40:00.000 Diamonds, art, you know, do you, I don't know, go overseas somewhere?
01:40:06.000 Holy shit!
01:40:07.000 Yeah, yeah, yeah.
01:40:07.000 It just happens.
01:40:09.000 And again, it's really, really important.
01:40:10.000 There's no fingerprints.
01:40:11.000 Like, there's no...
01:40:12.000 Right.
01:40:13.000 There's no person who— There's no stick above the strings.
01:40:16.000 Yeah, exactly.
01:40:17.000 It just happened.
01:40:18.000 And we can trace it back because we understand exactly—we know the politicians involved and we know how the agencies work and we know how the pressure is applied and we know that the banks get phone calls and so forth.
01:40:27.000 And so we can loosely—we understand the flow of power as it happens.
01:40:31.000 But when you're on the receiving end of this, your specific instance of it, like you can't trace it back and there's nothing you can do about it.
01:40:36.000 So what are the instances?
01:40:38.000 Like what is the company?
01:40:40.000 What are they trying to do and how do they run afoul?
01:40:43.000 Well, all the crypto startups in the last basically four years.
01:40:46.000 So remember the crypto thing got like really, you know, sort of everybody got excited and like NFTs and like all that stuff and then it just like stopped.
01:40:53.000 Yeah.
01:40:53.000 And the reason it stopped is because basically every crypto founder, every crypto startup, they either got debanked personally and forced out of the industry or their company got debanked and so it couldn't keep operating or they got prosecuted, charged, or they got threatened with being charged.
01:41:11.000 This is a fun twist.
01:41:12.000 This is a fun little twist.
01:41:13.000 So the SEC sort of has been trying to kill the crypto industry under Biden.
01:41:17.000 And this has been a big issue for us because we're the biggest crypto startup investor.
01:41:22.000 The SEC, they can investigate you.
01:41:24.000 They can subpoena you.
01:41:25.000 They can prosecute you.
01:41:25.000 They can do all these things.
01:41:27.000 But they don't have to do any of those things to really damage you.
01:41:29.000 All they have to do is they issue what's called a Wells Notice.
01:41:32.000 And the Wells Notice is a notification that you may be charged at some point in the future.
01:41:36.000 Whoa.
01:41:37.000 You're like on notice that you might be doing something wrong and they might be coming after you at some point in the future.
01:41:41.000 Oh, my God.
01:41:42.000 Terrifying.
01:41:43.000 That's the eye.
01:41:44.000 The eye of Sauron is on you.
01:41:46.000 Now, trying to be a company with a Wells notice doing business with anybody else.
01:41:50.000 Oh, my God.
01:41:51.000 Right.
01:41:51.000 Try to work with a big company.
01:41:53.000 Try to get access to a bank.
01:41:54.000 Try to do anything.
01:41:54.000 So that's when they support DEI initiatives?
01:41:57.000 Yeah.
01:41:58.000 Well, then the SEC under Biden became a direct application of...
01:42:03.000 Exactly.
01:42:04.000 So DEI... They did a lot with that, and then all the ESG stuff.
01:42:07.000 And ESG is a very malleable concept, and they piled all kinds of new requirements into that.
01:42:11.000 So through this process, the SEC could basically just simply dictate what companies do with no accountability at all.
01:42:18.000 There are hearings where they get yelled at, but nothing ever happened in a hearing that ever changed anything.
01:42:26.000 It was just the raw application of power.
01:42:28.000 Right.
01:42:29.000 Trevor Burrus And this is your friends.
01:42:31.000 This has happened too.
01:42:32.000 Trevor Burrus Oh yeah, for sure.
01:42:32.000 Yeah.
01:42:33.000 Like I said, we had an employee who got debanked because he had crypto in his job title.
01:42:36.000 He was doing crypto policy for us and his bank booted him because he – Trevor Burrus That's it?
01:42:41.000 Trevor Burrus Because they did a screen across – it's what they told us is they did a screen across their customer base.
01:42:47.000 Trevor Burrus Just anyone with crypto.
01:42:48.000 Trevor Burrus Because anybody with crypto became a politically exposed person.
01:42:52.000 Wow.
01:42:52.000 Because crypto was politically controversial, right?
01:42:55.000 That's so crazy.
01:42:56.000 You hear this sometimes as like these terms, compliance, reputation management, tone at the top.
01:43:04.000 They have these lovely sounding terms that make it sound like everybody's going to be an upstanding citizen.
01:43:09.000 But what they're all code for is destroy the enemy.
01:43:11.000 Yeah.
01:43:12.000 Like bring the hammer of God and the bank and the government or whoever or the social media, bring it down and just like crush the individual.
01:43:20.000 Wow.
01:43:20.000 With no due process.
01:43:21.000 And look, there's an argument in the long run that this is all unconstitutional because the Constitution gives us all the right to due process and this is government pressure and there's no...
01:43:28.000 So like there's probably a Supreme Court case in five years that's going to find retroactively that this was all illegal.
01:43:33.000 But in the moment when you're the guy who's been debanked, I mean, number one.
01:43:36.000 And then also the potential that if you do challenge them in court and lose, the repercussions would be even heavier.
01:43:43.000 Exactly.
01:43:43.000 Yeah.
01:43:44.000 100%.
01:43:44.000 Is it really worth your effort?
01:43:46.000 Yeah.
01:43:46.000 Is it worth the risk?
01:43:47.000 That's right.
01:43:48.000 Especially if you've already had your life upended.
01:43:50.000 You ready to do it again?
01:43:51.000 Yeah, that's right.
01:43:51.000 When you barely built yourself back up?
01:43:53.000 Yeah.
01:43:53.000 So this is, and I think this is important context where like when Elon and Vivek talk about like reducing regulation, you know, there's two ways of thinking about reducing regulation.
01:43:59.000 It's like, oh my God, the water in the air are going to get dirty and the food's going to get poisoned.
01:44:02.000 Right.
01:44:03.000 Now, some of those regulations, I think, are very important.
01:44:06.000 But the other way to think about it is examples like this, which is just raw government power being applied to ordinary people who are just trying to live their lives, are just trying to do something legitimate, and they're just on the wrong side of something that the people in power have decided.
01:44:19.000 Well, there's something that isn't illegal, but they don't want to be done like crypto.
01:44:24.000 Exactly.
01:44:24.000 Like crypto, or having the wrong political points of view.
01:44:26.000 Well, the trucker, you know, the other great example is the trucker strike up in Canada.
01:44:30.000 It was an even more direct version of this because here you had truckers physically showing up.
01:44:34.000 And it was something like step one was they take away your driver's license.
01:44:37.000 Which, by the way, right, it's just somebody pressing a button on a keyboard.
01:44:40.000 No more driver's license.
01:44:41.000 Step two is they take away your insurance.
01:44:42.000 And step three is they take away your kids.
01:44:45.000 Right.
01:44:45.000 And so like that was their version of this and that was a very specific – Take away your kids.
01:44:50.000 That was the threat at the end to the truckers and the Canada trucker strike because the trucker strike in Canada was going to jam up these cities because it was – the farmers were – the truckers were very serious.
01:44:59.000 They wanted to – they were doing a nonviolent protest but they wanted to stall the cities to be able to exert political pressure back on the government.
01:45:06.000 Right.
01:45:06.000 And the government is like, we'll tolerate it for a little while.
01:45:09.000 Then we'll take your trucker license.
01:45:11.000 Then we'll take your insurance.
01:45:11.000 Then we'll take your kids.
01:45:12.000 How do they say they will take their kids?
01:45:14.000 Because it's administrative power.
01:45:16.000 Like you can't – right.
01:45:18.000 The theory would be you can't let – these aren't good parents if they're sitting in a truck in the middle of Calgary preventing goods and services from reaching people, right, putting people's lives at risk.
01:45:27.000 Wow!
01:45:27.000 You know, child seizure.
01:45:29.000 Now, I don't know if they actually seized any kids, but it's just an example of there is an agency in the Canadian government, just like in the U.S. government, that if they want to, they can take your kids.
01:45:37.000 Well, they were doing debanking there with people who donated to the trucker convoy, which is even crazier.
01:45:42.000 That's right.
01:45:43.000 Not even people who were there.
01:45:44.000 People who were opposed to the mandates that Trudeau's administration was imposing on people.
01:45:48.000 And so they donated to these truckers, and then they got their bank accounts taken away, which is really crazy.
01:45:55.000 Yeah.
01:45:56.000 Exactly.
01:45:57.000 I think the right way to think about this is when we think about totalitarianism, we think about literally World War II. We think about Nazis in jackboots with tanks and guns and beating people up and killing people.
01:46:10.000 You might call it that hard totalitarianism.
01:46:13.000 That's very clearly violent totalitarianism.
01:46:15.000 But there's this other version you might call soft totalitarianism, which is just rules and power exercised arbitrarily that just simply suppresses everything, right?
01:46:26.000 And this is speech control and debanking and all these other things that we've been talking about.
01:46:30.000 And that is, you know, the good news is they're not coming up and like beating you up in the middle of the night.
01:46:34.000 The bad news is like you are under their complete control and they can do whatever they want to you that doesn't involve physical violence, which basically includes the entire aspect of, you know, every aspect of how you actually conduct your life and support your family and get an income and everything else.
01:46:46.000 Trevor Burrus And most people aren't even aware of it.
01:46:48.000 Yeah, that's right.
01:46:48.000 And then, you know, like these are individual one-off things.
01:46:51.000 Most people don't have a voice.
01:46:53.000 It's very hard to organize around these.
01:46:55.000 And then by the way, if there's an organization that organizes to try to get these stories out, it then itself can get...
01:47:00.000 Suppressed in deep bank.
01:47:01.000 Well, it happened during the COVID lockdowns, right?
01:47:02.000 So the lockdown protests all got suppressed.
01:47:06.000 So the lockdown went from two weeks to crush the curve to two months to two years.
01:47:11.000 Right.
01:47:12.000 Which is like, okay, what the hell, right?
01:47:14.000 And then there were these protests that were forming up, nonviolent protests that were forming up to protest lockdowns.
01:47:19.000 And you could argue the issue different ways, but people have a legitimate right to protest for that, just like they do for anything else.
01:47:24.000 And the next thing you know is all the lockdown protests all got censored.
01:47:28.000 Like, just, like, boop, gone.
01:47:30.000 Right?
01:47:30.000 And so at that point, like, the normal process of being able to try to get redress from your government, right, for, you know, to force your rights to literally, for example, see your family all of a sudden.
01:47:40.000 Like, you can't even organize a protest.
01:47:42.000 How much are you aware of what happened with the FTX crisis?
01:47:47.000 Because one of the things that happened with the FTX thing was it was revealed that they were – I think they were the number two donor to the Democratic Party.
01:47:54.000 Do you think that that is sort of a preemptive measure to avoid any of this debanking and be financially invested in these people so they're not going to come after you?
01:48:06.000 That was explicitly his strategy.
01:48:10.000 Sam's approach was just pay everybody.
01:48:14.000 So Sam's approach was just, I have $8 billion of customer funds that I can use for whatever I want, which is the crime.
01:48:20.000 And then a big part of what he used, some of it he used to hang out with celebrities and get Tom and Giselle to endorse FTX and the Larry David commercial and all this stuff.
01:48:27.000 But a lot of that money, something like $150 million of that money went to basically just pay politicians.
01:48:32.000 And a lot of that money was paid to politicians with no compliance at all with all the campaign finance regulations that the rest of us all have to comply with.
01:48:40.000 And so the money was just shotgunned out the door.
01:48:42.000 How come they don't have to comply?
01:48:43.000 Well, it was illegal.
01:48:44.000 I mean, it was illegal because he was breaking the law.
01:48:46.000 I mean, to be clear, he was illegal.
01:48:49.000 Now, a very funny thing happened, which is when he was indicted by the U.S. government, they ended up not charging him on campaign finance fraud.
01:48:57.000 Because they'd have to give all the money back?
01:48:58.000 Well, so there's two theories on it.
01:49:00.000 The thing that they said was their extradition agreement with Bermuda, Bermuda threatened to not extradite him if they charged him on that charge, which is like super weird because you're the United— Number one, you're the United States of America.
01:49:11.000 You can probably get the guy.
01:49:13.000 Number two, did he really want to stay in a prison in Bermuda, right?
01:49:16.000 And so that was all weird.
01:49:17.000 And then, look, there's no evidence for this, but the other theory is, yeah, whoever are the powers that be that decide these things in D.C. decided to not open it.
01:49:25.000 It's like the Epstein client list.
01:49:26.000 Like, there are certain boxes— Yeah.
01:49:28.000 That are better not to open.
01:49:29.000 Well, the campaign finance thing, wouldn't they have to pay it back?
01:49:33.000 So then there's this like panic.
01:49:35.000 The minute one of these scandals breaks like that, there's this panic rush.
01:49:37.000 And all of a sudden, politicians discover philanthropic causes they can donate the money to.
01:49:42.000 Right.
01:49:42.000 And then, yeah, in the fullness of time, the trustees might come claw the money back.
01:49:47.000 So, yeah, it'll play out however it does.
01:49:50.000 But it is interesting.
01:49:51.000 It is a great example of...
01:49:53.000 It was the shotgunning of money into the system under like basically just like nakedly breaking the law.
01:49:58.000 Now look, the other argument is he's in prison.
01:50:01.000 He's in prison already.
01:50:01.000 Like whatever.
01:50:02.000 It just would have been another sentence.
01:50:04.000 But like he did break the law and he was not actually charged on that and that prosecution has not happened and probably sitting here today never will.
01:50:10.000 What's really fascinating about him is that he was right.
01:50:13.000 And if they didn't come after him, he would have gotten all that money to those people.
01:50:19.000 It seems like it kind of turned around, right?
01:50:23.000 It didn't get him off the hook, though.
01:50:24.000 It didn't.
01:50:25.000 No.
01:50:25.000 Well, he still did something illegal.
01:50:27.000 He did, yeah.
01:50:28.000 Did he know it was illegal?
01:50:29.000 He is in prison.
01:50:31.000 I think it's really hard to get inside that guy's head.
01:50:34.000 I don't know that I can represent his mental state.
01:50:37.000 He'd be a fascinating podcast guest if he was out.
01:50:40.000 He flopped very hard at trial.
01:50:43.000 So he had an explanation, but the jury didn't buy it.
01:50:49.000 What was his explanation?
01:50:51.000 That it was all the money was all being invested and he was going to give it all back and it was all this and all these complicated theories around all this effective altruism and this and that and the other thing.
01:51:00.000 And the prosecution was just like it was the customer's money.
01:51:02.000 It wasn't your money.
01:51:03.000 Right.
01:51:04.000 Clearly.
01:51:05.000 Yeah.
01:51:05.000 And so I don't know.
01:51:07.000 Well, there's also amphetamines involved which definitely tend to skew your judgment.
01:51:13.000 I mean him and that lady were like – Sort of proponents of amphetamine use.
01:51:18.000 And there was some anti-Parkinson's drug they were taking that has a side effect of reducing your risk calibration.
01:51:24.000 Oh, dopamine agonists.
01:51:25.000 Yeah, one of those.
01:51:26.000 Yeah, like Re-Equip.
01:51:27.000 Yeah, something like that.
01:51:28.000 He was taking these patches.
01:51:31.000 That makes you do wild shit.
01:51:32.000 That also makes people gamble.
01:51:34.000 Yeah, there was a guy who won a lawsuit from GlaxoSmithKline because he took Re-Equip and became a gay sex and gambling addict.
01:51:42.000 Yeah, I think they paid him the equivalent of like 500 plus thousand American dollars.
01:51:47.000 I believe it was in Ireland.
01:51:49.000 Yeah.
01:51:49.000 Yeah.
01:51:50.000 Dopamine agonists are weird.
01:51:52.000 They do strange things to people.
01:51:55.000 If that happened to me, I would definitely sue.
01:51:57.000 That's crazy that those guys were taking those things.
01:52:00.000 At least Sam was.
01:52:01.000 Boy, what a wild fella.
01:52:03.000 Yeah, MSAM. Confirmed.
01:52:05.000 He wears an MSAM patch.
01:52:07.000 What's an MSAM patch?
01:52:08.000 He's supposed to use the depression medication.
01:52:11.000 Oh, his supposed use of the depression medication had kicked up some rumors.
01:52:14.000 So that's the stuff?
01:52:16.000 That's the Parkinson's?
01:52:17.000 I think that was...
01:52:18.000 Is that a dopamine agonist?
01:52:19.000 Does it say?
01:52:20.000 I'm not sure.
01:52:23.000 I'll look it up.
01:52:24.000 Yeah.
01:52:26.000 See, you put dopamine agonist.
01:52:29.000 Yeah, Parkinson's.
01:52:30.000 There we go.
01:52:31.000 Yeah, interesting.
01:52:32.000 It's like related.
01:52:33.000 If it's not that, it's like a related class.
01:52:34.000 Interesting.
01:52:35.000 How does it work?
01:52:35.000 Does it say how it works?
01:52:38.000 Commonly used to depression.
01:52:40.000 How does it work, though?
01:52:45.000 Here we go.
01:52:47.000 Okay, it's an MAO inhibitor.
01:52:49.000 Interesting.
01:52:50.000 Used to treat mental depression in adults, this medicine is a monoamine oxide inhibitor.
01:52:55.000 That's a different one that says it's Sleduline.
01:52:56.000 Oh, that's Sleduline.
01:52:58.000 Oh, okay.
01:53:00.000 Yeah, that's Sleduline.
01:53:01.000 Sleduline is also, people take that as well as a nootropic, I've heard.
01:53:05.000 Yeah.
01:53:06.000 That's what it is.
01:53:06.000 So it is a ceduline.
01:53:09.000 Celagene?
01:53:10.000 Celagene?
01:53:11.000 Celagene?
01:53:12.000 I think it's ceduline.
01:53:13.000 I knew a doctor who was taking that.
01:53:15.000 He was taking it as a, but not in a patch.
01:53:18.000 He was taking it in a pill form, and he said it was a nootropic.
01:53:22.000 So monoamine oxidase inhibitor.
01:53:24.000 So that's the stuff that's the active, and that's what makes ayahuasca orally active.
01:53:30.000 Same thing.
01:53:31.000 A monoamine oxidant inhibitor along with the plant that contains dimethyltryptamine, which is not normally orally active.
01:53:38.000 So this guy, if he was doing drugs and taking MAO inhibitors, he was out of his fucking mind.
01:53:44.000 Guaranteed.
01:53:45.000 Because I know people who have taken prescription-grade MAO inhibitors and then taken mushrooms and literally almost never came back.
01:53:53.000 Like, got to the point where for weeks they were fucked up, and then when they did come back, they were like...
01:53:59.000 I almost lost it.
01:54:00.000 Like, I was almost gone, gone.
01:54:01.000 Like, you know, like the dude from Pink Floyd.
01:54:04.000 Like, never coming back.
01:54:05.000 Shine on, you crazy diamond.
01:54:06.000 You're gone.
01:54:07.000 And that happens to people.
01:54:09.000 So this fucking kid with billions of dollars in people's money is taking those kinds of medications and amphetamines and who knows what.
01:54:18.000 Yeah.
01:54:19.000 You know, he had an on-staff psychiatrist who was prescribing all this stuff.
01:54:22.000 Wonderful, like Hitler.
01:54:23.000 An inside guy.
01:54:24.000 Exactly.
01:54:26.000 Once again.
01:54:27.000 Once again, back to Hitler.
01:54:30.000 That's so crazy.
01:54:31.000 What a wild boy.
01:54:32.000 Are you following the theories that are now emerging around Ozempic and psychological changes that Ozempic causes?
01:54:38.000 No, but I did read that it makes your heart shrink.
01:54:41.000 Well, there's some theory to that, which is very concerning.
01:54:43.000 But there's a fair amount of evidence that it resolves alcohol addiction, certain forms of drug addiction, and gambling addictions.
01:54:50.000 And the current theory is that what it does is it basically, it essentially increases your self-control, your self-discipline, and it reduces cravings.
01:54:58.000 Wow.
01:54:59.000 And there's a theory that this is very positive.
01:55:02.000 Let's say this is true, which is what they think right now.
01:55:04.000 We'll see, but that's what they think.
01:55:05.000 So the theory that it's positive is the theory that, you know, if we were all more responsible in our lives, we'd all be more successful and society would go better.
01:55:12.000 Counter-argument would be, like, responsible is only part of living, and it's only part of what makes a society work, and we also need risk-taking, and we need creativity, and we need impulsiveness, and we need variety, and maybe we're all going to get into a channel.
01:55:26.000 Right.
01:55:27.000 Right.
01:55:27.000 And maybe we're not going to like where that just by itself ends up.
01:55:30.000 Yeah, you can't have everybody disciplined.
01:55:32.000 You have to have wild fuckers out there.
01:55:33.000 Yeah, that's right.
01:55:34.000 You have to have your jelly rolls of the world.
01:55:36.000 You have to be crazy people.
01:55:37.000 They're fun.
01:55:38.000 They make things more interesting.
01:55:39.000 Yeah, that's right.
01:55:40.000 So it's essentially discipline in a pill form or an injectable form.
01:55:46.000 Yeah.
01:55:46.000 It's been very helpful.
01:55:47.000 We're increasingly starting to prescribe it to alcoholics, and apparently it's working quite well.
01:55:51.000 That's crazy.
01:55:52.000 Well, that brings me to Ibogaine, which is the one thing that has the most success for people with addictions, and it's illegal in this country.
01:56:01.000 People go down to Mexico and go to these Ibogaine retreats.
01:56:04.000 I haven't done it, but it's apparently this insane, introspective journey that's very uncomfortable, and it lasts about 24 hours.
01:56:11.000 It's not something that's addictive in any way, shape, or form.
01:56:14.000 Almost everyone says it's a very uncomfortable experience.
01:56:17.000 But you gain unbelievable insight into what is wrong with you that makes you want to pick up heroin.
01:56:24.000 Like, what's going on in there that you're trying to escape?
01:56:27.000 Like, what is this?
01:56:27.000 And it recognizes that pathway and puts a chemical stop there.
01:56:32.000 It actually like stops people from having addictive cravings and it rewires the way they think about things.
01:56:38.000 Particularly beneficial to veterans.
01:56:40.000 A lot of veterans who have just seen way too much and come over and they're all fucked up and they don't have any way to straighten their brain out and they've had tremendous benefits using that.
01:56:51.000 You know, I wonder with particularly with these these Ozempics and Wegavi and all these different types of weight loss Diabetic drugs.
01:57:03.000 I wonder if there's a way to mitigate these side effects.
01:57:07.000 Because, you know, when I've talked to people that think that, like, my friend Brigham, Brigham Bueller, who runs Waste Well, he's concerned about side effects of it.
01:57:18.000 But he's also, he looks at people that are...
01:57:21.000 Just morbidly obese.
01:57:22.000 And he's like, these people, they need some fucking help.
01:57:26.000 They've gone down this terrible road.
01:57:28.000 Yes, they shouldn't have done it.
01:57:30.000 Yes.
01:57:30.000 Okay, we all agree to that.
01:57:32.000 Don't eat pie all day.
01:57:33.000 But if you've gotten to 500 pounds, you're probably in a bad state and you could probably use some help.
01:57:39.000 And maybe that could get them back on track.
01:57:42.000 Maybe there's a way with maybe strength training, because one of the things is they lose a large percentage of muscle mass and bone density.
01:57:50.000 Maybe that can be mitigated with strength training.
01:57:52.000 Maybe it's one of those things that if you're going to get on a Zempick, you must lift weights three times a week, which is, that might be it.
01:57:58.000 I mean, if it's just losing tissue, there's certainly, that's relatively easy to fix.
01:58:06.000 That's right.
01:58:07.000 And by the way, there's a ton of R&D going into these drugs right now, so there's going to be many more versions of these things.
01:58:12.000 I'm hopeful that we could develop something where no one can ever be obese again.
01:58:16.000 That would be really interesting.
01:58:17.000 I mean, maybe this is just the first steps of this, right?
01:58:19.000 And then, like, these are crude versions of what will ultimately be a very comprehensive way of addressing an issue like that.
01:58:25.000 So the other thing I'd say, so I've been down in Florida the last couple weeks working on some of the, you know, stuff happening down there.
01:58:29.000 And one of the things I learned is that the RFK, the RFK is really in charge of health for the country from here, you know, for like, he's really in charge, you know, working with the president.
01:58:39.000 And he, you know, for all the controversy around some of his positions, like he's, you know, this whole, like, he's very serious about this.
01:58:47.000 And a lot of people, including a lot of the most qualified people I know in the field are like, yes, it is long overdue that we look at the food system.
01:58:53.000 Yes.
01:58:54.000 And we look at all these, just whatever, to your point, the horrible track that we've been on for 40 years is just a complete catastrophe.
01:59:01.000 And I think it's a, there's this concept in psychology called common knowledge, which is, it's like, it's something that everybody knows, but yet nobody states out loud.
01:59:09.000 And so it's like, it's like known, but then all of a sudden there's a tipping point and all of a sudden it's not only known, but it's like obvious, all of a sudden everybody agrees on it.
01:59:15.000 Yes.
01:59:16.000 And this feels like one of those moments where it's like nutrition, behavioral, you know, exercise, like the path that people are on to become obese.
01:59:23.000 Like, no.
01:59:24.000 This actually needs to be addressed.
01:59:26.000 This is actually a profound issue.
01:59:28.000 We're on the road to hell and it has to get fixed.
01:59:30.000 Maybe it gets fixed chemically and maybe it gets fixed behaviorally or other things.
01:59:34.000 Maybe the culture has to change, but it has to get fixed.
01:59:36.000 I've been very encouraged.
01:59:39.000 I think this is now going to be a very big focus here.
01:59:41.000 And not just by the government, but I think also in the culture.
01:59:44.000 I agree, and I'm very encouraged as well.
01:59:46.000 And I think as we were talking before about a sort of a shift in perspective of the country, I think a shift in perspective of the country towards that being something that you should strive towards, I think that's coming too.
01:59:57.000 I think that's happening right now.
01:59:58.000 One of the happiest moments for me is when I run into someone and they said they were inspired to get fit and healthy from listening to me talking about the benefits of it.
02:00:08.000 And I've talked to so many people that have lost 100 pounds, 150 pounds, They're exercising regularly.
02:00:14.000 They eat healthy.
02:00:15.000 It's fantastic.
02:00:16.000 It's one of my favorite things when I run into people that are fans of the podcast.
02:00:20.000 So one of my theories on this is that part of this what happened is something very specific happened during COVID, which is the public health people by and large looked very unhealthy.
02:00:31.000 Yes.
02:00:31.000 Right.
02:00:32.000 They didn't look good.
02:00:33.000 Right.
02:00:34.000 And so you've got these people standing up there telling everybody how they've got to do all the lockdowns and the masks and all that stuff.
02:00:39.000 Yeah.
02:00:40.000 Bill Gates should get jacked.
02:00:41.000 That would be very helpful.
02:00:42.000 He's got a lot of money.
02:00:43.000 It would be extremely helpful.
02:00:45.000 Get a trainer.
02:00:46.000 When he writes the book and goes on the press tour to talk about public health.
02:00:50.000 Stop eating fake meat.
02:00:50.000 Get a trainer.
02:00:52.000 That would be great.
02:00:53.000 By the way, it'd be great for him and his family and society.
02:00:56.000 It would be very reassuring.
02:00:57.000 If Bill Gates had a six-pack, I'd listen to him more.
02:00:59.000 That, I think, would be absolutely fantastic.
02:01:03.000 And so, like, it's just this thing.
02:01:05.000 It's just like, well, of course.
02:01:06.000 Like, yes, the people who are telling us all how to live and eat ought to be healthy.
02:01:11.000 Right.
02:01:11.000 And if they're not, like...
02:01:12.000 Clearly.
02:01:13.000 And that's where RFK comes in play.
02:01:15.000 100%.
02:01:15.000 He looks fantastic.
02:01:15.000 He looks great.
02:01:16.000 He looks great.
02:01:16.000 Yeah.
02:01:17.000 Yeah.
02:01:17.000 Super chat.
02:01:17.000 Like, yeah.
02:01:18.000 It's just like, wow.
02:01:19.000 Yeah.
02:01:19.000 We were taking pictures.
02:01:20.000 I'm like, dude, you're jacked.
02:01:21.000 We're going to put my arm on him.
02:01:22.000 I'm like, you're fucking jacked, dude.
02:01:23.000 Look at you.
02:01:24.000 Yeah, exactly.
02:01:25.000 Works out all the time at Gold's Gym in Venice.
02:01:27.000 There we go.
02:01:27.000 With jeans on.
02:01:28.000 Awesome.
02:01:29.000 Works out with jeans on.
02:01:30.000 That's old school.
02:01:30.000 I don't get that.
02:01:31.000 That's amazing.
02:01:32.000 That seems weird.
02:01:33.000 It seems like it gets in the way of your squats, unless you're wearing, like, origin jeans.
02:01:36.000 It's got a lot of stretchy fabric to it.
02:01:38.000 Stretchy jeans.
02:01:39.000 You'd have to give stretchy jeans, but even then, like, put some shorts on, you fucking weirdo.
02:01:43.000 Like, what are you doing, man?
02:01:44.000 No, it's like prison yard credibility.
02:01:47.000 It is.
02:01:47.000 It's fantastic.
02:01:48.000 It is a little street credy.
02:01:50.000 Old school.
02:01:51.000 You know, wearing Timbalands.
02:01:52.000 Yes.
02:01:53.000 Timbalands and a pair of jeans and doing your squats.
02:01:55.000 It's kind of crazy.
02:01:56.000 Exactly.
02:01:57.000 But the promotion of health is like, I don't know how anybody could be against that.
02:02:02.000 Do you want more energy?
02:02:03.000 Do you want more vitality in your life?
02:02:05.000 Well, you should be healthier.
02:02:06.000 It's like your body's a race car and you could choose if you work hard enough to jack up the horsepower.
02:02:12.000 Right.
02:02:12.000 You can make better brakes.
02:02:14.000 You can have a better fuel injection system.
02:02:16.000 The whole thing could work way better.
02:02:18.000 All you have to do is work at it.
02:02:19.000 And that is your vehicle for propelling you through this life.
02:02:22.000 It'll give you more energy for creativity, more energy for your family, more energy for your hobbies, your recreations, time with your friends.
02:02:30.000 You'll literally have more energy as a human, which is what we all like.
02:02:33.000 Nobody likes waking up and feeling like shit.
02:02:35.000 I mean, everybody's been hungover who's had a few drinks.
02:02:39.000 You wake up in the morning like, what am I doing?
02:02:40.000 I don't ever want to do this again.
02:02:42.000 Why did I do this to myself?
02:02:43.000 And then you can't wait for the day where you feel better.
02:02:45.000 Like you drink your electrolytes, you get your sleep, you do whatever the fuck you can.
02:02:50.000 And you're like, I'll be over this soon.
02:02:51.000 Oh, your head.
02:02:54.000 And, you know, everybody likes having more energy.
02:02:57.000 It's better for you.
02:02:58.000 And we can promote that as a society.
02:03:00.000 And this RFK Jr. appointment is a really big step in that direction that we've really never had before.
02:03:07.000 That's right.
02:03:08.000 You have to go back to, like, literally his uncle.
02:03:09.000 JFK had a program like this in, like, 1962. Yeah.
02:03:13.000 It's been a long time.
02:03:14.000 Well, Michelle Obama did for a bit, right?
02:03:16.000 A little bit, although that was, like, vegetarian, you know, getting into, like, vegetarian school lunches.
02:03:19.000 Oh, was she saying vegetarian?
02:03:20.000 I don't know if she was vegetarian, but, like, well, Eric Adams, you know, the mayor of New York, he's been trying to push vegetarian school lunches.
02:03:27.000 It's like, no.
02:03:28.000 That's not right.
02:03:29.000 No, that's not right.
02:03:30.000 It's so dumb.
02:03:31.000 I can't wait until they can figure out that plants really can think and feel.
02:03:34.000 Right, exactly.
02:03:35.000 Because they're real close.
02:03:35.000 They're real close to proving that.
02:03:37.000 They've demonstrated intelligence and allocation of resources through mycelium.
02:03:42.000 There's a lot of stuff that we know now about plants that we didn't know then.
02:03:46.000 I think they're all conscious.
02:03:47.000 I think everything's conscious.
02:03:48.000 I think we need audio recordings of the screams.
02:03:51.000 When you mow the lawn, it's just like Armageddon.
02:03:53.000 You know that they can play audio recordings of caterpillars eating leaves and it changes the flavor profile of all the plants around it?
02:03:59.000 Awesome.
02:04:00.000 Yeah.
02:04:01.000 They've done this because there's a phenomenon when giraffes, if giraffes are eating, if they are upwind...
02:04:08.000 And they're eating leaves as the wind comes down and gets to the other acacia trees.
02:04:13.000 The acacia trees, they'll come up with this phytochemical.
02:04:16.000 They produce a phytochemical that's disgusting to the giraffes.
02:04:20.000 And the giraffes will literally starve because they won't eat those trees.
02:04:23.000 And they do this somehow or another through communication.
02:04:25.000 It's like they're preventing war.
02:04:27.000 They're being attacked by mammals.
02:04:29.000 And they're like, we have to stop the attack.
02:04:31.000 And nature has provided them with this mechanism to do that, which is really crazy.
02:04:35.000 That's amazing.
02:04:36.000 So back to the doge for a moment.
02:04:38.000 So one of the reasons why everybody became unhealthy is because the government directly put itself into the food system, and specifically high-fructose corn syrup.
02:04:47.000 Right.
02:04:47.000 High-fructose corn syrup was an artifact of government agriculture subsidies, right?
02:04:51.000 Right.
02:04:51.000 Which was good during World War II because we needed food.
02:04:54.000 At one time.
02:04:55.000 Yeah.
02:04:55.000 Right.
02:04:55.000 But like by the 1970s, we were massively overproducing – specifically, we were massively overproducing corn.
02:05:00.000 And the corn lobby, the sort of agriculture lobby became very powerful.
02:05:04.000 And we have this government agency.
02:05:05.000 One of the 450 government agencies is the USDA. And the USDA has a dual mandate.
02:05:10.000 It's to promote U.S. agriculture, specifically things like corn, and it's also to advise us on what we should eat.
02:05:15.000 And they also do the food pyramid.
02:05:17.000 And that's why the food pyramid is upside down, right, for all those decades, where we're supposed to eat carbs and not protein and fat, was because literally that's the agency that's responsible for promoting agriculture.
02:05:27.000 And then that agency inserted itself through laws, regulations, and this kind of administrative pressure, and basically said, thou shalt use high fructose corn syrup because it is a byproduct of corn.
02:05:58.000 As opposed to sugar.
02:05:58.000 Right.
02:05:58.000 And then it's essentially like it's an evolutionary thing that like where bears would eat like a bunch of berries to get fat for the winter.
02:06:06.000 It's like these high fructose corn syrup encourages you to over consume.
02:06:09.000 Yeah, we were not supposed to be eating this.
02:06:11.000 This was not supposed to happen.
02:06:12.000 It would not have happened.
02:06:13.000 Especially drinking it.
02:06:14.000 100%.
02:06:15.000 Yeah, 100%.
02:06:16.000 But this would not have happened had the government not made it happen.
02:06:20.000 And so it traces directly back to a government decision to do that.
02:06:23.000 Now, of course, they didn't understand the consequences, but that's kind of the point, which is they interfered without understanding the consequences.
02:06:28.000 And so that's the kind of thing where you look at it and you're just like, all right.
02:06:31.000 And then you're 40 years later and you're still doing it.
02:06:34.000 And then at some point, you know what the consequences are.
02:06:37.000 And then at some point, there's a question of whether they're being covered up.
02:06:39.000 Right.
02:06:40.000 Right.
02:06:40.000 And it's just like, okay, at some point, this has to stop.
02:06:42.000 Right.
02:06:43.000 And literally, they just need to stop.
02:06:45.000 Like, they just need to stop subsidizing CORE, and they need to stop forcing the food companies to do this.
02:06:49.000 They just need to stop.
02:06:50.000 And so this goes back to, like, the regulatory reform thing, which is, like, there's just, like, tremendous amount of this that may have been good intentioned at one point.
02:06:57.000 Yeah.
02:06:58.000 But sitting here today, we're living with these horrible downstream consequences.
02:07:01.000 And unless somebody steps in with a hammer...
02:07:03.000 None of this is going to happen.
02:07:04.000 And they also have the insane amount of money that's involved because RJ Reynolds, these tobacco companies, when they were getting sanctioned, they were getting in trouble, they decided, well, let's buy all these food companies.
02:07:15.000 And so now these same companies that lied about whether or not cigarettes are addictive and cause cancer, Now these same companies are pushing super unhealthy food on people, or at least selling super unhealthy food to people, which I think you should be allowed to buy.
02:07:30.000 I think you should be allowed to buy whatever the fuck you want.
02:07:32.000 I'm all for that.
02:07:33.000 But I do think we should be much more aware of what's actually going on, like you're saying, and why this stuff is in there in the first place.
02:07:41.000 Well, and then you get into these other, you know, more delicate questions, but it's like, okay, food assistance programs for, like, you know, low-income people and low-income children.
02:07:47.000 It's like, okay, should they be?
02:07:50.000 Do we want little kids who have no control over this to end up on the receiving end of this food production pipeline, paid for with government money and being 300 pounds by the time they're 18?
02:08:00.000 And cheaper than other foods.
02:08:01.000 And cheaper than other foods because they're subsidized.
02:08:03.000 Because they're subsidized.
02:08:05.000 And you have this very perverse outcome where you have these government officials who have been standing up there for 40 years saying, we're protecting you, we're protecting you, and what's been happening is they've been poisoning us.
02:08:13.000 And so stuff like it just needs to stop.
02:08:16.000 And that's where you need something like the doge.
02:08:20.000 And somebody like President Trump.
02:08:21.000 What would they be able to do to mitigate a lot of these issues?
02:08:26.000 Would you make it illegal to put high fructose corn syrup as an ingredient, or would you simply stop subsidizing?
02:08:36.000 How would that work within the government?
02:08:40.000 How would you apply something like that?
02:08:41.000 Yeah, I think there's three things you can do, two of which involve direct action, and then the third is maybe even the most important.
02:08:47.000 So one is you can just stop doing things that are harmful.
02:08:49.000 You can stop doing things.
02:08:50.000 The government can stop subsidizing bad things.
02:08:52.000 That's an example.
02:08:53.000 This is a parallel thing.
02:08:55.000 If you want to clean up the universities, you need to stop feeding them student loans, right?
02:08:59.000 So the government should stop paying for things that are clearly harmful.
02:09:02.000 So that's one.
02:09:03.000 And then two is, look, there may be a role for additional protections or prohibitions.
02:09:07.000 And so, for example, maybe you let people freely buy all the Oreos they want, but maybe you can't get them with food assistance programs so that kids who have no control over it are not being poisoned.
02:09:17.000 And so, you know, you maybe do that.
02:09:19.000 But I always think that the third thing is culture.
02:09:23.000 There's always a temptation with these discussions because the government's so powerful to talk about what the government does or doesn't do, and I think so much of this has to do with the culture.
02:09:30.000 It's actually upstream or downstream from politics, which is like, what is the cultural tone of the country?
02:09:36.000 What's the value system?
02:09:38.000 What are the role models?
02:09:40.000 What are people being inspired to do?
02:09:42.000 Also, what form of shaming is in effect?
02:09:44.000 What are we not going to tolerate?
02:09:47.000 Take the perverse, fat studies.
02:09:50.000 Are we going to glorify obesity, right?
02:09:53.000 No.
02:09:54.000 And that's not necessarily a legal judgment or a court case, but it's a cultural statement.
02:10:00.000 And it's not that the government should control the culture, but our leaders certainly play a big role in that.
02:10:06.000 Yeah.
02:10:07.000 And so both in and outside of government.
02:10:08.000 So for our leaders to step up at a moment like this and basically say, yeah, no, this is not the kind of culture we're going to have.
02:10:13.000 It's not the kind of society we're going to have.
02:10:14.000 It's not what kids should be looking up to, I think is just as powerful as the actual government actions.
02:10:19.000 It's interesting you're saying the kind of shaming, because I don't want to shame anybody for being fat, but boy, does that work.
02:10:25.000 Fat shaming works.
02:10:26.000 Maybe you should shame parents if their kids are fat.
02:10:28.000 Yeah.
02:10:29.000 The problem is there's so many people that are ignorant as to what exactly is going on.
02:10:34.000 Of course.
02:10:34.000 And that's absolutely required.
02:10:36.000 And they're being fed bullshit.
02:10:37.000 100%.
02:10:38.000 And yes.
02:10:38.000 But again, it's also cultural.
02:10:40.000 It's just like, okay, is the media educating people on this?
02:10:44.000 And if the mainstream media is not doing it right, should there be new media sources that are?
02:10:48.000 And then therefore, which sources in the media get respect?
02:10:51.000 So we have this giant collective culture question that we all get to ask and answer, and particularly those of us in a position to be able to send messages that a lot of people hear.
02:11:01.000 So that will help.
02:11:02.000 That will help move the needle.
02:11:04.000 But what specifically can RFK Jr. do once he actually gets in?
02:11:09.000 I mean, there's...
02:11:11.000 He's a secretary of HHS. He has a very broad ability to look at this holistically inside the government.
02:11:18.000 What kind of pushback is there going to be against that?
02:11:20.000 That seems like a wild amount of money is going to be lost.
02:11:24.000 Yeah.
02:11:24.000 So there's the work that the cabinet secretary is like he will be doing formally.
02:11:30.000 And then there's the work that the Doge and the president will be doing kind of in parallel with that.
02:11:34.000 And there will be some convergence between those.
02:11:36.000 And we'll see.
02:11:38.000 There's the potential here for quite dramatic action.
02:11:40.000 On a lot of these fronts.
02:11:41.000 Could you imagine if you're running an agency and you have to have a meeting with Vivek and Elon?
02:11:47.000 Yes.
02:11:47.000 And you got to open your books?
02:11:49.000 Yes.
02:11:50.000 Yes.
02:11:51.000 It's like office space where they brought in the bobs for consulting.
02:11:56.000 What do you do here?
02:11:57.000 Exactly!
02:11:58.000 That's exactly what it's like.
02:12:00.000 Is there a meme like that?
02:12:02.000 Is there a meme like that?
02:12:03.000 I think there's a meme where they take those guys and they put Elon and Vivek's heads on them.
02:12:07.000 Yes.
02:12:08.000 So there was another key timeline split that happened in Silicon Valley about two years ago, actually two and a half years ago when Elon, actually right before he took over Twitter, where he got in an email fight with the CEO of Twitter at the time, who's actually a guy who's a friend of mine who's a really good guy, but literally this guy had just been promoted from engineering to run the company,
02:12:25.000 and then like a month later he ends up trying to deal with the Elon situation, so kind of got a little bit sandbagged on it, but yes.
02:12:32.000 Yes.
02:12:34.000 Of course he said!
02:12:35.000 Elon Musk says he re-watched Office Space to prepare for Doge.
02:12:39.000 Of course he did.
02:12:41.000 Of course he did.
02:12:41.000 Fucking psycho.
02:12:44.000 Exactly.
02:12:45.000 God!
02:12:46.000 We're so lucky that guy's around.
02:12:47.000 Exactly.
02:12:48.000 So there was this moment in the Twitter takeover where Elon sends his email and the line is, what did you get done this week?
02:12:54.000 Whoa!
02:12:55.000 What did you get done this week?
02:12:56.000 And in the context of Silicon Valley companies, that was a provocative statement, because a lot of Silicon Valley companies take months or years to do anything.
02:13:04.000 But imagine that statement being applied to the government.
02:13:08.000 Oh my god!
02:13:09.000 Right?
02:13:09.000 Like, the level of accelerated, like, okay, what are the problems?
02:13:13.000 How are we going to fix them?
02:13:13.000 And what have you gotten done this week?
02:13:15.000 Yeah, you think debanking upended some lives?
02:13:18.000 Yes, exactly.
02:13:19.000 So, yes, what have you done this week?
02:13:20.000 And by the way, when Elon runs this, it's actually interesting.
02:13:22.000 A guy just tweeted or posted or zeted what it's like to work for Elon at his AI company, XAI. And he said, Elon came in last week and he said, Elon spent 18 hours at the office and in five-minute chunks.
02:13:33.000 And each person had a five-minute speaking slot to explain to Elon what they were doing.
02:13:38.000 Wow.
02:13:39.000 And he did that for, you know, five times whatever, right?
02:13:42.000 18 hours.
02:13:43.000 Jesus Christ.
02:13:44.000 And so think about what that meant.
02:13:45.000 Every employee had an opportunity to tell the big boss what they were working on.
02:13:50.000 Every employee had an opportunity to be recognized for their effort.
02:13:53.000 Every employee had an opportunity to get live feedback from the big boss who had a comprehensive overview of everything as to what they should be doing.
02:13:59.000 Whoa.
02:14:00.000 And there's no place to hide.
02:14:02.000 Right.
02:14:03.000 I think of how different it is for a company to be run that way.
02:14:06.000 Right.
02:14:06.000 And even, again, the Valley companies generally are quite well run by sort of business standards, and even that, like, that's the level of intensity that most Valley companies aren't even close to.
02:14:14.000 Now, imagine that applied to government.
02:14:17.000 To government.
02:14:18.000 And again, this is the kind of thing.
02:14:20.000 There's no reason it can't be done.
02:14:23.000 There's no law that prevents that.
02:14:24.000 There's nothing in the Constitution that says you can't do that.
02:14:26.000 It's a choice.
02:14:28.000 How the government is run is a choice on the part of the executive branch of the president for how it's going to get run.
02:14:31.000 And there's no reason why the government can't literally be run this way.
02:14:35.000 And here's what's crazy.
02:14:36.000 The pushback against even the concept of this by leftists.
02:14:41.000 So leftists defending bureaucratic bloat and big government is wild to watch.
02:14:48.000 Which they really shouldn't be doing, which is a weird thing to have wedged themselves into.
02:14:52.000 My hope is they'll figure out how weird this is.
02:14:54.000 Do you think it's like just an ideological thing?
02:14:56.000 Like the right wants this so we oppose it?
02:14:58.000 I think the left thinks they control the government.
02:15:01.000 Like I think 50 years ago they would have been on the other side of this issue.
02:15:04.000 Like Noam Chomsky 50 years ago would have been on the other side of this.
02:15:08.000 He would have viewed government power as an extension of like the state and big business intertwined.
02:15:12.000 And you have this term manufacturing of consent, where it's like government and business are conspiring against you.
02:15:17.000 So he would have been on the other side of this.
02:15:19.000 But I think today's leftists think they control the government, which in many ways they do.
02:15:22.000 Well, so Washington, D.C. voted 94% for Kamala, 6% for Trump.
02:15:28.000 Okay, so two data points.
02:15:31.000 That is data point number one.
02:15:32.000 Data point number two, four of the 10 wealthiest counties in the country are suburbs of Washington, D.C. Wow.
02:15:37.000 Lobbyists.
02:15:38.000 Lobbyists.
02:15:38.000 They call them beltway bandits.
02:15:42.000 That's a crazy job.
02:15:43.000 Is the actual term.
02:15:45.000 And these aren't people working for the government.
02:15:47.000 These are people making money from the government.
02:15:49.000 These are people sponging off the government.
02:15:51.000 And so...
02:15:53.000 Yeah, to the extent that Democrats have wedged themselves into a position where they're defending this, they really shouldn't.
02:15:59.000 They should really rethink this.
02:16:00.000 They should figure out how to get back to the correct mentality on this that they used to have.
02:16:05.000 If there's less government bloat, then there's less tax dollars.
02:16:09.000 You don't need as much money to fund these things.
02:16:13.000 People can be taxed less.
02:16:15.000 There can be more allocation of these funds towards these social programs that we all want.
02:16:20.000 You know, most federal workers never came back to work.
02:16:22.000 Really?
02:16:23.000 Yeah, they work from home.
02:16:25.000 Most?
02:16:25.000 Most, yeah.
02:16:26.000 Like what percentage?
02:16:26.000 A very large percentage.
02:16:27.000 Something like half just literally just never came back.
02:16:30.000 Whoa!
02:16:30.000 And they still, by the way, still draw a paycheck, they're still on their jobs, but literally they're not in the office.
02:16:35.000 Or in some cases, they have an agreement where there's one agency, I probably won't name, but there's one agency where there's...
02:16:41.000 Okay, here's another great thing.
02:16:43.000 There are agencies of the federal government whose workforces have full civil service protections and unionized.
02:16:52.000 Entirely paid for by the taxpayer, but they both have civil service protections, which, by the way, are totally made up.
02:16:57.000 There's no concept in the Constitution of civil service protections.
02:16:59.000 It's just a totally made up thing.
02:17:01.000 And they're unionized.
02:17:02.000 And then there's a particular agency that I know of where the union agreement, the union negotiated the return of the office from COVID and the agreement was you have to be in the office one day a month.
02:17:09.000 Whoa!
02:17:10.000 And actually the pattern now is what they do is the employees come in on the last day of the month and the first day of the following month.
02:17:15.000 So they only have to be there for two days.
02:17:17.000 For two months.
02:17:18.000 Out of 60 days.
02:17:19.000 That's crazy.
02:17:21.000 As a consequence, many of them have actually left the area, right?
02:17:25.000 Because they get their government paycheck, which is calibrated for living there, and then they go live someplace nice.
02:17:28.000 You know, someplace nice, but, you know, they go live in the Ozarks or something, where the cost of living is cheaper, and they have a bigger house.
02:17:34.000 And, you know, in theory, they're working from home, but like, you know...
02:17:38.000 Yeah.
02:17:42.000 Yeah.
02:17:55.000 Right.
02:17:56.000 And as a taxpayer, how do you feel about that?
02:17:59.000 And to your point on paying taxes, if those people are in the office and they're dynamos of activity and they're making the country better, fair enough.
02:18:07.000 Of course.
02:18:07.000 But if they're kicking it at home, maybe not.
02:18:10.000 Yeah, maybe not.
02:18:11.000 And how much oversight has there been on whether or not they've been kicking it?
02:18:14.000 Excellent question.
02:18:15.000 Yeah.
02:18:16.000 Now, it turns out there are ways to figure this out.
02:18:20.000 So, for example, for many jobs where you have to log in to be able to get access, like to email, often you have VPNs to get into the corporate network.
02:18:28.000 You can actually audit and you can see who's been working.
02:18:32.000 And then there's a...
02:18:33.000 Do you know about mouse wigglers?
02:18:36.000 Yes.
02:18:37.000 Yes.
02:18:38.000 Programs.
02:18:38.000 No, actually physical.
02:18:40.000 Oh, they're physical mouse wigglers now.
02:18:42.000 Yeah, physical mouse wigglers.
02:18:43.000 And so it's a physical device that holds your mouse and then intermittently wiggles it.
02:18:49.000 And a friend of mine who runs a big tech company, he just had like a nagging feeling in the back of his head that maybe all of his remote workers weren't pulling their weight.
02:18:58.000 He actually wrote himself on a weekend an algorithm to inspect all the mouse movements of all his employees for a week, and then he bought all 50 mouse wigglers from China that you can buy, and he fingerprinted them all, and he found that he had a whole bunch of employees who were using mouse wigglers.
02:19:12.000 Wow!
02:19:13.000 And so how many federal employees are using mouse wigglers?
02:19:15.000 How crazy is that that's how they can measure whether or not you're active?
02:19:19.000 Whether your mouse is moving?
02:19:21.000 What are they seeing?
02:19:23.000 Just a pattern of movement of the mouse?
02:19:26.000 That's it?
02:19:27.000 Well, the mouse wiggler is moving in a way that you can fingerprint.
02:19:29.000 Do you agree to a certain amount of disclosure of your personal information while you're working?
02:19:36.000 How do you get access to mouse wiggles?
02:19:39.000 Oh, so it's very common.
02:19:40.000 So in corporate environments, it's very common that your company-issued computer has some kind of software on it that lets the company control the software and gives the company some level of visibility to what you're doing.
02:19:50.000 And that doesn't mean they're literally washing you, but it means that they have the ability to kind of reach in and be able to see how much is the computer on is the most moving.
02:19:59.000 And so that's actually a reasonably common thing.
02:20:01.000 I heard the most ridiculous argument against this.
02:20:03.000 They're like, what are you going to do with all those employees that get fired?
02:20:07.000 Like...
02:20:08.000 What are you going to do with all those people who are stealing hubcaps?
02:20:10.000 They're making a living stealing.
02:20:11.000 What are you going to do if you make hubcaps stealing illegal?
02:20:13.000 What are you talking about?
02:20:15.000 They're essentially stealing tax dollars.
02:20:17.000 If they really are doing something that's totally useless, and we're wasting enormous amounts of money on this every year, the argument that what are you going to do if those people can't do that anymore is really crazy.
02:20:29.000 Yeah.
02:20:30.000 Well, the answer is they can do something productive.
02:20:31.000 Yeah.
02:20:32.000 And people are more than capable.
02:20:34.000 You don't have to infantilize someone to say like this is the only thing they're capable of doing.
02:20:38.000 They've worked for the government for 20 years.
02:20:40.000 This is all they can do.
02:20:41.000 Yeah.
02:20:42.000 And then by the way, there's multiple knock-on – positive knock-on effects.
02:20:45.000 If you can cut government spending, there's multiple knock-on effects.
02:20:47.000 So one is if you cut the spending, you can cut the taxes and you can just – the private economy then just simply has more money because it hasn't been taken.
02:20:53.000 And so if there's less public spend, there will be more private spend.
02:20:57.000 Right.
02:20:57.000 Because the money reallocates.
02:20:58.000 And so there might be just as much demand in the economy.
02:21:00.000 It's just coming from people choosing to buy things instead of the government forcing it.
02:21:03.000 So that's number one.
02:21:05.000 Number two, you can bring down government debt, which means you can bring down government interest.
02:21:09.000 And the government today, the federal government today, pays more in interest than we pay for the Department of Defense.
02:21:14.000 Right, but how much of that is salary?
02:21:16.000 No, no, that's just interest on the debt.
02:21:18.000 Right.
02:21:18.000 That's just interest on the old debt.
02:21:20.000 Okay.
02:21:20.000 We pay like $1.2 billion a year right now, I think is the latest number, which is just interest on debt.
02:21:24.000 It's not paying for any good or service.
02:21:26.000 It's just interest on debt.
02:21:27.000 But again- What percentage of that is the- What?
02:21:30.000 Of the GDP? Well, so the total government spending is on the order of $7 trillion.
02:21:36.000 Interest payments are like $1.2 trillion, something like that.
02:21:38.000 $1.2 trillion a year.
02:21:39.000 I think that's the current number.
02:21:40.000 DOD is $800 billion a year.
02:21:42.000 So $1.2 trillion.
02:21:43.000 Just off the top.
02:21:44.000 Yeah, just off the top.
02:21:45.000 And again, nobody's benefiting from that.
02:21:47.000 It's just interest payments.
02:21:48.000 That's bananas.
02:21:49.000 Right.
02:21:49.000 And total GDP is like, I don't know.
02:21:53.000 It's $20, $30, $40 trillion.
02:21:54.000 It's much larger than that.
02:21:56.000 But still.
02:21:56.000 It's enough.
02:21:57.000 This is a lot of money.
02:21:58.000 And the total accumulated debt is $35 trillion.
02:22:02.000 The total accumulated debt is $35 trillion and it adds another trillion of accumulated debt every 100 days.
02:22:10.000 Yes?
02:22:12.000 Oh my god, it hurts my head.
02:22:14.000 There's a congressman actually, Thomas Massey.
02:22:16.000 So he's the one guy in Washington who talks about this and he's one of the only libertarians and he's an MIT engineer and he actually designed himself a pocket Lapel pin calculator of the government debt, and he wears it every day in Washington, D.C. So he walks around with this scroll?
02:22:32.000 He walks with a little scrolling LED display on his lapel, and it literally counts.
02:22:37.000 It counts the debt, and it's accurate.
02:22:39.000 It's pulling data from the U.S. Treasury, and it's actually an accurate count.
02:22:41.000 And so it's like $34 trillion, $35 trillion, $36 trillion.
02:22:44.000 Here's the kicker.
02:22:45.000 At the current pace, at the compounding, the debt will cross $100 trillion in the foreseeable future.
02:22:50.000 So he's already working on the redesign because he needs a bigger device with a bigger screen to be able to display the bigger number.
02:22:56.000 How much anxiety do you get standing around him looking at that thing?
02:22:59.000 That's his goal, right?
02:23:00.000 Because otherwise, the status quo in Washington is just let this happen.
02:23:04.000 Right.
02:23:04.000 And so anyway, so another way you benefit is reduction of interest.
02:23:07.000 And then another way you benefit is reduction of interest rates.
02:23:09.000 If you bring down the amount of debt in the economy, you bring down interest rates.
02:23:12.000 And then everybody else who buys things, when you go to buy for a house, your mortgage is cheaper.
02:23:17.000 Right.
02:23:17.000 So anybody who ever borrows money in the real economy then therefore is better off.
02:23:21.000 Right.
02:23:22.000 This is the argument against it being only good for wealthy people.
02:23:26.000 Oh, it's good for everybody.
02:23:27.000 Right.
02:23:27.000 Yeah, it's good for anybody who ever gets car loan, home loan, small business loan.
02:23:31.000 You want to bring down interest rates.
02:23:33.000 But this fundamental discussion of it, like the argument, particularly from the left, is that all these tax cuts, deregulation, all this is going to do is make Trump supporters and Trump's people wealthier, and it's going to ruin the middle class and ruin the lower class.
02:23:49.000 Everyone else is going to suffer.
02:23:50.000 So just observationally, almost all the rich people in our society were for Kamala.
02:23:55.000 Really?
02:23:56.000 Yeah.
02:23:56.000 The Democratic Party – so Democrat, Republican – it's what they call – it's a political scientist called top plus bottom versus middle is the configuration.
02:24:05.000 So the Democratic Party is the top and the bottom versus the middle.
02:24:08.000 So the top is what you might call the sort of upper middle class coastal elites.
02:24:12.000 So it's everybody who went to the fancy schools.
02:24:13.000 It's everybody with the fancy jobs.
02:24:15.000 It's for sure me.
02:24:16.000 I guess your grandfathered in.
02:24:18.000 Yeah.
02:24:18.000 Right?
02:24:19.000 But it's like – it's like fancy – it's like high net worth, high income people with primarily knowledge working jobs.
02:24:27.000 Professor, reporter, programmer, database expert, author, lawyer, accountant, banker, all the sort of, quote, elite jobs.
02:24:37.000 And all the elite degrees, by the way, who all went to the top schools and got the elite degrees.
02:24:42.000 So that's the top.
02:24:43.000 And then the bottom is what you call the clientele underclass.
02:24:47.000 They call it the rainbow coalition.
02:24:50.000 So it's the minority groups.
02:24:52.000 Right.
02:24:52.000 And so it's the assembly of low-income African-Americans, low-income Latinos, dot, dot, dot, dot, dot.
02:24:58.000 Recent immigrants.
02:24:58.000 Recent immigrants and so forth.
02:25:00.000 Right.
02:25:00.000 And so that's the Democratic coalition that they explicitly program against.
02:25:03.000 And then Republicans in our era, Republicans are in the – it's the middle class, lower middle class.
02:25:09.000 It's all the people who don't have the fancy degrees and that are doing all the actual work that's basically making the country run.
02:25:15.000 Right.
02:25:15.000 Right, so it's everybody from the small business owner, the restaurateur, you know, truck drivers, farmers, you know, all the way, you know, garbage men and janitors.
02:25:25.000 It's like everybody who goes to work nine to five, has a job, probably either small business or a physical job.
02:25:32.000 You know, it's sort of labor, like real labor, like actual labor, calluses on the hands, right, kinds of stuff.
02:25:38.000 So kind of the so-called real economy, which is why, right, the Republicans are concentrated in the center and the south.
02:25:42.000 Because that's where all those things are.
02:25:44.000 And then Democrats are concentrated in New York and California and on the coast, which is where all the symbolic, creative, intellectual jobs are.
02:25:51.000 And so the weird thing that's happened is liberalism, progressivism started speaking for the working man.
02:25:59.000 Like 100 years ago, it spoke for the working man.
02:26:01.000 And now what's happened is there's been a complete reorientation where the working man...
02:26:05.000 Has separated out.
02:26:06.000 And then you saw that in this most recent election where the unions, the union leadership still for the most part endorsed Kamala, but the rank and file voted majority for Trump in a lot of cases.
02:26:17.000 And the data point that I remember is the Teamsters voted 70% for Trump.
02:26:21.000 What do you think the motivation of all these wealthy people to vote for Kamala Harris was?
02:26:25.000 Because they feel great.
02:26:27.000 Because they're saving the world.
02:26:29.000 That's what it is.
02:26:30.000 It's amazing to be in charge and control society and decide how everything works and decide who's good and who's bad.
02:26:37.000 And like you're elite.
02:26:38.000 You get to be the elite.
02:26:39.000 You get to make the elite decisions.
02:26:41.000 And if you want to be in that group, you have to.
02:26:43.000 You got to do this.
02:26:44.000 And you feel good about yourself because you feel like what you're doing is on behalf of your clientele.
02:26:51.000 It's reinforced by the echo.
02:26:52.000 Trevor Burrus Yeah, and if you read the New York Times, it's either – New York Times only has two articles anymore.
02:26:59.000 It's either how evil are Republicans or how innocent and helpless are poor or grave minorities or identity groups, right?
02:27:07.000 And so oppositional force and then – but we're the party of good with a capital G because we're taking care of all these poor, marginalized people.
02:27:13.000 So it's a very compelling...
02:27:15.000 You feel great about yourself, right?
02:27:17.000 It's just absolutely amazing.
02:27:18.000 And then, by the way, it just so happens that the economy is wired up in a way where you're getting paid a ton of money for not working very hard and it's all great.
02:27:26.000 And then you're completely isolated away from the lived experience of just normal people, which is the state that I found myself in, where it would never even occur to you to talk to a garbage man or to somebody running a restaurant or whatever because But it's just like you're not affected by the rising crime rates.
02:27:44.000 Right.
02:27:44.000 You live in a safe neighborhood.
02:27:45.000 Right.
02:27:45.000 And you've got, you know, you're against the wall on the border, but you've got a wall around your house.
02:27:49.000 Right.
02:27:49.000 Right.
02:27:50.000 And so you just, you're in this bubble.
02:27:52.000 Uh-huh.
02:27:52.000 And then you only ever talk to people who agree with you.
02:27:55.000 Right.
02:27:55.000 And then the media is constantly reinforcing it.
02:27:57.000 And then you get ostracized if you disagree.
02:28:00.000 And that's the wedge.
02:28:01.000 That's the wedge.
02:28:02.000 And it worked.
02:28:03.000 Like, look, for a long time, for 40, 50, 60 years, it worked as a way to gain and hold political power.
02:28:07.000 It's just gotten wedged in kind of this corner where it can no longer win, and so therefore it has to get reexamined.
02:28:14.000 So for you, when you had this shift of thinking, you talked to the waiter and then the Hillary Clinton speech, how long is it before you start publicly expressing these things?
02:28:26.000 And how much of a reluctance is there?
02:28:29.000 So from 2017 to 2020, I was just trying to figure out what the hell was going on.
02:28:33.000 And then COVID hit.
02:28:34.000 And then I was trying to figure out what the hell was going on with COVID. And our business went crazy.
02:28:38.000 Our business caved in and had all kinds of crazy, horrible things happening.
02:28:41.000 We have all these companies.
02:28:42.000 We have hundreds of companies who are responsible for startups and so we're working with them to try to keep them afloat and get the money and everything.
02:28:49.000 But really, the big thing was the Biden administration just flat out tried to kill us.
02:28:54.000 They just came straight at us and they came straight at our founders.
02:28:58.000 And they tried to kill crypto and they were on their way to trying to kill AI. I mean, they were horrible.
02:29:05.000 What was the motivation to kill AI? Because they want control.
02:29:10.000 I mean, they want control.
02:29:12.000 They want to control in the same way they control it.
02:29:13.000 So they recognize the potential of it and they want to head it off of the path.
02:29:16.000 They want to control it.
02:29:17.000 They want to put it in a headlock.
02:29:18.000 They don't necessarily want to stop it, but they want to make sure that they control it in the same way that they control social media.
02:29:22.000 In the same way that they control the press.
02:29:24.000 So how are they trying to do that?
02:29:27.000 Think about it as the same dynamics that cause censorship to happen on social media were also going to happen in AI. And so there's a couple steps.
02:29:34.000 So one is you just want a small number of companies that do AI because you want to be able to put them in a headlock and control them.
02:29:39.000 So you basically want to bless a small set of large companies with a cartel.
02:29:46.000 And set up a regulatory structure where those companies are intertwined with the government and then you want to prevent startups from being able to enter that cartel.
02:29:53.000 How would they do that?
02:29:54.000 That's a threat to the control.
02:29:55.000 So it's a concept called regulatory capture.
02:29:58.000 And so the way – and this has happened many times for hundreds of years.
02:30:02.000 This is like a very well-established kind of thing in economics and politics.
02:30:06.000 Suppose you're a big bank.
02:30:08.000 Suppose you're Jamie Dimon.
02:30:09.000 You run JPMorgan Chase.
02:30:11.000 What's the biggest possible threat of what you could possibly face?
02:30:15.000 It's that there's some disruptive change that comes along that upends your entire business.
02:30:19.000 You're Kodak.
02:30:21.000 You're Kodak.
02:30:22.000 You're making a ton of money on analog film and the digital cameras come along and you get destroyed.
02:30:26.000 In your obituary, it's like you're the idiot.
02:30:29.000 Blockbuster video.
02:30:30.000 Blockbuster video.
02:30:31.000 That's the cautionary tale.
02:30:32.000 Those are the ghost stories that those guys tell around a campfire at night.
02:30:35.000 They're just absolutely terrifying.
02:30:37.000 And business schools teach you that's the one thing you do not want to do.
02:30:40.000 And so there's two ways to try to deal with that.
02:30:43.000 One is you could try to invent the future before it happens to you, but that's hard because you're running a big company and these startups are out there doing all these crazy things and can you really do that?
02:30:50.000 And it's hard and frisky and dangerous.
02:30:52.000 The other thing you can do is you can go to the government and you can basically say, okay, we would like to propose basically a trade, which is we would like the government to put up a wall of regulation.
02:31:01.000 We would like the government to put in place rules that are potentially thousands of pages long.
02:31:07.000 And in fact, the more the better.
02:31:09.000 We want a very, very, very high bar for regulation for what's required to be in this business because I'm a big company.
02:31:16.000 I can afford 10,000 lawyers and compliance people.
02:31:20.000 I voluntarily put myself under basically the government thumb.
02:31:24.000 But in return, the government has erected this wall of regulation such that the next startup comes along and just literally, the next company comes along and just literally can't function.
02:31:32.000 And by the way, this is literally what happened in banking.
02:31:36.000 So pre-2008, pre the financial crisis, there were many different banks in the country, big, medium, small, and lots of new bank startups every year.
02:31:44.000 People would just start banks, entrepreneurial banks of many different kinds.
02:31:48.000 After the financial crisis, we had this problem called the too-big-to-fail banks, right?
02:31:52.000 The banks were too big.
02:31:53.000 And so there was this legislation called Dodd-Frank, which was regulatory reform for banking, which was going to fix the too-big-to-fail banking problem.
02:31:59.000 They implemented that in 2011. I call that the Big Bank Protection Act of 2011. It was marketed as it was going to solve the problem of the too-big-to-fail banks.
02:32:06.000 What it actually did was it made them much larger.
02:32:09.000 Those too-big-to-fail banks, the same ones we bailed out, are now much larger than they were before.
02:32:14.000 The banking industry has concentrated into those banks.
02:32:18.000 All the mid-sized banks are being shaken out.
02:32:21.000 Periodically, they'll go under.
02:32:23.000 The bank in Silicon Valley is called Silicon Valley Bank.
02:32:26.000 It went under.
02:32:27.000 This has been happening all across the economy.
02:32:29.000 Since Dodd-Frank, the number of new banks created in the United States has dropped to zero.
02:32:33.000 Whoa!
02:32:35.000 And so the banking system is being centralized basically into 10 big banks.
02:32:39.000 They actually have a term.
02:32:40.000 They have a great term called GSIB, Globally Significant Something-Something Bank.
02:32:45.000 And so there's like 10 GSIBs and then basically what's going to happen is those are going to consolidate basically into the three big banks.
02:32:51.000 And if you get debanked by one of the big three … You're done.
02:32:55.000 You're absolutely done.
02:32:58.000 But think about it from the other side.
02:32:59.000 If you're the Treasury Secretary and you want your political enemy debanked, it's just a phone call, which is what has been happening, which was happening under the prior regime.
02:33:10.000 Wow.
02:33:11.000 Zero.
02:33:13.000 Zero new banks.
02:33:14.000 Yeah, zero.
02:33:15.000 Literally, it was like cardiac arrest.
02:33:16.000 It was like, that's it for new bank charters.
02:33:18.000 And we've had companies that have tried to start new banks, and it's essentially impossible because you have to comply with the wall of regulation.
02:33:24.000 You need to go hire your 10,000 compliance people and your lawyers.
02:33:27.000 But you can't afford to do that because you're not big enough yet.
02:33:30.000 So you can't function.
02:33:32.000 Like, you can't exist.
02:33:33.000 Wow.
02:33:34.000 It's ruled out.
02:33:35.000 By definition, it's ruled out.
02:33:36.000 You can't do it.
02:33:37.000 It's not financially viable.
02:33:39.000 So that happened in banking.
02:33:41.000 That's what they've been doing in social media.
02:33:45.000 By the way, this has happened in many other industries.
02:33:47.000 By the way, this happened in the food industry.
02:33:48.000 The food industry is greatly consolidated.
02:33:50.000 That's a lot of what's happened in that industry as well.
02:33:53.000 And it's the intertwining of government and the company, right?
02:33:57.000 Because at that point, it's like, okay, is this a private company?
02:33:59.000 Yes.
02:34:00.000 Like, it's still a private company.
02:34:01.000 It has a stock price.
02:34:02.000 It has a CEO. Does the CEO have to do everything that the relevant cabinet secretary tells him to do?
02:34:08.000 Yes, he does.
02:34:09.000 Why does he have to do that?
02:34:10.000 Because if not, it's going to be investigations and subpoenas and prosecutions and frottological examinations for the rest of his life.
02:34:16.000 Wow.
02:34:17.000 So it's essentially what we accuse the CCP of doing in China.
02:34:22.000 So if you combine banking and social media and now AI, you have basically privatized social credit score is where you end up with this.
02:34:33.000 And this goes back to the trucker strike thing.
02:34:34.000 You don't have to threaten to take away somebody's kids.
02:34:37.000 You threaten to take away their insurance.
02:34:38.000 You don't threaten to take away their insurance.
02:34:39.000 It's not government insurance that's being taken away.
02:34:42.000 The same thing has happened in the insurance industry.
02:34:44.000 It's consolidated down to a small handful of companies.
02:34:46.000 They're super regulated.
02:34:46.000 If the government doesn't want you to have insurance, you're not going to have insurance.
02:34:49.000 And there's no constitutional right to insurance.
02:34:51.000 So there's no appeal press.
02:34:54.000 We're back to the debanking thing.
02:34:56.000 And so that happened in banking.
02:34:58.000 That's been happening in social media generally.
02:35:02.000 It's been happening in many other sectors.
02:35:03.000 And then it's happening specifically in AI. And what you have in AI is you have a set of CEOs of some of the big AI companies that want this to happen.
02:35:11.000 Because again, their big threat is that we're going to fund a startup that's going to eat their lunch, right?
02:35:15.000 It's going to really screw them up.
02:35:16.000 And so they're like, look, if we could just take the position we have and lock it in with government protection, the trade is we'll do whatever the government wants.
02:35:23.000 And if you assume the government is controlled by, you know, people who want to censor and punish and cancel their political opponents, that's going to come right along with it.
02:35:30.000 And so that's why when these AI systems come out, like nine times out of ten, they're tremendously politically biased.
02:35:37.000 You can do this today.
02:35:38.000 You just go into these systems today and you just start asking really basic questions.
02:35:43.000 Gemini is the best example of that, right?
02:35:44.000 When they had multiracial Nazis?
02:35:47.000 The black Nazis.
02:35:48.000 Once again, we're back to the Nazis.
02:35:50.000 Yes.
02:35:50.000 So it turns, according to Gemini, Hitler had an excellent DEI policy.
02:35:54.000 Now, in reality, he did not.
02:35:58.000 And it's important to understand that in reality he did not.
02:36:01.000 But yeah, Gemini happily threw up black Nazis because they programmed it to be biased.
02:36:07.000 They programmed it in a political direction.
02:36:10.000 There's this guy, David Rosato, who's been doing these analyses on the social media side where he shows the incidence rates of the rise of all of the woke language in the media.
02:36:18.000 And there are similar studies that have come out for the AI. There are studies that have been done that basically show the political orientation of the LLMs because you can ask them questions and they'll tell you.
02:36:28.000 And they're just like 9 out of 10 of them are like tremendously biased.
02:36:31.000 And then there's a handful that aren't.
02:36:33.000 And then there's tremendous pressure.
02:36:35.000 This is one of the threats from the government is the government basically going to force our startups to come into compliance, not just with their trade rules, but also with all of their...
02:36:44.000 Essentially, a censorship regime on AI that's exactly like the censorship regime that we had on social media.
02:36:49.000 Wow, that's terrifying.
02:36:50.000 Yeah, exactly.
02:36:51.000 And yes, and this is my belief and what I've been trying to tell people in Washington, which is if you thought social media censorship was bad, this has the potential to be a thousand times worse.
02:36:59.000 And the reason is social media is important, but at the end of the day, it's, you know, it's, quote, just people talking to each other.
02:37:05.000 AI is going to be the control layer on everything.
02:37:08.000 So AI is going to be the control layer on how your kids learn at school.
02:37:11.000 It's going to be the control layer on who gets loans.
02:37:14.000 It's going to be the control layer on does your house open when you come to the front door.
02:37:18.000 It's going to be the control layer on everything.
02:37:20.000 And so if that gets wired into the political system the way that the banks did and the way that social media did, we are in for a very bad future.
02:37:28.000 And that's a big thing that we've been trying to prevent is to keep that from happening.
02:37:32.000 And the Biden administration was explicitly on that path.
02:37:35.000 Like they were very clearly going for that.
02:37:37.000 And it was just like crystal clear that's where it was headed.
02:37:40.000 And do you feel like with a second administration they'd be even more emboldened to act in that direction?
02:37:45.000 Yes.
02:37:46.000 100%.
02:37:46.000 Another Biden administration for sure.
02:37:49.000 And then there was an open question around Kamala and the open question there was just she wouldn't, as you know, she wouldn't declare if her issues positions were the same as Biden's or if they were different.
02:37:58.000 Right.
02:37:58.000 And so you can imagine a common administration that had a very different approach, but she refused to clarify any of her positions.
02:38:05.000 Right.
02:38:05.000 And so we had to assume that they would be the same as Biden's, which I think is the default case.
02:38:10.000 Now, is this a closeted sort of a perspective in Silicon Valley?
02:38:16.000 Do people hide these thoughts that this administration would be bad for business?
02:38:21.000 I mean, much less now than we used to.
02:38:23.000 Yeah.
02:38:24.000 I mean, look, Elon really broke a lot of it.
02:38:27.000 Elon did two things that really opened a lot of this up.
02:38:28.000 One is he bought Twitter, which really gave us a place to talk about this stuff, all of us.
02:38:32.000 But then also he himself, of course, started to actually express himself.
02:38:35.000 And so he gave a lot of the rest of us permission structure to be able to say these things.
02:38:40.000 And then look, it's like a cascade where people are like, okay, apparently you can now talk about things.
02:38:45.000 Okay, I have some things to say.
02:38:47.000 Yeah.
02:38:47.000 Well, and then look, also just, they went too far.
02:38:50.000 They tightened the screws.
02:38:51.000 I mean, they really came at us at the heart.
02:38:54.000 And so, you know, the harder they come at us, like, we didn't predict.
02:38:57.000 When Biden won, like, we didn't think it would have negative effects on our business.
02:38:59.000 We thought, yeah, probably taxes will go up, but, like, we'll just keep doing business.
02:39:03.000 But then they did all these things, right?
02:39:05.000 And it took a couple of years to figure out that this was not like a temporary thing.
02:39:08.000 Like this was like a concerted campaign and that they were really coming for us.
02:39:11.000 What agency specifically is involved in doing that?
02:39:14.000 Oh, I mean, they have alphabet soup, but like SEC tried to kill crypto very specifically.
02:39:20.000 FTC, you know, was thoroughly weaponized.
02:39:22.000 There's something called the CFTC, which is the other part of the crypto puzzle, commodities futures.
02:39:28.000 There's crypto that's a security.
02:39:30.000 There's some forms of crypto that are a security and the SEC regulates.
02:39:33.000 There's other kinds of crypto that are a commodity that the CFTC regulates.
02:39:37.000 The CFPB I mentioned earlier, so the Consumer Finance Protection Bureau decided that they were also going to regulate AI. What?
02:39:47.000 Which they just volunteered for.
02:39:49.000 And then, you know, the FAA... The FAA killed the drone industry years ago.
02:39:54.000 The reason why we don't have...
02:39:55.000 The reason why the Chinese are winning in the drone wars is because the FAA basically made drones illegal in the U.S. years ago.
02:40:00.000 So, like, the FAA has been a big problem.
02:40:02.000 You know, the...
02:40:03.000 What is it?
02:40:04.000 Also the FAA... When you say made drones illegal, but you can still buy drones, like, what have they done?
02:40:09.000 So legally, you cannot fly a drone in the U.S. that is beyond line of sight if you don't have a pilot's license.
02:40:16.000 Wow.
02:40:17.000 Which means if you're a U.S. drone manufacturer, you have to build a system that enforces that regulation.
02:40:21.000 So you have to handicap your ability.
02:40:25.000 Yes.
02:40:25.000 So either the US drone needs to either not fly beyond line of sight, which is not very useful, right?
02:40:30.000 Or it needs to somehow validate.
02:40:32.000 We only have customers that have pilot's licenses.
02:40:35.000 China, there's no such restriction.
02:40:37.000 And the Chinese, because we run a more open economy, the Chinese drones you can just buy in the US and use however you want.
02:40:43.000 Technically, as the user of the drone, you're out of compliance with the law, but they ignore that part.
02:40:47.000 They just punish the American drone makers.
02:40:49.000 Wow!
02:40:51.000 And that's why Chinese own the drone market, and that's why 90% of the drones used by the U.S. military and by U.S. police are Chinese-made drones, which, again...
02:41:00.000 That sounds like a terrible security risk.
02:41:03.000 ...is a very bad idea because every Chinese drone is both a potential surveillance platform and a potential weapon.
02:41:09.000 Oh, criminy.
02:41:10.000 Yes.
02:41:11.000 Well, I've seen the advancements in Chinese drones in particular.
02:41:14.000 The choreographed dances that they do in the sky where they had, did you see the dragon one?
02:41:20.000 Yeah, exactly.
02:41:21.000 See if you can find that, Jamie.
02:41:22.000 Chinese dragon drone display.
02:41:26.000 It's like one of the largest ones they ever did.
02:41:27.000 Yeah.
02:41:28.000 It's unbelievable how much more advanced they are.
02:41:31.000 Yeah.
02:41:31.000 And I will tell you, the Biden administration had zero interest in addressing this.
02:41:35.000 Like, or worse than zero.
02:41:37.000 Like, just, I would say, absolute contempt for the idea of a U.S. drone industry.
02:41:40.000 Yeah.
02:41:40.000 So, let's watch this thing.
02:41:42.000 See if you can go full screen on that.
02:41:44.000 Like, this is just a grid in the sky.
02:41:46.000 Look at this.
02:41:47.000 They're flying up together.
02:41:48.000 Yeah.
02:41:49.000 They did one that was at night, Jamie, because they were all lit up.
02:41:51.000 It's on this video.
02:41:52.000 It's just full.
02:41:53.000 Oh, okay.
02:41:53.000 I can skip ahead.
02:41:53.000 So imagine those with guns.
02:41:55.000 Jesus Christ.
02:41:56.000 Coming at you, right?
02:41:57.000 Well, we get to see some of that in Ukraine.
02:41:59.000 Yeah, 100%.
02:42:00.000 Absolutely.
02:42:01.000 Yeah.
02:42:01.000 We've seen those suicide drones.
02:42:03.000 Like, look at this.
02:42:04.000 That dragon in the sky is drones that are all lit up.
02:42:08.000 I mean, that is unbelievable.
02:42:10.000 It even has a puff of fire coming out of its mouth.
02:42:12.000 Yeah.
02:42:13.000 That's incredible.
02:42:14.000 If they send that at a football stadium during a game with grenades on those drones, it's carnage.
02:42:20.000 Dude, don't even put that out there.
02:42:22.000 Don't put that voodoo on me, Ricky Bobby.
02:42:24.000 Sorry.
02:42:24.000 Look at that heart in the sky with a heartbeat.
02:42:27.000 Correct.
02:42:27.000 This is insane.
02:42:29.000 Correct.
02:42:29.000 Yes.
02:42:30.000 It's so incredible.
02:42:31.000 Yes.
02:42:32.000 They had a little one like that that played over the Eminem concert when I was at CODA at the Circuits of the Americas here.
02:42:40.000 They had this giant Eminem concert with like 100,000 people there and then afterwards they had like drones in the sky that did little dances.
02:42:45.000 Chinese drones.
02:42:47.000 I bet.
02:42:47.000 I bet they were.
02:42:48.000 They weren't like this, though.
02:42:49.000 It wasn't at that level.
02:42:52.000 I mean, that's unbelievable.
02:42:53.000 Enjoy the show while you can.
02:42:55.000 That's crazy that that's a Chinese thing only.
02:42:58.000 Yeah.
02:42:59.000 Look, DOD runs in these.
02:43:00.000 Soldiers in the field.
02:43:02.000 It's very common, soldiers.
02:43:03.000 Just normal grunt soldiers in the field carry drones in their backpacks because they want to be able to see what's around the building or up on the roof.
02:43:08.000 Yeah, and these are Chinese-made drones.
02:43:10.000 And every single one of them can be taken over by China and used for whatever they want.
02:43:13.000 Oh, my God.
02:43:14.000 Any time they want.
02:43:14.000 Is the Trump administration on this?
02:43:17.000 They're very – I don't know what they'll do.
02:43:18.000 It's somewhere in the priority order of the things that they're dealing – but they are – yes, they are well aware of this.
02:43:23.000 And it's the kind of thing I would hope that would get some attention.
02:43:28.000 Yeah.
02:43:28.000 Well, this is the – brings us back to the UAP thing because if that's what we're seeing, we're seeing super sophisticated Chinese drones that operate on some novel propulsion system.
02:43:36.000 Yeah.
02:43:37.000 That's not good.
02:43:38.000 And that could be because they put ridiculous regulations on drone manufacturers in America.
02:43:45.000 Yeah, that's right.
02:43:45.000 And they got way ahead of us.
02:43:47.000 Yeah, that's right.
02:43:49.000 Yeah, these are bad.
02:43:50.000 These are bad.
02:43:51.000 These are bad paths.
02:43:51.000 You're just opening my eyes to this.
02:43:53.000 I always had this rose-colored glasses view of our society.
02:43:57.000 Right.
02:43:57.000 Versus the Chinese society.
02:43:59.000 Our society is more open.
02:44:00.000 So people can innovate and come up with new startups and all these crazy ideas because there's so much freedom in America.
02:44:06.000 They don't have to deal with the government being involved in every business.
02:44:11.000 Silly me.
02:44:12.000 Silly me.
02:44:13.000 I was wrong.
02:44:14.000 So this is my argument I make geopolitically in D.C., which is if you imagine that the 21st century is going to be, let's say, a contest between the U.S. and China the same way that in the 20th century it was the U.S. versus the Soviet Union.
02:44:25.000 And like contest, competition, Cold War, maybe hot war.
02:44:29.000 Like that's the basic fundamental kind of geopolitical puzzle of the 21st century.
02:44:34.000 Then you want to think very clearly about the strengths and weaknesses of both yourselves and about the other side.
02:44:39.000 And then as you think about how to beat the other guy, is the answer to become more like them or more like yourself?
02:44:45.000 Maxime Waters made that argument when it comes to social digital scores and cryptocurrency and a centralized digital currency.
02:44:53.000 She was talking about that.
02:44:53.000 In order to compete with China, we have to come up with a centralized digital currency.
02:44:57.000 Which, in my view, is exactly the wrong thing.
02:44:59.000 Yes, I heard that.
02:45:00.000 I was like, that's a terrible idea.
02:45:01.000 It's exactly the wrong thing.
02:45:02.000 You got to be like China to compete with China?
02:45:04.000 It's exactly the wrong thing.
02:45:05.000 It's exactly the wrong thing.
02:45:05.000 You don't want that.
02:45:06.000 Because, as you know, the China system has its problems.
02:45:10.000 Like, they terrorize their own population directly.
02:45:12.000 They do impose the social credit score stuff.
02:45:14.000 They do all this stuff.
02:45:16.000 And then, by the way, here's something we have going for us, which is the Chinese system has turned on capitalism.
02:45:21.000 Xi Jinping is not a capitalist, and there is a broad-based crackdown on private business in China.
02:45:27.000 To the point, a friend of mine, one of the leading investors in China, he said, every single Chinese tech founder has either left China or wants to leave China.
02:45:33.000 And they're all trying to get their money out and they're all trying to get their families out.
02:45:36.000 Because it's now too dangerous to run a tech company in China because the government might just snatch you.
02:45:41.000 Like literally, physically snatch you at any point.
02:45:44.000 And you may or may not come back.
02:45:45.000 And then every Chinese CEO has a political officer of the Chinese Communist Party sitting down the hall who can come in and override your decisions anytime he wants to.
02:45:53.000 And by the way, and drag you into training.
02:45:56.000 This is a great thing.
02:45:56.000 Okay, so you're the CEO of a company with 50 billion in revenue and 100,000 employees and this guy from the CCP comes in and pulls you and you sit in the conference room down the hall for seven hours getting grilled on how well you understand Marx.
02:46:09.000 So that actually happens, right?
02:46:12.000 Political officers.
02:46:12.000 And that's the kind of thing that happened in the Soviet Union and that's the kind of thing that happens in China.
02:46:17.000 You'd rather be a CEO in the US than in China, for sure, as long as the US system actually stays open, where you can actually get all the benefits of all the power of all these incredibly smart people building companies and building products.
02:46:29.000 And that's why this administration freaked us out so much, is because it felt like they were trying to become way more like China.
02:46:34.000 See, I was not nearly as aware as I should have been about all these things you're saying.
02:46:40.000 I didn't know this.
02:46:40.000 I did know about the banks, and I certainly didn't know that they were cracking down on AI the same way they cracked down on social media.
02:46:46.000 The AI thing was very alarming.
02:46:47.000 We had meetings this spring that were the most alarming meetings I've ever been in, where they were taking us through their plans, and it was...
02:46:53.000 Can you talk about it?
02:46:55.000 Basically, just full government control.
02:46:57.000 This sort of thing.
02:46:58.000 There will be a small number of large companies that will be completely regulated and controlled by the government.
02:47:02.000 They told us.
02:47:03.000 They just said, don't even start startups.
02:47:05.000 Don't even bother.
02:47:06.000 There's just no way.
02:47:07.000 There's no way that they can succeed.
02:47:08.000 There's no way that we're going to permit that to happen.
02:47:10.000 Wow!
02:47:10.000 They said, this is already over.
02:47:12.000 It's going to be two or three companies, and we're going to control them, and that's that.
02:47:16.000 This is already finished.
02:47:17.000 Oh my god.
02:47:18.000 When you leave a meeting like that, what do you do?
02:47:20.000 You go endorse Donald Trump.
02:47:27.000 Oh my god.
02:47:29.000 And again, like, I'll just tell you, like, you know, like, because I'm going to get a lot of, you know, the flack I'm going to get for this is, you know, he's just a crazy whatever right winger.
02:47:34.000 But, like, I was a Democrat.
02:47:36.000 I was, like, a Democrat.
02:47:37.000 I supported Bill Clinton in 92. I supported Clinton in 96. I supported Gore, who I knew very well in 2000. I knew John Kerry.
02:47:44.000 I supported him in 04. I supported Obama.
02:47:46.000 I supported Hillary in 16. Like, I was, like, a Democrat in good standing.
02:47:51.000 And then...
02:47:53.000 Are you completely out in the cocktail circuit now?
02:47:57.000 Are you allowed to hang out with people?
02:47:59.000 This is actually true.
02:48:00.000 There's now two kinds of dinner parties in Silicon Valley.
02:48:02.000 They've fractured cleanly in half.
02:48:05.000 There's the ones where every person there believes every single thing that was in the New York Times that day.
02:48:11.000 Which, by the way, is often very different than whatever was in the New York Times six months ago.
02:48:14.000 But everybody has fully updated their views for that day, and that's what they talk about at the dinner party, and I'm no longer invited to those.
02:48:21.000 Nor do I want to go to them.
02:48:22.000 And then there's the other kind, which is, you know, David Sachs and like all these guys and all these people and, you know, just this growing universe.
02:48:30.000 You know, it's a microcosm of what's happening more broadly in the culture, which is like, hey, let's actually get together and talk about things and have fun.
02:48:35.000 Right, but it's so much more comforting when it's you guys and not the MyPillow guy.
02:48:38.000 You know what I mean?
02:48:39.000 It's like, no disrespect, Mike.
02:48:42.000 To the MyPillow guy.
02:48:43.000 But you know what I'm saying?
02:48:43.000 I want people that are smarter than me to be saying these things.
02:48:47.000 That's what helps.
02:48:48.000 It helps when you say, well, this person actually knows what they're talking about, they're very well informed, and they understand the repercussions.
02:48:53.000 They understand what's been coming their way, and there's people like yourself that can speak about...
02:48:58.000 The plans that you're laying out, what they were trying to do with AI, is fucking terrifying.
02:49:03.000 That should terrify everybody.
02:49:04.000 Where you have bureaucrats are now in control of potentially the most...
02:49:10.000 The biggest agent of change in the history of the human race, potentially.
02:49:14.000 And you're going to let what?
02:49:16.000 The people that can't even balance the budget?
02:49:18.000 People that don't know what the fuck is going on?
02:49:20.000 That sounds insane.
02:49:23.000 And look, my hope, I think under Clinton and Gore, I think that they dealt with this very different.
02:49:28.000 I mean, look, they dealt with the internet very differently than the current crop are dealing with these technologies.
02:49:33.000 Well, it was very different.
02:49:34.000 It was very different, but also they were much more, Clinton and Gore in particular, were much more understanding.
02:49:40.000 So there used to be this thing I called the deal with a capital D. And the deal was you could be, and this is what I was, you could be a tech founder, you could start a private company, you could create a tech product.
02:49:48.000 Everybody loved you.
02:49:49.000 It was great.
02:49:49.000 Glowing press coverage, the whole thing.
02:49:51.000 You take the company public.
02:49:51.000 It employs a lot of people.
02:49:53.000 It creates a lot of jobs.
02:49:54.000 You make a lot of money.
02:49:55.000 At some point you cash out and then you donate all the money to charity and everybody thinks you're a hero.
02:50:00.000 Right?
02:50:00.000 And it's just great, right?
02:50:02.000 And this is how it ran for a very long time.
02:50:04.000 And this was the deal.
02:50:05.000 This was, you know, the deal.
02:50:06.000 This was Clinton and Gore were 100% in support of that.
02:50:08.000 And they were 100% pro-capitalism in this way and 100% pro-tech.
02:50:11.000 And they actually did a lot to foster this kind of environment.
02:50:14.000 And basically what happened is the last 15 years or so, Democrats culminating in this administration basically broke every part of that deal.
02:50:21.000 For people in my world.
02:50:22.000 Like every single part of that was shattered, right?
02:50:24.000 Where just like technology became presumptively evil, right?
02:50:27.000 And like, you know, if you're a business person, you were presumptively a bad person.
02:50:30.000 And then technology was presumptively had bad effects and dot, dot, dot.
02:50:33.000 And then they were going to regulate you and try to kill you and quash you.
02:50:35.000 And then the kicker was philanthropy became evil.
02:50:38.000 And this is a real culture change in the last five years that I hope will reverse now, which is philanthropy now is a dirty word on the left because it's the private person choosing to give away the money as opposed to the government choosing a way to give the money.
02:50:49.000 Ooh, ooh, ooh.
02:50:50.000 So I'll give you the ultimate case.
02:50:51.000 Here's where I radicalized on this topic.
02:50:53.000 So you'll recall some years back, Mark Zuckerberg and his wife Priscilla, you know, they have a ton of money in Facebook stock.
02:50:59.000 They created a nonprofit entity called Chan Zuckerberg Initiative, which the original mission was to literally cure all disease.
02:51:05.000 And this could be like, you know, $200 billion going to cure all disease, right?
02:51:09.000 So like a big deal.
02:51:10.000 They said they committed to donate 99% of their assets to this new foundation.
02:51:14.000 They got brutally attacked from the left.
02:51:17.000 And the attack was they're only doing it to save money on taxes.
02:51:22.000 Now, basic mathematics, you don't give away 99% of your money to save money on taxes, right?
02:51:29.000 But it was a vicious attack.
02:51:31.000 It was like a very, very aggressive attack.
02:51:32.000 And the fundamental reason for the attack was how dare they treat that money like it's their own?
02:51:37.000 How dare they decide where it goes?
02:51:39.000 Instead, tax rates for billionaires should go to 90 something percent.
02:51:43.000 The government should take the money and the government should allocate it.
02:51:46.000 And that would be the morally proper and correct thing to do.
02:51:48.000 What do you think is the root of that kind of thinking?
02:51:51.000 This is a utopian collectivism.
02:51:54.000 You know, it's the— Socialism that works.
02:51:56.000 Socialism.
02:51:56.000 Yeah.
02:51:57.000 It's the core idea of socialism.
02:51:58.000 Like the core idea is this sort of radical egalitarianism.
02:52:03.000 Everybody should be exactly the same.
02:52:04.000 All outcomes should be exactly the same.
02:52:06.000 Everything should be completely fair at all times.
02:52:08.000 And some root of it has to be an envy.
02:52:10.000 Of course.
02:52:10.000 Yeah.
02:52:10.000 Envy, resentment.
02:52:12.000 Yes.
02:52:12.000 Nietzsche had this great term called resentiment.
02:52:15.000 And it's like turbocharged resentment.
02:52:17.000 And so the way he described it is resentment is envy, resentment, and bitterness that is so intense that it causes an inversion of values.
02:52:27.000 And the things that used to be good become bad and the things that used to be bad become good.
02:52:32.000 Philanthropy becomes bad.
02:52:34.000 Philanthropy becomes bad because it should be the state operating on behalf of the people as a whole who are handing out the money, not the individual.
02:52:40.000 I was not aware of that blowback.
02:52:42.000 I would have loved to read some of those comments.
02:52:45.000 I would like to go to their page and see what else they comment on.
02:52:48.000 I'll give you another example.
02:52:48.000 Here's another radicalizing moment.
02:52:50.000 My friend Sheryl Sandberg, who I worked with very closely for a long time at Facebook.
02:52:53.000 By the way, Democrat, liberal.
02:52:55.000 By the way, endorsed Kamala.
02:52:56.000 Very much not on the same page as me on these things.
02:53:00.000 She actually worked in the Clinton administration, died in the World Democrats.
02:53:02.000 She wrote this book called Lean In about 12 years ago.
02:53:06.000 It's this sort of feminist manifesto, and it basically said...
02:53:09.000 Lean In?
02:53:10.000 Lean In.
02:53:11.000 Lean In.
02:53:11.000 And the thesis of Lean In was that women in their lives and careers could quote-unquote lean in.
02:53:16.000 She said what she observed in a lot of meetings was the men were leaning into the table and sitting like in front, and then the women were like leaning back and waiting to be called on.
02:53:23.000 She said that women should lean in.
02:53:24.000 It became a metaphor for her for women should lean in on their careers.
02:53:27.000 They should aggressively advocate for themselves to get raises and promotions.
02:53:32.000 Like men do.
02:53:32.000 Like men do.
02:53:33.000 Women should basically become more aggressive in the workplace and then therefore perform better.
02:53:37.000 And so it was a manifesto to women basically saying, be more confident, be more assertive, be more aggressive, be more successful.
02:53:43.000 And I read the draft of the book when she was writing it, and I said, well, you realize you've written a right-wing manifesto.
02:53:48.000 Right?
02:53:52.000 Right.
02:53:52.000 And she looks at me like I've lost my mind, right?
02:53:54.000 Because she's a lifelong lefty.
02:53:56.000 She's like, what do you mean?
02:53:57.000 And I'm like, this book is a statement that women have agency.
02:54:00.000 This book is a statement that the things that women choose to do will lead to better results.
02:54:04.000 That's what people believe on the right.
02:54:05.000 On the left, what people believe is that women are only, always, and ever victims.
02:54:10.000 And if a woman doesn't succeed in a career, it's because she's being discriminated against.
02:54:13.000 And so I predicted when this book comes out, the right-wingers are going to think it's great, and the left is going to come at you.
02:54:20.000 Because you're violating the fundamental principle of the left, which is anybody who does less well is a victim.
02:54:25.000 Which, in that case, is exactly what happened.
02:54:28.000 By the way, the reviews were all by women.
02:54:30.000 And they tore into her.
02:54:32.000 Like, in every major publication, they just, like, completely ripped her.
02:54:34.000 And they're like, how dare this rich, entitled woman be telling us, you know, be telling women that they're not victims and that they're, you know, that they have all this agency because this is denial of sexism, right?
02:54:44.000 It's denial of oppression.
02:54:45.000 Wow, because imagine if a man wrote a book like that for men.
02:54:49.000 Right.
02:54:50.000 That was patriarchy, right?
02:54:51.000 But men wouldn't attack it.
02:54:54.000 Oh, right.
02:54:54.000 Exactly, right.
02:54:55.000 It would be a guidebook.
02:54:56.000 Yeah, that's right.
02:54:56.000 This is how you kick ass and get ahead.
02:54:58.000 Yeah, we call it self-help.
02:54:59.000 Lean in, bro.
02:55:00.000 Lean in, bro.
02:55:02.000 Exactly right.
02:55:02.000 Just call it lean in, bro.
02:55:04.000 Exactly right.
02:55:04.000 Wow, that's crazy that she got attacked for that.
02:55:07.000 So again, it's the inversion.
02:55:09.000 It's the resentment.
02:55:09.000 It's the inversion, which is like advocating on your own behalf and choosing to do things that make you more successful.
02:55:13.000 What was her reaction to that?
02:55:15.000 I would say she was...
02:55:16.000 I don't want to speak for her, but she was not pleased.
02:55:20.000 But also, was she shocked that you were correct?
02:55:23.000 Did you have a follow-up conversation with her?
02:55:25.000 What did she say?
02:55:26.000 We've talked about it a lot.
02:55:27.000 God damn it, Mark.
02:55:28.000 How'd you see that one coming?
02:55:30.000 So she was in the...
02:55:31.000 But the answer is her worldview of how these things worked was from a different...
02:55:35.000 It was from the Clinton-Gore era in which you could say things like that.
02:55:40.000 You could talk like that.
02:55:41.000 Yes.
02:55:41.000 And by the time the book came out, it was already into the second Obama term heading it, right?
02:55:45.000 And then the woke stuff started, and then at that point you could no longer say things like that.
02:55:49.000 Wow.
02:55:50.000 And everything got classified through this very hard-edged, right, us versus them, right, oppressor versus oppressed, you know, kind of mindset.
02:55:56.000 And so...
02:55:57.000 It's such a...
02:55:58.000 It's such a contrast to what we hoped would happen when Obama would be president.
02:56:02.000 That's right.
02:56:02.000 My thought was, okay, there's still some racism, but clearly, if you're the baddest motherfucker, you can get ahead.
02:56:10.000 You can win.
02:56:11.000 The country will vote for you.
02:56:13.000 That's not what happened.
02:56:14.000 No.
02:56:15.000 And you can win again.
02:56:16.000 You can win twice.
02:56:16.000 You win twice.
02:56:17.000 And be like...
02:56:18.000 I've always said, up until...
02:56:20.000 I've lost a lot of respect for him from some of the things that he said during this election cycle because I think they got desperate and they just resorted to actual lies.
02:56:28.000 And I thought this is crazy to see him lying, especially the very fine people hoax.
02:56:33.000 And we played the video back and forth of what Obama said he said and what he actually said, and it's pretty shocking because he's very explicit.
02:56:41.000 You know, he's saying not white nationalists, not neo-Nazis.
02:56:44.000 They should be condemned.
02:56:45.000 He says that very clearly.
02:56:47.000 That's not what I'm talking about.
02:56:48.000 I'm talking about people who are protesting the taking down of the statue.
02:56:51.000 And when you see a guy like Obama do that, it's such a bummer because he was the guy for me that was like our best spokesman.
02:56:59.000 He was like, here's a guy that came from a single family or single parent household.
02:57:04.000 He wasn't some rich entitled kid who was given everything in life.
02:57:08.000 He's this brilliant speaker.
02:57:10.000 He's handsome.
02:57:11.000 He represents what we're hoping for.
02:57:13.000 We're hoping for a colorblind society that just treats people on the merit of who they are and anyone can achieve.
02:57:18.000 And look, here he is.
02:57:20.000 He made it.
02:57:20.000 And then all of a sudden, identity politics goes through the fucking roof and victim mentality becomes a thing that people choose to side with.
02:57:29.000 And it just gets real weird for a long time.
02:57:32.000 Yeah, that's right.
02:57:33.000 That's right.
02:57:34.000 And like I said, I hope they can find their way back.
02:57:36.000 But this lady's still on Team Kamala.
02:57:38.000 Oh, yeah.
02:57:41.000 She got a few lessons out of that, but not all of them.
02:57:44.000 Well, no, if you've been a lifelong Democrat, and if that is in this quarter a lot of people's value systems, then it's a real challenge.
02:57:53.000 Oh, yeah, it's my parents.
02:57:54.000 When your movement...
02:57:55.000 They're all in.
02:57:56.000 Goes in directions.
02:57:57.000 You can choose to follow into the craziest version of it, or you can choose to say, you know what, I'm still not going to switch sides, but at least I'm going to advocate for my team to come back.
02:58:09.000 This is Richie Torres.
02:58:10.000 This guy is a congressman in Queens, I think, or the Bronx.
02:58:15.000 He actually started out everybody thought he was going to be a far lefty because he's gay, he's black, he's Latino.
02:58:19.000 He was like at least associated with the squad early on and he's like one of the guys in the Democratic Party who has now stood up and he's been doing this in public for the last two weeks saying clearly we have to get back to sense.
02:58:30.000 Like we have to get back to common sense.
02:58:31.000 We have to get back to moderation.
02:58:33.000 Trevor Burrus We have to have law enforcement.
02:58:35.000 We have to have – we can't have crime in the streets.
02:58:37.000 We have to have a border.
02:58:39.000 We have to get – we Democrats have to get back to moderation in a sense.
02:58:42.000 So he is hoping to lead the party.
02:58:44.000 Trevor Burrus That's great.
02:58:45.000 Trevor Burrus We support him and I think he's like a really – I think he's a very impressive guy.
02:58:48.000 So there are people like – and he's young and very energetic and I think he has a very bright future.
02:58:53.000 But that's the kind of person who could lead the party.
02:58:56.000 Trevor Burrus Well, the big Nietzschean shift was when Dick Cheney endorsed Kamala and everybody cheered.
02:59:01.000 If there's not a better example than that, please tell me what it is.
02:59:04.000 Because that one was fucking nuts.
02:59:08.000 Dick Cheney was always the hard right.
02:59:11.000 During the Bush administration, all the lefties looked at him like that was Satan.
02:59:17.000 Yeah, that's right.
02:59:17.000 He was the profiteer.
02:59:19.000 That's right.
02:59:19.000 He was the manipulator.
02:59:21.000 He was the guy pulling the strings.
02:59:23.000 He was the CEO of Hal Burton.
02:59:25.000 The whole thing was so crazy.
02:59:27.000 And to see, oh, Dick Cheney just endorsed Kamala, and everybody's like, yay!
02:59:32.000 Look, Dick Cheney's on our side.
02:59:33.000 What?
02:59:34.000 What the fuck are you guys talking about?
02:59:36.000 This is the best shift of it, right?
02:59:38.000 Yeah, that's right.
02:59:38.000 That's right.
02:59:39.000 That's right.
02:59:39.000 All of a sudden, we're all neocons.
02:59:42.000 All of a sudden, as you said, all of a sudden, we're pro-war.
02:59:45.000 It's like, wait, wait.
02:59:45.000 Because as you know, the Democrats used to be the anti-war party.
02:59:49.000 Yes.
02:59:50.000 They were the anti-war party for a very long time.
02:59:51.000 Yes.
02:59:52.000 Yes.
02:59:54.000 Except back when they were trying to keep slavery.
02:59:57.000 That's part of the problem.
02:59:58.000 That was a different era.
02:59:59.000 People don't realize that.
03:00:00.000 That was a different era.
03:00:01.000 But coming out of Vietnam, they were definitely the anti-war party for like 30 years.
03:00:05.000 But isn't that a shift as well?
03:00:06.000 Yeah, it was.
03:00:06.000 But the shift of the Republicans from back in the day being Abraham Lincoln and trying to get rid of slavery and the Democrats fighting to keep it.
03:00:16.000 There's these weird ideological swings.
03:00:19.000 They happen.
03:00:21.000 And, you know, we're still attached to the idea of being a Democrat as like being a Clinton Democrat.
03:00:26.000 We're in this weird sort of denial of what the ideology actually stands for versus how we think of ourselves when we say, I'm a Democrat.
03:00:36.000 I'm a good person.
03:00:37.000 You know, I support civil rights, women's rights, blah, [...
03:00:41.000 Down the line, I'm a Democrat.
03:00:43.000 And if you go against that, well, now you're against all these things that you know to be inherently important for society.
03:00:49.000 Yeah, that's right.
03:00:49.000 They got you.
03:00:50.000 Yeah.
03:00:50.000 That's right.
03:00:51.000 They roped you into some crazy thing where you're supporting war.
03:00:54.000 And then there's the big faction, right?
03:00:56.000 There's the big Free Palestine versus Support Israel.
03:01:00.000 Yeah.
03:01:00.000 Because the left always supported Israel.
03:01:02.000 Yeah, 100%.
03:01:02.000 And then all of a sudden there's this Free Palestine movement, which divides the left even further.
03:01:07.000 Yeah.
03:01:07.000 There's a book written some years back by this guy, Norman Podhoritz, and it's great.
03:01:12.000 So why are Jews liberal?
03:01:14.000 Right.
03:01:15.000 And he was a right-wing Jew.
03:01:16.000 He was a right-wing Jew, a very important Jewish thinker, American Jewish thinker, like in the 60s, 70s, 80s.
03:01:21.000 And he's like, he basically is like, basically he had this thesis that like these Jewish liberal voters in the U.S. like basically are voting against, ultimately they're voting for the wrong team.
03:01:28.000 Because what they don't understand basically is that this is sort of a path, number one, to anti-Semitism, which is what's happened.
03:01:34.000 But number two, basically you're never going to have long-term support for Israel from the left because Israel – the basic concept of Israel violates the idea that Israel is like literally a religious ethnostate.
03:01:44.000 And that's like inherently a right-wing idea, not a left-wing idea.
03:01:47.000 Like the left doesn't have room for that.
03:01:48.000 And a military superpower.
03:01:49.000 And a military – right.
03:01:50.000 And is able to – right.
03:01:51.000 Is able to – And it's run by a former special forces operator.
03:01:54.000 Yes.
03:01:55.000 Very – Yes, a very capable soldier.
03:01:58.000 He's a fucking assassin.
03:01:59.000 Exactly.
03:02:00.000 And so, you know, he argued, I don't know, this is like whatever, 20 years ago, he's like, this is headed in the wrong direction.
03:02:05.000 But, you know, the argument was ignored at the time.
03:02:07.000 And then, you know, at least a lot of my Jewish friends after October 7th, you know, they were completely horrified, you know, to find out, for example, the DEI was actually anti-Jewish.
03:02:15.000 Right.
03:02:15.000 Which is what everybody learned with the scandals at the universities.
03:02:18.000 Right.
03:02:18.000 Right.
03:02:18.000 And it's like, you know, there's two ways of looking at that.
03:02:20.000 One is, oh my God, the DEI is anti-Jewish, therefore we need to add Jews to the DEI scorecard, right?
03:02:28.000 Well, when we saw the heads of Harvard and was it Yale?
03:02:32.000 No.
03:02:33.000 It was Harvard and MIT in Columbia.
03:02:34.000 Yeah.
03:02:34.000 That was...
03:02:35.000 Yeah, that's right.
03:02:36.000 That was just so in everyone's face and so bananas.
03:02:40.000 And then what we saw is that this same sort of radicalized left had actually slid into not just anti-Semitism and not just anti-Israel but also pro – I mean ultimately pro-terrorist, pro-Hamas.
03:02:50.000 You know, the new acronym, LGBTH. But there's a bunch of other stuff in there now.
03:02:56.000 There's Q, there's two-spirit.
03:02:57.000 I know, but you've got to get H in there now for Hamas.
03:03:00.000 Oh, boy.
03:03:01.000 Really?
03:03:01.000 Yeah, of course.
03:03:02.000 Of course.
03:03:02.000 Of course.
03:03:03.000 And so, like, I bring it up just as an example of it's the kind of realignment.
03:03:10.000 A lot of Jewish Americans now are having to kind of rethink fundamental questions about political structure and alliances and who they should be part of and who they shouldn't be part of.
03:03:17.000 So I think to your point, I think like the whole country is going through – I think we're going through the first like profound political realignment probably since the 1960s, which is when everything shifted between Johnson and Nixon in the South.
03:03:30.000 I think we're going through like the most profound version of that right now and I think it's something like the multi-ethnic working class coalition that came together around Trump.
03:03:39.000 You know, basically, again, against this sort of super exaggerated elite plus underclass, you know, kind of structure that the Democrats have built for themselves.
03:03:47.000 And it just turns out there's just a lot more people in the middle.
03:03:51.000 And so I think, but by the way, including a lot of black people, you know, black vote for Trump is way up, Hispanic vote for Trump is way up, youth vote for Trump is way up, gay vote, like all of the identity groups that Democrats relied on all these years, or union vote is for Trump.
03:04:06.000 I'm sure you've seen the map, the electoral map of California.
03:04:11.000 Yeah.
03:04:11.000 2024 and 2020. Yes.
03:04:13.000 In contrast, it's a crazy red wave that's going across the whole...
03:04:16.000 Most of the state is red now.
03:04:18.000 Those of us on the coast are going to get pushed into the ocean.
03:04:20.000 Yes.
03:04:20.000 Well, I think, you know, maybe the other way.
03:04:24.000 You were talking about the hopeful way that the Democrats will wake up and come up with a more reasonable—well, I mean, there's obviously clear cultural pushback on all these crazier issues.
03:04:35.000 I mean, like, the giant pushback from women about biological men competing against women.
03:04:41.000 I mean, this is a giant one where women are like, listen, we created Title IX for a reason.
03:04:46.000 We want women's sports to be for women.
03:04:48.000 You can't have them for mentally ill men that think that they can be able to just decide they're a woman and compete against women, which is what it is in a lot of places.
03:04:57.000 You don't even have to get tested.
03:04:58.000 There's not some sort of a hormone protocol.
03:05:01.000 It's just what your identity is, which is just nuts.
03:05:04.000 And that's one of the things that I think a lot of people on the left are having a really hard time justifying.
03:05:11.000 Yeah, right.
03:05:11.000 Because how can you deny a victim group?
03:05:14.000 Right.
03:05:14.000 Right.
03:05:15.000 You can't.
03:05:15.000 I mean, in the full version of that ideology, in the extreme version of that ideology, you cannot deny a victim claim.
03:05:20.000 Well, it also comes with this weird caveat where you have to deny the existence of perverts.
03:05:25.000 Right.
03:05:26.000 Because a pervert, all they have to do is say, I identify as a woman, throw on a wig, and now you can go hang around the women's room and no one can say anything.
03:05:33.000 Well, you've emboldened, empowered one of the worst groups in society that we've always protected women from.
03:05:40.000 Yeah.
03:05:41.000 And you have to pretend they don't exist if you just want to base it solely on identity, especially like a self-described identity.
03:05:49.000 You just decide, and then that's it.
03:05:52.000 And, you know, I mean, there's states that have that now with prisoners, that all a prisoner has to do is identify with being a woman, and you are now housed in women's prisons.
03:06:03.000 California has 47 of them, when the last time I looked at it.
03:06:07.000 And there's hundreds that are waiting on like a waiting list to try to get in.
03:06:11.000 So you have women who, you know, especially if you're someone who's dealing with if you've ever been raped or sexually abused, and now you have to share space with a man who might be a fucking pervert.
03:06:23.000 And some of these men even have some crimes that are along those lines that they're in jail for.
03:06:30.000 Yeah.
03:06:31.000 It's crazy.
03:06:31.000 I mean, Canada's the worst at it.
03:06:33.000 There's a bunch of different examples of these type of people getting into female prisons.
03:06:39.000 And it's just – it's insanity.
03:06:41.000 And I think the left rejects that too for the most part.
03:06:44.000 There's the sensible version of the left that is like, hey, yeah, I'm pro-gay rights.
03:06:48.000 Yeah, I'm pro-women's rights.
03:06:49.000 I'm pro-civil rights.
03:06:50.000 I'm pro-choice.
03:06:51.000 I'm pro this.
03:06:51.000 I'm anti-war.
03:06:52.000 But also – You can't let psychos just put on a fucking dress and hang out in women's rooms just because we want to be kind.
03:07:00.000 Like, that's nuts.
03:07:01.000 So there has to be some...
03:07:03.000 And then there's legitimate trans women.
03:07:05.000 So, like, how do you make the distinction?
03:07:07.000 Well, clearly we have to have a fucking conversation.
03:07:09.000 And if you don't allow that conversation to take place, like, if you go to Blue Sky and you type in, there are only two genders, you're banned.
03:07:16.000 Right there.
03:07:17.000 People have done it.
03:07:17.000 There's a bunch of people who have done it.
03:07:18.000 It's fun.
03:07:19.000 Yeah.
03:07:19.000 It's fun.
03:07:20.000 They've created a little sock puppet account, and they say some shit that should have been a reasonable thing to say just 20 years ago.
03:07:26.000 Yeah.
03:07:27.000 Well, you make me hopeful, Mark.
03:07:29.000 Good.
03:07:29.000 You do.
03:07:30.000 Good.
03:07:30.000 You do.
03:07:30.000 Because you lay things out in a really well-thought-out way that is not hyperbolic, and you're making a lot of sense.
03:07:38.000 So I'm glad we talked.
03:07:40.000 I feel better.
03:07:40.000 Good.
03:07:41.000 Fantastic.
03:07:42.000 I think the world does, too.
03:07:44.000 I really do.
03:07:45.000 I mean, I've talked to a lot of people, even people that are Democrats.
03:07:47.000 They say, I feel better that Trump won.
03:07:49.000 Every day it feels better.
03:07:51.000 It feels like just things are opening up.
03:07:54.000 It's the Obama campaign.
03:07:56.000 It's hope and change.
03:07:57.000 Yeah, hope and change.
03:07:57.000 Remember?
03:07:58.000 It's hope you changey.
03:07:59.000 This is kind of actually hope and change.
03:08:02.000 Yeah.
03:08:02.000 This is actually it.
03:08:03.000 It feels like oxygen returning.
03:08:05.000 Yes.
03:08:05.000 Well, thank you very much, Mark.
03:08:06.000 I really appreciate you.
03:08:08.000 Tell everybody your Substack, how to find you on social media.
03:08:11.000 Oh, I'm on X under P Mark A. I'm on Substack.
03:08:14.000 Google me.
03:08:15.000 All right.
03:08:16.000 Ask Perplexity.
03:08:17.000 All right.
03:08:17.000 Ask ChatGPT and it will deny that.
03:08:19.000 No.
03:08:19.000 It will happily tell you that I exist, at least last time I checked.
03:08:24.000 What about Wikipedia?
03:08:25.000 We don't know.
03:08:26.000 We don't know if Catherine is still running.
03:08:28.000 Always a pleasure, Mark.
03:08:29.000 Thank you very much.
03:08:30.000 Appreciate you.
03:08:30.000 All right.
03:08:31.000 Bye, everybody.