Timcast IRL - Tim Pool - August 01, 2025


New DOCS PROVE Obama Hillary CONSPIRACY To SABOTAGE Trump Admin | Timcast IRL


Episode Stats

Length

2 hours and 20 minutes

Words per Minute

193.9995

Word Count

27,341

Sentence Count

2,365

Misogynist Sentences

45

Hate Speech Sentences

45


Summary

New documents have been released from the Durham Annex, and oh boy, this one s a doozy. It goes on to explain how Hillary Clinton approved of a plan to smear Donald Trump as being supported by the Russians, and it was Obama s intel agencies that were helping.


Transcript

00:02:43.000 New documents have been released, and oh boy, this one's a doozy.
00:02:46.000 In the documents released from the Durham Annex, it goes on to explain how Hillary Clinton approved of a plan to smear Donald Trump as being supported by the Russians, and it was Obama's intel agencies that were helping.
00:03:00.000 When we put these stories together, what do you get?
00:03:02.000 They knew it was false.
00:03:03.000 They knew it was exaggerated.
00:03:05.000 They were going to smear Trump anyway, specifically to cover up the Hillary Clinton email scandal and shift the view of the public towards Trump instead of her because she had broken the law and Comey refused to prosecute her.
00:03:18.000 You combine this with the other documents that Tulsi Gabbard released, which show Obama ordered this directly, and things are getting a bit interesting.
00:03:26.000 Now, on top of this, we've got the Pelosi story, the Pelosi Act, as it were.
00:03:31.000 And Trump was initially mad at Senator Hawley, but Hawley says he talked to Trump, cleared it up, and Trump's actually on board.
00:03:37.000 They may actually ban stock trading.
00:03:40.000 So we'll talk about that.
00:03:42.000 And then on top of that first story, we have a whistleblower.
00:03:45.000 Apparently, there was a guy or an intel analyst who was threatened by the higher-ups that he had to sign off on bad intelligence to smear Trump.
00:03:53.000 Hence, this looks like a conspiracy against Trump and when he was president, his administration.
00:04:00.000 So we're going to talk about that and more.
00:04:02.000 But before we get started, my friends, we've got a great sponsor.
00:04:03.000 It is Beam Dream, my friends.
00:04:05.000 Go to shopbeam.com slash, I believe it's slash Tim Poole.
00:04:12.000 Is that what it is?
00:04:12.000 Or it could be Tim.
00:04:13.000 Slash Tim Poole.
00:04:14.000 Use promo code Timpool.
00:04:15.000 It's the easy way to do it.
00:04:15.000 Beam Dream is a delicious cup of hot cocoa.
00:04:19.000 No sugar.
00:04:19.000 You take it before bed.
00:04:20.000 It helps you sleep.
00:04:21.000 It's got magnesium.
00:04:22.000 It's got melatonin.
00:04:23.000 It's got L-thean in it.
00:04:25.000 I'm going to tell you, I've been drinking this non-stop for weeks now, and I have had the best sleep of my life.
00:04:33.000 No question.
00:04:34.000 And I've been dreaming a lot more, more vivid dreams.
00:04:36.000 I don't know if that has anything to do with it, but I'll tell you this.
00:04:38.000 I drink this right before bed after the show.
00:04:40.000 And then within 15 minutes, I am comfortably out like a light.
00:04:44.000 I wake up in the morning and I check my app that measures my sleep.
00:04:47.000 And I've been hitting like 97, 98, 99 every day.
00:04:50.000 I am deeply impressed.
00:04:52.000 Not only that, but it tastes amazing.
00:04:54.000 It's low calorie.
00:04:55.000 And you just make it right before bed.
00:04:57.000 It's fantastic.
00:04:58.000 If this was nothing but hot cocoa, it would still be amazing, but it helps you sleep.
00:05:01.000 So I recommend it.
00:05:02.000 My friends, check out shopbeam.com slash Tim.
00:05:06.000 Use code Tim Pool and you'll get 35% off.
00:05:10.000 We actually had to tell Beam to send more because the other people who work here were like, can I try?
00:05:14.000 I said, no, I've taken all of the samples.
00:05:16.000 They're mine.
00:05:18.000 I'm not even kidding.
00:05:18.000 I absolutely love this stuff.
00:05:19.000 James O'Keefe was on the show and he was like, wait, what is this?
00:05:22.000 This helps me sleep.
00:05:22.000 I buy this stuff.
00:05:24.000 I didn't think I needed it as much.
00:05:26.000 And then I was like, I'm going to make sure, I'm just going to, I'm going to get into this routine.
00:05:29.000 And holy crap.
00:05:31.000 I go to bed.
00:05:32.000 I'm warm, sunken in the bed.
00:05:34.000 I wake up feeling like a million bucks.
00:05:35.000 I am loving it.
00:05:36.000 So check that out.
00:05:37.000 Shout out to Beam.
00:05:37.000 Thanks for sponsoring the show.
00:05:39.000 And for everybody else, if you're trying to stay awake, go to CastBrew.com.
00:05:43.000 In the morning, after I wake up feeling great, I like to have a little bit of Appalachian nights.
00:05:47.000 What I do is I actually make a double shot of Appalachia nights.
00:05:50.000 I put it in my protein shake.
00:05:51.000 And it's like a mocha or a latte or whatever.
00:05:54.000 I don't know.
00:05:54.000 Depends on if I put chocolate in it.
00:05:55.000 But go to castbrew.com, pick up some coffee.
00:05:58.000 We got a bunch of amazing blends and support the show with great coffee.
00:06:02.000 For everybody else, smash that like button.
00:06:05.000 Share the show with everyone you know if you do like the work that we do.
00:06:08.000 We'd really appreciate it.
00:06:08.000 I appreciate it if you would share.
00:06:10.000 And we'll see you all on Saturday.
00:06:11.000 Joining us tonight to talk about this and so much more.
00:06:13.000 We've got Joseph Gratante.
00:06:15.000 Thanks for having me, Tim.
00:06:16.000 Who are you?
00:06:17.000 What do you do?
00:06:17.000 I'm the CEO of Elio Capital, A-L-L-I-O, Elio.
00:06:21.000 And what does this, what is this Elio Capital do?
00:06:24.000 It's a macro investing platform.
00:06:27.000 Yeah.
00:06:27.000 Interesting.
00:06:28.000 So it's actually fortuitous that we Have you because the Pelosi Act is a big story, and there's discussion about insider trading, how the ultra-wealthy are playing these games, how the politicians are playing this game.
00:06:37.000 So it'll be interesting to get your insights on how this whole infrastructure and investing works.
00:06:41.000 So thanks for hanging out.
00:06:42.000 Yeah.
00:06:42.000 Should be fun.
00:06:43.000 We got Mary hanging out.
00:06:44.000 Hello, everyone.
00:06:45.000 I'm Mary Morgan.
00:06:47.000 You can usually find me on Pop Culture Crisis here at Timcast alongside my lovely co-host.
00:06:53.000 I finally get to say that to you.
00:06:54.000 Perfect.
00:06:55.000 Yes.
00:06:56.000 Guys, I am filling in today for Phil.
00:06:59.000 It's Brett.
00:07:00.000 Normally also pop culture crisis Monday through Friday, 3 p.m. Eastern Standard Time.
00:07:04.000 Let's get into it.
00:07:05.000 Here's a story from the New York Post.
00:07:08.000 Read the documents that prove Hillary Clinton okayed plan to smear Trump with Russia collusion.
00:07:15.000 This is from the New York Post, published today.
00:07:18.000 They break down this annex.
00:07:20.000 This is crazy stuff.
00:07:21.000 Hillary Clinton and Obama were in on this scheme to smear Trump as a Russian asset, claiming that Russians hacked it or that Trump was colluding with them or something like that, which didn't just affect the campaign, but went into his presidency, resulting in a multi-year-long investigation that cost tens of millions of dollars.
00:07:39.000 Take a look.
00:07:40.000 Hillary Clinton signed off on a plan hatched by a top campaign advisor to smear then-candidate Trump with false claims of Russian collusion and distract from her own mounting email scandal during the 2016 campaign.
00:07:53.000 According to explosive intelligence files declassified Thursday, the 24-page intelligence annex was compiled from memos and emails obtained by the Obama administration in the lead up to Election Day that laid out confidential conversations between leaders of the Democratic National Committee, including then-chair Debbie Wasserman Schultz and liberal billionaire George Soros's Open Society Foundations.
00:08:16.000 Can you believe this?
00:08:17.000 The plot, the brain shout of the Clinton campaign's then foreign policy advisor, Julianne Smith, included, quote, raising the theme of Putin's support for Trump and subsequently steering public opinion toward the notion that it needs to equate the Russian leader's political influence campaign with actual hacking of election infrastructure.
00:08:39.000 They say Smith would go on to serve as former President Joe Biden's ambassador to NATO.
00:08:44.000 Quote, I don't have any comment, she told the Post when they phoned her on Thursday.
00:08:49.000 Now, I want to scroll down.
00:08:50.000 There's a little bit more.
00:08:51.000 They show some documents.
00:08:52.000 In one document, it says Obama has no intention to darken the final part of his presidency and legacy by the scandal surrounding the main contender for the DP.
00:09:01.000 To solve the problem, the president puts pressure on FBI Director James Comey through Attorney General Lynch.
00:09:08.000 However, so far, without concrete results.
00:09:11.000 That is to say, Obama knew Hillary Clinton was scandal-ridden.
00:09:15.000 She had a private server.
00:09:16.000 This was illegal.
00:09:17.000 They were refusing to prosecute.
00:09:19.000 She had the server destroyed, phone smashed with hammers.
00:09:22.000 And Obama was like, this is going to make me look bad if I do anything to intervene.
00:09:26.000 So he leans on the FBI and says, you take care of it.
00:09:29.000 And what happens?
00:09:30.000 The FBI says, we will not bring charges against Hillary Clinton for the crimes she has committed.
00:09:36.000 Shockingly insane.
00:09:38.000 They go on to say Durham consulted the FBI and CIA, both of which assessed the information was likely authentic, but couldn't corroborate exact copies of the Bernardo emails with open study foundations.
00:09:48.000 The CIA also determined that the intelligence was not the product of Russian fabrications.
00:09:53.000 Smith was at minimum playing a role in the Clinton campaign's effort to tie Trump to Russia, Durham concluded.
00:10:00.000 Now I'm going to scroll down here.
00:10:02.000 Page seven.
00:10:03.000 This is where it gets interesting.
00:10:04.000 Let me read you this paragraph.
00:10:06.000 This is from the Durham Annex.
00:10:08.000 According to data from the election campaign headquarters of Hillary Clinton, obtained via the U.S. Soros Foundation, on the 26th of July, 2016, Clinton approved a plan of her policy advisor, Juliana Smith, from the TS, NAB, TSS's unknown acronym, to smear Donald Trump by magnifying the scandal tied to the intrusion by the Russian special services in the pre-election process to benefit the Republican candidate.
00:10:34.000 As envisioned by Smith, raising the theme of Putin's support for Trump to the level of the Olympic scandal would divert the constituents' attention from the investigation of Clinton's compromised electronic correspondence.
00:10:47.000 In addition, by subsequently steering public opinion towards the notion that it needs to equate Putin's efforts to influence political process in the U.S. via cyberspace to acts against a crucially important infrastructure, it would force the White House to use more confrontational scenarios vis-a-vis Moscow that as a whole suits Clinton's line of conduct.
00:11:07.000 A relatively sluggish reaction by the administration to the events surrounding the DNC that led to the resignation of Chairman Deborah Wasserman Schultz provoked exasperation within the PC, possibly political convention, and the entire deep state, which may also be used by Clinton to reinforce her position among the security service agents.
00:11:28.000 To simplify, Clinton's campaign wanted to get the bad press off of her and shift the focus to Trump.
00:11:35.000 And they knew that making this scandal bigger would force the Obama administration to target Trump with actual law enforcement capabilities.
00:11:43.000 And then we got years of Trump being accused of being a traitor to this country, secretly working with the Russians the whole time because Obama didn't want to get his hands dirty, ordering the release of information they knew to be false.
00:11:58.000 Holy crap, ladies and gentlemen, the more that comes out, the more shockingly insane we learned this story to be.
00:12:05.000 And I guess the question then is for everybody watching: will anybody get arrested?
00:12:11.000 No, it's nothing ever changes, gang, over here.
00:12:14.000 That's like why it's not shocking is we just keep finding out that we were right in our hunches and our conspiracy theories, and the truth eventually does come out.
00:12:23.000 And I guess that's a happening, but nothing changes as a result.
00:12:28.000 Do you see this leading to some type of prosecution?
00:12:31.000 You know, or even grand jury.
00:12:34.000 What, you know, maybe, and I'll say this: Trump got arrested.
00:12:39.000 You know, they arrested Donald Trump several times.
00:12:41.000 They arrested his lawyers.
00:12:43.000 So maybe they might actually go after Obama or Comey.
00:12:48.000 I mean, they say they're investigating him.
00:12:49.000 And the question is, does Trump have the willpower?
00:12:51.000 And does Trump want revenge?
00:12:52.000 Or is this just big one puppet show to keep us distracted as they do a bunch of other weird stuff around the world?
00:12:58.000 And, you know.
00:12:59.000 And who ends up with the better mugshot, Trump or Obama?
00:13:02.000 It depends on what you just mean by better.
00:13:05.000 Well, Trump, Trump got to hang his on the White House wall.
00:13:10.000 That is crazy.
00:13:12.000 Like, is that the old office?
00:13:15.000 But that's not it.
00:13:15.000 Oh, he did.
00:13:16.000 Did he hang the actual mugshot?
00:13:17.000 Yeah.
00:13:18.000 That's right.
00:13:18.000 Yeah.
00:13:19.000 I went to the White House not that long ago, and the picture that Trump got, his presidential photograph, just looks identical to the mugshot.
00:13:26.000 And I'm like, I think he did that on purpose.
00:13:28.000 They gave him trace paper and just had him trace over.
00:13:30.000 The dramatic lighting and that like raised eyebrow stern.
00:13:35.000 He also looks thinner than like he looks thinner in that photo than he has.
00:13:38.000 I think he's 220 pounds.
00:13:41.000 What did you say?
00:13:42.000 Did you call it a svelte?
00:13:43.000 220?
00:13:44.000 Svelte.
00:13:44.000 220 pounds, mind you.
00:13:46.000 Firm.
00:13:49.000 I want to show you guys this.
00:13:50.000 We've got a video.
00:13:52.000 This is from 2016.
00:13:53.000 Listen to this.
00:13:54.000 According to the Washington Post, the CIA has concluded that Russia intervened in the election to help you win the presidency.
00:14:03.000 Your reaction.
00:14:04.000 I think it's ridiculous.
00:14:05.000 I think it's just another excuse.
00:14:07.000 I don't believe it.
00:14:09.000 I don't know why.
00:14:11.000 And I think it's just, you know, they talked about all sorts of things.
00:14:15.000 Every week it's another excuse.
00:14:17.000 We had a massive landslide victory, as you know, in the Electoral College.
00:14:22.000 I guess the final numbers are now at 306 and she's, you know, down to a very low number.
00:14:27.000 No, I don't believe that at all.
00:14:29.000 You said you don't know why.
00:14:30.000 Do you think that the CIA is trying to overturn the results?
00:14:33.000 No, I don't think you're on how to weaken you in office.
00:14:36.000 Well, if you look at the story and you take a look at what they said, there's great confusion.
00:14:41.000 Nobody really knows.
00:14:42.000 And hacking is very interesting.
00:14:44.000 Once they hack, if you don't catch them in the act, you're not going to catch them.
00:14:47.000 They have no idea if it's Russia or China or somebody.
00:14:51.000 It could be somebody sitting in a bed someplace.
00:14:54.000 I mean, they have no idea.
00:14:55.000 So why would the CIA put out the story that the Russians wanted you to?
00:14:58.000 Well, I'm not sure they put it out.
00:15:00.000 I think the Democrats are putting it out because they suffered one of the greatest defeats in the history of politics in this country.
00:15:06.000 And frankly, I think they're putting it out.
00:15:09.000 And it's ridiculous.
00:15:10.000 We're going to get back to making America great again, which is what we're going to do.
00:15:14.000 And we've already started the process.
00:15:17.000 I love how he says, like, to Trump, the CIA doing something wrong.
00:15:22.000 Like, that's a completely incredulous idea.
00:15:25.000 Why acted in the middle of the day?
00:15:25.000 Well, at the time, it was a new idea.
00:15:28.000 And I don't know where this falls on the timeline, this like news hit, but I'm just reminded of the big mic drop moment in that second presidential debate in 2016 between Trump and Hillary, where he says, maybe someone say tongue-in-cheek, like, you'd be in jail if I were in power, if I were, you know, in office, you would face criminal consequences for your private email servers while Secretary of State.
00:15:56.000 And it was like this big, like, oh, like, he really just said that.
00:16:00.000 But then he got into office and like, that didn't happen.
00:16:04.000 And she didn't get arrested the first time for the same reason that she's not going to get arrested now.
00:16:09.000 Now that there's evidence to, for the same reason that there is a little handshake between Trump and the deep state about the Epstein files now.
00:16:18.000 So what's the point of all of this declassification and the release if they don't intend to actually go after anybody?
00:16:24.000 I mean, just declassify all the other shit so that people stop asking about Epstein, I guess.
00:16:28.000 Distraction.
00:16:29.000 Yeah.
00:16:30.000 Keep the base happy.
00:16:32.000 This is a woman who did her dissertation on rules for radicals, and yet there are people at my company who still will refuse to believe that she had any part in this, no matter what you show them.
00:16:43.000 They would swear to you that Santa Claus was real before they would possibly believe that Hillary Clinton could do something wrong.
00:16:50.000 So I don't think there's anything that's going to come of this.
00:16:54.000 Well, I still don't think that the Sega is debunked.
00:16:56.000 But look at that video with, who was that?
00:17:00.000 Stephanopoulos?
00:17:01.000 I can't remember who this guy was.
00:17:02.000 Is he still on TV?
00:17:03.000 On Fox?
00:17:04.000 Is that Matthews there?
00:17:04.000 Matthews?
00:17:05.000 No, this guy, George Stephanopoulos.
00:17:07.000 That's Matthews in this video.
00:17:10.000 Chris Matthews.
00:17:10.000 That's Chris Matthews.
00:17:12.000 Man, I don't know who these guys are.
00:17:12.000 Oh, you're right.
00:17:14.000 They do look alike, though.
00:17:14.000 They're ancient.
00:17:15.000 Right.
00:17:16.000 Okay.
00:17:16.000 Chris Matthews.
00:17:19.000 As Brett brought up, the shock.
00:17:23.000 But what do you mean the CIA did something wrong?
00:17:28.000 Not them.
00:17:29.000 I mean, never.
00:17:30.000 I mean, even if we want to go back more recently, NSA spying scandal was like three years before then.
00:17:36.000 Tweeting that.
00:17:37.000 The Gulf of Tonga.
00:17:38.000 Yeah.
00:17:39.000 Oh, my goodness.
00:17:41.000 As if Hollywood hasn't been making movies about all the awful things the CIA has been doing for the last 20 years.
00:17:46.000 The CIA is like, who nigga me?
00:17:49.000 Yeah, they're like this.
00:17:51.000 And I think to your point, I think you're right.
00:17:52.000 Like, the issue here is that it doesn't actually matter what the truth is because 10 gazillion people ran with the Russian gate.
00:18:00.000 Chris Wallace.
00:18:01.000 I was going to say, yeah, I don't know where Matthews is.
00:18:03.000 Nobody knows.
00:18:04.000 Chris Wallace.
00:18:05.000 Okay, guys, Chris Wallace.
00:18:06.000 Chris Wallace.
00:18:07.000 Fox host won Fox host.
00:18:09.000 This was like ages ago.
00:18:10.000 One of those guys.
00:18:11.000 But the point is, is like the damage is done.
00:18:14.000 They did the reporting.
00:18:15.000 They claimed that he was colluding with Russia.
00:18:18.000 And the worst person you know and your highly liberal aunt all believe it.
00:18:23.000 And they ruined a lot of friendships and a lot of relationships running something that they knew was a lie.
00:18:27.000 And like you said, there are plenty of people, even if you're not talking just about what they think about Trump.
00:18:32.000 They could not imagine a world where Hillary Clinton was evil, despite the fact that the rest of the world knows that she's pretty evil.
00:18:40.000 And I want to just make sure I include this other document.
00:18:43.000 This is the 2020 ICA, which says, acting on President Obama's orders, DCIA Brennan directed a full review and publication of raw human intelligence information that had been collected before the election.
00:18:56.000 CIA officers said that some of this information had been held on the orders of the DCIA, while other reporting had been judged by experienced CIA officers to have not met long-standing publication standards.
00:19:06.000 Some of the latter was unclear or from unknown subsources, but would nonetheless be published after the election over the objections of veteran officers on the orders of DCIA and cited in the ICA to support the claims that Putin aspired to help Trump win.
00:19:22.000 It was all one big scam, and it was Obama and Hillary.
00:19:25.000 And I'm going to say it again, as I said before, I think likely what was going on is The Clinton Foundation was taking hundreds of millions of dollars.
00:19:33.000 I wonder where that money was coming from.
00:19:35.000 When Hillary Clinton lost the election, the money stopped coming in.
00:19:38.000 So the presumption many people make is that her private server was collect, was the, was the, it was how they communicated for this illicit transfer of funds of U.S. government policy, private bribery, you can call it.
00:19:52.000 And then when they said we want to see the server, Hillary Clinton had it destroyed.
00:19:56.000 Yeah.
00:19:57.000 Irretrievable.
00:19:58.000 And phones smashed with hammers.
00:20:01.000 I think they went after Trump because they were like, we're in trouble.
00:20:04.000 He's going to start going after this stuff.
00:20:06.000 This is what they do.
00:20:07.000 They have to go after Trump first so that if Trump responds in any way, they'll say, see, Trump is only doing this because we did this.
00:20:14.000 Now take a look at the Epstein story.
00:20:17.000 People keep saying that Donald Trump is distracting from Epstein, despite the fact the Obama Obama Russia Gate releases have been coming out well before the Epstein story came to prominence.
00:20:29.000 Don't get me wrong, Pam Bondi, the DOJ, they've been screwing this one up royally, and Trump is acting real weird about it.
00:20:35.000 I'll give you that.
00:20:36.000 But he's not distracting from it because these documents were getting released, and the conversation was happening well before the Epstein stuff happened.
00:20:43.000 Obama has one thing going for him.
00:20:45.000 There's no photos of him in Epstein.
00:20:47.000 Yeah, that's good for him.
00:20:48.000 Yeah.
00:20:48.000 He's got that going for him.
00:20:49.000 Yeah, there's a lot of other people like Howard Stern.
00:20:52.000 Bill Gates.
00:20:52.000 Yeah.
00:20:54.000 Uh-oh.
00:20:54.000 Bill Clinton.
00:20:56.000 You think Hillary went to with Bill?
00:20:59.000 Wait, was that?
00:21:00.000 Was there ever actually that picture of Bill in the dress that's Little St. James?
00:21:03.000 There you go.
00:21:04.000 That's real.
00:21:04.000 Maybe, maybe Hillary bought that and brought it home.
00:21:07.000 Wait, a photograph or an illustration?
00:21:08.000 Painting.
00:21:09.000 Oh, I think.
00:21:09.000 Painting.
00:21:10.000 Painting of Bill Clinton in a blue dress.
00:21:12.000 Yeah.
00:21:13.000 It's weird.
00:21:14.000 But there were just all of these cameras there for no reason.
00:21:19.000 The one that Epstein had in his apartment, famously, correct?
00:21:21.000 It was on his island.
00:21:22.000 On his island.
00:21:23.000 Yeah, he also had George W. Bush with two little towers.
00:21:26.000 He was throwing paper airplanes at him.
00:21:28.000 So the 9-11 truth movement went like, that proves it!
00:21:31.000 Epstein and the two towers or whatever.
00:21:33.000 Can I pull these up?
00:21:34.000 Oh, I didn't hear about that one.
00:21:35.000 Yeah.
00:21:36.000 George.
00:21:39.000 It's such a disgrace, and it's so obvious.
00:21:41.000 Like, it doesn't even need to be said.
00:21:43.000 But people are still denying reality.
00:21:45.000 Wait, which part?
00:21:47.000 The whole - it's not even a mishandling of the Epstein files situation.
00:21:53.000 It's very intentional what they're doing, and I feel malicious.
00:21:58.000 George W. Bush with two paper airplanes and two top ball Jenga towers.
00:22:02.000 Wow.
00:22:03.000 Yeah, that was in Epstein's Island.
00:22:05.000 That's creepy.
00:22:06.000 That is really creepy.
00:22:07.000 Yup.
00:22:08.000 Oh, here's the, look at this.
00:22:09.000 I got the Bill Clinton in the blue dress.
00:22:11.000 Let's pull this one up.
00:22:12.000 Let's disturb the audience.
00:22:13.000 That is so wild.
00:22:14.000 This is my first time seeing it in the other one.
00:22:16.000 Do we think Monica Lewinsky worked for Mossad?
00:22:19.000 No.
00:22:20.000 Why?
00:22:21.000 I've heard, I've heard that.
00:22:23.000 I mean, she's not Chuck E. Cheese works for Mossad.
00:22:26.000 Or at the very least, that she worked for some intelligence agency and purposefully was gathering blackmail on him.
00:22:35.000 Why?
00:22:36.000 But it came out, right?
00:22:37.000 So it doesn't really matter.
00:22:38.000 Like, wouldn't the whole point be that it doesn't come out?
00:22:40.000 That she gathers the intel or uses her feminine wiles to blackmail him.
00:22:47.000 And then the whole point is to keep it out of the public, not have it leak to the public.
00:22:51.000 Yeah, I don't think she's smart enough.
00:22:53.000 Well, then that's then Mossad needs to change their hiring practices if that's what they're doing.
00:22:58.000 It does kind of show you this Orwellian society that we're living in, though, when you think of people like Saul Alinsky, George Soros, Hillary Clinton, and just the lack of general information that people have about who these individuals are and this power apparatus that exists behind the Soros Foundation was involved.
00:23:17.000 They were communicating over this stuff, which is weird.
00:23:21.000 And everybody who brought up Soros was called a conspiracy theorist.
00:23:25.000 I'll tell you guys, you want something interesting?
00:23:27.000 When we first booked Mike Benz, who's been calling out USAID and all this stuff for a long time, I got spam blasted by weird liberals telling me that I shouldn't have him on the show.
00:23:38.000 And I'm like, what?
00:23:39.000 Why are you DMing me, bro?
00:23:40.000 This is weird.
00:23:41.000 They were panicked and desperate that we would not bring Mike on the show.
00:23:44.000 He comes with endless receipts.
00:23:46.000 Perhaps.
00:23:47.000 Perhaps.
00:23:48.000 It is funny that you mentioned it, too, that you said that Hillary Clinton, she did her dissertation on rules for radicals.
00:23:54.000 And what he mentioned was that they started accusing Trump of all of this stuff because it dirties him up so that they can't have accusations thrown back at them, which is right out of the playbook from the book.
00:24:03.000 Right, but yeah, that's why I brought it up.
00:24:05.000 Yeah.
00:24:07.000 That's very depressing.
00:24:08.000 And it is true also that we live in an age now as the internet becomes more prevalent that it's going to be, it's your parents and your grandparents that have this kind of a higher definition of what a politician might be because they were fed years of propaganda from mainstream media outlets,
00:24:24.000 depending on whether they were left or right, whether it was Fox telling you that Democrats were evil or CNN telling you that Republicans were evil and the rest of us who have moved on to greener pastures, getting their information from other places or just have any level of common sense, understand that pretty much all politicians are awful on some level.
00:24:42.000 Yeah, it's basically Democrats and Republicans for me is tantamount to Santa Claus and Mr. Bunny at this point.
00:24:48.000 Yeah.
00:24:49.000 Let's jump to this next part of the story.
00:24:51.000 We have a post from DNI Tulsi Gabbard.
00:24:54.000 New whistleblower reveals how they were threatened by a supervisor to go along with the Obama-directed Russia hoax intelligence assessment, even though they knew it was not credible or accurate.
00:25:06.000 The whistleblower refused.
00:25:08.000 Yesterday, we released the whistleblower's firsthand account of what happened in the crafting of the January 2017 ICA.
00:25:15.000 Their years-long effort to expose the egregious manipulation and manufacturing of intelligence carried out at the highest levels of government and the intelligence community detailed in our previous releases and how they were repeatedly ignored.
00:25:28.000 Thank you to this courageous whistleblower and others who are coming forward now, putting their own well-being on the line to defend our Democratic Republic, ensure the American people know the truth and hold those responsible accountable.
00:25:41.000 So we have the whistleblower's explosive story and evidence, Tulsi Gabbard says here.
00:25:48.000 And so there's a lot to break down.
00:25:50.000 It's 19 page.
00:25:51.000 They're not going to go to the full thing, but we do have this.
00:25:53.000 This is from the Federalist.
00:25:55.000 Clapper crew threatened whistleblower who refused to sign off on fabricated intel assessment.
00:26:01.000 A crony of then-DNI James Clapper threatened to withhold a promotion from a senior intelligence official unless he concurred in the fake intelligence community assessment on Russia's meddling in the 2016 election.
00:26:12.000 Notes obtained by the Federalist show.
00:26:14.000 The notes made public for the first time today recount a conversation the top analyst in the office of the director of national intelligence had with an unnamed superior who worked closely with then director James Clapper.
00:26:25.000 The release of the notes represents the latest cache of documents declassified by the Trump administration official concerning the ICA, this we understand.
00:26:32.000 According to a person familiar with the notes, the analyst documented his recollection of the conversation on March 31st, 2023, more than six years after the conversation occurred.
00:26:42.000 The delay, the Federalist source explained, occurred because the analyst efforts to share his concerns first, the Inspector General of the IC, and then later with special counsel John Durham and Virginia Senator Mark Warner proved unsuccessful.
00:26:55.000 Only later did the analyst receive an inquiry for more information about his claims, leading to the drafting of the summary of his recollections.
00:27:04.000 Well, there you go.
00:27:05.000 I saw up there it mentioned the Steele dossier.
00:27:07.000 So this is connected to everything that was in that Intel packet.
00:27:10.000 The Steel dossier was just OPPO research fake garbage and they passed it off as real to go after Trump.
00:27:16.000 I mean, I think this might be the biggest political conspiracy or scandal in the history of this country.
00:27:22.000 It won't matter because in two days the internet will have moved on from it and that's the sad part.
00:27:26.000 I mean, maybe.
00:27:26.000 We've been on the story for a few weeks now.
00:27:28.000 We've been on the story actually for months.
00:27:30.000 If you're talking about the statements made by Cash Patel and further made by Dan Bongino later on.
00:27:36.000 I feel like, but with the amount of information that comes out every day, people are going to start kind of judging these things on who actually is punished as opposed to all of the details coming out because you could have all the information in the world.
00:27:47.000 If nobody's held accountable, it doesn't really matter.
00:27:51.000 Yeah, and there are a lot of people that don't think anything's going to happen, but I'm not convinced nothing's going to happen.
00:27:58.000 And I'll put it this way: if something does happen, it just will feel like nothing.
00:28:02.000 Like, when we say nothing ever happens or nothing ever changes, Trump did get arrested several times.
00:28:09.000 He did lose in court several times.
00:28:12.000 And so he did get it right.
00:28:14.000 He got arrested.
00:28:15.000 I think people think nothing ever happens the other way.
00:28:19.000 Nothing ever changes the other way.
00:28:21.000 Democrats are never held responsible.
00:28:23.000 Indeed, but that sounds like a cop-out.
00:28:25.000 Maybe, maybe on the state level, like the ones who are attacking ICE agents might actually get arrested.
00:28:30.000 But at the federal level, it doesn't feel like.
00:28:33.000 And I think when people think of that idea, they're thinking of the Clintons, they're thinking of the Obamas, they're thinking of the Bidens.
00:28:40.000 Well, we have a poll up, and the question is: Obama will be charged.
00:28:46.000 The options are Obama will be charged, or nothing will happen, and nothing ever happens is at 63%.
00:28:52.000 Defeating Obama will be charged.
00:28:54.000 So even with all of this, the expectation for most people is that nothing will change.
00:28:59.000 If that's the case, what are we doing here?
00:29:02.000 Like, honest question.
00:29:05.000 If the reality is most people are blackpilled on this issue, why don't we all talk about the football?
00:29:15.000 You know, go.
00:29:16.000 Well, I think there's this underlying hope that the system could change.
00:29:19.000 I know deep down inside, I mean, I have this naive hope that the system can change if enough people are made aware.
00:29:25.000 Do I think people like Clapper and guys like John Brennan are going to be held accountable?
00:29:30.000 No, I think they might be held out there, but I question how accountable they could be held given that they know, I mean, where the bodies have been buried for how many years, you know, going back all the way to the Bush administration and really even before that.
00:29:45.000 And so, you know, they have a lot of ammunition on their side as well.
00:29:50.000 But the more people become aware of what's going on, then the more people can take an active role in potentially changing the system, right?
00:29:56.000 It starts with information.
00:29:58.000 Mary, do you care about this?
00:30:01.000 On the broad scale, yes, because I think that the right decision isn't just to disengage from politics altogether, obviously.
00:30:11.000 Although I think that Trump's victory and the last six months have actually made people on the right more complacent, and it's made them disengage because they haven't been delivered what they were promised.
00:30:24.000 And I think the path forward is actually to just start imagining what this ideological side, if you want to call it that, should look like past the point of Trump.
00:30:35.000 I mean, do you care about Hillary Clinton in 2016 accusing Trump of colluding with the Russians?
00:30:42.000 No, and I don't think Trump does either, to be honest.
00:30:44.000 So, you know, he was talking about getting retribution, and he has the option.
00:30:50.000 He has the ability to do that, and he doesn't care to.
00:30:53.000 Yeah, do you care?
00:30:55.000 I care more about this information coming out in its entirety to people that don't know it than I do about them actually being prosecuted.
00:31:04.000 I would rather see people who have kind of lived under the lie of these people of this being some ridiculous movie of good and evil where one side is good and one side is evil and the side they happen to support just happen to be the good guys.
00:31:17.000 I would rather see people who aren't currently awake to the evils of both sides.
00:31:24.000 The reason why I ask is because my theory as to why nothing ever changes is because you need only stall a development for a year or two before there's no longer any will behind it.
00:31:37.000 So when this stuff first happens, everybody's like, whoa, this is BS.
00:31:42.000 The Trump campaign's furious.
00:31:43.000 You saw Trump on TV saying the Democrats are putting this out.
00:31:46.000 It's not real.
00:31:47.000 And the conservatives supported Trump were angry they were being smeared.
00:31:50.000 They got called white supremacists.
00:31:51.000 It's been almost 10 years.
00:31:54.000 It's been just about nine years, a shy of nine years.
00:31:57.000 And so the people who lived it and were angry are focused on other things now.
00:32:03.000 There's no coalition anymore.
00:32:05.000 Young people who are in their 20s, I mean, Mary, how old were you in 2016?
00:32:12.000 I was 15, 16.
00:32:14.000 Yeah.
00:32:15.000 So right now, Trump is putting out information asking a bunch of, let's just say, 15 to 20 year olds to care about this fight as they're entering the political arena.
00:32:25.000 And people who are older are probably like, dude, this is 10 years ago.
00:32:29.000 Okay.
00:32:30.000 What's going on right now with jobs and the economy and the interest rates?
00:32:34.000 And so that's why it's so hard to get accountability because the criminals are like, we only need to stall for a couple of years and then there will be no political will to go over this.
00:32:42.000 And what are the Democrats going to say?
00:32:44.000 You're focused on the past.
00:32:45.000 I'm focused on the future.
00:32:47.000 I mean, I've kind of come to that sentiment with a lot of things recently, not just with this story, but with most things that involve the culture war, which is like, I'm focused on my life, like getting married, starting a family, and all the things that I can control.
00:33:01.000 And the rest of it just kind of feels like, look, this is out of my control.
00:33:05.000 And putting an extreme amount of focus on it just ends up hurting me.
00:33:09.000 And there was a time, perhaps, when phones were new to your pocket and everybody started taking politics very seriously as some type of team sport mentality where it really like resonated with people.
00:33:20.000 And now people are like, well, I can't afford a home.
00:33:23.000 The interest, you know, everything is impossible.
00:33:26.000 Jobs are scarce for a lot of people.
00:33:28.000 Buying a home is impossible for the next generation.
00:33:30.000 I got a lot of black pilling when it comes to that part of society.
00:33:34.000 Maybe the focus needs to go back there rather than this stuff.
00:33:37.000 I do have to admit, I am rather, what's the right word?
00:33:40.000 Entertained when I go on Instagram and I'm looking at the stories from like friends of mine for I've known for decades.
00:33:47.000 And these are people who just have never been involved in politics.
00:33:50.000 And now it's like, I'll click their story and it's just Gaza.
00:33:53.000 Like 700 stories about Gaza.
00:33:56.000 I'm like, well, I can see what current trend is, you know, what current issue is in the current year, because I know it's going to happen.
00:34:05.000 Three or four months, it's going to change.
00:34:07.000 And the posts are all going to be about some other trending issue.
00:34:11.000 Meaning that those people, like a couple of years ago, they were posting Ukraine stories.
00:34:15.000 Oh, yeah.
00:34:16.000 And then before that, it was like Spice Girls.
00:34:20.000 I think you skipped a decade or two there.
00:34:22.000 Yeah.
00:34:22.000 Nope.
00:34:24.000 Nope.
00:34:25.000 No.
00:34:26.000 Bro, those are millennials.
00:34:27.000 Yeah.
00:34:27.000 Yeah.
00:34:28.000 They're all posting weird drama stupid stuff.
00:34:30.000 You're saying there was nothing before it was, it was, it was Gaza, Ukraine, and San Francisco.
00:34:35.000 Didn't the Spice Girls do like a tour recently?
00:34:38.000 I have no idea.
00:34:39.000 I'm not kidding.
00:34:40.000 They're posting about Spice Girls.
00:34:41.000 I feel like we would have known that.
00:34:43.000 When was we not doing our job?
00:34:46.000 Yeah, they're planning a tour in 2026.
00:34:48.000 Oh, okay.
00:34:49.000 The 30th year anniversary.
00:34:51.000 Oh, that's that's going to be an embarrassment, just like in Sankinbax reports.
00:34:55.000 See, we need to get we need to get back to that type of corporate feminism.
00:34:58.000 That's what we really need.
00:35:00.000 Corporate-backed, like where only the big 2019, I was right.
00:35:04.000 Okay, so Spice World 2019.
00:35:06.000 It's exactly what I was talking about.
00:35:08.000 During COVID?
00:35:09.000 This was just before COVID.
00:35:10.000 That's why nobody, nobody nervous.
00:35:12.000 It's COVID.
00:35:13.000 Wow, look at that.
00:35:14.000 But posh wasn't there.
00:35:15.000 Well, then it's not a Spice Girls story.
00:35:17.000 She was the best.
00:35:17.000 I tried out of all of them.
00:35:19.000 Yeah, it's like millennials being like, can you believe the Spice Girls were back?
00:35:22.000 And they're posting about on Instagram.
00:35:24.000 And then COVID happened, everyone's brains turned to jello.
00:35:26.000 Yeah.
00:35:28.000 No, we've never recovered from that.
00:35:28.000 That's weird.
00:35:29.000 That weird cultural shock to the system.
00:35:32.000 And everybody just being awful to each other on Facebook during COVID for a myriad of reasons.
00:35:36.000 Yeah.
00:35:37.000 What are the expectations?
00:35:39.000 I mean, when you say like that the one side has gone complacent, what are the expectations that didn't get met?
00:35:45.000 I'm just curious where you're going.
00:35:46.000 So many.
00:35:47.000 I mean, maybe I'm just speaking for myself, and I shouldn't say that I represent this huge swath of people, but I like could not be more disappointed with Trump.
00:35:58.000 I mean, it's just cruel joke after cruel joke at this point.
00:36:02.000 He's thinking of pardoning Diddy, more money for Ukraine, more money for Israel.
00:36:10.000 We're bombing Iran, expanding a new visa program for foreign illegal aliens.
00:36:18.000 And it just keeps getting worse and worse.
00:36:20.000 And then you add the Epstein scandal on top of all that, and it's just wrapped in a little bow.
00:36:25.000 And I'm not disappointed.
00:36:28.000 He said mean things about Hillary Clinton.
00:36:30.000 Yeah.
00:36:31.000 Well, I wouldn't be bothered by it if he kept Hillary Clinton and Obama like criminally accountable for their actions.
00:36:37.000 I don't think he will.
00:36:38.000 It wouldn't bother me, but it would feel a little self-indulgent and it would feel like a huge distraction from what people really care about.
00:36:46.000 But I mean, kind of to your point, I mean, but you know, in terms of people being focused on, you know, what they can control, wouldn't you say that things are a lot better than where they were, say, just two years ago under Joe Biden in terms of ability to have, you know, that kind of impact over your own personal domain in terms of your own civil liberties and ability to make choices and engage in the economy and, you know, and have your free speech restored and things of that nature.
00:37:11.000 I think you could give Elon Musk more credit for restoring free speech and big tech platforms than Trump.
00:37:17.000 But can you be specific about that?
00:37:21.000 I just think the culture has shifted back more to the middle where you're not walking on eggshells over everything you say.
00:37:28.000 You know, we're not seeing as many men playing women's sports and kind of this.
00:37:33.000 I mean, that's, I just feel like the right is like desperate for something that appears to be a huge W. And it's a lot of people in the deregulation.
00:37:43.000 And, you know, with what you said about woke, basically, I don't actually agree that woke is dead, as everyone is saying.
00:37:50.000 I think that it actually disguised itself more cleverly and it's just covert woke now.
00:37:55.000 And we don't even know that we're woke.
00:37:58.000 It's just sort of a software update that everyone went through.
00:38:03.000 I agree.
00:38:04.000 Like we are woke.
00:38:06.000 Well, I don't know.
00:38:06.000 Maybe not us in this room, but society has been dragged over to a level of wokeness that we're not going to return back from.
00:38:16.000 And I agree with you.
00:38:18.000 I want to jump to the story to get into the core into the core of this.
00:38:21.000 This is from MSN.
00:38:23.000 Spotify introduces face scanning age checks for UK users as some furious fans threaten to return to piracy.
00:38:32.000 So here's what I think is happening.
00:38:34.000 I think that woke has been routed and we've pushed it back.
00:38:37.000 And you're seeing now like the Sydney Sweeney ad plus that Dunkin' Donut.
00:38:40.000 Was it Dunkin' Donuts?
00:38:41.000 Dunkin' Donuts.
00:38:42.000 Where he's like, I got good genes.
00:38:43.000 And everyone's like, it's racist.
00:38:45.000 But here's what I think is happening.
00:38:47.000 The powers that be, whether you have the international interest of the corporations, I imagine that they got together and said, guys, this force, this cultural force using racism and stuff didn't work In getting people controlled, let's give baby their bottle, but we are then going to go to the conservative woke route, which is, won't you think about the children?
00:39:11.000 We have to protect the children.
00:39:12.000 We have no choice.
00:39:13.000 So then they pass these bills saying, if you're going to buy porn, you need an ID.
00:39:17.000 And we all agree with it.
00:39:18.000 Like, yeah, of course, no one should be like, I agree with that.
00:39:20.000 Then they say, oh, okay.
00:39:22.000 Oh, and by the way, it means anything that is ever considered explicit now requires a face scan and ID.
00:39:27.000 So now Spotify is face scanning people.
00:39:30.000 What happens next?
00:39:31.000 This is worse than woke.
00:39:32.000 It's one thing if you said a naughty word on the internet and got banned from that platform, but you weren't banned from the other platforms.
00:39:37.000 Sometimes they didn't collude.
00:39:38.000 What's happening now is social credit scores are starting to pop up through these kinds of systems.
00:39:42.000 And we're seeing it happening in the UK.
00:39:44.000 And I will tell you, they're the canary in the coal mine.
00:39:46.000 That's the country where they were arresting people and are arresting people for speech.
00:39:50.000 We can claim cultural victories, but we've had these laws passed at state level that we've largely agreed with.
00:39:57.000 Like, hey, it's a good thing that Pornhub being banned.
00:40:00.000 Exactly.
00:40:00.000 And now what's happening is they're going to go, Visa and MasterCard have banned something around, what is it, like 20,000 games.
00:40:07.000 And the argument is, oh, but it's because they're porn.
00:40:10.000 Apparently, most, many of them, thousands maybe weren't even porn.
00:40:13.000 It was just adult-themed games, perhaps like GTA, which was the target of this.
00:40:18.000 They're going to go through payment processors.
00:40:20.000 They've always tried debanking.
00:40:21.000 We may have won on some cultural grounds where what we describe as woke has been pushed back, but the censorship industrial complex is just trying to find new ways to control what we can think, what we can see, and what we can purchase.
00:40:34.000 And now YouTube's going the age verification route with 18 Plus right after they suddenly started pushing shorts heavily, meaning that kids are the ones who spend hours a day glued to their phone looking at this is it's going to destroy most independent creators.
00:40:50.000 When they age get your content, your views drop by something like 60, 70%.
00:40:54.000 And it's not because the video is being served to children and they're saying, no, you kids, you can't watch this.
00:41:00.000 It's because people don't sign up.
00:41:02.000 And if you're not signed up, you can't watch content they deem to be age inappropriate.
00:41:05.000 That means the front page of YouTube is going to be Mr. Beast and nothing else.
00:41:10.000 There's going to be five big shows that are approved by YouTube.
00:41:14.000 That means news and politics will be gated.
00:41:17.000 And that means the average person will not be able to watch shows about politics.
00:41:20.000 And that's exactly what they want.
00:41:22.000 They want you to go back to sleep, America.
00:41:24.000 Your government is in control once again.
00:41:26.000 Here's American Gladiators.
00:41:27.000 Here's 40 channels of it.
00:41:28.000 No, I think these are all great points.
00:41:30.000 I'm just making the point that our political freedom only goes as far as our economic freedom.
00:41:35.000 And so we are still really, you know, early on in Trump's second term.
00:41:40.000 And so if home ownership rates start to go up and MA activity has already picked up and IPO activity is starting to uptick and things are on the uptrend, then I would say that that's definitely a move in the right direction from where we are.
00:41:55.000 Because if you don't have your economy, you have nothing.
00:41:57.000 You don't have opportunity.
00:41:58.000 I mean, that's what makes us Americans is that's the life, the lifebread of our society is our innovation.
00:42:03.000 And so, and I think it's too early to really make a call there on Trump's second term.
00:42:10.000 How do you feel things have been done with the tariffs?
00:42:12.000 I know that that's been a conversation that's come back up the last couple of days.
00:42:16.000 Do you think the tariffs were a good idea?
00:42:18.000 I think we're bringing in a lot of revenue so far in the tariffs.
00:42:18.000 I do.
00:42:21.000 And I think, you know, he's winning there.
00:42:23.000 And I think it's kind of allowed us to take control back on the global stage.
00:42:30.000 I think it's too early to say because we have to see if the GOP is really going to follow up the big, beautiful bill with some, you know, with more resistance packages to try to get the debt under control.
00:42:39.000 And if that's going to be our starting point or an end point.
00:42:41.000 I'm just making the case that I think it's too early into a second term to call it a W or an L yet.
00:42:48.000 I think that also has to do with the life cycle of the news, being if you're on the internet a lot and you're just blasted with news every single day.
00:42:56.000 Like not as much time has passed as people feel because what is actually six months feels like 10 years sometimes because you're reading nonstop news every day.
00:43:05.000 But the tariffs is one of the funniest examples to me of like when you hear about like I support the current thing was like I saw signs in local businesses talking about tariffs and I was like, you just know that the people who read those like they had to look up what a tariff was just to make sure that they were on the right side of whatever the issue was to the people that they're fighting with.
00:43:26.000 Like explain in pop terms.
00:43:27.000 Yeah.
00:43:28.000 I want to add to what we were talking about with YouTube.
00:43:30.000 Users on YouTube who believe that their AI age verification system is incorrect are supposed to verify their age by uploading a government ID or something.
00:43:42.000 That they'll have permanent.
00:43:44.000 And then you're going to start getting weird emails for services and stuff.
00:43:47.000 And you're going to be like, oh, why are they messaging me?
00:43:49.000 How do they get that email?
00:43:50.000 How do they know who I am?
00:43:51.000 And Google's going to use it to train their AI.
00:43:53.000 Yep.
00:43:55.000 And then the machine will know your face.
00:43:58.000 See, the thing is, I've said this, in the real world, if you go to an adult bookstore, you show your ID, right?
00:44:03.000 So why would we let kids on the internet do whatever they want, like on X?
00:44:06.000 The difference with the internet is they're telling you to upload your personal information permanently.
00:44:10.000 And so they're going to have your stuff on file.
00:44:13.000 And then, you know what this is?
00:44:15.000 It's problem reaction solution.
00:44:16.000 They're going to come back and say, okay, you're right.
00:44:17.000 You're right.
00:44:17.000 That is bad.
00:44:18.000 We're going to have a third-party company will assure us that you're verified.
00:44:22.000 So we will never hold your data.
00:44:24.000 The third-party company can then ban you for naughty words.
00:44:29.000 And then when you go to the grocery store and you're like, yeah, here's my groceries, they'll be like, yeah, just, you know, scan your credit card right then.
00:44:35.000 You'll tap it and it'll say, bam, banned.
00:44:37.000 And they'll be like, we use age verification, third-party app, and they're saying that you're a banned user, so we can't verify with them.
00:44:43.000 And that's what we use.
00:44:44.000 Our You Can't Shop here.
00:44:45.000 And the third party is Palantir.
00:44:47.000 Yep.
00:44:47.000 Oh, snap.
00:44:49.000 Just consolidated all of the data about American citizens that existed in each individual federal department, which is just yet another disappointment that I've heard.
00:45:01.000 So you're saying I got to buy some Palantir stuff.
00:45:03.000 First thing I did after that first Palantir story was to buy Palantir stuff.
00:45:08.000 I told you, I told you guys this story that Ian busted into my, so when I had the studio in the front of the castle, like when we first, this like five years ago, yeah, it was like five years ago, Ian slams it open like Kramer.
00:45:19.000 He's like, dude, you got to invest in Palantir right now.
00:45:22.000 I think it was at like $14.
00:45:23.000 Oh, $150 now.
00:45:26.000 And I was like, why?
00:45:27.000 I've heard of it.
00:45:28.000 He's like, dude, it's like this government database tracking, like prediction stuff, and it's going to be huge.
00:45:28.000 I don't know.
00:45:34.000 And I was like, I don't know, Ian, this is crazy.
00:45:36.000 What are you talking about?
00:45:37.000 Buying Palantir.
00:45:39.000 And so I didn't.
00:45:40.000 But then I remember one day, Ian was like, dude, graphene.
00:45:43.000 And I'm like, okay, you know what?
00:45:44.000 I'm buying a bunch of graphene stock.
00:45:46.000 And so I looked up companies that make graphene products and I bought stock and I made like 100 grand.
00:45:51.000 Yeah, I'm not kidding.
00:45:52.000 He's a seer.
00:45:54.000 I don't know, man.
00:45:55.000 Like, but he sounds crazy, so nobody wants to believe him.
00:45:58.000 He looks crazy.
00:45:59.000 That's his struggle.
00:46:00.000 You know, the problem is if it looks crazy and walks crazy, the chances are it's crazy, but then Ian's actually right.
00:46:06.000 So, you know, also, this is like Spotify is introducing Aatrix.
00:46:09.000 Why?
00:46:10.000 Because of like parental advisory music.
00:46:13.000 They could.
00:46:14.000 These companies could literally just say, we have the principal service.
00:46:18.000 And if you want anything deemed explicit, you can then opt for that.
00:46:23.000 Right.
00:46:24.000 So if you go on Spotify and you want to listen to like EZE as he raps about injuring LGBTQ people with pistols in their genitals, which he did, maybe you just need to say, I want to listen to explicit content.
00:46:34.000 And they say, okay, you got to prove you are.
00:46:36.000 The problem then is they're creating databases.
00:46:38.000 But the point I'm making is they're not doing that.
00:46:41.000 They're just saying everybody, no matter what, needs to face scan and verify because they want your data.
00:46:46.000 These big tech companies are like, oh, no, I guess we have no choice.
00:46:50.000 They collect your data and now they can sell it.
00:46:52.000 But Tim, there was a vibe shift and Mark Zuckerberg got a haircut.
00:46:56.000 Wait, did he get a haircut?
00:46:58.000 He's got the broccoli haircut.
00:47:00.000 I'm not mad anymore.
00:47:02.000 Yeah.
00:47:02.000 Here's another trick, Mark.
00:47:05.000 My phone number is 85.
00:47:06.000 I'm kidding.
00:47:07.000 We were two votes away from losing our sovereignty.
00:47:10.000 I mean, if not for Kristen Sinema and Joe Manchin, right, like we would have lost our rights essentially as U.S. citizens because it was just, it would have been endless open borders in terms of having to show a voter ID when you vote.
00:47:23.000 Wait, wait, wait, they voted.
00:47:25.000 They voted against it.
00:47:26.000 They did not go along when the Democrats were trying to pass that bill.
00:47:29.000 But if they were to be able to do that, that's how close we were, you know, to the end.
00:47:32.000 And so this seems like very minor in comparison to where we were.
00:47:37.000 And I feel like, you know, our memories, we have a very short that's a good point.
00:47:42.000 We were two votes away from Democrats.
00:47:45.000 It was banning voter ID, right?
00:47:46.000 It was making it so that anybody could just exactly.
00:47:49.000 Anybody can just vote, can just walk in there and just say they're largely out as a Democrat.
00:47:53.000 I'm Tim Poole and I'm going to vote.
00:47:55.000 And that would have been the end of our sovereign system as we know it.
00:47:58.000 Trump apparently wants a new census early.
00:48:01.000 And they're talking about trying to push that through so they can get illegal immigrants off of these.
00:48:04.000 So they can take some seats away from California.
00:48:06.000 Yeah, Texas is trying to take away five Democrat seats.
00:48:08.000 You see this?
00:48:09.000 Yeah.
00:48:10.000 That's crazy.
00:48:11.000 So those are big victories that further empowered a U.S. citizen.
00:48:14.000 I think closing the borders, I think there's a lot of wins that we're kind of taking for granted.
00:48:19.000 I think technology is just kind of going where it was going, regardless of who the, you know, which political party was in power.
00:48:26.000 So that's just my take on it.
00:48:28.000 I think it's too early to declare it a loss or a win yet at this point to Trump's second.
00:48:32.000 I do want to jump to this story.
00:48:33.000 This is from Tom's Guide.
00:48:35.000 YouTube's new AI age verification is coming soon.
00:48:39.000 Here's what's going to change.
00:48:40.000 AI will assess whether an account belongs to an adult or teen.
00:48:44.000 YouTube's going to start relying on AI to determine whether or not an account belongs to a teen or an adult and take action as a result.
00:48:50.000 In a recent blog post, YouTube announced machine learning would interpret a variety of signals that help us to determine whether a user is over or under 18.
00:48:58.000 My advice to 17-year-olds is just watch as much news as you can in between whatever it is you actually want to watch.
00:49:03.000 If the A believes the account is being operated by a teen, it will automatically apply age-appropriate protections.
00:49:09.000 Disabling personalized advertising, turning on digital well-being tools.
00:49:13.000 Oh, that sounds creepy.
00:49:14.000 Adding safeguards to recommendations, including limiting repetitive views of some kinds of content.
00:49:20.000 Meaning that it'll be like, you've watched too much of this, this guy's content.
00:49:24.000 Time to show you something else.
00:49:25.000 It's actually YouTube saying, hey there, kid.
00:49:27.000 I think you've had a bit too much to think.
00:49:29.000 Yeah.
00:49:30.000 So how does it work?
00:49:30.000 Move on.
00:49:32.000 They will, if they suspect a user is underage, restrict, make restrictions like disabling personalized ads and activating digital well-being tools.
00:49:43.000 Yeah.
00:49:44.000 Mary needs those.
00:49:45.000 So what is a digital well-being tool?
00:49:47.000 It's like a paperclip pops up and asks you how you're feeling.
00:49:50.000 How is this?
00:49:51.000 I have no idea.
00:49:52.000 I'm not getting parental advisory stickers or needing to be aged.
00:49:54.000 It's worse.
00:49:55.000 Buy a Playboy, you know, when I was a kid, something like that.
00:49:57.000 Well, like on YouTube?
00:49:58.000 No, just like to go into a convenience store, not to age myself and buy a Playboy machine.
00:50:02.000 You show your ID.
00:50:03.000 You show your ID for one second.
00:50:05.000 They don't take a copy of it, put it in their binder, and then say, I'm going to hold on to this forever.
00:50:08.000 Because they didn't have the ability to do so.
00:50:10.000 They could have put your ID on a fax machine on a copying machine.
00:50:10.000 They did.
00:50:15.000 But they didn't do it.
00:50:16.000 People got mad when they do that.
00:50:17.000 There's that scene from Atlanta, you know, talking about the show Atlanta.
00:50:21.000 Where he goes to the movie theater and he's like, you know, I want to buy a ticket.
00:50:21.000 Yeah.
00:50:25.000 And they're like, okay, here's how much it costs.
00:50:27.000 And he hands the credit card and they go, we need ID.
00:50:29.000 And he's like, okay.
00:50:30.000 And he shows the ID and they go, we're going to have to copy this.
00:50:34.000 He's like, what?
00:50:35.000 You're not getting a copy of my ID.
00:50:36.000 And then he walks away and the white guy walks up and then does the same thing.
00:50:39.000 And they're like, wait, what's going on?
00:50:40.000 He's racist or whatever.
00:50:41.000 But, you know, aside from the weird narrative of Atlanta, it is rare that someone at a store would take your ID and copy it and put it in a binder that they're going to keep forever and say, we might lose it, but that's your problem.
00:50:51.000 They do have the 16-plus requirements to go to the movies out by us out here.
00:50:56.000 Like after a certain time, you have to be over a certain age to get it.
00:50:58.000 That's probably because the kids are throwing popcorn.
00:51:00.000 Yeah.
00:51:01.000 Minecraft did it.
00:51:03.000 What were you saying, Mary?
00:51:04.000 You said you saw something.
00:51:05.000 Yeah, I tried to see which digital well-being tools they have.
00:51:09.000 They have reminders to take a break and bedtime reminders.
00:51:13.000 Instagram's had that for like a while.
00:51:16.000 Like you have to activate it yourself.
00:51:19.000 And also, if we're talking about Instagram and Meta, they have been proven time and time again to purposefully target underage accounts with more sexual content than the rest of their user base.
00:51:32.000 I think it was the Wall Street Journal that has released multiple reports about that.
00:51:36.000 And they have tested it extensively.
00:51:38.000 If you're an account on Instagram that is 13 years old, identified as 13 years old, you're immediately going to be fed more sexually suggestive content, usually pages that funnel into OnlyFans accounts.
00:51:52.000 Yeah, I think the ID thing's a Trojan horse as we're seeing it applied now, the age verification verification thing.
00:51:59.000 Again, they tried to go the woke route of don't be racist.
00:52:02.000 You're not racist, are you?
00:52:03.000 And people resisted and screw you, I'm not racist.
00:52:05.000 You're not banning me.
00:52:06.000 So now they're going the other route of, oh, the children, the children at risk and conservatives are on board with that one.
00:52:12.000 And that's the path towards creating systems of control and social credit systems.
00:52:15.000 I feel like the future is in making both sides angry.
00:52:18.000 That's how I felt about the Sydney Sweeney one because there's people on the right that were mad because of boobs.
00:52:24.000 Well, yeah, that and they were like, children, don't look.
00:52:24.000 Yeah.
00:52:27.000 And then the people on the left are mad because it's white supremacy.
00:52:30.000 So really, you want to shoot for making everybody mad.
00:52:32.000 So it's an Megelian dialectic, basically.
00:52:34.000 Yeah.
00:52:35.000 Make everybody angry.
00:52:36.000 Yeah.
00:52:37.000 Well, another trick is that suddenly no one in TSA is worried you're going to hide a bomb in your shoe anymore.
00:52:43.000 So you don't have to take your shoes off as long as you get the real ID.
00:52:47.000 Yep.
00:52:47.000 Yeah.
00:52:48.000 Or TSA pre- Well, you can't fly without it anyway.
00:52:52.000 No, you can, you can fly domestically without it up until, what was it, May this year?
00:52:57.000 Maybe.
00:52:58.000 It was an extension.
00:52:58.000 Now you have to have it.
00:52:59.000 Somewhere in May.
00:53:00.000 Yeah.
00:53:00.000 Oh, also, I had the story I told you earlier.
00:53:02.000 So my passport expired a long time ago.
00:53:04.000 I had to get a new passport.
00:53:05.000 When I went in to do that at the post office, I brought a certified copy of my passport and a copy of my passport.
00:53:12.000 One that blatantly says like copy on the top of it.
00:53:15.000 And they said they needed a certified copy, but the certified copy looks vastly different from a regular passport.
00:53:21.000 It doesn't have all the same information on there.
00:53:23.000 And the guy looks at the one that says copy, looks it up and down, uses that one, takes it with him.
00:53:29.000 And I'm just like, I'm like, I know I'm screwed.
00:53:31.000 Like this guy, like it went through.
00:53:33.000 Like I got my passport and I'm like, so on one hand, I'm happy because I didn't have to like file for like an extension and like an expedited passport.
00:53:43.000 On the other hand, they looked at this document and were like, yep, that's fine.
00:53:47.000 Rubber stamped it and sent it through.
00:53:49.000 Like that's bad either way.
00:53:52.000 That's just a good idea in conference.
00:53:54.000 Yeah.
00:53:55.000 Well, yeah, it's the government.
00:53:56.000 Yeah.
00:53:57.000 With the UK, it's the Online Safety Act.
00:54:00.000 One of the funniest things about this story is that to get past the face scanning, people are using Norman Reedis from Death Stranding.
00:54:07.000 That's right.
00:54:07.000 Yep, I saw that.
00:54:08.000 Because it's so lifelike.
00:54:09.000 Just because you can control the face and make him open his mouth and move.
00:54:12.000 And so it's like, it doesn't work anyway.
00:54:14.000 But they're creating a database.
00:54:16.000 And I think the play is get conservatives on board with it by saying it's for the kids and then create this ID database where everybody's going online has to submit their ID.
00:54:26.000 X did it.
00:54:27.000 And the right smiled as they submitted their IDs.
00:54:31.000 I did it.
00:54:32.000 Everyone with a blue check did it.
00:54:34.000 That's right.
00:54:35.000 Norman Reedas, they're about to finish the Daryl Dixon Walking Dead.
00:54:39.000 He's going to be able to cut his hair for like the first time in 20 years.
00:54:41.000 There you go.
00:54:42.000 But yeah, on X, Elon takes it over.
00:54:46.000 I believe this is a timeline.
00:54:48.000 Brings people back, then says, we're going to roll out monetization.
00:54:53.000 Everybody starts getting these big payouts.
00:54:54.000 And they're like, this is crazy.
00:54:56.000 I'm getting thousands of dollars.
00:54:58.000 You know, every other, what the heck?
00:54:59.000 Some people were getting like 30 grand in two weeks.
00:55:02.000 It was nuts.
00:55:03.000 I think there's a little bit of payola involved there.
00:55:05.000 Perhaps.
00:55:06.000 Or they were willing to have a kid with.
00:55:07.000 And they were posting those screenshots saying, in the interest of transparency, literally all of them were scripted.
00:55:14.000 It wasn't just women.
00:55:16.000 Well, so perhaps, perhaps it was payola because then what happens next is everybody's all excited about monetization.
00:55:22.000 Then we abruptly get this notification saying, if you don't submit your ID, you'll lose your blue check.
00:55:26.000 And then everyone went, whoa, whoa, whoa, hold on.
00:55:28.000 What?
00:55:28.000 I'm getting money here.
00:55:30.000 So then people, so this is incredible.
00:55:33.000 Elon brought the right onto the platform, offered them blue check marks and money along with it.
00:55:40.000 Once they all agreed, he says, now we're going to take it away unless you get you give us your IDs.
00:55:43.000 And they all said, yes.
00:55:45.000 He's been working with George Soros this entire time.
00:55:45.000 That's insane.
00:55:48.000 Oh, I don't know about that, but like the idea that Elon was like, they took away our jokes and it was wrong.
00:55:53.000 So I'm going to buy the platform and then restore free speech is silly.
00:55:56.000 Elon Musk was like, I want training data for my AI bot.
00:55:59.000 And I don't.
00:56:00.000 I said this much.
00:56:01.000 For sure.
00:56:02.000 But like people say, oh man, the Babylon B, you know, I bet Twitter regrets making that ban because Elon wanted to buy this well before Babylon B. So the whole Milton Friedman, you know, or I can, that was all an act, getting, you know, paying people a million dollars to read the Constitution and all that stuff.
00:56:19.000 What do you mean?
00:56:20.000 Well, he became a big disciple of Milton Friedman and the Constitution, apparently.
00:56:25.000 I mean, put on this act.
00:56:27.000 I'm just saying that data is valuable.
00:56:30.000 He wants to train his AI and he wants confirmed data.
00:56:34.000 So what I imagine he's doing with X, the reason for verification is that he doesn't want unverified profiles feeding the X machine.
00:56:40.000 So X AI is being trained on data and he's making sure that only people that they have verified as real humans with IDs are having their data fed into the training model.
00:56:52.000 That's probably why he did it.
00:56:53.000 So many of those verified accounts are botted anyways.
00:56:56.000 Yep.
00:56:57.000 X is just such slop now.
00:56:59.000 Everything is like this.
00:57:02.000 Every tweet that has high interaction, if you look at the replies, it's just more engagement bait in the replies, completely unrelated to the posts that you're looking at.
00:57:11.000 There's no actual discourse happening.
00:57:13.000 Anywhere, though.
00:57:14.000 It's not just X. Like, I'm going to tell you, I think the comments on every app are fake.
00:57:20.000 Oh, for sure.
00:57:21.000 They tailor them specifically to you.
00:57:23.000 If you even look at a TikTok and then you send it to your friend who's sitting next to you on the couch and they open the same link and look at the comments, you guys are going to see totally different top-liked comments based on what the algorithm assumes you'll agree with and will interact with.
00:57:41.000 Those are totally fake.
00:57:43.000 I believe Fortune just published a report that said over half of online traffic is bots.
00:57:51.000 Yeah.
00:57:51.000 It's more than that.
00:57:52.000 Way more than that.
00:57:53.000 It's dead internet theory.
00:57:54.000 It was trending today.
00:57:56.000 I think I kind of feel like everybody died a long time ago.
00:58:00.000 And I'm only half kidding.
00:58:02.000 Because I go outside and I'm like, where is everybody?
00:58:05.000 Honest question.
00:58:06.000 I go outside all the time and there's nobody.
00:58:08.000 I mean, like, where?
00:58:10.000 Well, do you mean that they're all staying inside because they're plug-in to the internet?
00:58:14.000 That might be.
00:58:16.000 There are some instances where it doesn't feel that way.
00:58:18.000 Like I went to the Christmas market in Chicago and it was shoulder to shoulder with people who aren't American.
00:58:24.000 And so that was kind of like, what's the word I'm thinking?
00:58:27.000 Looking disconcerting.
00:58:29.000 That I'm like, I'm here in Chicago at the Chicago Chris Kringle Market, and everybody here, they were from Asia.
00:58:29.000 Yeah.
00:58:38.000 It was like the people, they were all tourists, and it was shoulder to shoulder.
00:58:42.000 I went two years now because we used to go all the time when I lived in Chicago.
00:58:42.000 You could barely move.
00:58:45.000 And I'm like, it's all migrant tourists.
00:58:48.000 And I'm like, where are the Chicagoans that used to come out and say the bears?
00:58:52.000 They never actually did.
00:58:53.000 That wasn't true, but it's funny to say anyway.
00:58:55.000 And then, like I mentioned, I went for the 4th of July and nobody was out doing anything.
00:58:58.000 Nobody in the parks, nobody in the fields.
00:58:59.000 And I was like, what happened?
00:59:02.000 Where is everybody?
00:59:03.000 I mentioned this.
00:59:04.000 Local restaurant went out of business because they couldn't find anybody to work.
00:59:08.000 Trying to make food, couldn't do it.
00:59:09.000 Went out of business.
00:59:10.000 Then, you know what?
00:59:11.000 You know what I think a big component of this is?
00:59:14.000 People didn't have kids.
00:59:15.000 And if you don't have kids, you got nothing to do.
00:59:17.000 So these, these, I'll speak to the skateboard community because they're a bunch of degenerates.
00:59:23.000 30-year-old skateboard guy, he goes, I don't know, man.
00:59:26.000 I just, as long as they make enough to pay the rent, what's your rent?
00:59:29.000 It's like 200 bucks.
00:59:30.000 How?
00:59:30.000 What?
00:59:30.000 Well, I live with like six guys in a one bedroom at skateboarders.
00:59:34.000 And they're unmarried, single guys.
00:59:36.000 And so I'm like, okay, well, we need labor done.
00:59:39.000 Would you want to do a job?
00:59:40.000 No, I don't need to.
00:59:41.000 Why?
00:59:41.000 I work like 10 hours in the week and then I go skate.
00:59:44.000 And then I just beg or just eat scraps.
00:59:46.000 And it's just like, this is weird.
00:59:48.000 People didn't have kids.
00:59:50.000 So they don't have to fight to get resources anymore.
00:59:53.000 We're overly wealthy, lazy, and childless.
00:59:55.000 I mean, even before I came out here, I was like, I had to work a lot because I wanted to live on my own.
01:00:00.000 I didn't want to live with anybody.
01:00:02.000 So I had to work full-time.
01:00:04.000 But the whole point was like, I worked eight hours a day and then spent the rest of the time skating.
01:00:08.000 But that's pretty, pretty rare these days because most people, if they're in that community, they're going to want to go and, if they're dedicated to doing it like all the time, they're going to go live with people.
01:00:18.000 Living this bohemian life style.
01:00:20.000 Yeah.
01:00:21.000 How is that desirable, though?
01:00:23.000 I just don't get it.
01:00:24.000 At that time, I mean, during COVID, it was interesting because when you go on later in life, like, how is that still desirable to those people?
01:00:32.000 I don't understand it.
01:00:33.000 Don't you want to lay some roots somewhere?
01:00:35.000 They're deeply connected to what they're doing.
01:00:37.000 They really love it.
01:00:38.000 It's like if there was an activity that you really, really loved, you know, more so even than somebody, say, who plays an instrument, who can make time for that anytime, right?
01:00:47.000 Like you go to work, you come home, you can do that.
01:00:49.000 Skating is a little bit different for a lot of people.
01:00:52.000 It takes up a lot more of your time because you literally travel to go do it all the time.
01:00:56.000 And they're willing to sacrifice a normal life to go out and do that.
01:01:00.000 Let's jump to this next story from the Daily Mail.
01:01:03.000 Elon Musk makes bold play for an unlikely marriage with $3 trillion icon.
01:01:10.000 Elon Musk has been openly hinting at a historic merger in the business world, suggesting that his company, XAI, should partner with Apple.
01:01:18.000 Musk's company is the corporate face of his popular AI chatbot, Grok, which functions similarly to competitors like GPT, Claude, Gemini, Copilot.
01:01:26.000 Meanwhile, Apple has struggled to bring its own AI programs to consumers, notably delaying improvements to the Siri voice assistant.
01:01:33.000 Venture capitalists started openly speculating this month that Musk and Apple make the perfect power couple in the AI world with XAI bringing Grok to even more people using iPhones through this proposed partnership.
01:01:43.000 On the all-in podcast, investor Gavin Baker called XAI's Grokfor the best product in terms of ad chatbots right now, but added the best product doesn't always win in technology.
01:01:54.000 I think there's a solid industrial logic for a partnership.
01:01:57.000 You could have Apple, Grok, Safe, Grok, whatever you want to call it, said Baker.
01:02:01.000 Musk quickly replied to the comment saying, interesting idea.
01:02:04.000 The billionaire added, I hope so.
01:02:06.000 You want to know why I believe this is possible?
01:02:08.000 Why there's a good possibility?
01:02:09.000 Because when you pull up Nancy Pelosi's stock trades in May, she put $25 to $50 million in Apple, indicating Apple would be doing something.
01:02:18.000 And I'm going to stress this.
01:02:19.000 Apple's got nothing going for it.
01:02:22.000 They've made the same iPhone every year, non-stop for a decade, and people are tired of it.
01:02:26.000 They're not innovating.
01:02:27.000 They're offering up no real new products.
01:02:29.000 So why would Nancy Pelosi decide in May that we're going to do this big purchase?
01:02:36.000 To clarify, this is from a year ago.
01:02:38.000 So the report year is 24, she filed it a few months ago.
01:02:41.000 What has Apple done recently that has any play?
01:02:46.000 I feel like Phil needs to be here to defend Apple.
01:02:50.000 Unfortunately, I'm a Galaxy product.
01:02:52.000 It's a sticky product, and it could just be that the stock is oversold.
01:02:55.000 And I'm not saying that this is the case.
01:02:57.000 I'm just playing devil's advocate there and saying that there are people who are Apple users like myself who will always be Apple users because it's the Apple universe that you really buy into.
01:03:07.000 I think Apple's cooked.
01:03:08.000 But I don't think that's why she bought it.
01:03:10.000 I'm just making the case that she.
01:03:11.000 She knows something.
01:03:12.000 Oh, of course.
01:03:13.000 She knows something's going on.
01:03:14.000 Perhaps some intelligence came across her desk where there is murmurings of a potential merger between, say, X or a partnership that would require some congressional oversight or something like this.
01:03:24.000 And she was like, quick, buy Apple.
01:03:27.000 Buy it now.
01:03:29.000 A year out, maybe.
01:03:31.000 It's a stretch.
01:03:31.000 I don't know.
01:03:32.000 I'm just saying she bought NVIDIA.
01:03:32.000 Maybe not.
01:03:35.000 She knows what's going on.
01:03:37.000 She's got insider information.
01:03:38.000 That's what Trump said.
01:03:39.000 She goes, I'm not into that.
01:03:40.000 My husband is.
01:03:41.000 She tells her husband.
01:03:41.000 Oh, yeah.
01:03:43.000 So her husband goes, Anything interesting happened at work today, honey?
01:03:46.000 And she goes, Oh, there's a new AI thing.
01:03:48.000 Apparently, they want to deregulate AI.
01:03:50.000 And there's a company called In Nvidia or something.
01:03:54.000 And he's like, Really?
01:03:54.000 And he's writing.
01:03:55.000 He's got a notepad full of notes.
01:03:58.000 He has a bug on her lapel.
01:03:59.000 He's like, talk more, honey.
01:04:01.000 And then when she gets caught, she's like, I have no idea what you're talking about.
01:04:04.000 So I don't know.
01:04:05.000 Last year, she bought between $25 and $50 million worth of Apple.
01:04:08.000 But I don't know.
01:04:10.000 It's possible that she's, I mean, her net worth is $260 million.
01:04:15.000 So $25 to $50 is a good chunk of her net worth.
01:04:19.000 Maybe it's the low end.
01:04:20.000 That's still a lot, 10% into one company.
01:04:23.000 Is that abnormal?
01:04:24.000 Big purchase like that?
01:04:25.000 You would know better than I would.
01:04:26.000 Of course.
01:04:27.000 It is abnormal.
01:04:28.000 Yeah.
01:04:28.000 100%.
01:04:29.000 Now, okay, that proves it.
01:04:29.000 Okay.
01:04:31.000 That's it.
01:04:32.000 She's doing it.
01:04:33.000 Elon's buying Apple.
01:04:36.000 I'm going to check Apple stock right now.
01:04:37.000 Well, you can go on, what's that app?
01:04:39.000 And you can just follow the Pelosi stock trader, which that's what I'm on right now, QuiverQuant.
01:04:44.000 It's called the Pelosi strategy.
01:04:46.000 Yep.
01:04:46.000 You know, you follow her, you would have made 700% over 10 years.
01:04:50.000 It's NVIDIA.
01:04:52.000 It's Broadcom, Google, Vista, Palo Alto Networks.
01:04:58.000 Hold on, real quick.
01:04:58.000 Tesla.
01:04:59.000 Hold on.
01:05:00.000 She's got a lot of stuff.
01:05:00.000 Is she like Elon?
01:05:01.000 Here to date, Apple is down 15%.
01:05:06.000 Do we, honest question?
01:05:08.000 I mean, maybe it's just silly and it's memeing, but does Nancy Pelosi make bad trades like that?
01:05:13.000 When you look at her record, she doesn't.
01:05:16.000 She's up 730%, man.
01:05:18.000 She's a profit.
01:05:20.000 So she buys into Apple and she's down 15%.
01:05:22.000 I don't know.
01:05:23.000 It's girl math.
01:05:23.000 I think she knows something.
01:05:25.000 She's like really bad at trading.
01:05:26.000 Well, it's her, to be fair, it's her husband.
01:05:29.000 And because it's their joint net worth, she files this.
01:05:31.000 But I'm just wondering, you know, that's a big chunk of her net worth, her and her husband's net worth.
01:05:38.000 Yeah.
01:05:38.000 I don't know, man.
01:05:40.000 Does Apple have government contracts?
01:05:42.000 I'm assuming they do have government contracts of some sort.
01:05:46.000 I don't know.
01:05:47.000 Looks like you can look at what she bought recently.
01:05:50.000 She bought, what is this, Matthews International?
01:05:52.000 What is this?
01:05:54.000 Oh, she sold.
01:05:55.000 Sorry, she sold Matthews International between $15,000 and $50,000.
01:05:59.000 Was that on a Friday?
01:06:02.000 I don't know.
01:06:03.000 July 9th.
01:06:04.000 Probably taking some money out to go party over the weekend.
01:06:07.000 Yes.
01:06:07.000 AVGO Broadcom.
01:06:09.000 She bought between $1 and $5 million on the 9th.
01:06:11.000 It's a Wednesday.
01:06:12.000 It's a Wednesday.
01:06:13.000 Ain't nobody partying on Wednesday.
01:06:14.000 Maybe she was getting ready for the weekend, you know.
01:06:16.000 Wild night.
01:06:16.000 It's DC.
01:06:17.000 They can party any day they want.
01:06:18.000 No, no, it's California, bro.
01:06:20.000 Well, I mean, policy.
01:06:21.000 I guess they're in sessions.
01:06:22.000 She did buy another between quarter and a half mil of NVIDIA and the same for Google and Tempest AI.
01:06:30.000 Oh, Tempest AI.
01:06:32.000 Really?
01:06:33.000 She bought Tempest AI, huh?
01:06:36.000 Huh.
01:06:37.000 She goes in the heart of Silicon Valley.
01:06:39.000 I will say that.
01:06:40.000 But the reality is, and I don't know anything about this new bill that got passed, but unless that they can prove that they had insider information, it's just been pretty much sand and operating procedure for insider politicians who have all their ancillary knowledge as to what's going on with these companies to then add them to their portfolios.
01:06:58.000 And that's why so many of them are so wealthy.
01:07:00.000 Should be illegal.
01:07:00.000 Absolutely.
01:07:01.000 100% agree.
01:07:02.000 Basically, what happens is there will be a okay.
01:07:06.000 There's a couple ways they can do it.
01:07:07.000 One is they can introduce legislation they know will damage a company.
01:07:11.000 They'll say, if I announce publicly that we are going to regulate this industry, these companies are going to go down.
01:07:17.000 Short them now.
01:07:19.000 You know, get those shorts ready.
01:07:22.000 Then we file this bill or we get some co-sponsors.
01:07:25.000 We announce it.
01:07:25.000 Their stock drops, sell it.
01:07:27.000 We'll drop the bill.
01:07:28.000 You never have to actually even pass any legislation.
01:07:32.000 Or the inverse is possible that they come to him and they say, look, we've got this new bill.
01:07:36.000 Lobbyists from these companies have been saying that this regulation is a problem for them.
01:07:40.000 We want to win the AI race.
01:07:41.000 So we're going to be deregulating and it's going to benefit a lot of these companies that make GPUs.
01:07:46.000 And then Pelosi goes, oh, oh, oh, she calls her husband.
01:07:48.000 She's like, buy NVIDIA now.
01:07:49.000 Buy NVIDIA.
01:07:50.000 And then she does, and she gets like several hundred percent off her investment.
01:07:54.000 And then they go, that's not insider trading.
01:07:56.000 Yeah, they just get to make decisions whether or not a company succeeds or fails.
01:07:59.000 And that's not insider trading.
01:08:00.000 That's how it works.
01:08:02.000 It's depressing.
01:08:04.000 That's exactly how it works.
01:08:05.000 Well, that's why a lot of people were mad about this new Pelosi Act.
01:08:10.000 Preventing elected leaders.
01:08:12.000 What is it?
01:08:13.000 Preventing investing.
01:08:17.000 What is this stupid thing called?
01:08:20.000 What does that stand for?
01:08:20.000 Pelosi Act.
01:08:21.000 Let me see.
01:08:22.000 from Agents of Shield stood for, but not the Pelosi Act?
01:08:22.000 So, you know what S.H.I.E.L.D.
01:08:25.000 Preventing elected leaders from owning securities and investments.
01:08:28.000 There was an F, and I was like, well, it's not Pelopsy.
01:08:31.000 Pel Foci.
01:08:33.000 George Carlin said it best.
01:08:34.000 It's a big club and you ain't in it.
01:08:36.000 And it's the same club they used to beat you over the head with.
01:08:38.000 That's what he said.
01:08:40.000 Indeed, it is.
01:08:41.000 So the funny thing is, Democrats and Holly were on board with this, but Republicans weren't.
01:08:48.000 And I was like, what's the problem?
01:08:51.000 This is weird.
01:08:52.000 Why are the why are the Democrats saying stop the stock trading and the Republicans are saying no?
01:08:57.000 Which Republicans were saying it?
01:08:58.000 Rand Paul, for instance.
01:09:01.000 Like these guys are like, could you imagine being a freshman member of Congress and you're like, I finally made it.
01:09:01.000 You know what it is?
01:09:08.000 And then they go, oh, by the way, you're the last one in.
01:09:10.000 We're going to ban you.
01:09:11.000 No, that's the whole point.
01:09:12.000 Do you think I want to work for $175,000 a year?
01:09:15.000 It was hard to get here.
01:09:19.000 Yep.
01:09:20.000 But now.
01:09:21.000 They get pensions, though, right?
01:09:23.000 I guess.
01:09:24.000 They do.
01:09:25.000 It's not as nice as being worth whatever she's worth, right?
01:09:28.000 $260 million.
01:09:31.000 That's crazy.
01:09:32.000 Just pure luck.
01:09:33.000 Pure luck.
01:09:34.000 She's guesses.
01:09:34.000 She's a prophet.
01:09:35.000 Her husband's a prophet.
01:09:36.000 Whatever.
01:09:37.000 You see her freak out when she was asked about it by Jake Tapper?
01:09:40.000 No.
01:09:41.000 On CNN.
01:09:42.000 She was like, why are you asking me about this?
01:09:44.000 He tried playing the clip of Trump saying she's insider trading and all that.
01:09:48.000 She got mad and she's like, I don't talk about Medicaid.
01:09:50.000 And I was just like, I want to talk about how you're 84 and you should quit.
01:09:54.000 Yeah.
01:09:55.000 Yeah.
01:09:56.000 It's like, you've made your money.
01:09:57.000 Like, you've bled us dry.
01:09:59.000 This is the creepiest thing about these octogenarians.
01:10:03.000 A gerontocracy?
01:10:03.000 What do they call it?
01:10:05.000 Rule by the old?
01:10:06.000 Well, yeah.
01:10:08.000 What's crazy to me about all of it is just what is wrong with Nancy Pelosi's brain that she won't leave.
01:10:16.000 Like, just leave.
01:10:17.000 Go away.
01:10:18.000 You're 84.
01:10:18.000 Go away, lady.
01:10:19.000 Bye.
01:10:20.000 What's wrong with these people?
01:10:21.000 Just get out.
01:10:23.000 Why not?
01:10:24.000 What's going on there?
01:10:25.000 Be happy.
01:10:25.000 She won.
01:10:26.000 Enjoy retirement, right?
01:10:27.000 Why do they keep coming back?
01:10:29.000 I just, it's insane to me that something's wrong with these people.
01:10:33.000 Why do you think that is?
01:10:34.000 Why do you think they don't leave?
01:10:35.000 Do you think they just enjoy exerting power over other people?
01:10:37.000 I do think it has to be some power dynamic because I've been wondering this for a long time.
01:10:41.000 People like George Short, even people like Trump, like what, right?
01:10:43.000 Like, they're like all the adrenochrome.
01:10:46.000 Yeah, there's got to be some chemical going on in the brain there that they keep going back to that makes them feel better.
01:10:55.000 Well, also, it's like they don't really work that much.
01:10:57.000 It doesn't feel like they're never there.
01:11:00.000 Maybe this, maybe she should, I don't get it.
01:11:03.000 Like, you're worth $260 million, lady.
01:11:05.000 You can eat all the Jenny's ice cream in the world right now.
01:11:08.000 Just leave.
01:11:09.000 And not just her, but a bunch of these other populations.
01:11:11.000 Okay, if you're Democrat or Republican, what is wrong with these people?
01:11:15.000 I don't know.
01:11:16.000 I'm just society is cooked.
01:11:19.000 Kids' brains is cooked, right?
01:11:21.000 People aren't working anymore.
01:11:23.000 There's no babies.
01:11:25.000 Our culture is fractured a million ways.
01:11:28.000 And you've got these sociopaths like Pelosi who won't just get out and leave.
01:11:35.000 And then, with all due respect to the boomers, it's not all boomers, I get it, but boomers hold a disproportionate amount of wealth and they won't give it away.
01:11:43.000 I don't expect them to give it away.
01:11:45.000 They want to spend it.
01:11:46.000 They're living longer.
01:11:47.000 They want to spend it.
01:11:48.000 That's right.
01:11:48.000 And yes, the Great Wall Transfer is not unfolding the way they predicted.
01:11:52.000 So Gen Z's got nothing.
01:11:54.000 They're going to live in pods, eat the bugs.
01:11:56.000 And you know what's going to happen?
01:11:57.000 Communist revolution.
01:11:58.000 Yep.
01:12:00.000 It was always weird to me when people would say, like, people will understand, people will fall in love with free markets once it gets bad enough.
01:12:05.000 Like, no, no, no, they won't.
01:12:07.000 They will, somebody at the government's going to tell them this is how we fix it, and they're going to fall in line with that.
01:12:12.000 And I think what's going to happen is, or I said there's a probability of this.
01:12:16.000 Young people are skewing to the right quite a bit.
01:12:19.000 So I would call it cultural revolution.
01:12:23.000 You know, I know that it's got a negative.
01:12:24.000 It's young men that are skewing right.
01:12:26.000 That's right.
01:12:26.000 And those are the ones that are going to go nuts.
01:12:28.000 The government could never be a solution to the problems.
01:12:32.000 I mean, I understand that young people, they feel like they don't have a stake in the system.
01:12:35.000 And so the natural impetus is to turn toward.
01:12:38.000 But if you just look at history, government has never been a solution to the problem.
01:12:42.000 It never will be a solution to the problem.
01:12:44.000 Freer markets and less government involvement is the only viable option.
01:12:49.000 So that's kind of what I was alluding to before.
01:12:53.000 I disagree.
01:12:54.000 I mean, the market is overly regulated today.
01:12:57.000 But if we were to loosen up regulations, you are not going to remove the multiple homes from the boomer generation.
01:13:04.000 They're not going to give them away.
01:13:05.000 And so what's likely to happen is I don't blame boomers for being like, look, I worked hard.
01:13:09.000 I've got three houses.
01:13:11.000 Screw you.
01:13:11.000 They're mine.
01:13:12.000 These are my investments.
01:13:13.000 I own stock and corporate securities.
01:13:14.000 They're mine.
01:13:15.000 I paid for them.
01:13:16.000 I worked.
01:13:17.000 Screw you.
01:13:17.000 Well, there's going to be Gen Z guys.
01:13:20.000 I think it's going to be a right, it's looking like a right word, a culture revolution, but they are going to take your stuff.
01:13:25.000 They don't care what you think.
01:13:26.000 And they're going to appropriate it for their cultural revolution.
01:13:30.000 And they're going to say they're going to be in their late 20s, these Gen Z guys.
01:13:34.000 And so this could be five, 10 years, depending, if nothing changes, right?
01:13:38.000 And they're going to say, we're expected to have families.
01:13:42.000 We're expected to work jobs.
01:13:44.000 We're expected to live, but we can't own property anywhere.
01:13:48.000 We've got foreign landlords and we've got an older generation that is living too long and they've got multiple homes and they use them when they see fit.
01:13:57.000 You are going to get, it starts with the DSA, but the problem with the DSA is that they're woke and ineffective.
01:14:05.000 Everybody saw that convention they had where they were like, point of personal privilege, my pronouns are actually, stop.
01:14:11.000 Can you stop clapping?
01:14:13.000 They were actually doing this when they were saying it, too.
01:14:15.000 Yeah, and people were clapping and they're like, I have anxiety.
01:14:18.000 You have to do what I want.
01:14:19.000 So young people who are actually just angry because you can't get anything done, we're like, we're going the right.
01:14:25.000 Because the left is crazy.
01:14:26.000 It doesn't work.
01:14:27.000 But those are still the same grounds on which the Bolshevik Revolution happens.
01:14:30.000 For them to do that successfully, they have to have some type of government authority.
01:14:34.000 So it would be, no matter what, even if the basis was right, it would be inherently left.
01:14:39.000 Otherwise, it's not successful and it gets called by those that are in power.
01:14:43.000 I would not be surprised if in like 10 years there was government enforcement that seized assets from boomers and reallocated it.
01:14:52.000 Yep.
01:14:53.000 Or put it on the market at lower rates or something.
01:14:58.000 But that's the left, though.
01:14:59.000 It's a left-wing approach to things is to take somebody else's property and redistribute it.
01:15:04.000 It's just authoritarian, authoritarian.
01:15:06.000 Depending on the cultural slant of the individuals as they do it, if they're like white Christian, you know, traditionalists who are like, we're going to restore the American dream and the white picket fence, but we're going to need corrective measures, then we would call that right.
01:15:20.000 But you're not allowing free markets to distribute those resources.
01:15:23.000 You are relying on force or the government or some other means to redistribute those resources.
01:15:28.000 So that's in that sense.
01:15:30.000 That's an economic scale.
01:15:31.000 So when we're talking about like the political compass, for instance, right and left don't necessarily mean free market.
01:15:37.000 It just right and left means traditional or progressive.
01:15:41.000 So some people use right and left to mean free market versus socialism, economic.
01:15:47.000 That's the economic scale.
01:15:49.000 The political scale is just like the fascists, the Nazis are ultra, they're authoritarian traditionalists, and the communists were authoritarian progressives.
01:15:58.000 So it doesn't matter what strain of authoritarianism you get.
01:16:00.000 Some might say, well, that's still left.
01:16:03.000 Maybe left-leaning.
01:16:05.000 You know, like Hitler wasn't economically right-wing, of course.
01:16:07.000 They had a centralized, they had a command economy of sorts.
01:16:10.000 They used cultural force to enforce what they wanted in their production.
01:16:13.000 But I digress.
01:16:15.000 Whether it's left or right is immaterial.
01:16:16.000 I think young people, when the Gen Z today, there's no way a bunch of people in their 30s are going to be like, I am content with living five people in a single unit apartment in New York as the older generation sold us out to illegal immigrants.
01:16:30.000 They're going to be like, nah, the power is ours now.
01:16:33.000 We inherited this country and we're going to take what we want.
01:16:37.000 It doesn't mean that they're going to seize and redistribute like commies.
01:16:39.000 I wouldn't be surprised, however, if they say, we're going to take your homes and then put them on a market at a rate per square foot or something like that.
01:16:47.000 Could it be that productivity just becomes so great as a result of post-labor economics and AI that there's no need for any redistribution?
01:16:55.000 That's just productivity is so high that it automatically creates a system where people just aren't working because they lose the right to labor anyway because AI is so productive.
01:17:05.000 AI can't build specialty projects.
01:17:06.000 AI can't open businesses.
01:17:08.000 AI can't handle regulation.
01:17:10.000 You can, I mean, if the entirety of government went AI and regulation was just handled by machines, theoretically, it could be awesome because you don't got to worry about committees, meetings, fines, personal beefs.
01:17:22.000 Unfortunately, in the short term, that's not going to happen.
01:17:24.000 And so the issue I'm facing right now is I ask this question all the time, why is it so hard to get any job done?
01:17:31.000 And we've assessed this over and over again.
01:17:34.000 We need something built.
01:17:36.000 Well, it took what, like, three years to build this building.
01:17:40.000 And that's insane.
01:17:41.000 It's a field.
01:17:42.000 We owned it.
01:17:43.000 Minimal permit requirements took years.
01:17:45.000 And the issue is no one wants to work.
01:17:48.000 And there's no amount of money you can offer them to work.
01:17:51.000 That is true.
01:17:52.000 I go to somebody and I say, how much do you want to do this job?
01:17:55.000 And they go, I'm busy.
01:17:57.000 And I say, sir, certainly there's amount of money that you would take To do this job, and they go, I'm busy.
01:18:01.000 And I'm like, what?
01:18:03.000 I tell people all the time, we try to get an exterminator.
01:18:09.000 We have a rotating cycle of warring bugs in this building.
01:18:12.000 It's frogs now.
01:18:13.000 There's like the spring peeper frogs are just dang, dang it.
01:18:16.000 It's crazy.
01:18:16.000 We had crickets, we had ladybugs, we had stink bugs, we had wasps.
01:18:19.000 Now we got mosquitoes.
01:18:20.000 And we call the exterminator and they go three weeks.
01:18:23.000 And I'm like, why?
01:18:24.000 And he goes, because that's when we want to do it.
01:18:25.000 We don't want to come in.
01:18:26.000 We don't need it.
01:18:27.000 We don't need the money.
01:18:28.000 And I'm like, I'm sitting here wondering why we reach out to so many contractors, we reach out to so many people, and they all just say, we don't need it.
01:18:36.000 And I'm like, well, is there no workers?
01:18:39.000 Well, that supports my point, though, because a homeless person today has a higher standard of living than, say, the Pharaoh of Egypt.
01:18:46.000 People are so spoiled, kind of what you were saying before.
01:18:49.000 And they don't have kids.
01:18:50.000 And so if a smaller percentage of the labor force has an exponentially higher level of productivity, there is this potential – I don't know about in the short term but in the intermediate term where we're living in this era of UBI and post-labor economics where the basic bare means of subsistence are provided for people.
01:19:08.000 Then anything beyond that is just the – You're going to be eating mashed bug paste in a pod because they're not going to give you luxury.
01:19:18.000 So I was hanging out and we were at this club where they have a fake beach at a lake.
01:19:23.000 And I'm frustrated because we can't get people to do jobs.
01:19:27.000 It's months out, it's weeks out, and they want insane amounts of money.
01:19:31.000 And I'm watching all these, I see all these guys, young men, 20-year-old guys, and they're just sitting there on the beach not working.
01:19:37.000 And I'm thinking to myself, we've offered double rate.
01:19:41.000 We've said, like, we'll pay you extra.
01:19:42.000 We'll pay you double.
01:19:43.000 And they go, no, we just don't, I don't need it.
01:19:45.000 And I'm thinking to myself, why?
01:19:47.000 What has happened socially where people are like, meh?
01:19:50.000 Yeah, they're childless, single young men.
01:19:54.000 That's their culture.
01:19:55.000 That's a society.
01:19:56.000 They don't strive for this.
01:19:57.000 And they have no reason to say, I have to make money.
01:20:00.000 Like my dad had to work two jobs because he had three kids.
01:20:03.000 And so he was like, got to do a double.
01:20:04.000 It has to get done.
01:20:06.000 And then I see a ton of people who are like, I don't need to work, so I won't.
01:20:09.000 So my concern now, you've got a lot of people who live in their parents' house still.
01:20:14.000 It's not necessarily a bad thing to live with your parents, by the way.
01:20:16.000 I actually think culturally and socially, it's probably good, but not when people aren't working or trying to create their own families.
01:20:24.000 With no kids, there's no future market.
01:20:27.000 That's just a fact.
01:20:28.000 So you're not going to sell anything to anybody.
01:20:30.000 And as the market begins to shrink, we're going to get, I guess we're going to get deflationary pressures and we're going to get a strain on the economy where I'll put it like this.
01:20:41.000 Any ecosystem that reaches equilibrium with its principal organisms and its food supply, the organisms starve to, are half starving and suffering and covered in sores and lesions because they're getting just the bare minimum of required energy to survive because it's equilibrium.
01:21:02.000 We need to constantly be slightly below.
01:21:05.000 We need an excess of resources so that we're not constantly strained and starving.
01:21:10.000 But with a shrinking population size, labor is going to decrease.
01:21:14.000 And that means there's going to be a massive older population that doesn't want to work that has no choice but to work now.
01:21:18.000 And it's going to get real bad because nobody wants to and young people don't have to.
01:21:22.000 So I've talked about this in terms of the social security problem.
01:21:26.000 Right now, I believe it's 2.8 workers pay for one social security recipient.
01:21:32.000 It used to be like five.
01:21:33.000 But with population decline, we're going to get to the point where it's going to be one for one.
01:21:38.000 How are you going to sustain social security recipients off of a younger generation that doesn't work at all?
01:21:47.000 You won't.
01:21:48.000 And then boomers are going to be aging and they're going to be like, I paid into it.
01:21:51.000 I deserve it.
01:21:52.000 And they're going to be like, well, there's nothing there anymore.
01:21:54.000 And they're going to get angry.
01:21:55.000 Young people are going to be like, I'm not working because even if I do, houses cost $800,000.
01:22:01.000 I can only make $25 an hour.
01:22:02.000 So I'll never get a down payment.
01:22:04.000 This is the structure of how Ukraine effectively operates.
01:22:07.000 So young people are going to say, what's the point of working?
01:22:09.000 You're going to tax me.
01:22:10.000 You're going to give it to the older people who already own their generation owns the properties.
01:22:14.000 So something's going to break because so it's either going to be before this happens, the government intervenes and seizes properties from people and we go communist or something, or Gen Z goes right wing and seizes properties from people to force the social transition of wealth.
01:22:33.000 What about starting with seizing properties from corporations, well, China and also corporations?
01:22:40.000 The argument is that these corporations, many of them are actually publicly traded corporations.
01:22:46.000 And so it's actually, once again, the boomers that own the corporate securities in those holding companies.
01:22:50.000 That's true.
01:22:51.000 But it would come off as less commie to at least seize residential properties from corporations instead of individuals.
01:23:01.000 You know, it's going to be real weird when AI takes over and eliminates a lot of jobs.
01:23:07.000 Powerful, wealthy people will own the AI, the rights to it.
01:23:11.000 You know what I was thinking?
01:23:12.000 Did you guys see that?
01:23:14.000 What is it called?
01:23:14.000 Light of Motomora?
01:23:17.000 Have you heard of this?
01:23:18.000 Come on, this is Pop Culture.
01:23:20.000 Crisis, what are you guys doing, huh?
01:23:22.000 You've not seen this?
01:23:24.000 It's a video game that looks identical to Horizon Zero Dawn.
01:23:27.000 And Sony, I think, just announced they're suing Tencent because they basically ripped off Horizon Zero Dawn.
01:23:33.000 Oh, I did see a heard of that.
01:23:34.000 Yeah.
01:23:35.000 And it was crazy to me because I was like, does Sony own the idea of tribal people with robot animals?
01:23:42.000 I mean, just throwing into litigation, the companies are just going to litigate each other to death anyways.
01:23:47.000 But here's the point.
01:23:48.000 The idea is, for those that are unfamiliar, Horizon Zero Dawn is, you know, I'll just, I'll show you.
01:23:53.000 And this matters for the AI future.
01:23:55.000 I love Horizon Zero Dawn and Ferbin West.
01:23:58.000 It's a fantastic game.
01:23:59.000 Let me just show you.
01:24:01.000 Oh, this is hilarious.
01:24:03.000 I searched for it and people are already.
01:24:04.000 It's Motorom.
01:24:05.000 There you go.
01:24:06.000 Okay, that's too small.
01:24:07.000 Let me just do this.
01:24:09.000 Lights of Motorom Horizon.
01:24:12.000 And they're side by sides.
01:24:14.000 This is wild.
01:24:16.000 Let's pull this up.
01:24:18.000 Check this image out.
01:24:19.000 Actually, let's just talk about this.
01:24:20.000 Let's roll.
01:24:21.000 We've got this from all key shop.
01:24:25.000 Sony lawsuit bombshell.
01:24:26.000 Tencent wanted Horizon deal before allegedly copying it.
01:24:30.000 So for those that are familiar, Horizon Zero Dawn, the Horizon series, there's multiple video games and expansions.
01:24:35.000 It's a video game where you play a female tribal human.
01:24:38.000 There's a bunch of tribes.
01:24:39.000 They're rather primitive, but there's weird advanced technology, gigantic animals and monsters that are made of machines.
01:24:46.000 For those that don't know, it's a decade-old game.
01:24:47.000 The story is: Earth was wiped out by AI bots that were consuming biomass until they destroyed the planet and turned to a barren rock.
01:24:54.000 The solution launched by scientists was to build a bunch of underground terraforming bases so that after the biomass was completely consumed and the AI bots were destroyed because they had no energy, they would rebuild society with, you know, I guess incubation pods that would recreate humans.
01:25:11.000 Something went wrong, humans are tribal.
01:25:13.000 Tencent launched effectively a clone of the game with robot animals.
01:25:17.000 It looks identical.
01:25:18.000 Here's what I started thinking about this lawsuit: in the future, the wealthy people will be the people with imagination.
01:25:27.000 If you can come up with an idea that's interesting, you instantly own that idea and nobody can ever use that idea.
01:25:33.000 And if people like the idea, they have to pay you for it.
01:25:37.000 So we're right now in what's called the attention economy.
01:25:41.000 We had a manufacturing-based economy, we had a service sector economy.
01:25:44.000 There were questions about whether that could work.
01:25:45.000 Then we went to the information economy.
01:25:47.000 We're now past the information economy into what's called the attention economy.
01:25:51.000 Right now, the money you receive is largely determined by your ability to make people stare at you.
01:25:57.000 And that's the easiest way to explain it.
01:26:00.000 Podcasts are getting massive.
01:26:01.000 YouTube, TikTok.
01:26:02.000 The question is: can I make you look at me longer?
01:26:06.000 And then your view of the world will be based upon those who have the ability to hold your attention the most, which creates really weird things like ElsaGate.
01:26:16.000 I'm looking at this lawsuit and I'm thinking to myself, I mean, what's what was copied?
01:26:21.000 Tribal people with robot animals?
01:26:24.000 Can they own the rights to that idea?
01:26:27.000 It's not like they directly rip the story off.
01:26:29.000 It's just similar.
01:26:30.000 So, this is what I imagine.
01:26:32.000 In the future, your burger restaurant, you're going to be poor.
01:26:36.000 You're going to live in the pod.
01:26:37.000 You're going to get your UBI, whatever it might be.
01:26:37.000 You're going to eat the bugs.
01:26:39.000 And there's going to be some ultra-wealthy guy.
01:26:41.000 Why?
01:26:42.000 Because he owns the idea of a certain kind of food.
01:26:46.000 And when you go to your chicken store and there's robots making your food in kiosks where you order it, and you scan your palm or your retina to pay, and it's based on your government account, no one's working there.
01:26:59.000 And the question is: AI organized it.
01:27:01.000 AI filed the paperwork.
01:27:03.000 Machines came and built it.
01:27:05.000 The guy who owns it is ultra-wealthy and flies around on private jets and does whatever he wants.
01:27:09.000 But why?
01:27:10.000 Because of the idea of the burger restaurant and the kind of food that it was was good, so it's his.
01:27:15.000 And you can't copy it.
01:27:16.000 That would be illegal.
01:27:18.000 The future is going to be, quite literally, people on UBI, if that, and the people of imagination who are smart enough to conceptualize things.
01:27:28.000 That seems to, that's one possibility of where we're going.
01:27:36.000 It's totally plausible.
01:27:37.000 That's it.
01:27:39.000 No disagreements.
01:27:42.000 I mean, how far in the future are you thinking about the 50 years?
01:27:45.000 50 years.
01:27:45.000 Maybe not, maybe not even 50.
01:27:47.000 You made a good case.
01:27:48.000 I have to give it to you.
01:27:49.000 Perhaps.
01:27:51.000 Right now, we've been talking about, you know, there's an Amazon investment into what's it called?
01:27:55.000 Showrunner?
01:27:56.000 Showrunner, and then what's the other one?
01:28:00.000 I don't know what the other one is.
01:28:01.000 What's Showrunner's website?
01:28:04.000 I'm trying to do enough research to repudiate your point here, but.
01:28:08.000 Here we go.
01:28:09.000 Showrunner.
01:28:10.000 AI-generated sitcoms.
01:28:12.000 Dude.
01:28:13.000 Yeah.
01:28:14.000 Look, I'm telling you.
01:28:15.000 I'm telling you guys.
01:28:17.000 As soon as this AI is powerful enough, I am remaking Revenge of the Sith so that when Anakin walks in and Mace Windu has the saber to the Chancellor and he goes, Don't let him kill me.
01:28:30.000 And then Mace is like, He controls the courts, you know.
01:28:32.000 And then what I'm going to change with the AI is that Mace is going to go, Anakin, you're right.
01:28:37.000 This isn't the Jedi way.
01:28:39.000 Call more Jedi in and we'll have him tried.
01:28:42.000 And then they come in and they arrest him, and Anakin never becomes Darth Vader, and that's the end of it because Mace Window just didn't have to be a dick.
01:28:46.000 And the rest of the movies don't even need to happen.
01:28:48.000 Yep, that's just over.
01:28:49.000 The funny thing about that is episode one, I mean, should have never happened, right?
01:28:53.000 Episode two should have been episode one.
01:28:54.000 Yeah, The Clone Wars should have been episode two.
01:28:58.000 Amazon is dumping a bunch of money into AI, into AI-based streaming services, which is funny because they're closing up freevie, which is their free streaming platform.
01:29:08.000 I'm guessing they're probably going to end up using the existing infrastructure from Freevy to build out whatever that ends up being afterwards because they're moving all the stuff that's on Freevy over to just Amazon proper.
01:29:19.000 So they'll probably use the infrastructure from the free, like what is now the freevy app for that AI program once it comes out.
01:29:26.000 Dude, that's going to be down the line.
01:29:29.000 They're putting investment into it.
01:29:30.000 Doesn't mean it's going to come out.
01:29:31.000 This showrunner website, they make TV shows, AI-generated.
01:29:36.000 And they used to have the videos on the site now.
01:29:38.000 I guess they don't.
01:29:39.000 I think.
01:29:39.000 Oh, is there something?
01:29:40.000 I don't know any pencils that stand up and write by themselves.
01:29:44.000 Creativity is at a deep blue and Kasparov moment.
01:29:48.000 We were trying to prove that it was possible to build a chess machine that could be the best player in the world.
01:29:55.000 Because we didn't build Deep Blue to make chess players better.
01:29:58.000 And we didn't build AlphaGo to make Go players better.
01:30:01.000 We built it to win.
01:30:02.000 It's a completely new medium.
01:30:04.000 And it's going to push you to be more creative.
01:30:08.000 I'm not interested in AI as recreating a process that we already.
01:30:13.000 Whatever.
01:30:13.000 Anyway, they were awful.
01:30:15.000 Fable Studios is the other one that I was thinking of.
01:30:17.000 Oh, right.
01:30:18.000 We played some of these shows, and they're just not good in any way.
01:30:22.000 Well, the idea is like the next generation of iPod baby of iPad babies are going to be the ones who are going to go into the Showrunner app and just make their own shows without even really thinking about it.
01:30:35.000 Oh, Fable is Showrunner.
01:30:36.000 Yeah.
01:30:37.000 Yeah.
01:30:38.000 Fable Studios and Showrunner are like subsidiaries or something.
01:30:40.000 Same thing.
01:30:41.000 When you go to Fable, it just rings you there.
01:30:43.000 Yeah.
01:30:43.000 So here, I'll give you this one.
01:30:45.000 You know, we don't got to do a full segment on this one, but have you guys seen this app, Gage?
01:30:49.000 Yeah, I have not.
01:30:50.000 You know, we here at Timcast have decided that, you know, maybe we should have this.
01:30:54.000 See, here's how it works.
01:30:56.000 Your employees are required to sign into it.
01:30:58.000 And then you can see there that they'll get a score between zero and 1,000.
01:31:02.000 And then if they're naughty and they don't do their jobs, I, as the boss, can reduce their score.
01:31:07.000 Here's the best thing.
01:31:08.000 It follows them everywhere they go for the rest of their lives.
01:31:12.000 So when they try to apply to another job, that new company can look and say, Mary, you've got a 403.
01:31:18.000 Can you explain to us why your work score is so low?
01:31:21.000 Isn't that what references were for?
01:31:23.000 Well, now we don't need it because we have called.
01:31:26.000 It's a credit score but for society.
01:31:29.000 Yeah, I think it's a slippery slope there.
01:31:33.000 It's going to happen.
01:31:34.000 Because what if a person between the age of 35 and 45 is completely different than who they were from 20 to 25, and yet they're stuck with that being typecast based on who they were while they were in college and they were part of some bohemian frat or something.
01:31:50.000 What if your boss is this lecherous movie producer who says, you know, if you want to move up in this company, you got to give me a little sugar.
01:31:58.000 And then the woman goes, I'm not doing anything.
01:32:01.000 I'm going to have to turn your score down.
01:32:04.000 And then what if the inverse, your boss is some like purple haired feminist.
01:32:08.000 And then you're like, I, you know, I'm here to do my job.
01:32:12.000 Well, you're a man.
01:32:13.000 And what you said was racist.
01:32:15.000 And I'm, and then you're like, why am I getting a bad score?
01:32:17.000 And then what's going to happen?
01:32:19.000 This is like the Black Mirror episode.
01:32:20.000 You go, you apply for a new job.
01:32:22.000 And the boss goes, so I see that your, your gauge score is six, six 15.
01:32:28.000 It's a little low.
01:32:29.000 Can you explain to us why it's so low?
01:32:31.000 And you say, you know, to be honest, it's unfortunate, but sometimes it happens.
01:32:35.000 I don't think my boss and I got along.
01:32:36.000 And so I tried to leave amicably and they go, right, right.
01:32:41.000 Well, look, I have another applicant with a seven 83 and I don't think you're really selling yourself and I'm not interested in taking the risk on a six 15, but I appreciate you coming in.
01:32:50.000 Goodbye.
01:32:50.000 I don't care about you.
01:32:52.000 You think McDonald's is going to tell their regional managers?
01:32:55.000 Like what happened to like in privacy?
01:32:57.000 I mean, where did that ever, when did that stop being, you know, the hip thing?
01:33:01.000 Like, just like, Hey, what's mine is mine.
01:33:03.000 And I have a right to privacy and I don't want people to know my business and my past.
01:33:08.000 Then people want it.
01:33:10.000 I agree, but I don't know if they understand where, where it leads to.
01:33:13.000 No, what I mean is there's going to be a regional manager at a, at a, like a fran, as a McDonald's franchise corporation that owns 50 locations.
01:33:21.000 And the boss is going to be like, what's our turnover rate?
01:33:25.000 And they're going to be like, it's high.
01:33:26.000 And he's gonna say, why?
01:33:27.000 And it's like, well, because you know, it goes, you hire somebody, the references are fake.
01:33:31.000 It doesn't matter.
01:33:31.000 And then they're bad and you fire them or they quit.
01:33:34.000 And he's going to say, what's this, what's this app everybody's using where you give a score?
01:33:38.000 Just hire people who have at least a 700.
01:33:40.000 And they go, okay, whatever you say.
01:33:42.000 Starting now require all employees to have it so we can track their scores and it's going to make it easier and safer and give us recourse for termination without HR, uh, lawsuits or I'm sorry, like civil, like workplace lawsuits.
01:33:54.000 And then you're going to, you're going to be 18 or 17 or 16, whatever.
01:33:58.000 You're going to go to McDonald's.
01:33:58.000 You're going to say, I want to apply for a job.
01:34:00.000 And they say, we require all employees to use gauge.
01:34:03.000 here's the best part they're going to post your schedule on gauge and say this is where you get information on your schedule make sure you check it every day because it can change then your boss is going to send you a message on gauge and you're expected to answer and they're going to say i need you to come in on sunday i know you have off but we're we're it's it's it's a rush day and so we're asking you to come in and no longer can you say i didn't have my phone on me this is the future millennials want too because gen z is all about work-life balance and millennials are the ones that are answering their slack messages at 10 p.m i'm anti-background check
01:34:33.000 I am anti-Freedom of Information Act.
01:34:35.000 I am against all these things.
01:34:37.000 I don't imply them at my own company.
01:34:39.000 I don't do background checks and I don't think so.
01:34:42.000 I am just opposed to this on principle and I might be the wrong person to ask.
01:34:45.000 This is – People already use Glassdoor as a rating system for employers.
01:34:49.000 Why can't it be two-sided?
01:34:51.000 Gage could be a two-sided platform where employers can rate their – their employees can rate their employers.
01:34:57.000 You know, I think the issue is the direction.
01:35:02.000 If you have a company with 50 employees and you get a bad score, something is going on at that company for all of these people to be mad at you.
01:35:10.000 And maybe it's not necessarily your fault.
01:35:12.000 Sometimes the reviews are BS.
01:35:13.000 But if you're an employer, one person who is bad, you can destroy 50 people's lives by giving them bad scores so they can never work again.
01:35:20.000 Not to mention it is creepy for a boss to be like your score went down five points because you refused to mop the bathrooms.
01:35:31.000 I just – I think this is a creepy thing to do.
01:35:33.000 Stars being like I worked at this company.
01:35:36.000 I give them three out of five stars.
01:35:38.000 It's like, okay, I guess.
01:35:39.000 This is a literal social credit score from zero to 1,000 that follows you everywhere you go no matter what.
01:35:44.000 If I own a company and the employees give it a bad score and it fails, I just shut the company down and open a new company.
01:35:51.000 If you have one account on this app and you can't have two accounts and it's going to follow you everywhere you go.
01:35:56.000 And so if you get one boss that hates you or like let's just go the feminist route.
01:36:01.000 You get one boss that hits on you and then you're like I'm not interested and he gets really angry and just says fuck you and then he nukes your score.
01:36:08.000 What are you going to do about it?
01:36:09.000 Well, I don't think that something like this would reach mass adoption because employers will know that there is room – plenty of room for situations like that in human error.
01:36:18.000 I think – They wouldn't just assume that the score is objective just because it was provided by someone's previous employer.
01:36:26.000 You at a corporation, a regional manager, a mid-level manager are confronted with 10 applicants for three available positions and you've got 800, 800, 800, 800, 400, 400, 400.
01:36:42.000 You're not going to go – you're going to throw the 400 in the garbage.
01:36:44.000 They already do this now.
01:36:46.000 No, I have to disagree because I will want to actually speak to the 400s to understand why they're 400s and I would – I've learned through trial and error that I think I would throw the 800s maybe in the garbage before I throw the 400s in the garbage.
01:36:59.000 I want to know why – I think that makes no sense.
01:37:01.000 It's illogical.
01:37:01.000 Well, I just think like rate my professor.
01:37:03.000 It does seem illogical but sometimes that's how – rate my professor used to be a big thing when I was in college and I always – I started going on there trying to game the system looking for the easiest professors.
01:37:14.000 And then I found that I actually got my best grades with the hardest professors.
01:37:18.000 So I find that the candidates that have some type of unique story or whatever, that they tend to have more to prove.
01:37:25.000 They have more to prove and so it's just – you got to get them at the right place in their career.
01:37:29.000 80-20.
01:37:30.000 You're going to be talking about a manager who makes $50,000 a year at – overseeing like three McDonald's locations.
01:37:37.000 And he's going to say, don't know, don't care.
01:37:40.000 It's easier to get a, it's hard to get a good score.
01:37:43.000 It is easy to get a bad score.
01:37:45.000 So I know someone with a good score is at least able to pretend or hide whatever bad things they might be doing.
01:37:52.000 And I'll take it because I don't got to deal with it.
01:37:54.000 I'm not going to get sued over it.
01:37:55.000 And I've got legal protection when I say we only hired a certain threshold.
01:37:58.000 It makes it that I can't be sued when I say no to a bad applicant.
01:38:02.000 In fact, I just actually did this in practice not too long ago.
01:38:05.000 I can't disclose the specifics, but I will say that I, you know, instead of the perfect candidate, I went with a candidate who really had nothing on paper and looked much more flawed and ended up.
01:38:15.000 I think I made the right choice.
01:38:17.000 It's only been about three months, but we'll see.
01:38:20.000 And that's entirely true that that will happen.
01:38:22.000 The point is, at the macro scale, when companies have to hire 300,000 employees, or they have 5,000 hourly wage employees at a series of chains, they're going to tell their managers, I don't know or care.
01:38:35.000 We only hire 700 plus.
01:38:37.000 So you think this would happen mostly at larger scale companies and smaller ones wouldn't adopt something like this?
01:38:42.000 Smaller ones probably will, spatteringly.
01:38:46.000 But I feel like this app is an inevitability because we're a small company and I've dealt with stupid government regulation and employment complaints.
01:38:55.000 But like if you were using something like this, like me or Mary never would have been hired because we didn't have any history and like it's for hourly only anyway.
01:39:02.000 Oh, okay.
01:39:02.000 Yeah, it's hourly shift worker base.
01:39:05.000 It's not for salaries.
01:39:07.000 So we would never have this app in the first place.
01:39:09.000 But if you are a burger restaurant, look, it's really simple.
01:39:14.000 The $30,000 a year manager of your burger shop doesn't care and is sitting there with all the applications in front of him.
01:39:20.000 And he's going, what time is the fight?
01:39:22.000 Dude, I don't care.
01:39:23.000 With this app, he's going to be like, delete anything under 800.
01:39:26.000 He's like, filter out anything below 800.
01:39:29.000 He's going to see three 800s and he's going to be like, we only got one spot to fill.
01:39:32.000 I got three 800s.
01:39:33.000 Everyone else is in the garbage.
01:39:34.000 Well, yeah, that's scale.
01:39:34.000 Yeah.
01:39:36.000 That's good.
01:39:38.000 AI already does this.
01:39:40.000 Yep.
01:39:40.000 AI already does resume searches.
01:39:42.000 And so this is really fascinating.
01:39:44.000 You know, people do.
01:39:45.000 They'll take out keywords for algorithms like work late, overtime, double shift.
01:39:54.000 They'll make it one size one font and white, and they'll put it at the very bottom of the page where the AI can see it.
01:40:03.000 So when people submit their resumes, the AI filters are specifically looking for certain keywords and their resume gets jumped to the top, despite the fact that they might already write, I'm hardworking, willing to work overtime, and I really want this job.
01:40:17.000 By doubling up the words, the algorithms are weighing them more heavily.
01:40:20.000 And then it's shut.
01:40:22.000 People are making weird AI manipulating resumes.
01:40:25.000 We are.
01:40:26.000 Welcome to the nightmare scenario.
01:40:27.000 It's getting crazy.
01:40:28.000 It's going to get worse.
01:40:28.000 It is one of those things.
01:40:29.000 Like, there's a meme going around on X right now of the guy drinking the bottle of vodka in his car.
01:40:34.000 It's like what it's like applying for a job right now.
01:40:36.000 It's like, welcome to your third round of interviews.
01:40:38.000 He's like, sir, this is a burger restaurant.
01:40:41.000 A third round of interviews.
01:40:43.000 Yeah, for all of the talk about people not wanting to work, it's almost like employers don't want to hire either.
01:40:48.000 Yeah.
01:40:49.000 Here's the other big thing, too, because someone's mentioned this in Super Chat.
01:40:51.000 Dan Vicious says employers can't do references, risk of lawsuit.
01:40:55.000 They tell you this.
01:40:57.000 One of the first bits of advice you get when you're opening a business is anytime anyone calls your company to ask about an employee, you always just be absolutely neutral.
01:41:07.000 You say, ah, yes, so-and-so did work here.
01:41:10.000 They no longer work here.
01:41:12.000 I would not hire them again.
01:41:13.000 Thank you and have a nice day.
01:41:14.000 That's all you can say.
01:41:16.000 Because if you say they did X, they did Y, it's defamation.
01:41:19.000 And that's true.
01:41:20.000 Yeah, they'll come after you.
01:41:20.000 Yes.
01:41:21.000 So think about this app.
01:41:22.000 You're a company and you're like, how can we adequately share that this was a bad employee?
01:41:31.000 Companies are going to be happy to do this.
01:41:34.000 And there's nothing the worker B can do about it.
01:41:36.000 If you get locked in a low score, good luck getting out of that hole.
01:41:40.000 You know, here's the other fun thing about it.
01:41:42.000 There's going to be some dudes who are going to be like, hey, you own a restaurant, right?
01:41:47.000 Hey, hire me on your app.
01:41:48.000 Just give me good reviews.
01:41:49.000 I'll give you $100.
01:41:51.000 And they'll be like, okay, I want to get a 900.
01:41:53.000 And then I can go apply.
01:41:54.000 And that's what they'll do.
01:41:55.000 Like, there will be a dude at a Burger King who's a shift manager for $16 an hour.
01:41:59.000 And he'll be like, yeah, I'll hire you.
01:42:01.000 And it's like, just come hang out.
01:42:03.000 I'll just give you good scores.
01:42:04.000 You're good.
01:42:05.000 And it'll be a lot of that.
01:42:06.000 It's going to be wonky and busted.
01:42:10.000 All right.
01:42:11.000 We're going to go to your chats, my friend.
01:42:12.000 So smash the like button.
01:42:13.000 Share the show with everyone.
01:42:14.000 You know, we're going to grab your super chats and Rumble Rant, but that uncensored members only show is coming up at rumble.com/slash Timcast IRL at 10 p.m.
01:42:23.000 You don't want to miss it.
01:42:24.000 In the meantime, let's grab your chats.
01:42:27.000 All right.
01:42:27.000 Shane H. Wilder says, I'm glad Trump is calling out Pelosi's insider trading.
01:42:32.000 She gets all these gains and can't send a dollar to the Gongo.
01:42:35.000 She claims to be a Catholic, and she hasn't heard about the Christian act of about the Christian act of charity.
01:42:40.000 It's a travesty.
01:42:42.000 That's right.
01:42:44.000 Alva 2 Omega says, Howdy, people.
01:42:46.000 I tried sharing your Australia no minors on social media video being in favor of it.
01:42:51.000 Facebook triggered a single post a dozen times and flagged it as spam and removing it.
01:42:55.000 Censorship.
01:42:55.000 I'm telling you.
01:42:57.000 So I made a video about Australia is banning YouTube for under 16s, and it got almost no views.
01:43:04.000 And so I'm like, okay, this is probably one of the most important stories on YouTube for people on YouTube that YouTube will be banned for teenagers, regardless of your opinion on it.
01:43:13.000 And for some reason, it's not appearing in recommendations.
01:43:16.000 But Tim, Mark Zuckerberg, got a haircut and does jujitsu.
01:43:20.000 Oh, he's cool now.
01:43:22.000 Pinochet says, this isn't just against Trump, but an affront to every American.
01:43:25.000 A violation of the Constitution.
01:43:27.000 These are the domestic enemies mentioned in the oath I took and was never relieved of.
01:43:33.000 No quarter.
01:43:35.000 Oh, yeah.
01:43:36.000 Let's see what we got.
01:43:38.000 Vic the Fix Shaw says about referrals.
01:43:40.000 When I was down in Harpers Ferry two years ago, three bros and I stopped at HF Brewing, and it was 100 till they shut the place down.
01:43:48.000 It was 100 till they shut the place down at 9:30.
01:43:51.000 Management told us 11 and then tossed everyone out at 9:30.
01:43:57.000 About.
01:43:58.000 Oh, they showed up at 1.
01:44:00.000 It was 1 till it says 100 till they shut the place down.
01:44:03.000 Maybe like 1 p.m. till 11.
01:44:05.000 They showed up at 1, and then the business claimed to be open until 11 and sun out early.
01:44:12.000 Oh, yeah.
01:44:14.000 You know, we used to love the Harpers Ferry Brew, but I guess they sold it.
01:44:16.000 Did they?
01:44:18.000 I guess.
01:44:19.000 We used to go there and we were like, it's really cool that we have this local beer.
01:44:24.000 And so we would buy like 600 tall boys from them.
01:44:28.000 And then something happened where a few months, like we'd go in, we'd order like two months worth of beer and have it stocked in the fridge.
01:44:34.000 And then when guests would come, people who drank, people don't really drink that much.
01:44:36.000 We don't do it anymore.
01:44:37.000 And then, you know, two months went by and we were like, oh, we should go restock.
01:44:40.000 We showed up and they were like, you can't buy those.
01:44:42.000 And we were like, we buy them every few months.
01:44:44.000 And they were like, no, we can't sell those to you.
01:44:46.000 And I was like, okay.
01:44:48.000 He's like, we talked to the manager because the manager, like, we do this.
01:44:51.000 And then they told us they weren't allowed to do it.
01:44:52.000 And I guess they sold.
01:44:54.000 And then we went back there for the first time a few months ago.
01:44:56.000 And Brandon got super pissed because he's worked in a bar.
01:45:00.000 He's worked in bars and he's played shows and stuff.
01:45:03.000 And we got there like a half an hour before close.
01:45:06.000 And there's, you know, three of us in line.
01:45:08.000 And, you know, Andy walks up and gets a beer.
01:45:10.000 He walks up and they go, bar's closed.
01:45:12.000 And he was like, you just served.
01:45:13.000 I'm like, I know, but we just closed.
01:45:14.000 And he was like, you haven't done anything.
01:45:16.000 I don't know what it's called, but he's like, I know what you do when you shut down the taps.
01:45:20.000 You can pour me a beer right now.
01:45:22.000 And they're like, no, we're closed.
01:45:23.000 And he was like, this is the worst place I've ever been to.
01:45:25.000 I'm never coming back.
01:45:26.000 See if they have the gauge app, then we could just make sure that these people have zero stakeholders.
01:45:29.000 Oh, you can't.
01:45:30.000 No, that's Yelp.
01:45:31.000 You're the customer one.
01:45:32.000 Yeah.
01:45:33.000 Yeah, but I guess because he worked at a bar, he knows when the taps are shut down and when you can't serve beer.
01:45:37.000 And they were still completely able to do it.
01:45:39.000 And Yelp doesn't always even work anyways.
01:45:41.000 A lot of the employees, because they don't own the company.
01:45:44.000 There's no stake in the company.
01:45:45.000 They don't really care if they get bad reviews unless their manager comes and says something to them.
01:45:49.000 Well, businesses give themselves fake positive reviews on Yelp 2.
01:45:53.000 Yep.
01:45:55.000 It was like, oh, there was an article for the upcoming Green Lantern television show on Warner Brothers.
01:46:02.000 And the dude was a television show.
01:46:04.000 Yeah, it's called Lanterns.
01:46:05.000 And it says, a Warner Brother executive said that the show is fantastic.
01:46:09.000 And I'm like, well, yeah, because Warner Brothers made it.
01:46:10.000 That's like me.
01:46:11.000 I was like, me, a PCC insider, says that PCC is fantastic and you should go watch the show.
01:46:16.000 That doesn't really mean anything if you own the company or have stake in the company saying that's awesome.
01:46:21.000 Max Reddick says, Tim, a while back, you were working on getting David Pacman on the culture war.
01:46:25.000 Whatever happened with that?
01:46:27.000 With all due respect to David Pacman, he was very polite and said that he was busy with family.
01:46:30.000 I think he recently had children.
01:46:32.000 And I respect that and have no issues and nothing bad to say.
01:46:35.000 I disagree with him and his style of content, but it is what it is.
01:46:39.000 And, you know, if he can't make it out, he can't.
01:46:41.000 But we'll reach out to him again because we'd love to have him.
01:46:43.000 I think it'd be a great show.
01:46:44.000 And I think he would enjoy it too.
01:46:46.000 Also, I'd love to get like Kyle and Crystal Ball perhaps.
01:46:46.000 So we'll see.
01:46:51.000 I watched this really funny clip with Crystal Ball just scolding Alyssa Slotkin on Israel committing genocide in Gaza.
01:46:59.000 And it was like, Slotkin was like, yes, you are correct.
01:47:02.000 And then Crystal was just like, but it's a genocide.
01:47:04.000 And she was like, okay.
01:47:05.000 And Crystal was like, say it.
01:47:07.000 I'm kidding, but it was kind of like that.
01:47:10.000 I was like, wow, Crystal, like, really just going after.
01:47:15.000 And then Slotkin was like, I wrote a letter saying that they're being starved.
01:47:18.000 And I'm like, it's a genocide Olympics.
01:47:21.000 Like, who can say more about Gaza than the other person?
01:47:25.000 How fun.
01:47:27.000 Let's see what we got here.
01:47:29.000 Fuck Dirk says, if no arrest is made, then Trump term two is a failure for me.
01:47:35.000 And I will be demotivated to ever care about the right again.
01:47:38.000 I don't care how great immigration in the economy is.
01:47:40.000 He will be a failure.
01:47:41.000 You know, we live in a plutonomy, right?
01:47:44.000 This country is all for and by the wealthy and always has been.
01:47:48.000 And there was a report that was put out over a decade ago.
01:47:50.000 It's almost 20 years ago now by a Citigroup talking about how the will of the American people has no bearing whatsoever on legislation.
01:47:58.000 And there's actually these really great infographics where it's like 80% of the country can want something, but as long as 30% of the wealthy want something, they get it.
01:48:08.000 That's amazing, right?
01:48:09.000 It's like when you watch a show, it's like you watch something involving the U.S. government or like the CIA and somebody says, it's vital to U.S. interests.
01:48:17.000 And then you say, what does that actually mean?
01:48:19.000 Like, who is the person who decides what U.S. interests actually are?
01:48:23.000 So, so what is what is he saying will determine whether or not he's successful?
01:48:28.000 I didn't get that part.
01:48:29.000 Arresting the corrupt people who sabotage the government.
01:48:32.000 Got it.
01:48:32.000 But I mean, I do think there's a bit of a fault to that because what did they do?
01:48:36.000 Stop Trump from carrying out his agenda.
01:48:38.000 So should Trump's agenda solely be on going after the people who stop his agenda or should he try and fulfill his agenda?
01:48:43.000 It'd be better than nothing, which is what we're getting now.
01:48:45.000 We're getting a little bit.
01:48:46.000 Well, we got the border shut down.
01:48:48.000 We have a good economy.
01:48:49.000 Which was the agenda.
01:48:51.000 Make America great again.
01:48:52.000 Yep.
01:48:53.000 I mean, I think that you're reducing it way down from what was promised.
01:48:59.000 Fair enough.
01:49:00.000 Perhaps.
01:49:01.000 Let's grab some more.
01:49:02.000 Andre says, what do you use to measure sleep and heart rate, Galaxy Watch?
01:49:06.000 You keep it on sleeping?
01:49:07.000 When do you charge it?
01:49:08.000 So my bed.
01:49:09.000 I have a sleep eight bed.
01:49:12.000 Luke recommended it.
01:49:13.000 I got it.
01:49:14.000 And it heats and cools as you sleep, which is good.
01:49:17.000 But it's not like the heating and cooling thing isn't perfect, but it does try to adjust the temperature so they don't wake up either too hot or too cold, which is, it does work.
01:49:26.000 But when I wake up in the morning, it shows me everything about my sleep.
01:49:29.000 It tells me when I was in deep sleep, when I was in REM sleep, when I woke up, it's pretty amazing.
01:49:34.000 And then it gives you a score.
01:49:36.000 Are you sure that it's not going to share your biometric data?
01:49:41.000 Like that Spotify?
01:49:43.000 Did I ever say that they wouldn't?
01:49:43.000 What do you mean?
01:49:43.000 Am I sure?
01:49:45.000 Does it bother you that they almost certainly do?
01:49:48.000 No.
01:49:49.000 Okay.
01:49:50.000 That like they're going to be like, they're going to share this data and be like, sir, Tim Poole entered deep sleep at 2 a.m., lasted for one hour before entering a period of light REM sleep.
01:50:01.000 No, they're going to fact check you.
01:50:02.000 They're like, Tim Pool said he deep sleep for an hour and it was actually 52 minutes.
01:50:07.000 and then Snopes is going to cover it.
01:50:08.000 Knowing when you're awake or asleep is actually...
01:50:16.000 This is the most amazing thing.
01:50:19.000 There was this website in the early 2000s.
01:50:22.000 I think it was called like The Spark or something.
01:50:24.000 Do you guys remember this?
01:50:25.000 You're old enough, right?
01:50:26.000 I'm old.
01:50:27.000 And it had a bunch of tests.
01:50:28.000 And they were really rudimentary Early websites.
01:50:31.000 There's no apps or anything.
01:50:32.000 And one of them was called the gender test.
01:50:34.000 And it would ask you weird questions.
01:50:36.000 And then it would tell you what your gender was.
01:50:38.000 And you were like, how does this make sense?
01:50:41.000 So basically, it was like, it said, I will ask you questions and then predict your gender.
01:50:45.000 And it would ask you things like, which do you prefer?
01:50:48.000 And it would show a bike, a boat, and a plane.
01:50:50.000 And you're like, a boat, I guess.
01:50:53.000 And then it would say, pick a shape.
01:50:54.000 And it would be like a blue triangle.
01:50:56.000 It would be like a red square, a green circle.
01:50:59.000 And then women tended to pick certain things for some reason that men tended to pick something else for some reason.
01:51:04.000 And at the end, it would be like, you're a woman, you're a man.
01:51:06.000 And it got it right like 90% of the time.
01:51:08.000 That last thing just sounded like a PlayStation controller.
01:51:10.000 Maybe.
01:51:11.000 But the funny thing, yeah.
01:51:12.000 The funny thing about it is that was before AI.
01:51:14.000 That was just a basic algorithm that was like 90% of the time, women pick these things, men pick these things, and so we can make that prediction.
01:51:22.000 Now, Facebook, based off of the weirdest of things, knows when you're going to poop.
01:51:27.000 That's not a joke or an exaggeration.
01:51:29.000 This has been published seven or eight years ago.
01:51:32.000 Your mobile app has the Facebook, your messenger app, or your actual Facebook app.
01:51:38.000 It knows when you're moving.
01:51:39.000 It knows when you're sitting.
01:51:40.000 It knows when you're eating based on how you're moving and where you're at.
01:51:44.000 So it knows when you go to work and when you go to lunch.
01:51:46.000 Why?
01:51:46.000 Because it has GPS data and knows the coordinates of Burger King.
01:51:49.000 So you go to work.
01:51:51.000 You then get up.
01:51:52.000 It knows when you're going to go to lunch before you do based on the patterns of your movement at work.
01:51:57.000 Then it predicts you're going to go eat.
01:51:58.000 And based on the prediction of when you're going to eat, it knows when you're going to have a bowel movement.
01:52:02.000 And this is not an exaggeration or meant to be funny.
01:52:04.000 It literally does this without anything, but all it needs is your GPS.
01:52:09.000 Is it called the TMI?
01:52:09.000 It's crazy.
01:52:11.000 Perhaps.
01:52:12.000 But, you know, for Zuckerberg and the rest of this company, they're like, this data is invaluable.
01:52:19.000 Like, you can control populations.
01:52:20.000 You can predict their movements.
01:52:22.000 It's insane what they can do.
01:52:24.000 And I'll tell you this: I'm willing to bet AI is far more advanced than we even realize.
01:52:28.000 The commercial grade stuff they're showing us, it's probably 20 years more advanced they got behind the scenes.
01:52:33.000 You need a GPS for decades before it was ever commercially available, right?
01:52:37.000 That's right.
01:52:37.000 Like, I always imagine, like, if I was Google and I wanted government contracts, I would just, and they said, no, I would just shut down Google Maps for a day and then say, okay, let the peasants figure out where they're going.
01:52:49.000 Same with Apple.
01:52:50.000 Think about how crazy that is.
01:52:52.000 That I used to, I used to have, I used to have memorized like, like 20 or 30 phone numbers.
01:52:58.000 Now I have got two.
01:53:01.000 I never wrote down, I've written down a phone.
01:53:02.000 I didn't start writing down phone numbers until I was in my early 30s.
01:53:05.000 Promise.
01:53:07.000 I memorized everybody's.
01:53:08.000 And then all of a sudden, my memory started to.
01:53:11.000 Yeah, now it's just you store it in your phone.
01:53:13.000 And it's like, I don't know your phone number.
01:53:14.000 I know your name.
01:53:15.000 Did you see that post that was going around on exit last week where it was like, technology's going too far, man?
01:53:19.000 My roommate got locked out of his light bulbs.
01:53:22.000 And now we're sitting in the dark because he doesn't know the password.
01:53:25.000 Let me tell you about the worst thing about Sleep 8 is that when my bed gets disconnected from the internet, I can't turn it off.
01:53:32.000 Yeah.
01:53:33.000 So you get this.
01:53:34.000 I got it.
01:53:35.000 My bed ran out of water.
01:53:38.000 So the air conditioner wasn't working or the temperature control.
01:53:43.000 And I was like, whatever.
01:53:44.000 I'm not dealing with this.
01:53:46.000 So I went to sleep.
01:53:47.000 Then I wake up with my alarm going off, which is, it vibrates.
01:53:50.000 It goes, and I pick my phone up and go to the app, and it says, can't find it.
01:53:57.000 So there's no way to turn it off.
01:53:59.000 So now I got to get up and go to the box and unplug it.
01:54:02.000 It's a nightmare.
01:54:03.000 You know what else I want to stress?
01:54:04.000 You know what I hate more than anything right now is TV.
01:54:06.000 Let me tell you.
01:54:08.000 You guys, maybe you remember this.
01:54:10.000 You youngsters.
01:54:10.000 I don't know.
01:54:11.000 When I was a kid, you know what I would do?
01:54:13.000 I'd walk up to the TV and I would grab a little knob and I would pull it out.
01:54:17.000 You pull the knob forward and the TV would turn on.
01:54:20.000 And there were two knobs.
01:54:21.000 There was, was it VHF and UHF?
01:54:24.000 And I'd go click, click, click, click, click.
01:54:26.000 But here's the best part.
01:54:28.000 It was already on channel 32 Fox.
01:54:30.000 So when The Simpsons were coming on at, what was it, like 7 o'clock or 5:30, I'd go up to the TV, I'd pull the little thing forward, the TV would turn on, and I'd sit down.
01:54:39.000 Do you know what I have to do now?
01:54:40.000 Brett?
01:54:41.000 I turn the TV on.
01:54:43.000 Then it boats up.
01:54:44.000 It starts booting up.
01:54:45.000 It takes about 10 seconds.
01:54:47.000 Then it brings me to some loads of home screen and instantly a thing pops up saying, Would you like to update your TV?
01:54:53.000 To which I'd say, no, I don't want to update my TV.
01:54:55.000 Then a box pops up saying, Would you like to update your remote?
01:54:58.000 No, I don't want to update remote.
01:54:59.000 Then I click home.
01:55:00.000 Then it takes 10 seconds to load.
01:55:02.000 Then I have to press down to go to the YouTube TV app and I hit it and it says, Would you like to update the app?
01:55:07.000 And I say no.
01:55:08.000 Then it opens the app and it's on some, it's on some default pre-record.
01:55:12.000 And I have to then select and find the channel.
01:55:14.000 Gone are the days where I could just click the button and it turned on to the channel I watch all the time.
01:55:19.000 Those were the days, huh?
01:55:20.000 Channel three so that you could go turn on your video games right away.
01:55:23.000 Yeah.
01:55:23.000 I guess right.
01:55:24.000 You know what's crazy is they still have, with all that, they still have you like go to each individual letter and select it and then go back.
01:55:31.000 Yep.
01:55:31.000 Yeah.
01:55:32.000 I think they would have figured that out.
01:55:34.000 Modern TVs suck.
01:55:35.000 Yeah.
01:55:36.000 I mean, what you need is a Roku TV with the very pleasant Roku City playing in the background, keeping PlayStation.
01:55:36.000 Terrible.
01:55:42.000 I just do get a Roku TV.
01:55:44.000 I turn the PlayStation on and I turn the TV on, let it boot.
01:55:48.000 I don't care.
01:55:48.000 I press the PlayStation.
01:55:49.000 I go to YouTube, enter, but it pops up.
01:55:51.000 I want them to make a movie about Roku City.
01:55:54.000 Roku City is when you're just when you have a Roku TV that's just a city that plays in the background.
01:55:54.000 What is Roku City?
01:55:59.000 Well, it's all of these like apocalyptic scenarios that a robot.
01:56:03.000 They need to make a skibbity toilet Roku City movie.
01:56:06.000 Just load it up in the AI.
01:56:07.000 Yeah.
01:56:08.000 I am going to make that scene from Star Wars where Mace Windu was like, oh yeah, Anakin, you're right.
01:56:12.000 I better not just randomly kill the Chancellor.
01:56:14.000 That would be an assassination.
01:56:15.000 I think if I did it, I would take the movie Little Big League where King Griffey Jr. robs him of a home run at the end.
01:56:20.000 I'd have him actually hit the home run and they'd win.
01:56:23.000 You know what I would do is I would have Mace Windu accuse the Chancellor of colluding with the Trade Federation to steal the election and then bog him down with years of investigation so that he couldn't act his agenda.
01:56:37.000 But he does declare him under arrest.
01:56:39.000 He does declare him under arrest.
01:56:41.000 But then he force lightnings and Anakin walks in and he was and then the lightning is rebounding onto the emperor and he was killed me.
01:56:49.000 And then Mace Windu is like, I have to.
01:56:51.000 He's too powerful.
01:56:52.000 And then Anakin's like, no, he should be arrested and tried.
01:56:54.000 And he goes, he controls the courts.
01:56:56.000 He can't be stopped.
01:56:57.000 I have to kill him.
01:56:58.000 And then when he goes to Swing to kill him, Anakin cuts his arm off.
01:57:01.000 Anakin did nothing wrong.
01:57:04.000 I'm sorry.
01:57:04.000 If a religious military faction is trying to assassinate the duly elected leader because he's of a different religion, you stop the person trying to kill the other guy.
01:57:14.000 The Chancellor didn't go to the Jedi Temple and try to murder anybody.
01:57:17.000 He was in his Chancellory quarters or whatever when the Jedi showed up and said, We just found out you have a different religion from us, so we're going to kill you.
01:57:24.000 And it's like, what?
01:57:27.000 And Anakin was like, don't.
01:57:30.000 You know, and they make him the bad guy.
01:57:32.000 And then Obi-Wan, he's the real bad guy because he torches Anakin.
01:57:36.000 Come on.
01:57:37.000 Obi-Wan stows away in Anakin's pregnant wife car and then jumps out standing there like a dick.
01:57:43.000 You should fight him.
01:57:45.000 You should pitch this to George Lucas.
01:57:47.000 He loves redoing it.
01:57:50.000 He never leaves it alone.
01:57:51.000 I would modify Titanic so that she chooses her fiancé and doesn't cheat on him.
01:57:56.000 I would modify Titanic so that when she's on the front and she's going, yeah, she falls in and he's like, oh, crap.
01:58:01.000 What's with the young hating Christensen Force ghost at the end of Return of the Jedi?
01:58:07.000 Why are Yoda?
01:58:08.000 They changed it.
01:58:08.000 I know, but why?
01:58:10.000 Did you change it?
01:58:10.000 Why not like it?
01:58:11.000 Yeah, because Obi-Wan and Yoda are old.
01:58:13.000 So why does Darth Vader get younger?
01:58:16.000 It's stupid.
01:58:16.000 Yeah, younger.
01:58:17.000 Yeah.
01:58:18.000 George Lucas is out of his mind.
01:58:20.000 The sequel movies are the stupidest things I've ever seen in my life.
01:58:23.000 In the original one, he had people that could tell him no, and there was nobody to tell him no in the later ones.
01:58:28.000 The sequels don't exist.
01:58:29.000 To me, that's just Disney.
01:58:30.000 That's a cash grab.
01:58:31.000 The prequels, they could have got it right if, you know, if the Clone Wars were episode one.
01:58:37.000 It's not the Clone Wars.
01:58:38.000 It was episode two.
01:58:39.000 Tackle the Clones was episode one.
01:58:41.000 And then they made the Clone Wars episode two.
01:58:43.000 Because episode three was good.
01:58:44.000 It was just, it was rushed.
01:58:45.000 It could be ironic.
01:58:46.000 Maybe you could use AI to make Avatar actually interesting.
01:58:49.000 You know, the big problem with episode three as well is that it is rushed.
01:58:53.000 There's no transition to the dark side.
01:58:55.000 You don't see what drives Anakin.
01:58:57.000 In the Clone Wars series, you can see him embracing darkness.
01:59:02.000 And so it makes no sense that in Revenge of the Sith, he's like, I am a good guy, and you're a Sith.
01:59:09.000 And I'll inform the Jedi.
01:59:09.000 It must be stopped.
01:59:10.000 And then 10 minutes later, he's like, I'm going to side with the Sith now and I'm going to murder children.
01:59:14.000 It's like, well, why?
01:59:15.000 Yeah.
01:59:15.000 And also General Grievous's character, like there's no buildup to that character.
01:59:19.000 You get to see him in the Clone Wars take on that kind of that Vader-esque, at least villain personality.
01:59:25.000 So they really vote with that one.
01:59:29.000 I would change episode four.
01:59:32.000 I guess it's called four.
01:59:33.000 And it would be like when Luke puts the computer away and the voice is like, yo's the force, Luke.
01:59:38.000 And then they're like, Luke, is something wrong?
01:59:40.000 He's like, no, I got this.
01:59:42.000 And then he just misses.
01:59:43.000 And then they're going to be like, you moron, you turned your computer off.
01:59:46.000 I was like, but I thought magic was going to save me.
01:59:47.000 And then it doesn't.
01:59:49.000 And then that start just blows everybody up.
01:59:51.000 That'd be, that's the better ending.
01:59:53.000 Actually, what I really want to do is I want to make an entire version of Star Wars.
01:59:56.000 That is the truth.
01:59:57.000 The truth is, the Empire did nothing wrong.
02:00:00.000 It's all rebel propaganda.
02:00:02.000 The religious zealots from a desert planet took a cargo ship and blew up a military base.
02:00:06.000 And then they made a movie about it where they're like, oh, but watch Darth Vader.
02:00:09.000 He blew up a planet.
02:00:10.000 And it's like, did he?
02:00:11.000 Or is that propaganda?
02:00:13.000 Yeah.
02:00:13.000 What's another series or movie you'd make a change to?
02:00:16.000 Off the top of my head, I don't know.
02:00:17.000 Probably all of them.
02:00:19.000 How about what's the one I just watched?
02:00:21.000 Happy Gilmore.
02:00:23.000 His wife should have died of cancer.
02:00:26.000 Yeah, like, come on.
02:00:29.000 But the Jedi maintained peace for millennia.
02:00:31.000 I'm sorry.
02:00:31.000 I'm not.
02:00:32.000 Go to the Star Wars thing.
02:00:33.000 They maintain peace for a millennium.
02:00:34.000 They're kidnapping children and indoctrinating into their religion and executing anybody who's of a different religion.
02:00:40.000 Sure.
02:00:42.000 Of course, in their movies, they paint it as noble.
02:00:46.000 But like, look how they use Jedi mind tricks, which we consider it to be a good thing to do to like a prey upon the minds of your everyday person because they're weak-willed, so you can get what you want.
02:00:57.000 Well, the Sith were anti-alien.
02:00:59.000 I mean, they colluded with gangsters.
02:01:01.000 I mean, they were.
02:01:03.000 That's all extended universe stuff that's been that's been retconned.
02:01:05.000 It's no longer real.
02:01:07.000 If you actually watch the movies, the only real thing they did is they blew up, was it?
02:01:10.000 They blew up Alderan?
02:01:12.000 Yeah.
02:01:12.000 Or was it Alderan?
02:01:14.000 It was Alderan.
02:01:15.000 Yeah, yeah.
02:01:16.000 Death Vader's like, I'm going to blow up a planet for no reason.
02:01:18.000 And it's like, what?
02:01:19.000 Why?
02:01:20.000 So if, like, I think you could easily remake Star Wars, where you could have, you know, Darth Vader, who's a disabled war veteran, resisting this fanatical religious zealotry that are trying to impose their religious will over a government to the point where they tried.
02:01:37.000 Let's put it this way.
02:01:38.000 A religious militant sect has a high level of power in the government of the galaxy.
02:01:43.000 And when they find out the Chancellor has a different religion, what did he do wrong?
02:01:47.000 What did the Chancellor do wrong?
02:01:48.000 There's a civil war and machinations?
02:01:50.000 Okay, you got to prove that in court.
02:01:53.000 You got to present evidence.
02:01:54.000 You can't show up to his room.
02:01:55.000 It's like, kill him.
02:01:56.000 Well, the galaxy was, I should say, the universe was oppressed under the Sith.
02:02:02.000 It's just a Galaxy Hanti universe.
02:02:05.000 In the actual Star Wars canon, there's other galaxies.
02:02:07.000 Yeah.
02:02:08.000 That's why I said universe.
02:02:09.000 But he's not ruling.
02:02:10.000 The Sith are not ruling over other galaxies.
02:02:13.000 Yes.
02:02:13.000 Oh, really?
02:02:14.000 So one of the old extended stories was that the Emperor was actually trying to mechanize the galaxy because an external galactic threat was coming.
02:02:21.000 And there was a story written about it.
02:02:23.000 That was counting?
02:02:24.000 It used to be.
02:02:25.000 They got rid of it.
02:02:26.000 Disney was like, nah, throw it in the garbage.
02:02:28.000 And you're sure the anti-alien thing is not counting?
02:02:30.000 I think it is.
02:02:31.000 Well.
02:02:32.000 Or isn't Canon?
02:02:34.000 They made this stuff after the fact because in the original Star Wars, the Empire was just an empire.
02:02:39.000 And then you only ever hear him say, oh, I hate the Empire.
02:02:40.000 And you're like, well, what did the Empire do?
02:02:42.000 They blew up Alderan.
02:02:44.000 They blew up a planet.
02:02:44.000 They're evil.
02:02:46.000 We're going to go to the members only portion of the show, my friends, at rumble.com slash Timcast IRL.
02:02:46.000 All right.
02:02:50.000 So smash the like button, share the show with everyone, you know.
02:02:53.000 It's going to be fun.
02:02:54.000 Make sure you use promo code Tim10 if you want to get 10 bucks off your yearly membership.
02:02:58.000 You can follow me on X and Instagram at Timcast Kitser.
02:03:02.000 Would you like to shout anything out?
02:03:04.000 Check out the app on the app store.
02:03:06.000 Go to alleo.com, A-L-L-I-O, alleocapital.com.
02:03:11.000 And yeah, check it out.
02:03:12.000 Give it a spin.
02:03:14.000 Go subscribe to Pop Culture Crisis.
02:03:16.000 We go live every Monday through Friday at 3 p.m. Eastern on both YouTube and Rumble.
02:03:21.000 Brett is going to sell it to you a second time.
02:03:23.000 You can also send me validation on Instagram at MaryArchived.
02:03:27.000 You can send me hate on X. That is also Mary Archived and help me get TikTok famous.
02:03:32.000 That is also Mary archived.
02:03:34.000 I think we'll just let Mary sell it.
02:03:36.000 That's good.
02:03:38.000 They're more likely to click on it if you tell them anyway.
02:03:40.000 Okay, well, if you guys want to follow me, I am on Instagram and on X at Brett Dasovic on both of those platforms.
02:03:46.000 And yeah, Pop Culture Crisis, Monday through Friday, 3 p.m.
02:03:49.000 We will see you all at rumble.com/slash Timcast IRL in about 30 seconds.
02:03:53.000 Thank you.
02:03:53.000 Thanks for hanging out.
02:03:53.000 Thank you.
02:04:23.000 Thank you.
02:05:00.000 So, anyway, the Sith aren't bad guys.
02:05:03.000 The Jedi are the bad guys.
02:05:05.000 And I stand by it.
02:05:08.000 See, I'm just over everybody trying to switch movies and make and give me some thesis as to why something was wrong 20 years ago.
02:05:15.000 I say you just leave it be for any movie.
02:05:18.000 Yeah, I guess.
02:05:19.000 I don't know, man.
02:05:22.000 If you play the games they extended universe, they tried to balance out what the actual issues were.
02:05:27.000 The Sith were driven by passion and purpose, and the Jedi were monks.
02:05:32.000 But the Jedi are communists.
02:05:36.000 They kidnap kids at young ages and then force them to be like weird monks and celibate.
02:05:41.000 It's like religious zealotry.
02:05:42.000 And then they're forced to.
02:05:45.000 Like the line Anakin says when he's like, I think the Jedi are evil.
02:05:48.000 But they didn't really explain that transition.
02:05:48.000 He's right.
02:05:50.000 That was the problem.
02:05:51.000 So it's like Anakin could have just said, Obi-Wan, the Jedi Council just tried to assassinate the Chancellor and you're defending them.
02:05:59.000 Why are you on their side?
02:06:00.000 And Obi-Wan would have been like, because you're dealing in absolutes.
02:06:04.000 The problem is you're giving them ideas now because they love to retroactively go back to franchises and ruin it.
02:06:09.000 Remember when they made the Jedi are selfless, though, and that's the key component there where the Sith are they rely on their passions and hatred, anger, you know, basically the things that destroy the human psych soul and that's what makes it so it makes them such a such a issue with this idea of the Jedi as like a as a story is that they've created so many characters that are flawed it's clear the Jedi are not selfless in any sense they're power driven no
02:06:39.000 No, but if you think of the universe through this dichotomy of love and, you know, hate, fear, let's just say fear and love, right?
02:06:47.000 Being the two things that drive the universe.
02:06:48.000 And as you can get closer to these things, you can get either, right?
02:06:51.000 Like, and the Jedi are as close as possible to love and the Sith are as close as possible to fear.
02:06:57.000 And so that's really the, that's the, that's the propaganda though.
02:07:00.000 Like if someone came to me and told me that and they, this is what we hear.
02:07:04.000 We hear like, you know, Donald Trump, he just hates.
02:07:07.000 He's so full of hate.
02:07:09.000 You have to march in lockstep with us or else.
02:07:11.000 And the Sith are like, you can do whatever you want.
02:07:13.000 Believe in yourself and follow your passion.
02:07:15.000 That's not evil.
02:07:16.000 That sounds like they're the good guys.
02:07:17.000 What in the story though, they're like, and also they kill kids.
02:07:21.000 And it's like, shut the fuck up.
02:07:23.000 And you all, and they also say Trump's a white supremacist.
02:07:25.000 So if we're telling a story and Star Trek did this too with the Romulans.
02:07:28.000 If we're telling a story and we're like one side's driven by passion, tons of people are passionate.
02:07:32.000 It'll go around murdering children.
02:07:33.000 And there are tons of religious zealots who are selfless who do.
02:07:37.000 So I think the Jedi are evil as exemplified by the fact that those, those Jedi that came with Mace Windu to execute the chancellor had no problem showing up knowing they were going to murder the chancellor who was duly elected.
02:07:52.000 That's fucked up.
02:07:53.000 Like, how do we, how do we feel about this?
02:07:55.000 How do we feel about a bunch of Democrats saying we love, and Trump is full of hate.
02:08:00.000 So they shove to the White House with rifles intending on killing Trump.
02:08:04.000 And then Trump kills a bunch of them.
02:08:06.000 And he's on the ground.
02:08:07.000 And fucking, I don't know, Hakeem Jeffries is pointing a gun at him.
02:08:11.000 And then Zoran Mamdani walks in.
02:08:14.000 And he's like...
02:08:15.000 How did he get there?
02:08:16.000 What?
02:08:18.000 This movie's completely unrealistic.
02:08:21.000 I want to make this...
02:08:22.000 Like, he's, like, what is he, a time traveler?
02:08:24.000 Can he...
02:08:25.000 What are you talking about right now?
02:08:26.000 Zoran Mamdani's right here.
02:08:27.000 Okay, so, but, okay, so...
02:08:29.000 He's a rising star of the Democratic Party.
02:08:30.000 Can he, like, can he shapeshift?
02:08:33.000 Can he move?
02:08:34.000 No.
02:08:35.000 What?
02:08:36.000 In a car.
02:08:36.000 How does he get around?
02:08:37.000 Okay, so he just, he just ended up in D.C. somehow?
02:08:40.000 He was in D.C. Okay.
02:08:42.000 And he meets with Trump.
02:08:43.000 And then Trump is like, actually...
02:08:45.000 Why isn't he in New York?
02:08:46.000 He should be in New York right now.
02:08:47.000 The point is, he's a rising...
02:08:48.000 And Anakin was not supposed to be at the temple.
02:08:50.000 Or he was not supposed to be at the chancellor's headquarters.
02:08:52.000 He was supposed to stay at the temple.
02:08:53.000 One of the most underrated scenes in Star Wars is the scene with Padme and Amidala.
02:08:57.000 She says, this is how liberty dies.
02:08:59.000 With thunderous applause.
02:09:00.000 When the chancellor is talking about the...
02:09:02.000 How he was left scarred and deformed.
02:09:04.000 And the Jedi are now traitors.
02:09:06.000 And he's given that, you know, in the Senate, in the Senate tundra.
02:09:08.000 So, like, they're in favor of a republic.
02:09:10.000 It's a dictatorship under the Sith.
02:09:12.000 It's oppression.
02:09:13.000 As long as you don't impose your view on others.
02:09:15.000 But that's what the Sith are doing.
02:09:17.000 Sure, sure.
02:09:18.000 But let's try this again.
02:09:19.000 I see what you're doing.
02:09:22.000 So, the chancellor is standing before the Senate.
02:09:25.000 And he says, the Jedi have tried to assassinate me.
02:09:30.000 And they've left me scarred and deformed.
02:09:32.000 The Jedi Order has betrayed the republic.
02:09:35.000 And he is 100% correct.
02:09:38.000 And then Padme goes, so this is how liberty dies.
02:09:41.000 With thunderous applause.
02:09:43.000 And I'm only imagining Donald Trump in the Oval Office going, Barack Obama tried to have me put in jail and falsely accused.
02:09:50.000 Because he's a traitor.
02:09:51.000 And then fucking Elizabeth Warren is going, this is how liberty dies.
02:09:56.000 With thunderous applause.
02:09:57.000 And I'm like, yeah, you're fucking evil, bitch.
02:09:59.000 You are the evil scumbags who tried to overthrow the republic.
02:10:02.000 And you're acting like you were defending liberty the whole time.
02:10:05.000 So, when the Jedi show up and they're like, I'm literally going to kill the chancellor.
02:10:08.000 And it's like, but the machinations.
02:10:10.000 I get it.
02:10:11.000 I get in the truest sense.
02:10:12.000 of the movie you're watching the chancellor palpatine create have these evil machinations to create a civil war to steal power.
02:10:19.000 I'm just saying, on the surface, this is the narrative in Star Wars that they give to us about Trump.
02:10:27.000 They come to say, Donald Trump colluded with Russia to steal the election and seize power in the United States.
02:10:32.000 I'm like, that's fucking Star Wars.
02:10:33.000 Shut the fuck up.
02:10:34.000 It didn't happen.
02:10:34.000 It's not real.
02:10:35.000 But from the standpoint of like, what's see, the problem with the left is we live in this Orwellian society.
02:10:41.000 And so what you're propagating is not actually the truth, though.
02:10:44.000 Like, if you use the example of Elizabeth Warren, you know, they want to overly, you know, they want a police state that's overly regulatory.
02:10:52.000 And like companies like mine wouldn't exist because they want to over-regulate and they want you to pay, you know, three, four million dollars to set up a broker dealer, which creates the barrier to the money.
02:11:01.000 And that was possible.
02:11:02.000 And that was the basis for the trade separatists, the separatist movement in the first place.
02:11:08.000 In the first movie, when they're like, this is not fair that you are enforcing these embargoes and regulations on us.
02:11:14.000 You are shutting our businesses down.
02:11:15.000 No, I'm saying Donald Trump, like the system that they're, this, this capitalist, you know, system that goes back to really, you can go take it back to the Puritans and the Pilgrims, setting up the system of liberty of, you know, running away from the monarchy in Europe.
02:11:29.000 The Galactic Republic set up this free system under the Constitution.
02:11:32.000 Right.
02:11:32.000 The Galactic Republic in episode one was overly regulating the trade federation, forcing, creating a separatist movement because of the way they were handling the marketplace and regulating businesses.
02:11:46.000 And it resulted in conflict, which led to the, it was the blockade on the boo.
02:11:52.000 That was the first plot.
02:11:53.000 So the civil war started because the Galactic Republic was overbearing in their regulations and laws, and they were bureaucratic and unable to move.
02:12:03.000 The chancellor said, this is bullshit.
02:12:06.000 And through his machinations, he was just a center time.
02:12:09.000 He didn't have the power, started to seize control.
02:12:11.000 The point I'm saying is, I get it.
02:12:14.000 Darth Vader blew up a planet.
02:12:15.000 They're evil.
02:12:16.000 We get it.
02:12:17.000 Let's take a look at the surface level politics of the narrative of Star Wars.
02:12:21.000 And this is what they said.
02:12:23.000 The Chancellor was secretly conspiring with the Trade Federation to ignite a conflict so he could steal power and shut down liberty.
02:12:30.000 Okay.
02:12:32.000 With Trump, the Democrats tried to overthrow him, smeared him, lied about it.
02:12:37.000 We've got all the documents now, and more are coming.
02:12:39.000 We've known this for years.
02:12:40.000 They call him hateful.
02:12:42.000 They call him a white supremacist.
02:12:42.000 They call him a bigot.
02:12:44.000 And they've tried to kill him several times.
02:12:47.000 So my joke, I'm joking when I say this, is that if you were to apply the politics of Star Wars outside of the battles and the narrative and the backstory of the Sith, the Jedi are Democrats.
02:12:58.000 And Padme is going, this is how democracy dies with Thunder Suplus.
02:13:02.000 Just like the Democrats claimed Trump's victory was the end of democracy.
02:13:05.000 But what I'm saying is that Saul Linsky and Hillary Clinton and Barack Obama, those are the real Sith that are trying to overthrow.
02:13:11.000 They're the ones trying to overthrow the system.
02:13:15.000 That's the Jedi Order.
02:13:16.000 No, they're trying to use their rules against the United States.
02:13:18.000 The Democrats control the institutions.
02:13:20.000 Take the Constitution and use it.
02:13:23.000 They'll use the Constitution when it serves their purposes, but then.
02:13:27.000 Which faction, Sith or Jedi, was in control of a military faction in the Republic.
02:13:33.000 I mean, both at one point.
02:13:34.000 Nope, the Jedi.
02:13:35.000 Only after the Jedi collapsed did the Sith actually take over, and it was just one guy.
02:13:39.000 Two guys, technically.
02:13:40.000 So in the Republic, before the Chancellor took power, the Jedi were a militaristic religious faction with control of all the institutions.
02:13:47.000 But they only had lightsabers.
02:13:48.000 There wasn't even a military.
02:13:49.000 They were reticent to even control the clone troops that were created for them in episode two.
02:13:54.000 You mean that they created?
02:13:57.000 Well, it goes back to what was it, Sypho Dias and Count Dooku, who they, you know.
02:14:02.000 We're part of the Galactic Republic and went and built a clone army for the Republic and were doing tons of evil ass shit.
02:14:08.000 But that was actually, that was Palpatine actually controlling and had them.
02:14:12.000 So it was still the Sith that really set up that whole military operation.
02:14:18.000 The way they justify the Sith being evil is they say, oh, yeah, you know that bad stuff?
02:14:22.000 Emperor did it.
02:14:23.000 And it's like, okay, well, like, we didn't actually see any of that.
02:14:25.000 It was the Republic that did all those things.
02:14:27.000 So the Republic was under a senator, Palpatine.
02:14:30.000 He had the power to do that.
02:14:31.000 He wasn't the chancellor at the time.
02:14:34.000 Well, he was doing it shrouded in secrecy and darkness, right?
02:14:36.000 That's how they operate.
02:14:37.000 I mean, Yoda says, right, when Obi-Wan tries to make the case that it was a victory for the Jedi, he says, no, no, master.
02:14:44.000 My point is this narrative that Hollywood creates, they apply it to people like Trump when we trump.
02:14:52.000 No, Trump's not the bad guy.
02:14:54.000 No, he's not.
02:14:54.000 The concept of the Jedi was that they had institutional power and they wielded it against their religious and ideological enemies.
02:15:02.000 Donald Trump and the right did not have institutional authority at all in this country or the media.
02:15:07.000 And the Jedi did.
02:15:08.000 My point is the Democrats and the Jedi are basically the same thing.
02:15:11.000 And that's why I'm jokingly saying all the bad stuff they claimed the Empire did was lies and propaganda, just like we say in reality, when they accuse Trump of being a white supremacist, they literally will call Trump a Sith because they equate that as the evil villain.
02:15:24.000 And it's like, oh, okay, so the Emperor wasn't really a bad guy because you're lying about Trump and you've made this up and you've claimed that Trump's victory is the end of democracy, just like Padme did.
02:15:32.000 Does that mean that Natalie Portman plays Melania in this version of the movie?
02:15:36.000 No, it means she's Elizabeth Warren or Slotkin.
02:15:39.000 It means she's definitely not Natalie Portman, then.
02:15:42.000 Liberty dying because Trump got elected.
02:15:44.000 I was just making the case that it's the system, really.
02:15:48.000 The system of the Republic, the system of the Constitution is much more in line with the ideals of the Jedi, even though I understand the argument that you're making, but I am just saying the system that allows all people to have freedom and liberty and choose their path and their religion as opposed to the groupthink mentality of the left.
02:16:07.000 The Jedi were the groupthink of the left.
02:16:10.000 But the Jedi were the groupthink faction.
02:16:12.000 That's canon.
02:16:14.000 They have temples.
02:16:16.000 They're celibate.
02:16:17.000 They take children from their families.
02:16:18.000 They raise them under a rigid religious order.
02:16:20.000 And the Sith do whatever they want.
02:16:21.000 And there's few of them.
02:16:22.000 And they don't coordinate.
02:16:24.000 But they don't impose it on others.
02:16:26.000 And Jedi.
02:16:28.000 The Jedi mind tricks.
02:16:30.000 The Jedi, The Jedi mind trick?
02:16:32.000 What?
02:16:32.000 The Jedi mind trick they use over and over again in the series on people they want to bend to their will?
02:16:37.000 We're going into rabbit holes.
02:16:39.000 This is literally a component of the Jedi.
02:16:41.000 When they want to force someone to do something they want, they can impose their will into their mind.
02:16:46.000 It's justified in their eyes.
02:16:48.000 Oh, sure, like when the commies execute people because it's justified in their eyes.
02:16:50.000 Or when they cancel you from social media because you're a bigot.
02:16:53.000 The Jedi are evil.
02:16:56.000 Let's go to callers.
02:16:59.000 Someone said, Surge, cut to Mary.
02:17:00.000 I want to see how interested she is in this argument.
02:17:04.000 I watched all of Star Wars last year, I think, because our viewers paid some wager.
02:17:10.000 I forget what it was now.
02:17:12.000 10 crisis parties?
02:17:13.000 It might have been $1,000 in donations, and I would watch all of Star Wars in one go, and I did.
02:17:19.000 And I'm more sympathetic to your view, actually.
02:17:22.000 I was like, wait, but why are the Sith bad, though?
02:17:25.000 Right.
02:17:27.000 In the first movie, the only thing bad was blowing up Alderan.
02:17:30.000 And that one's really easy because you could be like, if the justification for blowing up a military base with the estimate based on the size was something around 3 million private contractors, civilian workers, because we know military bases aren't stocked with just soldiers doing everything.
02:17:45.000 So you've got this movie where they blew up a planet.
02:17:48.000 It was terrorists.
02:17:48.000 Why?
02:17:49.000 It was religious fanatics and extremists that were trying to assassinate political leaders and blow up military bases.
02:17:55.000 But they put all these other planets under tyranny and then their history in the galaxy.
02:18:00.000 And I mean, the entire movie.
02:18:02.000 But if you go back to Knights of the Old Republic and the history of the-that's extended universe.
02:18:06.000 I'm saying in the first, when it first came out, the only thing you see the Emperor do the Empire do wrong is blow up Alderaan.
02:18:12.000 Only a planet.
02:18:13.000 They only blow up a planet.
02:18:14.000 But you could make the argument that the rebels blew up a military base, killing millions of civilians with impunity and no thought.
02:18:22.000 They just said blow them up, fuck them.
02:18:23.000 Rebel base, planet.
02:18:26.000 Military base.
02:18:27.000 Military base.
02:18:27.000 But Alderan, though, was generally a peaceful planet.
02:18:31.000 That was Dantuina.
02:18:31.000 It wasn't military.
02:18:32.000 Propaganda.
02:18:34.000 Propaganda.
02:18:35.000 That's the argument I'm making, is that history is written by the victors.
02:18:38.000 And because the Jedi blew up the military base and won, they claim the destruction of Alderan was unjust.
02:18:43.000 And the government says Alderan was basically a major HQ and supplier for insurgents that were terrorizing the universe.
02:18:49.000 And I want the galaxy.
02:18:51.000 And I'll stress, if we scale things down, you're talking about a galaxy.
02:18:56.000 If you scale it down to a planet, then the story is basically they nuked a country.
02:19:00.000 So I am talking about systems.
02:19:02.000 If you want to talk about the two religions and how they're juxtaposed to each other, then that is accurate.
02:19:06.000 But when you talk about systems, one is advocating a republic, a free market system that is generally peaceful, and the other is advocating for an authoritarian militant system that oppresses all those that are in its path.
02:19:20.000 And where the Jedi want to literally uphold peace with lightsabers, which are swords, lasers.
02:19:26.000 This is propaganda.
02:19:27.000 And the Sith create this massive clone army that's designed to literally entangle the clone army.
02:19:34.000 The galaxy, the Republic under the guise of the Sith, though, the Sith control.
02:19:40.000 We know that now.
02:19:40.000 We know the facts now, right?
02:19:42.000 We can look back.
02:19:42.000 We saw the movies.
02:19:43.000 They wrote after the fact.
02:19:46.000 Like this is the point I'm making.
02:19:47.000 The story is written to make sure they justify why they've said they're the bad guys.
02:19:51.000 Well, if you want to then put George Lucas on trial and say that's a whole different...
02:19:56.000 My point is, if you remove the very obvious...
02:20:03.000 The actual storytelling never shows the Empire do anything wrong until Andor.
02:20:08.000 Andor is the first time the Empire has ever been shown.
02:20:11.000 Well, maybe I guess Rogue One.
02:20:13.000 Rogue One.
02:20:14.000 But what in Rogue One did the Empire actually do that was evil?
02:20:21.000 Kill a lot of people.
02:20:22.000 Who?
02:20:25.000 I don't know what they were called, but they were at that point.
02:20:27.000 In Andor, they said.
02:20:28.000 I didn't see Andor.
02:20:29.000 In Andor, they say a false flag so that they can lock down a planet, steal all its khyber, and blow it up.
02:20:36.000 Yeah, Rogue One, then when they were handing off the disc with all the secrets on it, they killed all the people that were trying to get it.
02:20:44.000 So you mean the insurgent terrorists?
02:20:47.000 So you're saying the terrorists who stole military plans and disabled war veteran Darth Vader was trying to get this killer.
02:20:54.000 The Jesus Christ captain and kill it, right?
02:20:55.000 What was his name?
02:20:56.000 Captain.