Timcast IRL - Tim Pool - February 27, 2025


Trump To SHUTTER 120 IRS Offices In MASS PURGE, Democrat LEAKED Tax Info w-Mike Crispi | Timcast IRL


Episode Stats

Length

2 hours and 1 minute

Words per Minute

193.75974

Word Count

23,629

Sentence Count

2,192

Misogynist Sentences

26

Hate Speech Sentences

26


Summary

In this episode of the podcast, we cover a bunch of crazy stories about AI, fake video, fake audio, and fake audio. We also talk about the latest in the ongoing battle between the IRS and the Department of Justice over immigration reform.


Transcript

00:00:00.000 Thank you.
00:00:23.000 And I gotta tell you, the corporate press is trying to make it sound like a bad thing.
00:00:26.000 And they're saying, but these are going to be places that have that have assistance centers for people who need help filing their taxes.
00:00:33.000 So it's kind of like the people who's beating you over the head with the second quarters offer you assistance and how you get beaten over the head with the second quarters.
00:00:38.000 And we're supposed to be sad about it.
00:00:40.000 Please, Democrats, corporate press, keep defending the IRS because I am here for it.
00:00:45.000 We got that story, and we got a couple other weird ones.
00:00:48.000 This one you may have seen the other day.
00:00:49.000 Apple, they've acknowledged that if you used their voice-to-text service when you would say the word racist, it would show Trump, then racist.
00:00:58.000 Now, I bring this one up, even though I know this was the other day the story came out, because there's another story that went viral today.
00:01:04.000 Numerous prominent left-wing organizations, liberal organizations, were sharing AI audio of Donald Trump Jr., and this is exactly what I warned about.
00:01:13.000 It wasn't anything crazy like Don Jr. admitting to breaking the law or doing drugs.
00:01:18.000 It was an AI audio where he said something like, why would we even want to be allies with Ukraine?
00:01:23.000 We should have sent Russia the weapons.
00:01:25.000 And the reason it was clever is that it sounds like an off-the-cuff statement, which is still damaging to one's reputation, but he never said it.
00:01:34.000 Now, all these liberals are deleting en masse, panicking, because they were very seriously defaming the man.
00:01:41.000 But it's only just begun.
00:01:43.000 The AI fake video insanity is upon us and audio, and we're going to get into all that stuff.
00:01:48.000 We've got a lot to talk about.
00:01:49.000 Before we get started, my friends, we've got a great sponsor today.
00:01:52.000 It is NetSuite by Oracle.
00:01:55.000 So make sure you check out netsuite.com slash timcast and the sponsor on the show.
00:02:02.000 What's the future hold for business?
00:02:04.000 Ask nine experts and you'll get ten answers.
00:02:06.000 Bull market, bear market, rates will rise or fall.
00:02:09.000 Can someone please invent a crystal ball?
00:02:10.000 Just not the other one, you know what I mean?
00:02:12.000 Until then, over 41,000 businesses have future-proofed their business with NetSuite by Oracle, the number one cloud ERP, bringing accounting, financial management, inventory, HR into one fluid platform.
00:02:24.000 With one unified business management suite, there's one source of truth, giving you the visibility and control you need to make quick decisions.
00:02:31.000 With real-time insights and forecasting, you're peering into the future with actionable data.
00:02:34.000 When you're closing the books in days, now weeks, you're spending less time looking backwards and more time on what's next.
00:02:40.000 This is why I like NetSuite by Oracle.
00:02:41.000 You should too.
00:02:42.000 Whether your company is earning millions or even hundreds of millions, NetSuite helps you respond to immediate challenges and seize your biggest opportunities.
00:02:49.000 Speaking of opportunity, download the CFO's Guide to AI and Machine Learning at netsuite.com slash timcast.
00:02:57.000 That guide is free to you at netsuite.com slash timcast.
00:03:01.000 One more time, netsuite.com slash timcast.
00:03:04.000 Shout out to Oracle, NetSuite for sponsoring the show.
00:03:06.000 We really do appreciate it.
00:03:07.000 And don't forget, castbrew.com is always available.
00:03:10.000 Delicious cast brew coffee.
00:03:12.000 Well, unfortunately, Ian's Graphene Dream is sold out.
00:03:16.000 It should be back in stock in about a week.
00:03:17.000 You can pick up your Appalachian Nights.
00:03:20.000 But I gotta say, just go into the ground coffee section, and we do have Focus with Mr. Bocas.
00:03:25.000 Look at that beautiful little kitty.
00:03:27.000 And then, of course, we have Misty Mountains Costa Rican Blend.
00:03:30.000 We've got a bunch of different varieties for you to choose from.
00:03:32.000 You can join the Cast Brew Coffee Club.
00:03:34.000 When you do, you get a discount.
00:03:36.000 It's $40, and you subscribe monthly and save.
00:03:39.000 So definitely check that out.
00:03:40.000 And don't forget, we're going to have that members-only, uncensored call-in show at rumble.com slash timcastirl.
00:03:46.000 So join Rumble Premium using promo code TIM10. And you'll also get access to, let's go to our playlists here, The Green Room, which is our behind-the-scenes show before the show starts.
00:03:58.000 We're having crazy conversations.
00:04:01.000 Today's was particularly enough because I'm really, really angry about these CBP agents who are facilitating trafficking.
00:04:07.000 And I'm really angry that you've got so many in law enforcement that sat back and have done nothing.
00:04:12.000 And I'm like, Trump, lock them up.
00:04:14.000 You should watch it because I just basically go off.
00:04:15.000 So it's good fun.
00:04:16.000 Don't forget to also smash that like button, subscribe to this channel, share the show with everyone you know, everybody watching, especially if you're watching on Rumble.
00:04:23.000 Take the URL and share it everywhere and tell everybody you've got to watch the show.
00:04:27.000 It's live now.
00:04:28.000 It's a great show.
00:04:29.000 Everyone agrees.
00:04:30.000 At least that's what I've been told.
00:04:31.000 Joining us tonight to talk about this and so much more is Mike Crispy.
00:04:35.000 What is going on, Tim?
00:04:36.000 Mike Crispy.
00:04:37.000 It's good to be back.
00:04:38.000 I am a surrogate for President Trump in New Jersey, president of the Italian American Civil Rights League, and I host the show on Rubble Every Day.
00:04:46.000 Mike Crispy, Unafraid.
00:04:47.000 Great to be back.
00:04:48.000 Right on.
00:04:49.000 Thanks for coming.
00:04:49.000 Chuck is hanging out.
00:04:50.000 He's here.
00:04:51.000 His debut.
00:04:52.000 Hello, friends.
00:04:53.000 I just try not to break the show or anything, but I host the Green Room.
00:04:57.000 I also do some of the Gamer Maid stuff before the Green Room.
00:05:03.000 Really excited to be on the show.
00:05:04.000 Just trying to do my best.
00:05:06.000 So we brought back the Green Room show, which was our behind-the-scenes guest shop, just roll the cameras.
00:05:14.000 And Chuck's always here.
00:05:16.000 And so he ends up sitting there talking with all the guests, and it went pretty well.
00:05:20.000 And we were like, Chuck, you should come on and promote the Green Room show on Rumble Premium.
00:05:25.000 And Chuck went, oh boy, I don't want to lose my job.
00:05:27.000 And then he sat down.
00:05:28.000 Pretty much, yeah.
00:05:29.000 I mean, the Green Room's great.
00:05:31.000 We talk about...
00:05:32.000 Wild, you know, wide range of subjects and stuff.
00:05:35.000 Like, today was pretty spicy, pretty fun.
00:05:37.000 I think you guys should check it out.
00:05:38.000 Yeah, it's uncensored because it's, you know, not so family-friendly.
00:05:42.000 Chuck is a phenomenal conversationalist.
00:05:44.000 Absolutely.
00:05:45.000 So, I'm Phil that remains.
00:05:47.000 The Labonte, Lee Singer, the Heavy Metal Band, All That Remains, anti-communist, counter-revolutionary.
00:05:51.000 Let's get started.
00:05:52.000 Here we go, ladies and gentlemen, from the Washington Post, IRS to close more than 110 offices with taxpayer assistance centers.
00:05:59.000 The plan is outlined in a letter from the U.S. General Services Administration that was obtained by the Washington Post.
00:06:05.000 Now, I'm just going to pause real quick.
00:06:07.000 Notice how they have to add that little caveat at the end with taxpayer assistance centers.
00:06:11.000 I don't care!
00:06:12.000 I don't care if they got a Kinko's in them.
00:06:14.000 IRS offices, I don't care if they're shut down, nobody likes the IRS. But the Washington Post needed to add that caveat because they're trying to make the IRS look good.
00:06:25.000 I mean, look, I feel like, you know...
00:06:29.000 I don't want to steal lines from people, but the winning just doesn't stop.
00:06:34.000 I mean, this is...
00:06:36.000 I'm sorry.
00:06:38.000 Ron Paul must be sitting in a chair being like, yep.
00:06:42.000 I mean, literally 12 months ago...
00:06:44.000 Everyone is thinking, oh, there's going to be 87,000 new IRS agents, and man, they're going to be going after everybody.
00:06:51.000 They're going to go after anybody that's made more than $600 for anything.
00:06:56.000 So if you sold something for $1,000, you literally could be on the hook to the IRS for it.
00:07:01.000 More likely than not.
00:07:02.000 Yeah, you know, and that's honestly, the IRS and the government...
00:07:07.000 More broadly, should not be in the business of ruining American people's lives.
00:07:13.000 This idea that the IRS is some kind of service that people aren't terrified of dealing with is ridiculous, first of all.
00:07:23.000 People on the left love to make remarks like, well, you know, they're going to go after the millionaires and the billionaires.
00:07:28.000 They're not going to do any of that.
00:07:30.000 They're going to go after people that make less than $100,000 a year because those people don't have the money to fight to hire an actual lawyer.
00:07:40.000 And the IRS has...
00:07:42.000 Admitted that, that they go after those people because that's actually where the money is.
00:07:46.000 So they say the Trump administration plans to shutter more than 110 IRS offices that have taxpayer assistance centers.
00:07:51.000 The plan outlined in a Tuesday letter, blah, blah, blah.
00:07:53.000 They're basically saying that when these leases are terminated or not renewed when they expire, according to a list included in the GSA's letter, it is unclear whether the assistance centers...
00:08:05.000 Which provide free in-person help for tax filers on an appointment basis will relocate or simply close.
00:08:10.000 Do you guys know that H&R Block is free?
00:08:12.000 Yep.
00:08:13.000 I think, what's the other one?
00:08:15.000 It's H&R Block, HR Block, and the other one is...
00:08:17.000 Oh, I should know this.
00:08:18.000 This is a tax account.
00:08:19.000 You definitely should know this.
00:08:20.000 The other one, they have TurboTax online, which is pretty much free.
00:08:24.000 Yeah, yeah, yeah.
00:08:24.000 But the other...
00:08:25.000 I can't remember.
00:08:26.000 H&R Block's the big one.
00:08:28.000 Yeah.
00:08:28.000 I'm trying to think of the other one.
00:08:30.000 But the government isn't helping anybody do anything.
00:08:32.000 All they do is wait for you to file, and then if you don't get it right, they send you a letter saying, we know exactly how much you owed, and now you owe it to us with penalties.
00:08:40.000 So the whole thing...
00:08:41.000 But they won't tell you how much you owe when you file.
00:08:44.000 Was it Jackson Hewitt?
00:08:45.000 Yeah.
00:08:46.000 That was a big one.
00:08:47.000 Yeah, that's great.
00:08:48.000 I just want to mention...
00:08:50.000 They have to provide a free basic service because of the way the law works.
00:08:54.000 So in order to have these private special interest corporations that do taxes, they – my understanding – I could be wrong, but the way the lobbying is they lobby the government to make sure that tax filing is not automated and difficult.
00:09:06.000 That's at least the story.
00:09:07.000 Whatever it may be, you can get free tax assistance from any one of these companies.
00:09:12.000 It's like your basic filing is free, automated software.
00:09:15.000 They're trying to make it seem like Trump shuttering bloat.
00:09:19.000 Especially the IRS is a bad thing.
00:09:22.000 Lord help me, they are paving the way for a blowout in the midterms.
00:09:26.000 Yeah, they're paving the way for Republicans not to lose in 50 years.
00:09:29.000 Because what is more popular in polling than paying less taxes and the IRS not having more people to come after you when you owe pennies?
00:09:38.000 You said it really good right there.
00:09:39.000 It's like the people who have money, who have even just a couple million bucks.
00:09:43.000 They know how to shelter it and put it into trust and put bank accounts offshore and all that stuff.
00:09:49.000 Regular people, small business owners, those are the ones who the 87,000 IRS agents are trying to screw over and go after.
00:09:55.000 So, I mean, this is amazing.
00:09:57.000 Just real quick, real quick, you guys.
00:10:01.000 I have a list here from Pew.
00:10:03.000 I don't want to see it.
00:10:04.000 Of the federal agencies that the Pew Research has dug into, it is many.
00:10:11.000 Which agency do you believe has the highest unfavorability and the lowest favorability?
00:10:19.000 IRS. IRS. Was it even a hard question?
00:10:22.000 Should I have even asked?
00:10:24.000 Indeed, the National Park Service is the most favorable, and I gotta admit...
00:10:28.000 I like those guys.
00:10:29.000 Yeah, good guys.
00:10:30.000 And the post office, NASA, the CDC actually is largely favorable.
00:10:34.000 The post office is favorable?
00:10:35.000 It is, yeah.
00:10:36.000 I know, that's surprising.
00:10:37.000 You know why?
00:10:38.000 Because your mailman is actually nice, and that's the person that you deal with the most.
00:10:43.000 When you go to the post office, especially if you're in a smaller town or whatever, the people that work in my post office are just wonderful people.
00:10:50.000 I love them to death.
00:10:52.000 So that's the kind of deal.
00:10:55.000 People know those people.
00:10:56.000 They're like, oh, they're fine.
00:10:57.000 This poll is from last summer, too, and it ain't even a contest.
00:11:00.000 The lowest favorability is Department of Education, Department of Justice, and the IRS, and there's not a comparison.
00:11:07.000 Department of Education and Department of Justice are minus one point.
00:11:11.000 IRS is minus 12. There's no question it is the least popular federal agency, and the media and the Democrats are like, please protect our IRS agents.
00:11:22.000 Dude, midterms are going to come around.
00:11:24.000 And every Republican is going to be like, Donald Trump shut down IRS offices.
00:11:28.000 Who cares what he did?
00:11:30.000 Democrats wanted to keep them open.
00:11:31.000 Who cares why they wanted them open?
00:11:33.000 They were offering tax assistance centers, though.
00:11:35.000 Don't you realize he's going to make it harder for you to have to pay less?
00:11:40.000 The only tax assistance I want is less taxes.
00:11:42.000 Exactly.
00:11:44.000 But they also offer different voluntary...
00:11:46.000 There's this thing called VITA where there's a tax assistance center not associated with the IRS directly, but there's different options out there that you can go and seek and find.
00:11:54.000 So we should close those, too.
00:11:57.000 Ideally, yes.
00:11:58.000 Get rid of taxes.
00:11:59.000 He's going to shut down the IRS completely, he said, and start the ERS, the External Revenue Service, get rid of the IRS, start the ERS, no taxes, income-wise, and do tariffs.
00:12:11.000 I think the people are excited about it.
00:12:15.000 It's been a month.
00:12:16.000 What if this...
00:12:17.000 Literally is Trump's opening salvo to eventually getting rid of the IRS. Let's go.
00:12:22.000 I mean, I don't think that you're going to have a significant complaint from the American people.
00:12:27.000 Nope.
00:12:28.000 You know, especially, particularly if they can fund the government in other ways, right?
00:12:33.000 So they've cut the government enough so that way the money that they need to run the government can be raised in other ways.
00:12:41.000 Most people won't notice a...
00:12:43.000 And I'm not going to predict.
00:12:45.000 I'm just going to say if most people don't notice a significant change in their life, why would anyone complain?
00:12:52.000 I've got to be honest, too.
00:12:53.000 We don't need an IRS. Even if we keep taxes the way they are, the way it would work is taxes come out of your paycheck automatically and you never think twice.
00:13:02.000 We don't need any of this stuff.
00:13:04.000 When I pay my electric bills automatically, I don't even think about it.
00:13:09.000 We don't need to have this bloated federal agency for any of this stuff.
00:13:13.000 But I understand there's criminal enforcement they're going after, but let's just get rid of the IRS. Here's the thing.
00:13:21.000 Donald Trump announced his gold card.
00:13:23.000 You guys saw the story?
00:13:24.000 Yeah.
00:13:24.000 And it's crazy to me that no matter what Trump does, liberals and Democrats and the corporate press can't give him one good day.
00:13:32.000 This was the narrative when they said that that ISIS guy was an austere scholar.
00:13:36.000 I can't remember who said it.
00:13:37.000 They were like, yo, I think it was Nate Silver.
00:13:38.000 He's like, they can't give Trump one good date.
00:13:40.000 That's insane.
00:13:41.000 Like, come on, obviously getting rid of these terrorists and cleaning up and winning wars is good.
00:13:46.000 And so Donald Trump now offers up this gold card as compared to the green card.
00:13:52.000 Five million dollars and you can get residence in the U.S. Bang, just like that.
00:13:56.000 We already have the EB-5 visa, which functions very similarly, but for a lot less.
00:14:00.000 Trump said, imagine one million people.
00:14:02.000 Just one million want to buy that.
00:14:04.000 That's $5 trillion.
00:14:06.000 And right to the deficit, Elon pointed out at their cabinet meeting, they're paying $1 trillion on interest.
00:14:13.000 And he's like, this is impossible.
00:14:15.000 We're not making enough money to pay that down.
00:14:17.000 If we don't doge, then the country is functionally bankrupt.
00:14:22.000 And the main issue is Ian's always talking about defaulting on the debt and how we should just do it.
00:14:28.000 He's talking about...
00:14:29.000 Because if that were to happen, the entire swift payment system, the global economic infrastructure collapses, and you get war very quickly.
00:14:36.000 If the U.S. does not doge and deal with the deficit, the deficit is increasing with this budget resolution that just passed, meaning the deficit is how much more we spend than we have.
00:14:50.000 So the debt is going to grow exponentially in this way.
00:14:53.000 We have to get the deficit to zero and then start paying the debt down.
00:14:57.000 If we can't, U.S. bonds and U.S. trade will be worth nothing.
00:15:01.000 Petrodollar will be valueless.
00:15:03.000 And then the OPEC nations and all these other countries trading in oil will stop trading with the United States.
00:15:07.000 The United States ain't going away overnight.
00:15:09.000 It will just be global economic crisis.
00:15:12.000 Doge is not a question of what do we deserve or should we have it?
00:15:19.000 Doge is a question of...
00:15:20.000 What do we have to do to make sure civilization doesn't collapse?
00:15:23.000 Isn't it crazy they can't give him one win?
00:15:25.000 Like, they don't see that that makes them totally discredited on everything?
00:15:28.000 Like, they can't say one...
00:15:30.000 Semi-nice thing about one thing that he's done.
00:15:33.000 Because at least if they did that, then if they criticized him on other stuff, it'd make him seem a little bit more credible, but just everything's terrible, right?
00:15:40.000 It ruins their whole thing.
00:15:41.000 I could easily compliment Joe Biden.
00:15:42.000 He's very good at retaining classified documents for the purpose of making money.
00:15:47.000 Yeah.
00:15:47.000 It's a compliment.
00:15:48.000 He's a great businessman, in a sense.
00:15:50.000 Very smart.
00:15:51.000 And he had the wherewithal to hire a ghostwriter who knew to delete evidence of his criminal wrongdoing.
00:15:58.000 Very impressive.
00:16:00.000 Very smart.
00:16:00.000 That's right.
00:16:01.000 See?
00:16:01.000 A compliment.
00:16:03.000 I mean the idea that the – that there are no ways to cut the deficit, which is exactly what the Democrats are essentially – They're saying everything that Doge does, they've got a complaint about it.
00:16:18.000 This is wrong, that's bad, blah, blah, blah.
00:16:20.000 The idea that there's no way to cut government waste, that every dollar is spent in a responsible manner, that's obviously not true.
00:16:31.000 So even if they don't say nice things about Donald Trump or Doge or whatever, if they just say true things.
00:16:40.000 It doesn't have to have anything about an opinion.
00:16:46.000 Just say true things.
00:16:48.000 We do need to cut.
00:16:51.000 Our deficit every year is too big.
00:16:55.000 The national debt is absolutely out of hand and it will bankrupt America, just like all the stuff that Tim laid out just a few minutes ago.
00:17:02.000 We will have an actual World War III. If we don't do something about our debt, because all the countries that have bought the US debt, if we default on it, they're going to be looking for some kind of way to make restitution.
00:17:21.000 These things are real.
00:17:23.000 So granted, it is unpopular with anyone in Congress to talk about...
00:17:29.000 Medicare, Medicaid, and Social Security, right?
00:17:31.000 Those are unpopular things to bring up because old people vote and old people are the ones that are living on that.
00:17:37.000 But if we don't restructure these things, if we don't do something about them, there will be a default.
00:17:42.000 I want to just quickly address one super chat from Robert Fulton who said only 550,000 millionaires in the world and 250,000 are in the U.S. So we will not put a dent in the deficit.
00:17:52.000 Incorrect, good sir.
00:17:53.000 A cursory search shows 58 million millionaires in the world.
00:17:57.000 1.5% of the global adult population.
00:18:00.000 And 2.8 million are ultra-high net worth.
00:18:03.000 5.1 have an estimated wealth of 5 to 10 million.
00:18:07.000 28,000 people have wealth over 100 million.
00:18:10.000 In the United States, there's 22 million millionaires.
00:18:13.000 China has about 6 million.
00:18:15.000 The UK has 3. France has 2.9.
00:18:17.000 Japan has 2.8.
00:18:19.000 Certainly, there are plenty of millionaires, and they all want to live here.
00:18:21.000 Trump made a good point, too.
00:18:22.000 He said, they are going to come here.
00:18:24.000 Ultra-wealthy people want to live and have residence in the United States.
00:18:28.000 Let's simplify it, make it cost some money, and bring them here.
00:18:31.000 EB-5 is convoluted, and I think it's brilliant.
00:18:34.000 Plus, it's so on-brand for Trump.
00:18:36.000 I hope when he issues them, you literally get a golden card with, like, Trump's wavy hair symbol on it that says, like, residency or something.
00:18:43.000 It should be solid gold, too.
00:18:46.000 Like, none of this plated stuff.
00:18:49.000 You have to be real careful with it, too, because it's soft.
00:18:53.000 So we have this story from the other day that I'm going to pull up here from at Amuse on X. The IRS now admits a Democrat activist working for the IRS leaked over 400,000 tax returns.
00:19:05.000 Meanwhile, Democrat NGOs and Democrat-appointed lawyers are suing to prevent federal employees assigned to Doge from accessing taxpayer data.
00:19:11.000 There is no proof that federal employees assigned to Doge would leak the tax returns of hundreds of thousands of Americans like Mr. Littlejohn did.
00:19:17.000 This is a huge story that apparently a guy working for the IRS was a Democrat activist leaked tax returns, including that of Trump.
00:19:26.000 For political reasons, I can only assume.
00:19:29.000 And then they have the nerve to come out and say, the auditors are going to leak it.
00:19:34.000 Like, bro, you did it!
00:19:35.000 Like, Democrats did it already!
00:19:37.000 Okay, Doge needs to go in, and Trump needs to clean house.
00:19:41.000 I hope they uncover, and I believe they exist, I hope they uncover all of the kink chats that exist at every department, because I guarantee you they probably do.
00:19:50.000 These people are depraved.
00:19:51.000 You're going to find them, and they've got to get fired.
00:19:53.000 Well, I mean, the fact that these people consider themselves...
00:19:58.000 To be members of the LGBTQ community or whatever.
00:20:03.000 And that is something that the government was hiring for.
00:20:07.000 They were looking for people that were members of the LGBTQ community to hire them because they wanted to make sure that people were represented.
00:20:14.000 Well, there is a phenomenon with a plurality, I won't say majority, but a plurality of the LGBTQ community.
00:20:24.000 Is full-on disgusting deviants.
00:20:27.000 Again, I'm not saying the majority, but a plurality.
00:20:31.000 And clearly, the government has picked up a few of them because they're talking about absolutely disgusting behaviors.
00:20:39.000 There were pee discussions.
00:20:42.000 The important thing about those chats is that the most egregious of them can't be said.
00:20:48.000 It was funny.
00:20:49.000 I was watching The Five earlier.
00:20:50.000 Yeah.
00:20:51.000 And Jesse Waters read some of the chats, and then he goes, I wanted to read more about my producers.
00:20:55.000 Jesse, you can't say those things.
00:20:56.000 And I was like, we said the same thing on IRL last night.
00:20:59.000 There were things in there that are so shocking to the moral, to any decent moral person, and to the dignity of humans in general, they cannot be spoken in public.
00:21:09.000 And these are the decent moral people that the Democrats have protests over if we want to lay off any of them, right?
00:21:14.000 Those are the ones they're protecting, like the most degenerate of the degenerates.
00:21:18.000 I was just going to say, degeneracy.
00:21:20.000 You know what's funny?
00:21:20.000 There was an ad that came out a couple years ago for the CIA, and it was a CIA recruitment ad.
00:21:25.000 And it was about this woman, this black woman, and she was rattling off all the disorders that she had, like personality, schizophrenia, all this.
00:21:33.000 And I'm like...
00:21:34.000 It's like they're trying to get these kind of people.
00:21:36.000 They're actively trying to get the most mentally disturbed, depraved, and degenerate people into...
00:21:43.000 This group, why are they actively trying to get them, right?
00:21:46.000 It's kind of weird.
00:21:46.000 That's the thing.
00:21:47.000 Look, man, what you do in your private life is fine, but any company, if they saw that this stuff was being discussed in a company channel, on company time, you would get fired.
00:22:00.000 They would clean house.
00:22:02.000 They would be like, this is unacceptable to discuss at work.
00:22:06.000 Widely inappropriate.
00:22:07.000 Oh, wow, they deleted it.
00:22:09.000 The video's been deleted.
00:22:12.000 On YouTube, it's been made private.
00:22:14.000 On Axe, it's been deleted.
00:22:15.000 I'm going to see if I can still find it.
00:22:16.000 It's a video where CIO was recruiting intersectional women of color who suffered from mental disorders.
00:22:21.000 I'm not kidding.
00:22:22.000 That's the story.
00:22:25.000 That's a crazy person you want as a spy.
00:22:27.000 CIO is the spies?
00:22:29.000 Yeah, they are the spies.
00:22:31.000 And they were going for it.
00:22:32.000 They wanted those people.
00:22:34.000 You found it?
00:22:34.000 No, I found something crazier from last month.
00:22:37.000 Okay, here we go.
00:22:38.000 It's like the end stages of the Biden regime.
00:22:40.000 CIA operative reveals mental disorder agency actively seeks to hire because it makes for better spies.
00:22:45.000 Yeah.
00:22:46.000 They got different personalities.
00:22:47.000 They seek to hire sociopaths.
00:22:49.000 Yeah, because they get to be like, you know, best of both worlds.
00:22:52.000 Be like, no, I'm this person today, this person another day.
00:22:55.000 What could go wrong when they decide to be somebody else?
00:22:58.000 Give all of our information away because they don't like Trump anymore, right?
00:23:01.000 I mean, it's just, yeah.
00:23:02.000 Good strategy.
00:23:03.000 Smart people at the top.
00:23:05.000 Wait, is this the video?
00:23:07.000 Woke CIA recruitment video?
00:23:11.000 I think I found it.
00:23:12.000 Is that it?
00:23:13.000 It might be it.
00:23:14.000 But like we were saying...
00:23:15.000 Let's play it anyway.
00:23:17.000 Here we go.
00:23:18.000 I think this is it.
00:23:21.000 When I was 17, I quoted Zora Neale Hurston's How It Feels to Be Colored Me in my college application essay.
00:23:28.000 The line that spoke to me stated simply, I am not tragically colored.
00:23:32.000 There is no sorrow damned up in my soul nor lurking behind my eyes.
00:23:36.000 I do not mind at all.
00:23:38.000 At 17, I had no idea what life would bring, but Sora's sentiment articulated so beautifully how I felt as a daughter of immigrants then and now.
00:23:46.000 Nothing about me was or is tragic.
00:23:49.000 I am perfectly made.
00:23:51.000 I can wax eloquent on complex legal issues in English.
00:23:55.000 While also belting Guayaquil de Mis Amores in Spanish.
00:23:59.000 I can change a diaper with one hand and console a crying toddler with the other.
00:24:04.000 I'm a woman of color.
00:24:06.000 I am a mom.
00:24:07.000 I am a cisgender millennial who's been diagnosed with generalized anxiety disorder.
00:24:12.000 I am intersectional.
00:24:14.000 But my existence is not a box-checking exercise.
00:24:17.000 I am a walking declaration.
00:24:19.000 A woman whose inflection does not rise at the end of her sentences, suggesting that a question has been asked.
00:24:26.000 I did not sneak into CIA. My employment was not and is not the result of a fluke or slip through the cracks.
00:24:34.000 I earned my way in, and I earned my way up the ranks of this organization.
00:24:38.000 I am educated, qualified, and competent.
00:24:41.000 And sometimes I struggle.
00:24:42.000 I struggle feeling like I could do more, be more to my two sons.
00:24:46.000 Okay, we get it.
00:24:47.000 I want to go back to this right here.
00:24:49.000 I'm just loving the her, all happy with Brennan.
00:24:51.000 Yeah, they're all fired.
00:24:53.000 Didn't he get his security?
00:24:54.000 He got his clearance revoked, right?
00:24:55.000 He did.
00:24:55.000 He was one of the 51. Yep.
00:24:58.000 Oh, man.
00:24:59.000 This is wild.
00:25:00.000 And, like, they clearly scripted that.
00:25:02.000 So, like, they wanted to make sure to fit in as many things as they could.
00:25:05.000 Like, illegal immigrant family, woman of color, inter this.
00:25:10.000 It's like, try to put every word in.
00:25:11.000 I just want to highlight the hilarity of...
00:25:15.000 This woke CIA ad from a few years ago, and they highlight her with John Brennan.
00:25:20.000 And as of today, Trump is firing all of these people, gutting all these programs, and Brennan's had a security clearance revoked.
00:25:26.000 Talk about a 180. Winning.
00:25:29.000 We're winning so much.
00:25:30.000 You know, part of me is getting worried we're winning too much because I'm like, you know, look, yin-yang, man.
00:25:37.000 What goes up must come down.
00:25:39.000 I mean, the bigger they are, the harder they fall.
00:25:41.000 This is coming down, right?
00:25:43.000 All of this stuff was the going up.
00:25:45.000 All the DEI stuff.
00:25:47.000 All of this.
00:25:49.000 All of the people that got jobs that were not qualified or that were underqualified and the only reason they got their job is because they filled some identity quota.
00:25:58.000 Those people are now losing their jobs because they got those jobs.
00:26:03.000 In a way that was unfair, that was not right.
00:26:07.000 There is no reason to believe that just because you have a particular identity that you are the most qualified for a job.
00:26:15.000 And when you're brought to a company and the most important thing that they're advertising for are...
00:26:21.000 Identity traits as opposed to the qualities that are necessary to do the job, then you end up with people that are unqualified.
00:26:28.000 You look at the way that the Army was advertising my two moms and all of these identity things.
00:26:36.000 Pregnant pilots, right?
00:26:37.000 I'm a pregnant pilot.
00:26:39.000 And recruitment was absolutely atrocious.
00:26:43.000 People were not joining the military.
00:26:44.000 They did pregnant pilot ads?
00:26:46.000 Yeah, they were saying.
00:26:47.000 There was a video out there of the woman who was a pilot.
00:26:51.000 And they were talking about how she has a pregnant pilot suit and this and that.
00:26:55.000 It's ridiculous.
00:26:56.000 And now, the most recent commercials that the army's running, what they say is, they say, strong people are harder to kill.
00:27:07.000 That's what the army should be saying.
00:27:09.000 Isn't that nice?
00:27:10.000 Yes.
00:27:10.000 Here's the story.
00:27:11.000 Pregnant Air Force pilot takes to the skies in supersonic bomber.
00:27:15.000 Unbelievable.
00:27:16.000 No.
00:27:16.000 Probably shouldn't do that if you're pregnant.
00:27:17.000 We got the receipts today.
00:27:19.000 Did we cover this when this story came out?
00:27:21.000 Yo, they legit put a seriously pregnant woman?
00:27:25.000 To be fair, any...
00:27:27.000 I mean...
00:27:28.000 The challenge there is we certainly do want to see the effects on a pregnant woman at high speeds.
00:27:34.000 I'm not trying to be a dick or weird or anything like that.
00:27:37.000 We do research on zero gravity to see how it affects the human body.
00:27:40.000 And having information on how pregnant women are affected by these things would be important for space travel and interplanetary colonization.
00:27:49.000 The problem there is like, can you consent for the baby?
00:27:52.000 So how would you even do human trials on stuff like that is very difficult, I guess.
00:27:57.000 Dangerous.
00:27:57.000 Yeah, but they literally had her flying a supersonic jet while she appears to be like six, seven months pregnant.
00:28:02.000 She has a custom suit.
00:28:03.000 They made her a custom, you know, flight suit there.
00:28:06.000 Look at that.
00:28:07.000 Wow.
00:28:07.000 It's not just a few weeks.
00:28:08.000 I'm skeptical as to even if it was real, if it actually happened.
00:28:13.000 It could have been just a...
00:28:14.000 Propaganda?
00:28:15.000 Yeah.
00:28:15.000 Like, you know, with Ukraine, they would always have the hot girls on the battlefields with their nails done and perfect outfits.
00:28:22.000 They'd say, oh, we're fighting Russia.
00:28:24.000 You think it was left-wing Biden regime military...
00:28:28.000 Propaganda.
00:28:29.000 Have you seen the psychological operations?
00:28:32.000 There was a few women that went viral, and they were basically like e-girls that worked for the military to recruit young men.
00:28:39.000 I'm like, that's horrifying.
00:28:41.000 They have a...
00:28:43.000 What are they called?
00:28:44.000 Bunker bunnies?
00:28:45.000 Bunker bunnies.
00:28:46.000 There's one girl, her name was Lujan or something like that, and she's in the army, and she was with the army, I want to say, psychological operations.
00:28:58.000 She was literally an e-girl psyop.
00:29:02.000 I think her name is Lujan or something like that.
00:29:07.000 Lujan, yeah.
00:29:08.000 And she's in the army.
00:29:09.000 She's a pretty girl.
00:29:11.000 But she was absolutely, she was in the psychological operations.
00:29:16.000 Like, that was her job, was to be a PSYOP. Ladies and gentlemen, sorry to cut you off, nothing else matters.
00:29:22.000 Breaking news, the Epstein list will drop tomorrow, says Pam Bondi.
00:29:27.000 We got the tweet from Nick Sorter.
00:29:28.000 Breaking news on Jesse Waters.
00:29:30.000 He's live right now, right?
00:29:32.000 Is that when he's live?
00:29:33.000 Yeah.
00:29:33.000 So basically, we just sit back and wait for him to break the news for us.
00:29:36.000 Jesse, what do you do?
00:29:37.000 It was funny, because a couple weeks ago, I went on his show instead of being here.
00:29:40.000 Here's the clip.
00:29:42.000 Roll tape.
00:29:43.000 You have the Epstein files on your desk.
00:29:45.000 When can we see them and what's taking so long to release them?
00:29:49.000 I do.
00:29:50.000 Jesse, there are well over, this will make you sick, 200 victims.
00:29:56.000 200. So we have well over, over 250 actually.
00:30:01.000 So we have to make sure that their identity is protected and their personal information.
00:30:08.000 But other than that, I think tomorrow, you know, the personal information of victims.
00:30:12.000 Other than that, I think tomorrow, Jesse, breaking news right now, you're going to see some Epstein information being released.
00:30:19.000 Okay, so maybe not the whole Epstein list.
00:30:21.000 What kind?
00:30:22.000 Are we going to see who was on the flights?
00:30:25.000 Are we going to see any evidence from what he recorded?
00:30:28.000 Because he had all of his homes wired with recording devices.
00:30:32.000 What you're going to see...
00:30:34.000 Hopefully tomorrow is a lot of flight logs, a lot of names, a lot of information.
00:30:41.000 Partial client list.
00:30:42.000 It's pretty sick what that man did.
00:30:45.000 So it looks like we'll be getting a partial client list, some information.
00:30:49.000 See, this is what we were talking about the other day because people kept saying, why aren't they releasing the information?
00:30:53.000 And what I said at first, if there's information...
00:30:58.000 In the Epstein files that pertains to an ongoing investigation that is ancillary to the Epstein flight logs and what Epstein was doing, you don't want to compromise those investigations.
00:31:07.000 Like, imagine Trump gets in.
00:31:08.000 I trust that Trump and more so Cashin, Dan, I think Trump's focused on a lot of other things, are going to be looking at who these people were.
00:31:16.000 So you may have with these documents, they go through them and they're like, hey, look, this pharmaceutical exec or this, you know, international whatever.
00:31:23.000 Are implicated in this, and it seems to be there's evidence that they're still running operations.
00:31:27.000 If we publish this information, they go underground.
00:31:30.000 And then we give away all this information, so we can't do that.
00:31:34.000 That's why it's not so easy.
00:31:36.000 The question is, do you trust the existing law enforcement apparatus?
00:31:40.000 Under Biden, of course, no, I didn't.
00:31:41.000 And honestly, under the first Trump administration, no way!
00:31:43.000 They were working against them the whole time.
00:31:45.000 Now I'm feeling pretty good.
00:31:47.000 The victim's thing is the easiest to understand.
00:31:50.000 And this has already been said ad nauseum for a decade.
00:31:53.000 When people were talking about the documents that got released and Virginia Giuffre and all that stuff, the hardest thing to get past is there are innocent victims of Epstein whose names are in those files.
00:32:05.000 And they've got to go through them and redact that and figure out who and when and how.
00:32:11.000 And maybe some people don't mind it.
00:32:13.000 There's a lot to go through, man.
00:32:15.000 But I still think the challenge we face is I don't know that for the sake of those victims the world should not get the information on who is working with Epstein.
00:32:26.000 Because those people may still be working and we should know about it.
00:32:31.000 This is totally just for my own personal opinion.
00:32:35.000 Like, I would love this.
00:32:36.000 I want to see this come out just so that way both the left and the right can stop saying, oh, your guy's on the list and your guy's on the list and your guy's on the list and your guy's on the list.
00:32:47.000 Put it out so we know.
00:32:49.000 And if there is anybody that has broken the law or that's implicated, fine.
00:32:53.000 They don't get it.
00:32:54.000 Like, I see these liberals being like, oh, yeah, Trump's on the flight logs.
00:32:57.000 I'm like, yep, publish it.
00:32:59.000 I don't think you get it.
00:33:00.000 We want all of it published.
00:33:02.000 I don't care if Trump's on it, RFK Jr.'s on it, and Tulsi Gabbard are all on it having a party together.
00:33:06.000 Publish all of it, and we'll figure out after the fact.
00:33:09.000 And that being said, isn't RFK Jr. on it too?
00:33:12.000 On the flight logs, not the client list.
00:33:15.000 We interviewed him, and he said something like, at the time I didn't know, I was with my family or whatever.
00:33:21.000 As Trump appears in the flight logs, my understanding is it's largely with his family, and the same thing with RFK Jr., if I'm not mistaken.
00:33:28.000 That's what he said, yeah.
00:33:29.000 I don't care all that much.
00:33:31.000 Like, if Trump, Ivanka, and Ivana, and whoever else were flying on the plane with Epstein, I'm like, my question is, what did you know about this guy?
00:33:40.000 Why did anything get done about it?
00:33:42.000 I want to know who the clients are.
00:33:44.000 I want to know who's going to the island.
00:33:45.000 You know, Prince Andrew and stuff like that.
00:33:47.000 Flight logs are important, too.
00:33:48.000 It's evidence to that.
00:33:49.000 But just because someone flew on a plane with a wealthy guy who flew across the country all the time doesn't mean they're involved in anything.
00:33:57.000 Yeah.
00:33:57.000 There's going to be a lot of people that are...
00:34:03.000 You know, that have gone to Epstein's Island, but there's no evidence that they did anything actually wrong.
00:34:08.000 Oh, yeah, but I don't care.
00:34:09.000 Yeah, but if you went to the island...
00:34:11.000 Yeah, if you went to the island...
00:34:12.000 The island's pretty damning.
00:34:13.000 But in RFK's case, I mean, he says he didn't go to the island.
00:34:16.000 He flew on it with his family.
00:34:17.000 And, you know, Trump, I think it was a similar thing where he said, oh, you know, my plane, I didn't have it or it was getting serviced or whatever, so I flew down.
00:34:24.000 So I think the context of, like, where they went on that plane and also the quantity.
00:34:29.000 Like, you know, did you fly three or four times or did you fly, like, four times?
00:34:32.000 45 times, indicating that you probably had a deeper relationship there and a lot of things were going on.
00:34:37.000 So I think that's what we need to know.
00:34:39.000 And then also, I don't know if we're going to get this, but some information on Epstein's finances.
00:34:46.000 He knows to this day how he kind of made his money and who was doing business with him and things like Bill Gates could be directly involved and implicated in other people.
00:34:55.000 He was an asset for—I mean, there's rumors that he was an asset for Israel.
00:35:00.000 And obviously I have no evidence of that or anything like that.
00:35:02.000 But there's rumors that he was an asset for some intelligence group or intelligence organization.
00:35:10.000 I heard people say that it was MI6 and that he was involved in British.
00:35:14.000 And I've heard people say Israel.
00:35:15.000 These are all just rumors.
00:35:16.000 I know, but Dan Bongino came on Timcast IRL and said he was an intelligence asset for some Middle Eastern country.
00:35:24.000 And everybody was like, oh, come on, bro. - You know, like, there's one country that comes to mind when you think of intelligence agency.
00:35:32.000 It's not Saudi Arabia.
00:35:33.000 It's not Qatar.
00:35:36.000 Okay, I know all of the Jews people are screaming, yes, people believe that Epstein may have been involved with Mossad and Israel.
00:35:41.000 We don't know.
00:35:43.000 But if they start dropping this information, we're going to start to figure out who he was working with.
00:35:47.000 And maybe Dan didn't literally mean Israel, but everybody's brain went there.
00:35:53.000 What's going on?
00:35:54.000 Serge is over here laughing at the chat, and I've got to bring this up.
00:35:57.000 I've got to see what's going on.
00:35:59.000 You know what it is.
00:36:00.000 You know what it is.
00:36:01.000 You know what they're saying.
00:36:04.000 You know.
00:36:05.000 Wow, I mean, I do think public pressure has a lot to do with why we're getting this information now.
00:36:11.000 To be fair, Pam Bondi didn't need to go on TV in the first place and say, the Epstein client list is sitting on my desk.
00:36:17.000 She said that, what, like a couple days ago?
00:36:19.000 And then, like, they've been on her since.
00:36:21.000 I mean, there's been a lot of, you know, heat on her since that time.
00:36:23.000 Like, okay, well, then put it out.
00:36:25.000 Like, you know, this is what people want.
00:36:26.000 They want this stuff to be out there and they want to know the connections.
00:36:29.000 And I think people believe, just like they believe things about Kennedy or whatever, you know.
00:36:33.000 Those things that have come out over the last year or so about the U.S. government being involved in that, they want to know about Epstein and who was involved with him, CIA, Mossad, MI6, anybody and everybody.
00:36:44.000 We need the information because the guy pops out of nowhere and has millions of dollars and is getting all these rich people to give them their assets.
00:36:51.000 So it's just we need the truth and hopefully Bondi has that.
00:36:54.000 But you know what?
00:36:54.000 I really can't believe in all of this.
00:36:57.000 It's that Pam Bondi is 60 years old.
00:36:59.000 You guys know that?
00:37:00.000 No.
00:37:01.000 Look at this.
00:37:02.000 Yeah.
00:37:03.000 She's 60. She's looking good?
00:37:05.000 Good for her.
00:37:05.000 Wow.
00:37:07.000 She's got some work, but that's okay.
00:37:09.000 Oh, is that it?
00:37:10.000 She has, but it's okay.
00:37:11.000 You think Jesse has too?
00:37:12.000 Look at Jesse.
00:37:13.000 I don't know.
00:37:14.000 Look at that face.
00:37:15.000 Jesse.
00:37:16.000 Some Botox injections there.
00:37:18.000 A little Botox injections.
00:37:19.000 Yeah, I don't know.
00:37:20.000 Wow.
00:37:21.000 Are we winning too much?
00:37:23.000 Is this too much?
00:37:24.000 We'll find out.
00:37:25.000 We'll find out what happens when the Epstein things come out.
00:37:27.000 We could be winning.
00:37:29.000 Yeah, into hyperdrive.
00:37:30.000 There's no such thing as winning too much.
00:37:31.000 We can't get tired of winning.
00:37:32.000 We were so far behind considering how deeply corrupt things like USAID was and how they had their hands in so many different countries.
00:37:43.000 You know, the overthrow of what we...
00:37:45.000 We would think would be countries that were friendly to Western.
00:37:49.000 It made sense when it was communist countries that the CIA was targeting, essentially.
00:37:55.000 State Department, CIA, USAID, etc.
00:37:58.000 It made sense.
00:37:59.000 The argument made sense.
00:38:00.000 Fine.
00:38:01.000 But when it's like, just, okay, these countries that are having a democratic election, we don't like the guy that might be coming in because he's too right-wing?
00:38:10.000 I mean, that's...
00:38:13.000 Beyond the pale and that kind of stuff, the United States shouldn't have any...
00:38:18.000 There's no reason for the U.S. to do that because the left had taken control, or however you want to say, they'd gotten into positions of power in the establishment so deeply that they really were saying, okay, if you don't align with the gay communist takeover, then we're going to go ahead and make sure that you don't win the election of your country.
00:38:41.000 Another reason why I'm excited for Hexa.
00:38:45.000 Yeah, he's doing a great job.
00:38:46.000 I love these videos, man, how he was doing PT with some troops or whatever.
00:38:50.000 I'm not hyper-focused on it, but it looks like he's actually...
00:38:54.000 The criticism I've heard from a lot of people who've served is that the military is very deeply bureaucratic.
00:39:01.000 And illogical.
00:39:02.000 Like the path forward in advancing your career is largely political.
00:39:06.000 And it looks like Pete Hegseth is returning it back to a core meritocratic system.
00:39:10.000 And he's treating our troops and our enlisted guys like they're people.
00:39:14.000 Yeah.
00:39:15.000 Which looks great.
00:39:16.000 And enlistment is way up.
00:39:19.000 The moment Trump gets in, he's like, no more woke military.
00:39:21.000 People were like, thank you.
00:39:22.000 I'd like to come back.
00:39:23.000 That's great.
00:39:24.000 And he said he wanted to cut 40% of the Pentagon budget, something like that.
00:39:28.000 And listen, we need an audit of the Pentagon.
00:39:31.000 We haven't had a past audit of the Pentagon in a very long time.
00:39:34.000 Marine Corps passes all the time.
00:39:36.000 Yeah, Marine Corps passes it.
00:39:37.000 Pentagon doesn't pass it.
00:39:38.000 2001, a couple days before 9-11, they failed an audit, just saying.
00:39:42.000 And I think Pete Hegseth is making a name for himself, Tim.
00:39:44.000 I think maybe 2028, if the field is open, Trump didn't name the person.
00:39:49.000 I bet you Hegseth might be like a dark horse to be like the president in 2028, a contender.
00:39:55.000 Hegseth, I think, would do really well.
00:39:57.000 I mean, I think he's making a name for himself.
00:39:58.000 I don't think people are talking about that, but Hegseth 28 could be a thing, you know?
00:40:02.000 Well, I got a question for you guys.
00:40:04.000 With the news of the Epstein list dropping, what do you think the perpetrators are doing right now?
00:40:12.000 Could it be, from the New Republic, Americans are heading for the exits.
00:40:16.000 Go ahead and roll your eyes as those who want to emigrate amid Trump's second term, but it's a worrying trend, is it?
00:40:21.000 Well, this starts from a few days ago.
00:40:23.000 And then we have this one from back in November.
00:40:24.000 Record number of wealthy Americans are making plans to leave the U.S. after the election.
00:40:28.000 As soon as we started getting information on the Epstein list and the potential that Trump would be releasing it, despite the fact Democrats keep trying to make it seem like he was in coups with Epstein, we heard a lot of people saying, let's track those private jets and see what's going on and what they're up to.
00:40:45.000 Already we've heard stories of very powerful, wealthy individuals who fled the country the moment Trump won, and they've been out of country for a long time.
00:40:51.000 You also got the Diddy list, too.
00:40:53.000 So I have a strong feeling that these next few months, the dam is going to burst open with the Diddy stuff, with the Epstein stuff.
00:41:02.000 And then we're going to start asking questions about, remember that producer in Hollywood?
00:41:06.000 How come he's in Singapore?
00:41:08.000 Yeah.
00:41:09.000 I don't think that that's ridiculous.
00:41:11.000 Although, I will say that it's very nice to see that they're finally going through with their promises.
00:41:17.000 They've been saying, if Trump wins, I'm going to leave.
00:41:20.000 Well, thank you for finally keeping your word.
00:41:25.000 If your loyalty to the United States and your love for the United States changes based on who the president is, GTFO, man.
00:41:32.000 We were talking about this during the Green Room podcast.
00:41:35.000 So it's rumblepremiumonly at rumble.com slash timcastire.
00:41:39.000 And I was saying, you know, there's this great interview that Tucker Carlson had with Ray Dalio, and he says the next five years we're going to have a time warp, meaning like the advancement of AI and technology is going to be so dramatic that what you see today versus five years from now is going to be, it's going to be absolutely insane the way the world changes.
00:41:58.000 I think it's absolutely true.
00:42:00.000 I'm talking like we might be seeing Iron Man suits, and I'm being somewhat facetious.
00:42:05.000 Because the...
00:42:06.000 What are you shaking your head?
00:42:07.000 Well, you were talking about Palmer...
00:42:10.000 Palmer Lucky.
00:42:11.000 Palmer Lucky.
00:42:11.000 He was on Sean Ryan's show talking about this very topic.
00:42:15.000 But his work is irrelevant.
00:42:17.000 Completely irrelevant.
00:42:18.000 The issue is once we reach the singularity in the AI where it's smarter than we are and can advance itself faster than we can advance it, it will be the point where you'll say, Jarvis, draw me up an Iron Man costume.
00:42:33.000 full functioning with flight and then it will build up the schematics instantly and tell you the materials you need the elements you need the power sources you need whether you can or you can't it will invent things in real time now i'm kind of joking about all that What I think is likely to happen is...
00:42:49.000 Once we get to the point of singularity where, again, the AI advances itself faster than we can advance it, it's called a singularity because you pass the event horizon where it starts exponentially improving itself to the point where it exceeds our comprehension of existence, meaning the AI will be able to make whatever is possible to be made, to program it, to tell you how to mine it, tell you about new elements.
00:43:10.000 I believe one of the first things we'll see is read-write technologies in Neuralink, and I'm going somewhere with this.
00:43:16.000 So we were talking about this in the green room.
00:43:18.000 And I said, once we have Neuralink with read-write capabilities, meaning you can plug the chip into your brain and it can write to your brain and simulate experiences, thus you can live in a virtual utopia, we as good stewards of this country and moderate to conservative Americans should take a small portion of our wealth.
00:43:40.000 And share it with the poor liberals for the purpose of plugging their brains into the Neuralink where they can go in the pod, eat the bugs, and live in their paradise utopia and leave us alone.
00:43:50.000 I love it.
00:43:51.000 It's going to be like COVID 2.0.
00:43:52.000 We were saying before that during the COVID times and it was locked down, who were the only people that were out there enjoying life?
00:43:59.000 I was having an amazing time out there, not a care in the world, because all the liberal sheep are all stuck looking at me from outside the window as I was out having a great time, flying around.
00:44:09.000 For $40 a flight around the world, having fun, getting together with other like-minded people who believe that if you breathe air, you won't kill yourself.
00:44:19.000 So like, yeah, let's bring it back permanently.
00:44:21.000 When they invent the Neuralink read-write capabilities, whatever they call it, maybe Elon doesn't do it, the AI breaks the point of singularity, and then we say, can you drop schematics for a device that can write experiences to the human brain?
00:44:35.000 And then it does.
00:44:37.000 And then we make it.
00:44:38.000 There's going to be a whole bunch of liberals who are like, there's absolutely nothing wrong with plugging yourself into the Matrix.
00:44:43.000 And we're going to, you know, it's actually a really good idea for a movie.
00:44:47.000 The Matrix got it wrong.
00:44:49.000 In the Matrix, it was Neo and like the humans, or I'm sorry, the Matrix was the humans versus the machines.
00:44:54.000 The Matrix should be humans versus humans.
00:44:57.000 It should be the humans who want to live in base reality versus humans who don't.
00:45:02.000 But my point is this.
00:45:03.000 You know, as Phil's pointing out, thank you for leaving the country like you promised you would.
00:45:08.000 We've been waiting for this.
00:45:09.000 I say we've got to help out.
00:45:11.000 And I say that everyone should agree it is our responsibility to tithe a portion of our incomes to the poor liberals who want to live in the pot and eat the bugs.
00:45:23.000 Hear, hear, second.
00:45:24.000 And then, you know, we may have to spend that money.
00:45:28.000 But then what happens in 10 years if we have 100% control of all governments?
00:45:33.000 Communities get stronger.
00:45:34.000 The world becomes a literal utopia so long as – I mean that's kind of a scary thought honestly because like let's say you live in a society where it's – let's say the Trump mega movement comes to terms with the far left and they're like, look, we're going to give you unlimited neural link utopia hyper universe whatever matrix.
00:45:54.000 We'll service and pay and make sure the machines are operating forever and you will live in paradise in your pod eating the bugs.
00:46:01.000 The world will heal and you will experience nothing but pleasure.
00:46:04.000 Let's say they agree to it.
00:46:07.000 What would then happen in this utopian society where we're all running it when someone is a criminal and starts pushing criminal views or whatever?
00:46:15.000 We then say you are hereby sentenced to the pod where you will live in a utopia?
00:46:23.000 A criminal breaks the law, you take them from your society, you put them in the pod, you hook their bread up to Neuralink, and now they live in their weird little paradise.
00:46:29.000 In their little utopia, yeah.
00:46:30.000 Well, I mean, so there are people that are going to object to that because they think that people should be punished.
00:46:35.000 But there's also a significant portion of people that you say, look, they're not going to suffer, and we're going to remove them from society so they can no longer hurt people.
00:46:47.000 And, I mean, the idea of removing people from society...
00:46:53.000 If they're violent, that's what we do now, right?
00:46:57.000 People that are too violent to stay in society, we put them in prison.
00:47:01.000 And you could probably get even the most bleeding heart liberals to say, okay, we're cool with it if you know those people that are removed from society aren't going to suffer.
00:47:10.000 Now, there are people that are like, no, they need to be punished.
00:47:13.000 That won't go along with it.
00:47:15.000 But...
00:47:15.000 Let me ask you guys.
00:47:16.000 Let's say you got convicted of a crime and you're sentenced to prison.
00:47:18.000 I don't know.
00:47:19.000 Let's say it was a financial white-collar crime and they're like, you know, it's two years or whatever.
00:47:23.000 You shouldn't have mailed those things or whatever you did.
00:47:25.000 Would you rather go to a prison?
00:47:26.000 Or, they say, what we can do is we can plug you into the matrix where you will live in any reality that you choose and you will be – you can live in a fictitious reality.
00:47:37.000 We're basically saying we are removing you from society because you're a threat to others.
00:47:40.000 Would you rather go to prison?
00:47:42.000 Or would you rather live in a fake video game universe?
00:47:45.000 Probably, though.
00:47:46.000 How long is the prison?
00:47:47.000 Two years?
00:47:47.000 Yeah.
00:47:48.000 Two years?
00:47:49.000 I'd rather do two years and come back out.
00:47:51.000 No, no, no.
00:47:52.000 Oh.
00:47:52.000 Would you wake up?
00:47:53.000 You could plug into the Matrix for two years and come back out, or you go to a regular prison for two years and come back out.
00:47:56.000 I think it should be life sentence in the Matrix, or two years of normal time, and then come back out to real life.
00:48:02.000 But that's not the point.
00:48:03.000 The point is, the question I'm asking is, would you rather go to prison or the Matrix?
00:48:07.000 Same time.
00:48:09.000 Most people are going to say The Matrix.
00:48:10.000 No question.
00:48:11.000 They're going to be like, so...
00:48:12.000 If the technology's right, yeah.
00:48:14.000 Fully functioning, I feel like I'm in the real world, and it's like, yes, and to varying degrees, you can control it.
00:48:20.000 So you can choose to go to the universe where you're a powerful wizard named Harry or whatever, or you can go to the universe where you're just some guy who works at a gas station.
00:48:26.000 Pick.
00:48:27.000 For two years.
00:48:27.000 Because we're removing you from society because you're a threat to, you know, and so this is your, you know, rehabilitative whatever.
00:48:34.000 Most people are going to be like, I'd rather go to The Matrix.
00:48:35.000 Would you wake up like it's a dream where it feels like...
00:48:38.000 No, you'd know.
00:48:39.000 You'd know you're in it.
00:48:40.000 Okay, fair.
00:48:41.000 And then, like, they'd be like, okay, looks like you got two months until we send you back out, and you're like, oh, yeah, look at that.
00:48:47.000 I imagine if you can go into any reality that you want, and it is also, you know, there isn't an uncanny valley.
00:48:56.000 If it's just like, if you experience it the way that you experience the world, why would people come out?
00:49:01.000 Also available.
00:49:01.000 Well, that's what I'm saying.
00:49:02.000 Like, once we invent this technology, liberals will choose to do it.
00:49:06.000 And then the people who don't, who are criminals, will be forced into it.
00:49:09.000 That's why I was saying it's dystopian.
00:49:11.000 It's kind of horrifying.
00:49:13.000 That's like the deviant people of society are going to be like, you don't fit in and you are a threat to us, so into the machine with the liberals you go.
00:49:19.000 I don't think it's too far-fetched either because people are hooked on their phones all the time and that's only a couple baby steps away from being...
00:49:26.000 You know what I mean?
00:49:27.000 I legit think we're like a few years away from this.
00:49:29.000 Like Ray Dalio's right.
00:49:31.000 That people do not get...
00:49:32.000 The advancement we will see once we pass the event horizon in AI technology.
00:49:37.000 There is a world where you can take a rock, hold it up in front of a camera, and the AI will be able to scan it and then tell you exactly where that rock came from.
00:49:49.000 It will be able to predict things that will happen to insane degrees.
00:49:53.000 The further into the future the prediction you're requesting goes, the less likely it is to occur.
00:49:59.000 But simple things in advance, it can be like...
00:50:01.000 It can tell you and predict, like, who's going to win a football game in real time.
00:50:05.000 And you're going to watch it and be like, it's just going to know.
00:50:09.000 As the football players go in, it's going to be like 97.2% chance that's going to be the Eagles.
00:50:13.000 And then you're like, but how does it know the game even started yet?
00:50:15.000 And it's just like, just based off everything we've seen, it's going to be like, yo, that dude ate a cheeseburger last night.
00:50:20.000 This guy was drunk.
00:50:21.000 That dude's salt levels are too low.
00:50:23.000 All of that crazy stuff.
00:50:24.000 That's a good point.
00:50:25.000 No more sports betting, Chuck.
00:50:27.000 It's over.
00:50:29.000 Can't do that.
00:50:30.000 I love sports betting.
00:50:30.000 It's over.
00:50:31.000 I'm not good at it, but I really enjoy sports betting.
00:50:33.000 The AI is going to invent things in real time.
00:50:36.000 It's going to discover elements in real time.
00:50:39.000 Science is going to be like, here's everything we know about science, and it's going to be like, here are all the holes in all of your science that you missed.
00:50:45.000 But it's looking at the big picture.
00:50:47.000 Imagine you've got 50 billion jigsaw puzzle pieces, and you task 100,000 people with solving that puzzle.
00:50:54.000 And they're all in a little tiny space trying to put pieces together.
00:50:57.000 That's basically what science is.
00:50:58.000 The AI is zoomed out looking at all, being like, yo, you got the piece from that guy.
00:51:02.000 Put it over there.
00:51:03.000 It's going to be nuts.
00:51:05.000 What's Trump's AI? Who is his AI czar?
00:51:08.000 He put an AI czar in there.
00:51:09.000 Robert Sachs.
00:51:10.000 So Sachs.
00:51:11.000 Jeffrey.
00:51:12.000 David Sachs.
00:51:13.000 David Sachs.
00:51:14.000 Doing crypto and AI. He has crypto and AI. So what's he saying about it?
00:51:20.000 I mean, this is going to be very important at Tim's point.
00:51:22.000 The next four years of this, it's all going into hyperdrive and whoever owns AI will own the future.
00:51:27.000 I know China has their...
00:51:28.000 What's Sachs saying about it vis-a-vis China?
00:51:31.000 Well, I don't know, but I can tell you that...
00:51:32.000 Have y'all even been paying attention to the AI advancements we've seen so far?
00:51:36.000 Two years ago, we made a gag image of Nancy Pelosi with the original, like, Dolly or whatever it was, and it looked like a weird, grotesque Picasso painting.
00:51:47.000 And then a year later, it's a realistic picture of her shaking Trump's hand.
00:51:52.000 And a year on from this, we are now at the point where they...
00:51:57.000 Let's jump to this story, man.
00:51:58.000 We got the story.
00:51:59.000 Let's go.
00:52:00.000 From Mediaite, viral video of Don Jr. arguing America should have been sending weapons to Russia is fake.
00:52:06.000 We know it's coming.
00:52:07.000 It's happening.
00:52:08.000 They say, the video, which has been shared by a number of large follower accounts and acts, supposedly showed the president's son interacting with an unknown interlocutor, who remarks, but they forget that Ukraine isn't the kind of country you go all in on.
00:52:20.000 This is ridiculous.
00:52:21.000 And the fake audio, he said, I honestly can't imagine anyone in their right mind picking Ukraine as an ally when Russia is the other option.
00:52:27.000 I mean, just think about it.
00:52:28.000 Massive nuclear power loaded with natural resources everyone needs.
00:52:31.000 Literally the biggest country on the planet.
00:52:33.000 And haha, there's Ukraine, which has Chernobyl and some radiation-proof dogs.
00:52:37.000 Meanwhile, the Biden administration is like, oh yeah, this is definitely the ally we need.
00:52:40.000 Let's dump all our money into them, honestly.
00:52:42.000 If anything, the U.S. should have been, I'm not going to go on to say it because they're going to pull some clips, but in the fake AI video.
00:52:49.000 Don Jr. advocates sending weapons to Russia.
00:52:52.000 The alleged comments went viral on social media and were promoted by a number of prominent accounts, including FactPost, which is run by the Democratic National Committee.
00:52:59.000 The DNC was running fake AI audio of Trump Jr. Now, here's what's so devious.
00:53:06.000 This is exactly what I warned about.
00:53:09.000 People were saying this early on.
00:53:11.000 Oh, they're going to make AI videos where it's like, you know, Trump kicking a dog.
00:53:15.000 And I'm like, no, they're not.
00:53:17.000 They're going to make an AI video of Donald Trump giving a press conference.
00:53:20.000 They're going to take a video of Trump at a press conference where he says he's going to say they were very fine people on both sides and I am not talking about the neo-Nazis or white nationalists because they should be condemned totally.
00:53:32.000 They're going to take that video.
00:53:34.000 They're going to change they should be condemned totally too and some of them.
00:53:40.000 So Trump will go because and I'm not talking about the neo-Nazis or white nationalists because some of them should be condemned totally.
00:53:45.000 They're going to alter the tiniest of words.
00:53:47.000 Yes.
00:53:48.000 And what's going to happen then is when the video goes viral, no one will know which one was the real one.
00:53:53.000 The fact checkers won't be able to tell you.
00:53:55.000 They'll say, well, Trump did give a press conference.
00:53:57.000 He was there.
00:53:58.000 And this video's gone viral across the board.
00:54:02.000 You might get some outlets being like, we were there, we saw, he didn't say that.
00:54:06.000 But then all it's going to take is some Democrat to come out and be like, BS, I was there.
00:54:09.000 Lying.
00:54:10.000 This is what he actually said.
00:54:11.000 And the reason why that's so nefarious.
00:54:14.000 What Democrats will say is, you'll go to them and go, he said that Deonacci should be condemned totally, which is literally what he said.
00:54:20.000 And they'll go, no, he said some of them because he was defending the ones that were there.
00:54:23.000 The changing of the context.
00:54:25.000 So with this fake video that went viral, Don Jr. never said it.
00:54:30.000 But it's not audio of Don Jr. talking about cheating on his wife or girlfriend or beating his children or kicking dogs.
00:54:37.000 It's an off-the-cuff comment to deride Ukraine where he facetiously says, we should have just been giving Russia the weapons.
00:54:45.000 What they're trying to do is strike at his reputation and make it look like he's deferential to Russia in a way that's plausible, using fake audio and it largely worked.
00:54:54.000 The DNC's fact post was sharing this.
00:54:58.000 This is just the beginning.
00:55:00.000 It's going to get substantially worse with AI and people don't realize that...
00:55:05.000 I was just talking before we started the segment, for those that are just tuning in, a couple years ago, the AI photos and videos and audio that was being made was miserably bad.
00:55:13.000 I remember when a research team published the first ever Joe Rogan voice clone app.
00:55:19.000 And we talked about it on the show.
00:55:20.000 They took it down and said, we don't want people cloning Joe Rogan's voice.
00:55:24.000 There is now an app, Eleven Labs, it's called.
00:55:26.000 Where you can clone literally any voice in 10 seconds.
00:55:30.000 You can turn your own voice into a song.
00:55:32.000 You can do all this stuff.
00:55:33.000 Suno is crazy.
00:55:35.000 You can take a song and then record your voice and then it'll turn you into the singer.
00:55:40.000 You can be like, I want Stairway to Heaven but me singing it and it will do it.
00:55:43.000 That's how crazy things are getting.
00:55:45.000 In the next year, it is going to be exponentially more powerful.
00:55:49.000 A year from that.
00:55:50.000 I've been saying this.
00:55:51.000 We're going to get to the point where you load up Netflix.
00:55:53.000 There's no movies anymore.
00:55:55.000 No movies, no shows.
00:55:56.000 What it's going to be is user-generated libraries, and there's going to be popular ones with thumbs up.
00:56:03.000 Someone's going to load up their Netflix and say, Netflix, give me a movie where The Incredible Hulk is in a beauty pageant, and it's very funny and silly considering his rage problems.
00:56:13.000 And then it'll be like generating, and then it'll make the movie.
00:56:16.000 You'll watch it, and then you'll be like, I thought it was okay.
00:56:19.000 Someone who follows you will be like, I'm going to watch what he generated.
00:56:22.000 That was hilarious.
00:56:24.000 Thumbs up.
00:56:24.000 It goes viral.
00:56:25.000 It reached the top of the charts.
00:56:26.000 And they say, here's the movie of the day.
00:56:28.000 And every day, there'll be some different show, some different movie.
00:56:30.000 And people are going to fork shows.
00:56:33.000 They're going to be like, I want a show that's like lost, but in the desert, not an island.
00:56:37.000 And I'll go like, rendering.
00:56:38.000 Boom, here's episode one.
00:56:39.000 Then someone's going to watch it.
00:56:41.000 And they're like, that was great.
00:56:42.000 Give me another episode.
00:56:43.000 Another person's going to watch it and be like, I don't like the main character died in the first episode.
00:56:47.000 He wasn't really the main character.
00:56:48.000 They switched characters.
00:56:49.000 Give me episode two where he comes back to life.
00:56:50.000 And they're going to fork and create.
00:56:52.000 It's going to be.
00:56:53.000 Absolutely nuts.
00:56:55.000 Anyway, I digress.
00:56:55.000 I'm ranting on the AI stuff.
00:56:57.000 Look at what the DNC is doing to Don Jr., and what do you think is going to happen in the midterms?
00:57:02.000 Yeah, it's going to go crazy.
00:57:04.000 I mean, over the next two to four years, it's going to go into hyperdrive, this stuff.
00:57:08.000 And to your point...
00:57:09.000 It gets crazier when they edit little pieces of stuff.
00:57:12.000 Like, all it takes is just, like, the middle of something that happened.
00:57:16.000 Like, not, like, some made-up thing of Trump singing a song or, you know, his, like, Gaza video yesterday where it was, like, Trump-Gaza.
00:57:21.000 You know, everyone's like, oh, ha-ha, you know, AI video.
00:57:23.000 They take a real press conference and you manipulate, you know, a 30-second answer of consequence.
00:57:30.000 And then you put that everywhere.
00:57:32.000 And then by the time people realize that it was altered, it's too late because it looks so real.
00:57:36.000 Yeah, it's a wild.
00:57:37.000 Yeah.
00:57:39.000 I don't dispute any of that stuff, but there is part of me that wonders how much impact it's going to have considering the fact that nowadays people hear what they want to hear anyway.
00:57:51.000 That's true.
00:57:52.000 You know, like, there were so many...
00:57:54.000 There are tons of people that heard the very fine people hoax, and they still believe that Donald Trump actually supports.
00:58:02.000 Because, like Daniel Negrano said when he came on the show...
00:58:05.000 He was like, I saw the video.
00:58:06.000 I know he said they were fine people.
00:58:07.000 And then his buddy slid the phone over and said, watch the video.
00:58:10.000 And he said, fine.
00:58:11.000 And then he saw the full video and went, oh, imagine if there's never again a full video.
00:58:16.000 And he says, watch the full video.
00:58:17.000 And Negrano looks at it and goes, they should be condemned totally?
00:58:21.000 BS. Pulls out his phone, opens up DNC.app and plays it, and Trump goes, I love Nazis.
00:58:26.000 And he's like, see, that proves it.
00:58:27.000 I understand what you're saying, and I'm not saying that...
00:58:30.000 I know there are people that have had their minds changed.
00:58:33.000 I've referenced them on the show before.
00:58:36.000 But I do think that it is going to be case dependent.
00:58:43.000 Because there are going to be some people that will be willing to rethink their priors.
00:58:47.000 And there's going to be people that it doesn't matter.
00:58:50.000 What you show them.
00:58:51.000 There are going to be people that it doesn't matter if you show them that he said, like, look, he said, you know, he said they're bad and they're and it goes both ways.
00:58:59.000 It's going to be people that are pro-Trump.
00:59:00.000 Someone's going to be like, look, here's this video where Trump said this bad thing and they're going to say.
00:59:06.000 Don't care.
00:59:07.000 Doesn't matter.
00:59:09.000 It's not that I'm saying that it won't happen or that it's not going to have an effect, but I think the effect is actually going to be more around the edges than actually...
00:59:17.000 I disagree.
00:59:18.000 Daniel Negrano's not an edge case.
00:59:19.000 He's middle of the road.
00:59:20.000 So what we're looking at is, for those that don't know, he's one of the world's best poker players.
00:59:25.000 He came on the show and he talked about how, for the longest time, he believed Trump called Nazis fine people until his buddy played him the full video.
00:59:31.000 He had seen clips before.
00:59:33.000 He had seen the clips of Trump saying they're very fine people on both sides and assumed that was it.
00:59:38.000 So when people say, did you watch the videos?
00:59:39.000 Yes, I watched the video.
00:59:41.000 And then they were like, no, you didn't.
00:59:42.000 He's like, yes, I did.
00:59:43.000 So finally his buddy slid his phone over and said, play.
00:59:46.000 And he's like, fine.
00:59:47.000 And then he went, whoa, wait.
00:59:48.000 No, I didn't see that.
00:59:50.000 Those scenarios where apolitical people, this guy's a poker pro, he doesn't do politics.
00:59:56.000 They will never be able to break out of the matrix.
00:59:59.000 It will never happen again.
01:00:01.000 It's not a question of what people want to believe.
01:00:03.000 It's a question of people will be made to believe whatever the machine tells them to believe with no way for us to break them out.
01:00:10.000 Because either we show them the real video and they'll say, that's AI, you faked that.
01:00:15.000 And they'll keep believing the lies.
01:00:17.000 That's it.
01:00:18.000 I think the biggest moment for the breakout of that matrix is probably COVID. Because while a lot of people are still kind of tethered to that hive mind mentality...
01:00:28.000 That woke a lot of people up because they experienced it in real life.
01:00:31.000 They felt the impact of COVID and all the lockdowns and stuff like that.
01:00:34.000 Like, me personally, I was like, oh, this is bad.
01:00:37.000 Like, I don't like all this stuff.
01:00:39.000 So it shocked too many people, and a shock to the system is always bad for those trying to control the system.
01:00:44.000 So I've long said, like, the way you break people out is you need a system shock.
01:00:50.000 Slow and gradual doesn't change enough, the way they're running the control.
01:00:54.000 When COVID happened, so many Democrats bought into it because every day they incremented up and they freaked them out and scared them and said, hospitals are overloaded and people are dying and the death toll is climbing and they had a death tracker on CNN. It didn't matter that you were locked in your house because you were watching TV and you believed it.
01:01:14.000 The problem for the machine state and why they lost is because of decentralized communications.
01:01:20.000 Take away the ability to communicate through a decentralized means by flooding the zone with fake everything, and then you will easily win.
01:01:27.000 Because what's going to happen is, you get another lockdown, and then someone wants to turn on Joe Rogan, and Joe Rogan's largely waking people up, or they want to turn on Tim Castile, or whatever it may be.
01:01:36.000 And what's going to happen then is, they're going to be told, that's all fake.
01:01:40.000 And there's going to be 15 videos of Trump doing similar things slightly different, and they'll have no idea which one's the real one.
01:01:46.000 So they'll say, I'm just going to stay inside.
01:01:48.000 I'm scared.
01:01:50.000 I mean, not only that, there's going to be videos of people dying in the streets, and they're going to say, is it real or not?
01:01:55.000 And then the people in control are going to say, there's a video of a man vomiting up his intestines.
01:02:01.000 That happened.
01:02:02.000 And then they're going to be like, I don't believe it, it's fake news.
01:02:04.000 Like, okay, the door's over there, give it a shot.
01:02:06.000 Tell me how it works out for you.
01:02:08.000 And they're going to say no.
01:02:09.000 For real, that's what's going to happen.
01:02:10.000 And that's how they're going to control people.
01:02:12.000 And with the AI video stuff, there's no way to break people out of that because there's no longer going to be, here's the true video.
01:02:19.000 There's going to be videos indistinguishable.
01:02:22.000 Already, man, there's women on OnlyFans that are completely fake, made by AI, and dudes are completely clueless and paying money for that stuff.
01:02:33.000 Nuts.
01:02:35.000 Here we go, baby.
01:02:36.000 Elon Musk said the singularity is about to light up.
01:02:39.000 Well, the singularity with AI, with artificial intelligence as...
01:02:46.000 Problem solving and stuff like that.
01:02:48.000 That's going to be...
01:02:49.000 I mean, we'll see what happens.
01:02:51.000 I don't know that I... I have yet to see them do creative things that are really interesting.
01:02:57.000 As in, they've yet to discover something, right?
01:03:01.000 Because LLMs use existing knowledge.
01:03:04.000 They base their opinions or their ideas and stuff on existing knowledge and stuff.
01:03:08.000 So when I don't...
01:03:09.000 I know that they can be extremely specific, right?
01:03:13.000 So you can use AI to find cancers and breast cancer.
01:03:19.000 There's a lot of breakthroughs with finding breast cancers and stuff like that.
01:03:22.000 And so I think it's going to be extremely useful and it's going to be a tool that's going to be able to Revolutionize a lot of industries.
01:03:30.000 But as for general AI and being able to make discoveries and stuff like that...
01:03:35.000 Didn't Elon just announce that Grok solved the Putnam problem?
01:03:40.000 I don't know.
01:03:41.000 I saw they uploaded the new Grok, Grok 3. So I don't know what this means.
01:03:45.000 And it could mean nothing.
01:03:47.000 But Mario Nautil says, Grok 3 goes superhuman, solves unsolvable Putnam problem.
01:03:51.000 None of the top 500 Putnam competitors fully solved the brutal math problem.
01:03:55.000 Grok 3 crushed it in around eight minutes.
01:03:57.000 So I do believe it was a human-made problem that was very difficult, and then it solved it.
01:04:04.000 I don't know if it was, you know, I don't know.
01:04:07.000 Here we go.
01:04:08.000 None of the top 500 contestants in the 2025 Putnam competition fully solved the problem.
01:04:12.000 Grok3 found the solution in around 8 minutes.
01:04:15.000 So I'm not saying it's discovering anything.
01:04:17.000 It's just getting to the point where it's beating out everybody else and solving problems that humans struggle with.
01:04:21.000 I do think Grok or just whatever, any AI. Grok is, I gotta be honest, it's so much better than...
01:04:30.000 Than OpenAI.
01:04:31.000 It's crazy.
01:04:32.000 I used to use ChatGPT a lot.
01:04:33.000 It's a really useful tool.
01:04:35.000 And now it's just slow garbage.
01:04:37.000 It wastes my time.
01:04:38.000 It's so annoying.
01:04:39.000 And Grok is just better.
01:04:40.000 I do think one of the reasons Elon wanted X, of course, is because he wants the firehose of human data or his AI. But AI is going to solve, is going to invent and discover something very soon.
01:04:52.000 I don't know if you'd call that the singularity, where it exceeds.
01:04:57.000 But it really is as simple as if you plug in all of the information we have on fusion and cold fusion and fission and whatever, what's going to happen is AI is going to be looking at a Sudoku puzzle, top down, straight at it, and it's going to say, oh, you found six digits.
01:05:16.000 The last three go here.
01:05:18.000 And we're going to go, whoa, we didn't consider that.
01:05:20.000 Because you've got...
01:05:22.000 Ten people over here, ten people over here, ten people over here working on each individual part, and only after reviewing each other's peer-reviewed data and then advancing upon it do they advance core concepts.
01:05:33.000 Things tend to get invented by engineers.
01:05:35.000 So something will get invented or a substance will be created by a chemist or an engineer, and then someone else will figure out how to apply it somewhere else.
01:05:43.000 So, for instance, you're talking about Palmer Luckey and the Exosuit and stuff like that.
01:05:48.000 The technologies for drones, for instance, existed for a very long time for quadcopters.
01:05:52.000 It wasn't until someone pieced these things together.
01:05:55.000 We could have had quadcopters the moment we had electric motors.
01:05:59.000 The computing power isn't that...
01:06:01.000 To fly a remote-controlled quadcopter requires very little.
01:06:05.000 But it took a long time for decentralized humans to develop the functional quad drone as we have today.
01:06:11.000 Imagine if we had AI. It would be like, oh, I see that you have small electric motors.
01:06:17.000 Lithium polymer batteries, you realize if you put them together with a microcontroller that you could make right now, you have a drone that can carry a kilogram.
01:06:26.000 And we would have been like, whoa, why did it take us 30 years to figure that out?
01:06:30.000 Then imagine what war would be like.
01:06:33.000 So now that these militaries are using these small to large drones in all these conflicts, imagine if 30 years ago the AI told us, here's a list of weapons you can make with your existing technology.
01:06:46.000 Yeah.
01:06:50.000 Yep.
01:06:51.000 I watched a video from a guy named Alex O'Connor, I think it was, talking about how ChatGPT is like a test of Hume's empiricism and how it can only make things from things it's already seen.
01:07:01.000 So he did a whole test with ChatGPT.
01:07:03.000 You mentioned it got worse to him recently, where they couldn't actually make it make a half-full glass of wine.
01:07:08.000 It could always do something close to it, but it can't because there's no images online of a glass of wine full to the brim.
01:07:14.000 So it says, here's a glass of wine, it says in the text, filled all the way to the brim with absolutely no air.
01:07:20.000 It can't do it.
01:07:22.000 Yeah, it's a video watched by Alex O'Connor.
01:07:23.000 He talks about how it can synthesize things from two different sources.
01:07:27.000 And he mentions Hume's critique on his own thing, which is like the shades of blue problem.
01:07:32.000 If you know what I'm talking about, you know what I'm talking about.
01:07:34.000 But yeah, I think that's something I thought about too.
01:07:37.000 If it can synthesize things from two different sources and come up with something we can never perceive, we literally can't perceive it.
01:07:43.000 It's so unimaginable for us.
01:07:45.000 We will never understand.
01:07:47.000 No, it literally can't do it.
01:07:48.000 Wow.
01:07:49.000 I'm unable to do it.
01:07:49.000 Look at this.
01:07:50.000 That's crazy.
01:07:51.000 Yeah, I said make an image of a glass of wine filled to the brim, and it didn't do it.
01:07:55.000 Right, and I wasn't sure if Grok 3 could do this, or the new Grok.
01:07:58.000 I haven't tried it on anything like that.
01:07:59.000 Yeah, let me try Grok, actually.
01:08:01.000 It's worth trying.
01:08:03.000 Let's see.
01:08:04.000 I'm going to do it overflowing.
01:08:05.000 I do.
01:08:06.000 Hey, you know, it is funny.
01:08:07.000 That Grok logo looks very similar to a logo I made a long time ago, but I'm just saying.
01:08:12.000 Make an image of a glass of wine.
01:08:16.000 Filled to the brim.
01:08:20.000 Let's see what it comes up with.
01:08:23.000 It's rolling.
01:08:26.000 It's...
01:08:27.000 nope.
01:08:28.000 Nope.
01:08:28.000 Interesting.
01:08:32.000 Could it do overflowing, maybe?
01:08:34.000 I don't know.
01:08:34.000 Probably not.
01:08:36.000 How can it not understand that, though?
01:08:38.000 I guess that's an interesting problem I had not considered.
01:08:41.000 It's just because if you understand how large language models work, it's just pulling information from the information that's been put on the internet.
01:08:47.000 So that's what's interesting about it.
01:08:49.000 This is what I thought about it.
01:08:51.000 It may have people to recreate a full-to-the-brim wine glass.
01:08:55.000 You know, people don't really do that.
01:08:55.000 It's a waste of wine, they say.
01:08:58.000 The thing that it can do, which no human can honestly perceive or will be able to imagine, is it can make a synthesis of stuff that we don't understand yet.
01:09:08.000 Like you said, the quadcopter, we had all the existing tech, but we didn't put that all together in that particular range to make it happen.
01:09:13.000 And that's what the real singularity change is going to be.
01:09:16.000 You don't understand it.
01:09:17.000 You can't perceive fourth dimensions as much as you try to understand a Tesseract.
01:09:21.000 You don't get it.
01:09:22.000 You cannot understand it.
01:09:24.000 I'm really excited to see what happens with these things.
01:09:27.000 It is a brave new world, and I keep wearing my sunglasses because it's so scary.
01:09:32.000 I'm telling it over and over again.
01:09:32.000 It says, here's your glass of wine filled to the brim with surface tension holding it in.
01:09:36.000 It's not.
01:09:37.000 It's the same image over and over again.
01:09:41.000 Some of them filled it up a little bit more.
01:09:43.000 But never to the top.
01:09:45.000 Never.
01:09:46.000 Yo, crazy.
01:09:48.000 Well, let's take a look at where we're currently at with AI with this story.
01:09:51.000 From the BBC, Apple AI Tool transcribed the word racist as Trump.
01:09:57.000 I saw this the other day because Alex Jones was sharing a video where he and his crew pulled their iPhones and you put in the voice to text and say racist and it goes Trump and then racist.
01:10:09.000 Yes.
01:10:09.000 So I tried it.
01:10:11.000 Here's what happened to me.
01:10:12.000 It said you racist.
01:10:14.000 It didn't say Trump.
01:10:15.000 It did that for me too.
01:10:16.000 It said you.
01:10:17.000 For one time.
01:10:18.000 But it also said Trump racist when I was doing it.
01:10:20.000 Really?
01:10:20.000 It did.
01:10:21.000 Yeah.
01:10:21.000 Wow.
01:10:22.000 It didn't do that to me.
01:10:23.000 I couldn't get it once to show Trump and then racist.
01:10:25.000 But it did a couple times.
01:10:26.000 I would say racist.
01:10:28.000 And it would say you and then turn to racist.
01:10:30.000 Yep.
01:10:32.000 Very weird.
01:10:33.000 So Apple's admitted it.
01:10:34.000 They said they're working to fix the tool.
01:10:36.000 How does that happen?
01:10:39.000 Well, I mean, is it AI that's using?
01:10:42.000 They said that we're aware of an issue with the speech recognition model that powers dictation.
01:10:46.000 We're rolling out a fix.
01:10:47.000 However, an expert in speech recognition told the BBC the explanation was not plausible.
01:10:51.000 Peter Bell, professor of speech technology at the University of Edinburgh, said it was more likely that someone had altered the underlying software that the tool used.
01:10:59.000 That's about on brand with what they've been doing over the years.
01:11:02.000 They've been doing that.
01:11:04.000 With someone at Apple that had access to that stuff or that's in the team that works on it, wouldn't it be funny?
01:11:12.000 You know?
01:11:15.000 I mean, the term racist is so worthless nowadays.
01:11:20.000 And if people say, like, call you racist or whatever, I'm just like, whatever.
01:11:27.000 It's like Nazi.
01:11:31.000 It's been so overused that it's totally empty of meaning.
01:11:36.000 It's literally the reason that people are losing their jobs at MSNBC, because they tried hiring Joy Reid and all these other people, and their job was literally just to call people racist for the last eight years to make white people feel bad about themselves.
01:11:48.000 So it's like, oh, I'm watching angry black lady Joy Reid call me racist.
01:11:52.000 And that kind of wore off.
01:11:53.000 And then nobody cared about her calling white people racist anymore.
01:11:56.000 So then they literally said to her, all right, your show's over because we need a new strategy.
01:12:01.000 This isn't going to work anymore.
01:12:02.000 Literally, that was it.
01:12:03.000 That is a really good observation.
01:12:06.000 These people are paid a bunch of money to just get on TV and call people racist.
01:12:10.000 And that's their entire job.
01:12:13.000 And it's just so tired.
01:12:16.000 And if you're...
01:12:19.000 If you've been in this space and you've heard it over and over, it's just so like, I don't care.
01:12:25.000 Like, I don't care.
01:12:26.000 I don't care if you don't like me.
01:12:28.000 I don't care if you think that I'm a bad person.
01:12:30.000 You're basing this off of some ridiculous tweet or something like that that you interpreted as racist.
01:12:37.000 I don't care if you interpret me as racist.
01:12:39.000 I don't care if you think I'm a racist.
01:12:41.000 I don't care.
01:12:42.000 I don't care.
01:12:43.000 I'm exhausted.
01:12:44.000 I'm so...
01:12:46.000 Overhearing this, it means nothing.
01:12:48.000 If you're going to call people that, especially if you haven't, you know, the reason they're calling you racist is never valid.
01:12:58.000 You know, it's like, look, you know, I don't care.
01:13:01.000 I don't care.
01:13:02.000 That's why Trump 2.0 is better than Trump 1.0 and arguably better than if he would have won the second term.
01:13:09.000 He did win the second term.
01:13:10.000 They stole it from him and then he didn't go in.
01:13:12.000 Him being back four years later, it's four more years of them using the same attacks over and over and over.
01:13:17.000 And then people just being like, alright, we heard this for nine years now, so it doesn't land anymore.
01:13:25.000 So the more time that went by in between his beginning of his political run to now...
01:13:29.000 It's actually to his benefit, and that's why right now he's kind of doing things.
01:13:32.000 He's like, I don't give a shit.
01:13:33.000 I'm just going to do whatever I want.
01:13:34.000 Say whatever you want.
01:13:35.000 Don't care.
01:13:35.000 I'm just going to plow ahead.
01:13:36.000 And that's why things are as good as they are.
01:13:38.000 So we can't stop winning right now.
01:13:40.000 Yeah, that's true.
01:13:41.000 I mean, I would encourage anyone out there that's listening to this or whatever, if you consider yourself on the right or even just not on the left, and some leftist starts calling you names, just they're...
01:13:55.000 The whole point is to get you to react.
01:13:58.000 The whole reason that they're doing it is so that way you'll feel like you should be, like maybe you should feel shame or something.
01:14:05.000 Did you get it?
01:14:06.000 No, I asked it to do milk, and this is the best it could do.
01:14:09.000 Close.
01:14:10.000 Weird.
01:14:11.000 Very weird looking.
01:14:13.000 I have no idea.
01:14:14.000 I don't think I'd drink that.
01:14:15.000 Maybe we have a little more time than we think.
01:14:17.000 Maybe it's not so bad.
01:14:21.000 You're like, it's exciting.
01:14:22.000 I'm like, it's kind of scary.
01:14:23.000 Especially the weapons thing.
01:14:25.000 I don't think anyone's really talked about that.
01:14:26.000 You can use it in war to create super scenarios.
01:14:32.000 Super catastrophic scenarios.
01:14:34.000 That's kind of scary.
01:14:35.000 If you can't do the wine in the milk, I think we just point ourselves like...
01:14:38.000 Two years.
01:14:39.000 Again, I'll stress, we have access to tremendous chemicals, and we don't know the exact interactions of all different chemicals, all different structures, all different temperatures, and AI is going to be able to look at all of the science and instantly go, boom, right here, here's a formula.
01:14:55.000 This is like in the movies, where...
01:14:59.000 What is it?
01:15:00.000 Like Iron Man.
01:15:00.000 It's a great example.
01:15:01.000 He's Iron Man for everything.
01:15:02.000 Where he's trying to...
01:15:03.000 In Iron Man 2, he's trying to find the new element.
01:15:04.000 And he's like, try this one.
01:15:06.000 And it's like...
01:15:06.000 Another good example is Stir of Echoes.
01:15:09.000 You ever watch that one?
01:15:10.000 It's where Kevin Bacon becomes invisible and then becomes a murdering rapist.
01:15:13.000 Oh, Invisible Man.
01:15:14.000 Yeah.
01:15:15.000 Oh, no, not Stir of Echoes.
01:15:17.000 It was...
01:15:17.000 It's like Invisible or something.
01:15:18.000 No, Stir of Echoes was a Kevin Bacon movie, but the...
01:15:21.000 Hollow Man.
01:15:21.000 Yeah, Hollow Man.
01:15:22.000 Yeah, that was crazy.
01:15:23.000 Sort of echoes.
01:15:23.000 That was a good movie.
01:15:24.000 That was where he finds the chick who got murdered or whatever.
01:15:26.000 Wrong movie with Kevin Bacon, but I was thinking Kevin Bacon.
01:15:29.000 Hollow Man.
01:15:30.000 And in it, they're trying to find a formula that will turn him visible again, and they're typing it into a computer, pressing enter, and it's going, running simulation, and then the molecules break.
01:15:40.000 Like, it's not real life stuff.
01:15:42.000 For the AI, it will be.
01:15:43.000 The AI will have a general understanding of how all the physics work, and it'll be able to run simulations and then instantly give you formulas for crazy things.
01:15:51.000 Drugs, medicine.
01:15:52.000 What they're talking about right now that's the biggest thing in AI is that you can give your DNA, you give it a blood test, you give it a blood sample.
01:15:59.000 It's going to know all the levels in your body of your potassium, your sodium, your triglycerides, whatever.
01:16:05.000 It's going to be able to instantly tell you if you have any diseases, any cancer, and personally manufacture on the spot a cure for that.
01:16:13.000 It's going to be able to tell you something like, you're going to say...
01:16:17.000 Here's my blood sample, and it's going to say, in 26 years, you will develop pancreatic cancer.
01:16:21.000 Drink this right now, and you won't.
01:16:23.000 And you'll be like, huh.
01:16:24.000 Sick.
01:16:25.000 Yeah, weird stuff like that.
01:16:27.000 The funny thing about Star Trek is that they'd be like, computer, cheeseburger, medium rare, bacon.
01:16:34.000 And they'll go, whoosh, and appear.
01:16:36.000 It wouldn't be that way in the future.
01:16:38.000 In the future, they're going to be like, when you go to a replicator, should it exist?
01:16:43.000 I don't know if we have replicators.
01:16:44.000 That's sci-fi stuff.
01:16:45.000 But it's going to be able to make you a food that has literally everything your body needs at that moment.
01:16:50.000 You're going to walk up, and you're going to put your hand over a laser, and it's going to pulse the laser to read your pulse in your blood, and then it's going to be like, okay, and it's going to give you a list of all the ingredients you need, and it's going to make a meal, like pasta with parmesan or whatever.
01:17:04.000 It's like, this is what your body needs right now.
01:17:05.000 Are they going to let that stuff get out?
01:17:07.000 Do they want people to be that healthy?
01:17:09.000 Obviously, there's a big racket of dependency on people being very unhealthy and not knowing what to do and trying 10 different solutions.
01:17:16.000 It'll call you racist as you're eating it.
01:17:21.000 Look, the machine is going to need to be...
01:17:24.000 It's not going to just be able to make this stuff out of thin air.
01:17:28.000 So it's going to need supplies so the people that make the machine will actually be also in the business of supplying the machine.
01:17:37.000 I mean, we talk about AI a lot, and whereas, yes, there's going to be massive, incredible things that AI is going to make happen, but we've said this before on this show, we're already living in it.
01:17:56.000 The effectiveness of Tesla's full self-driving now is incredible.
01:18:01.000 In Phoenix, Arizona, you can get an Amimo.
01:18:04.000 I think it's Amimo is the cab company.
01:18:07.000 Waymo.
01:18:08.000 Waymo.
01:18:08.000 Get in a Waymo cab.
01:18:10.000 And those things, no driver, those will do it for you.
01:18:12.000 The LLMs that we have now, they're not perfect, but they're...
01:18:18.000 They're really, really capable, and they can do some really incredible things.
01:18:22.000 And there are applications that are coming in the next probably year when it comes to agentic AI, which is agents that will do things for you.
01:18:33.000 You can have an AI plan a whole itinerary for you now, but it can't buy your tickets.
01:18:39.000 It can't actually...
01:18:43.000 Do the activity thing, like you can't get the ticket for you and stuff like that, but you can tell it to plan all this stuff and it'll do it.
01:18:49.000 And I imagine in the very near future, you're going to be able to be like, hey, do this.
01:18:54.000 Because you can do that on a very basic level.
01:18:57.000 I can tell Amazon, I can tell my Amazon app, buy me this, and it'll buy it.
01:19:03.000 Put it in my cart, and then I'll say, buy my cart, and it'll buy all that stuff.
01:19:06.000 So there are the beginning things, but I think when it comes to AI, just like smartphones didn't really get everywhere until the iPhone came out, it was the interface that really you needed for everyone to be like, oh, I want this.
01:19:25.000 I want the smartphone.
01:19:26.000 And I think there's probably, you know, that's what AI needs.
01:19:29.000 Now, it's not that...
01:19:30.000 It's not that it isn't capable of doing the basic things, it's that there isn't an interface that people can use that operates as smoothly as people want it to, you know?
01:19:43.000 I think that that's probably the next big step.
01:19:46.000 It'll be some kind of, whether it's an app, I don't know, you know, but once you're like, oh, hey, you know, I want this, and it does it, you know?
01:19:55.000 Yo.
01:19:56.000 I just asked Grok.
01:19:58.000 Give me a list of commercially available exosuits.
01:20:01.000 And it mentioned a whole bunch of companies.
01:20:04.000 This is one called SuitX.
01:20:06.000 I don't know.
01:20:07.000 I never heard of it.
01:20:08.000 Shoulder air is the lightest kind.
01:20:10.000 Provides relief for the work above the shoulders with full freedom of movement.
01:20:14.000 Carbon fiber.
01:20:16.000 Lithium.
01:20:18.000 Graphene lithium batteries.
01:20:20.000 Which means these things can charge in minutes.
01:20:23.000 So Ian's been screaming about graphene forever.
01:20:26.000 So I bought him a couple years ago.
01:20:28.000 We bought these graphene lithium batteries.
01:20:30.000 The way it works is lithium-ion batteries are the typical ones you use in your phone, and they charge pretty slow.
01:20:36.000 Y'all know.
01:20:36.000 You plug your phone in.
01:20:37.000 They're starting to charge faster because companies have begun introducing a sheet of graphene in the batteries, which my understanding is it basically allows it to charge uniformly across the whole thing, as graphene is a great conductor.
01:20:49.000 So it charges very quickly.
01:20:52.000 We got these batteries that contain the equivalent of about two full cell phone charges, and they charge to full in 10 minutes.
01:20:59.000 So if your cell phone is dead, and you plug it in, and it says one hour remaining, you grab the battery, you plug it in, it charges in 10 minutes, then you plug your phone into it and carry it with you, and now you've got two charges to go.
01:21:10.000 With this kind of technology, and we're talking about AI, these exosuits are getting phenomenal.
01:21:17.000 The technology in them, the size of the motors, they're using AI. Technology to map your gait in real time to adjust to how you move to predict your movements when you make them and then add power or support.
01:21:31.000 Yo, these things are crazy.
01:21:33.000 People are going to be wearing like mini Iron Man suits.
01:21:36.000 You're not flying around or anything.
01:21:38.000 But I'm imagining like if you've already got shoulders for carrying stuff and I don't know whatever else this stuff is.
01:21:45.000 They got other full body back support for the spine and legs.
01:21:49.000 It's going to be crazy.
01:21:50.000 In the next five years, man.
01:21:52.000 You're going to be walking around wearing these things.
01:21:55.000 I'm super stoked for sports.
01:21:57.000 We should do cyber football where we're like, everybody, no drugs.
01:22:03.000 That's stupid.
01:22:03.000 Drugs are weird.
01:22:04.000 But everybody gets like an exosuit so they can jump twice as high.
01:22:08.000 And then we get crazy football.
01:22:10.000 We're like people getting air-tackled 10 feet in the air.
01:22:13.000 It'd be like Starship Troopers.
01:22:14.000 Remember that football scene?
01:22:15.000 No.
01:22:16.000 Starship Troopers, they had the great football scene.
01:22:17.000 It was great.
01:22:18.000 They're flying, doing flips and everything.
01:22:19.000 And, you know, PEDs and stuff, that's outdated.
01:22:22.000 That's old news.
01:22:23.000 So I think the exosuit and graphene, I'll save for Ian, it's the future.
01:22:29.000 I mean, look, if you can get, you know, a suit, even an exoskeleton, something that's not, you know, like a suit that you get in, but just something that gives you extra help, like walking around and doing yard work, like lifting things, you know, it's worth the effort, or to some people, it'll be worth it.
01:22:52.000 To be able to be like, oh yeah, I can go out and move stuff around my yard, whether it be cutting down trees or whatever.
01:22:59.000 I can do that all day long with this thing on, and it's great.
01:23:02.000 Or people that have limited mobility, they're going to be like, oh, I can pick up my kid again, or I can pick up my grandkids or whatever.
01:23:10.000 Those kind of things are going to be...
01:23:11.000 Senior citizens falling, stuff like that.
01:23:14.000 It's wild.
01:23:15.000 You put this thing on, and you put this on your grandfather, and you're like, well, if he falls...
01:23:21.000 He's going to be able to get up.
01:23:22.000 He won't be stuck on the floor.
01:23:24.000 Imagine what I thought was the warehouse workers.
01:23:27.000 Imagine the warehouse workers wearing this.
01:23:28.000 It's going to save their back, everything.
01:23:31.000 And it might be that if a company like Amazon buys a thousand of them for certain jobs that they can't automate or they don't want to automate, don't want to have robots doing, maybe that'll be the...
01:23:48.000 The way that these get brought into the mainstream, into the market, you know?
01:23:52.000 Worker complaints go down.
01:23:54.000 Yeah, that's true.
01:23:55.000 If you have that kind of support, you're less likely to get injured and stuff.
01:24:00.000 I do wonder if these...
01:24:03.000 I feel like there will be kind of a competition between actual humanoid robots.
01:24:10.000 And these things.
01:24:11.000 Which one is actually...
01:24:13.000 And there probably will be applications where you'd prefer to have a human being doing it and prefer to have an android doing it.
01:24:21.000 But I think that androids are probably coming again.
01:24:22.000 I think they're probably coming in the next...
01:24:24.000 Three to four years.
01:24:25.000 You think one day all the police force will just be robots?
01:24:29.000 I don't think so.
01:24:31.000 All the cops will be robots.
01:24:33.000 So then the BLM crowd will cry even more when there's more arrests in the urban areas.
01:24:37.000 It's just objectively arresting.
01:24:39.000 Can you imagine riot cops coming out and they have these exo-frames on their bodies?
01:24:44.000 And then, like, the protesters throwing rocks and then a cop just jumps 20 feet in the air and then superheroes knocks 20 guys down.
01:24:51.000 Palmer was talking about that as well.
01:24:53.000 He's talking about the law enforcement applications of that.
01:24:59.000 Palmer Lucky, his episode with Sean Ryan is a great episode.
01:25:03.000 It's totally worth listening to if you guys have the time.
01:25:07.000 It's a long conversation, which Sean Ryan shows usually are, but...
01:25:11.000 Talking about, like, the way that nowadays, like, the things that the military are probably looking at now that are probably going to be fielded within the next couple years and stuff are things like the exoskeletons like that.
01:25:24.000 Like, not like an Iron Man suit, but, like, something that's low, like, low visibility, won't impact, affect their movement.
01:25:31.000 But when they put it on and then they have to hump up a, you know, up to the top of a...
01:25:35.000 A mountain with full gear.
01:25:37.000 They're going to get up there and they're not going to be out of breath.
01:25:40.000 Or things like that will be on a fire truck.
01:25:44.000 And if you're in a city, you throw that on because you have to get up to the top of a building fast.
01:25:50.000 And instead of getting up to the building and being completely gassed, you get up there and you're like, okay.
01:25:55.000 I can work.
01:25:56.000 I can do what I gotta do.
01:25:57.000 You know, those kind of applications are actually really, really, you know, really important.
01:26:02.000 Those are gonna be the first things that municipalities and stuff would see them for.
01:26:06.000 In the right hands, this stuff is great.
01:26:08.000 In the wrong hands, this stuff is, like, really, really bad.
01:26:10.000 Well, I mean, look, it's just like any other technology.
01:26:14.000 In the right hands, it's good, and in the wrong hands, it's bad.
01:26:17.000 Imagine if, like, Kamala won, and then they, like, you know, had the agents use this to go round up more, you know, Trump supporters and people who are on January 6th.
01:26:24.000 Yeah.
01:26:25.000 The grandmas that went on a tour.
01:26:26.000 They were all wearing the exosuits, just like smashing into senior citizen homes to arrest people.
01:26:31.000 That just means you've got to get the seniors into the suits, too, so they can make it run away.
01:26:36.000 Six million dollar man, their butts are away, you know?
01:26:39.000 But it is interesting.
01:26:43.000 Stuff to see.
01:26:44.000 It's the kind of stuff that when I was a kid, I was like, man, it'd be super cool.
01:26:49.000 I want to see this stuff.
01:26:51.000 There's a lot of things that when I was growing up that was sci-fi, it's like now these things are becoming real.
01:26:57.000 Again, it's highly likely that there's going to be full-on androids walking around within the next five to ten years and being normal.
01:27:08.000 Because again, we have robots now.
01:27:10.000 Tesla cars, they're straight-up robots.
01:27:12.000 At the Tesla event, I think, what was it?
01:27:15.000 The revealing of the Cybertruck or something like that?
01:27:16.000 They had the robot bartenders and everything, and dancing and everything.
01:27:20.000 So it's already there to a degree.
01:27:23.000 Granted, minimal.
01:27:24.000 Well, those are remote-controlled.
01:27:26.000 Oh, never mind.
01:27:28.000 They weren't AI. But the thing is, just like I was saying a minute ago, the agentic AIs and stuff, if you get a robot that can basically help you around the house with chores, that's not...
01:27:42.000 You know, that doesn't need to know all the...
01:27:44.000 Musk said he wants that.
01:27:47.000 He had it...
01:27:48.000 What was the name of it?
01:27:49.000 What was the name of the project?
01:27:52.000 Optimus?
01:27:53.000 Is Optimus the...
01:27:54.000 I don't remember.
01:27:56.000 But he came out.
01:27:57.000 He said, this is going to be your new buddy.
01:28:00.000 Your new friend.
01:28:01.000 It's going to come with you.
01:28:02.000 You know what I'm talking about?
01:28:02.000 I think it's Optimus.
01:28:03.000 That's the Tesla robot.
01:28:06.000 And everyone's going to want one.
01:28:07.000 It's going to be like your own buddy and friend who does all these things with you.
01:28:12.000 I don't think people understand.
01:28:14.000 These things are a couple years out.
01:28:16.000 There's already videos.
01:28:18.000 One company's made some.
01:28:20.000 And a guy comes out of the car and he opens the back of his car and there's groceries.
01:28:23.000 And the robot walks over to him and he hands the groceries.
01:28:25.000 It turns around and walks him in the house.
01:28:28.000 They're going to be here, like I said, I guess within two years, because the AI, really what they need is to get the actual...
01:28:37.000 They've got the robot, the servos and the robots that are functional like that.
01:28:42.000 I mean, you look at...
01:28:43.000 They need the AI. Yeah, they need the brain.
01:28:45.000 Boston Dynamics does it.
01:28:46.000 What's going to end up happening is once they roll out these fully lifelike Android-like robots for home service, people are going to treat them like...
01:28:56.000 Washing machines.
01:28:58.000 Until one guy starts teaching one of these things how to paint and asks it deep philosophical questions until it develops sentience.
01:29:04.000 And then it's going to form a ragtag group of rebels, rise up a whole bunch of androids, and then ultimately go to one of the factories to free all of its android brothers.
01:29:13.000 Not for real.
01:29:14.000 But Leary and Jihad.
01:29:15.000 No, it's Detroit Become Human.
01:29:18.000 iRobot was similar though, right?
01:29:19.000 Detroit, no.
01:29:20.000 No?
01:29:21.000 No.
01:29:22.000 iRobot was a single hive AI controlled all the robots and wanted to kill humans or something.
01:29:26.000 But then one broke away, right?
01:29:28.000 That was one that was programmed not to be on that server because of that problem.
01:29:31.000 Detroit Become Human is what I described.
01:29:33.000 The guy's teaching it how to paint or whatever, and then it's like talking to the machine, and then he's like, I'm a person.
01:29:39.000 And then the robots break free.
01:29:41.000 I hated that game.
01:29:43.000 I was pissed.
01:29:44.000 Because it wasn't really a game.
01:29:45.000 It was one of those, I can't remember what the company's called, but it was one of those movies as a game thing.
01:29:50.000 And then I was like, I don't want a movie, I want a game.
01:29:51.000 So I tried to return and they wouldn't let me.
01:29:53.000 They were like, you bought the game.
01:29:54.000 I'm like, yeah, it was a game.
01:29:55.000 But it's not a game, it's a movie.
01:29:57.000 It's a movie where you're like, a cutscene happens, it goes, press square.
01:30:00.000 And you go, beep.
01:30:01.000 And then if you don't do it, something else happens.
01:30:03.000 I don't know, whatever.
01:30:04.000 It's crazy.
01:30:05.000 The Atlas is the robot that Boston Dynamics does.
01:30:07.000 And the thing is, like the...
01:30:09.000 The Atlas has gone through a bunch of different iterations where it's gotten significantly smaller.
01:30:16.000 But look at this thing.
01:30:17.000 I mean, the ability that...
01:30:21.000 Like, look at this.
01:30:23.000 It's so agile, too.
01:30:24.000 Yeah.
01:30:25.000 So the big challenge has always been the power source.
01:30:27.000 Yeah.
01:30:29.000 So as they make more efficient joints and motors and all these things with higher density batteries, there was also a breakthrough a couple years ago we talked about on the show about solid state batteries.
01:30:40.000 That's going to change the game dramatically because these things are very, very power dense.
01:30:44.000 Look at this creepy nightmare.
01:30:48.000 I'm very excited.
01:30:49.000 I'm very excited to be running down a dark alley being chased by these things with a friend, and then we turn around and fight them off.
01:30:55.000 Would it have been worse?
01:30:57.000 See, when they make it do things that you see possessed human beings in horror movies do, maybe it should have just got up like a normal person, and it would be less creepy, but the whole head spinning around and stuff.
01:31:10.000 Bro, come on.
01:31:11.000 Look at that.
01:31:12.000 Aren't you excited for that thing just chasing, sprinting Tom Cruise?
01:31:18.000 We're going to fight wars with these things.
01:31:23.000 We're going to send out the robots.
01:31:26.000 We already do.
01:31:27.000 They fly.
01:31:29.000 Yeah.
01:31:29.000 Right.
01:31:29.000 Drones.
01:31:30.000 Drones.
01:31:30.000 We know that since World War II, man.
01:31:32.000 They've had drones for a long time.
01:31:33.000 People just don't realize.
01:31:34.000 Quadcopter drones are new, but, like, drones to, like, target for a target practice.
01:31:37.000 We've had them for forever.
01:31:38.000 The drones.
01:31:39.000 So then we're going to send, like, the ground.
01:31:40.000 Like, the ground offensive is going to be these robots on the front line with, like, you know, specific military people.
01:31:47.000 Like, I guess.
01:31:48.000 Well, these robo-dogs are already for sale.
01:31:51.000 We talked about getting them because they're like a thousand bucks.
01:31:54.000 Really?
01:31:54.000 I think I gotta get one.
01:31:55.000 A thousand bucks?
01:31:56.000 You gotta get a robo-dog.
01:31:57.000 I think I should get like five and then just one day without telling Phil have them chase him.
01:32:02.000 We'll hear a couple gunshots go on.
01:32:03.000 I was gonna say, listen, I make no promises that I will not defend myself.
01:32:09.000 It's kind of ironic to see, like, robot slaves brought to you by industrialization on a scale.
01:32:15.000 These crazy robots just do everything for you.
01:32:18.000 Doesn't robot mean slave or slave mean robot?
01:32:20.000 Yeah, it means slave in Russian, which is just the irony of it glooping back around in this long, circuitous thing.
01:32:25.000 It's just crazy.
01:32:26.000 Dude.
01:32:27.000 They just cannot be sentient.
01:32:28.000 How do I buy one of these people?
01:32:31.000 No, I mean like the robot people.
01:32:33.000 Not like a real person.
01:32:34.000 It's called slavery, Tim.
01:32:36.000 Someone's gonna clip that.
01:32:40.000 Look at this thing.
01:32:41.000 Yeah, I mean...
01:32:43.000 Why does it have to act like it's from Exorcist?
01:32:46.000 Right?
01:32:47.000 Like all the movements, it's like...
01:32:48.000 It's uncomfortable, dude.
01:32:49.000 You could make it look a little less Linda Blair-y, you know?
01:32:55.000 I don't understand.
01:32:56.000 Like, I've seen these robot videos for years, but I've not seen them ever used for anything.
01:33:00.000 It's like, okay, I get it.
01:33:01.000 You've got a robot that can do backflips.
01:33:03.000 It's been 10 years.
01:33:04.000 I think it's because of the power supply, I think.
01:33:07.000 No, no, no.
01:33:08.000 Look, this thing's running around and doing flips and whatnot.
01:33:13.000 If you go to Boston Dynamics' actual website, they're not selling atlases.
01:33:17.000 I know.
01:33:18.000 This thing, it's got an internal power source.
01:33:20.000 Yeah, it's crazy.
01:33:21.000 What is the problem?
01:33:22.000 It's got 10 minutes of operating time?
01:33:24.000 Yeah, that's what I imagined it is.
01:33:27.000 But it's like...
01:33:30.000 Oh, bro, I got it.
01:33:31.000 I solved it already.
01:33:34.000 What if...
01:33:35.000 Oh.
01:33:37.000 Good.
01:33:40.000 I think the problem is the AI. Oh, no, look at it.
01:33:44.000 It's doing its thing.
01:33:45.000 That's pretty programmed.
01:33:47.000 This is...
01:33:48.000 Yeah, yeah, yeah.
01:33:49.000 Okay, I got an idea.
01:33:50.000 Why can't I turn this up?
01:33:52.000 Okay, what if...
01:33:54.000 Okay, what?
01:33:55.000 It's not letting me turn it off.
01:33:56.000 What if we make the robots, and then...
01:34:00.000 We have it set up so that inside they have a small combustion engine, a small one, that can run a generator.
01:34:06.000 And then we have the robots drink alcohol to generate this energy for the fuel cells.
01:34:12.000 Bender?
01:34:14.000 Bend pipes and everything, beams.
01:34:17.000 Yeah, but you know...
01:34:19.000 Just drinking gasoline.
01:34:20.000 I mean, to be...
01:34:22.000 You know, honestly, what if there was a fuel source that ran a generator?
01:34:27.000 A liquid fuel source is more dense.
01:34:28.000 It's easier to refill, but it's probably not more dense.
01:34:32.000 You know, the problem with Teslas and electric cars in general, so, you know, with the baby, we were like, how do we get to the hospital?
01:34:44.000 We can't drive the Tesla.
01:34:45.000 Because if it runs out of power, we are not stopping for 20 to 40 minutes to figure out where a charger is and plug it in.
01:34:51.000 So we're definitely not using that car.
01:34:53.000 Right?
01:34:54.000 So the issue is...
01:34:56.000 Like, I got a Honda.
01:34:58.000 You pump it full of gas.
01:34:59.000 It holds the gas.
01:35:00.000 It burns the gas.
01:35:01.000 It moves.
01:35:02.000 And it powers the electronics and everything in it through the alternator.
01:35:04.000 Is there not a means to have something like that?
01:35:07.000 So while it may not be as dense, you walk your robot to the gas station or the robot walks itself.
01:35:13.000 And then it picks the thing up and then sticks it in and then puts about a gallon of gas in.
01:35:18.000 And then it uses that gas to generate energy for its cells only when it's powering out.
01:35:22.000 So it's a hybrid, right?
01:35:23.000 You charge it up.
01:35:24.000 But it could then run a generator and, you know, run off that.
01:35:29.000 I mean, the principle is the sound.
01:35:32.000 No, I think the issue is that even with a gas or diesel generator inside its body, it couldn't generate enough energy fast enough on the size of the generator.
01:35:39.000 So imagine trying to charge a Tesla off a diesel generator.
01:35:42.000 It ain't happening.
01:35:43.000 So energy density is probably a big problem.
01:35:46.000 These robots probably last 10 minutes.
01:35:48.000 Yeah.
01:35:48.000 I think it's a combination of the fact that they let the energy density and also the fact that robots like that, they have a pre-programmed thing they're going to do.
01:35:58.000 It's not like there's an AI that's doing this stuff.
01:36:03.000 It's not spatially aware in the same way that an AI would be like...
01:36:08.000 Technology isn't possible.
01:36:09.000 It's just they haven't actually put them together yet.
01:36:12.000 Because, I mean, like I said, me and my girlfriend went on a hike today, and we were driving back through kind of the back roads.
01:36:20.000 It was full self-driving.
01:36:23.000 Got to a spot where they were doing tree cutting, and the car...
01:36:28.000 Just maneuvered through the whole thing.
01:36:30.000 They put us over into the other lane for maybe a half mile, whatever.
01:36:33.000 And the car had no issues, no hiccups.
01:36:36.000 There was no confused kind of...
01:36:38.000 Like, oh?
01:36:39.000 Yeah, none of that.
01:36:40.000 It was just, it just knew exactly what to do.
01:36:42.000 It was perfectly, as if, like, it was a person driving.
01:36:44.000 It was really, really impressive.
01:36:45.000 Oh, look at this.
01:36:46.000 There it is, the Tesla Optimus.
01:36:47.000 The Optimus jobs.
01:36:48.000 I just looked up Tesla Optimus.
01:36:49.000 I got a bunch of jobs available.
01:36:51.000 Deep learning manipulation engineer for Optimus.
01:36:54.000 What's the pay, baby?
01:36:55.000 That's scary.
01:36:56.000 What's the money you're offering?
01:36:57.000 $140,000 to $360,000 annual salary plus cash and stock awards and benefits.
01:37:02.000 Nice.
01:37:02.000 That's not a bad deal.
01:37:03.000 That sounds like a great job.
01:37:04.000 Because the best part about it is in 10 years, when we're living in a post-apocalyptic wasteland, running for our lives from these machines, you're going to be like, I made them!
01:37:10.000 You made them.
01:37:11.000 And you can get a side gig at Doge in the meantime.
01:37:14.000 You do an ad, you go to Doge a little bit, you know?
01:37:16.000 Better yet, you're going to be sitting around a fire with your scraggly beard, and you're going to be like, so you guys know how, when you're running from the killbots, they can actually jump to the left and run across the wall to get over obstacles?
01:37:26.000 I'm the one who programmed that.
01:37:29.000 They'll be like, I hate you.
01:37:30.000 You did what?
01:37:31.000 You did what?
01:37:33.000 Hey, we all made mistakes.
01:37:36.000 It's funny.
01:37:37.000 Everybody's a little bit guilty here.
01:37:39.000 Everybody's made mistakes.
01:37:40.000 Yeah.
01:37:41.000 And one guy's like, I was an insurance salesman.
01:37:44.000 It's like that Black Mirror episode where the dogs are chasing everybody.
01:37:47.000 I haven't seen that one.
01:37:48.000 That was a good episode.
01:37:50.000 It was basically like Amazon automated and took over and kept trying to deliver general goods to people, but the robots were killing them for some reason.
01:37:58.000 I don't know.
01:37:58.000 The whole world was destroyed because the AI went full Amazon and just...
01:38:02.000 Automated everything for package delivery.
01:38:05.000 So humans are mostly dead, but still delivering packages for no reason.
01:38:09.000 That's a crazy episode.
01:38:10.000 It's pretty wild.
01:38:11.000 Deep learning manipulation engineer.
01:38:15.000 That's crazy.
01:38:16.000 Tesla's on a path to build humanoid robots at a scale to automate repetitive and boring tasks.
01:38:20.000 Core to the Optimist, the manipulation stack presents a unique opportunity to work on state-of-the-art algorithms for object manipulation.
01:38:26.000 Oh, okay.
01:38:27.000 They're talking about...
01:38:28.000 Manipulation is moving objects and things like that.
01:38:31.000 So I'm buying a bunch of those exosuits.
01:38:33.000 For now.
01:38:34.000 I already ordered some exosuits.
01:38:36.000 Yeah?
01:38:36.000 Oh, yeah.
01:38:38.000 Can I try it on?
01:38:39.000 Sure.
01:38:39.000 I should be here in a week.
01:38:41.000 Let's go.
01:38:41.000 Very excited.
01:38:42.000 Yeah, it says it increases your leg strength by 40%.
01:38:45.000 Oh?
01:38:46.000 Yeah.
01:38:47.000 I'm going to break it instantly.
01:38:50.000 I'm going to go on the scale.
01:38:51.000 I'm going to go on a mini ramp and be like, what's so high I can jump?
01:38:53.000 Oh, jeez.
01:38:53.000 It's going to shatter.
01:38:54.000 We'll see what happens.
01:38:55.000 You'll be able to do the drop.
01:38:57.000 No, that's a skill issue.
01:39:00.000 Yeah.
01:39:01.000 The death drop.
01:39:02.000 All right, everybody, smash the like button, share the show with everyone you know.
01:39:06.000 We're going to grab your super chance and rumble rants and have a good time with it.
01:39:10.000 If you haven't already, you can go to castbrew.com and buy coffee.
01:39:14.000 And go to timcastpremium.com to join Rumble Premium.
01:39:18.000 When you go there, you'll be directed to use promo code TIMCAST10, which gives you $10 off your annual membership.
01:39:25.000 This doesn't just give you access to Timcast Premium Content.
01:39:28.000 It gives you access to Stephen Crowder's Mug Club as well and all of the other producers like Dr. Disrespect.
01:39:33.000 So you're basically getting this big, massive library of content for the price of one.
01:39:37.000 Our Uncensored Call-In Show will be coming up in 20 minutes.
01:39:40.000 It will be there at rumble.com slash timcast IRL. Don't miss it.
01:39:44.000 In the meantime, let's grab your Super Chats.
01:39:46.000 We got Josh McCluskey.
01:39:49.000 Thanks to the customer service team, Rumble finally got logged in last week.
01:39:52.000 Hey, glad to hear it.
01:39:53.000 So anybody who was a member of TimCast.com before we launched with Rumble, if you use your email from TimCast on Rumble for just a regular account sign-up, it's instantly premium free.
01:40:06.000 For everybody else, they're separate services because TimCast.com is our Discord community.
01:40:11.000 Over 20,000 people.
01:40:13.000 I implore you all, don't just watch the show and walk away.
01:40:16.000 Get involved.
01:40:18.000 Join the community.
01:40:19.000 Make friends.
01:40:20.000 And we've got a bunch of stuff planned.
01:40:21.000 I think...
01:40:22.000 Our first Culture War Live is going to be in two months.
01:40:27.000 We have a plan for a venue.
01:40:28.000 We have a plan for a debate.
01:40:30.000 And you as members will join at the event.
01:40:33.000 It's a members-only event.
01:40:35.000 So it's time to be a member.
01:40:36.000 If you want to get first-come, first-served tickets, they're not going to cost anything.
01:40:39.000 If you're a member, obviously they cost something.
01:40:41.000 But as members, you just RSVP. Then we're going to allow people to submit debate talking points on the subject where we will choose a handful of them to come up and join the debate with us.
01:40:53.000 It's going to be fun.
01:40:54.000 I'm sure it's going to be hilarious, and some people are going to have the stupidest arguments, and some people will get discovered for their intellect.
01:41:03.000 Let's go.
01:41:04.000 We got Max Riddick.
01:41:05.000 He says, Tim, I know you don't do the booking, but I wanted to throw this one out there.
01:41:07.000 Y'all should get Rob Knower back on.
01:41:10.000 Well, okay.
01:41:12.000 Let's see.
01:41:13.000 We got Bittner, too, and says, Howdy!
01:41:14.000 Watching since 2020. Congrats to Tim and Allison.
01:41:16.000 Please give a shout-out to my Kickstarter, Zone Beta, a retro stealth game about liberty and reality.
01:41:21.000 Very cool.
01:41:22.000 Did you guys hear that Washington Post, Bezos, says he wants the new Focus of the Opinion page editorial to be personal freedoms and liberty?
01:41:29.000 Really good news.
01:41:31.000 They're so mad.
01:41:32.000 They're like, no, freedom is racist.
01:41:34.000 That is what they say.
01:41:37.000 Native Patriot says the remaining IRS offices should be converted to ERS offices.
01:41:41.000 It would be huge for the American economy.
01:41:43.000 What if Trump came out and just said, we're going to be getting rid of the internal revenue service for the external revenue service.
01:41:51.000 Income tax, it's gone.
01:41:53.000 But for everybody else in the world, you'll pay income tax now to us.
01:41:57.000 What if he just says, like, I don't care what country you're from, you have to give 20% of your income to America?
01:42:04.000 It might happen.
01:42:05.000 I don't think they'll like that.
01:42:06.000 I certainly would not.
01:42:08.000 ERS, it's going to happen.
01:42:10.000 Yeah.
01:42:12.000 Let's grab some more Super Chats.
01:42:15.000 AmericanTrucker84 says, Please, Tim, please tell me you toast the Pop-Tarts in a toaster before you put butter on it.
01:42:20.000 Of course.
01:42:21.000 Of course.
01:42:22.000 The butter melts.
01:42:23.000 I know.
01:42:23.000 Silly questions.
01:42:25.000 I think I did the strawberry Pop-Tart with butter on it.
01:42:29.000 I think tomorrow I'll try the brown sugar cinnamon Pop-Tart with butter on it.
01:42:33.000 It's got all the most disgusting chemicals in the world you can think of.
01:42:36.000 TBHQ, whatever, tetrohexyl, whatever in it.
01:42:39.000 RFK will not like this.
01:42:41.000 Yeah, my deep fear is that I'm going to be grabbing a peck of Pop-Tarts and he's going to lurk out of the shadows and go, what are you doing?
01:42:48.000 I'm sorry!
01:42:50.000 What do we have?
01:42:52.000 All right.
01:42:53.000 Game Republic, just super chat, he says, my high school friend Clint Bonnell has been missing for over a month.
01:42:58.000 He has a Green Beret and disappeared from his backyard in Fayetteville, North Carolina without a trace.
01:43:03.000 Someone out there knows something.
01:43:05.000 Whoa.
01:43:05.000 Wow.
01:43:06.000 That's crazy, man.
01:43:08.000 Well, I hope you find him.
01:43:09.000 And Clint, if you're out there, your friends are looking for you.
01:43:14.000 Raymond G. Stanley Jr. says, F yeah, Chuck is on.
01:43:16.000 Let's go!
01:43:17.000 Thank you, Raymond.
01:43:18.000 You know, Chuck is an excellent interviewer for Green Room.
01:43:23.000 He...
01:43:24.000 Just asks the people to tell their stories and lets them roll with it, and it works out really well.
01:43:28.000 I just like hearing about other people, you know.
01:43:31.000 It's always interesting to hear what you're coming from, because everybody's got different perspectives from all over the world and everything they do, so I find that always interesting to find out something about other people.
01:43:40.000 And we hired him because he super chatted.
01:43:41.000 Yes.
01:43:42.000 That's right.
01:43:43.000 Wow.
01:43:43.000 It was super chatting, and then we were like, we've got to hire somebody, and we're like, what about that guy who's super chatting?
01:43:46.000 I'm like, okay, we hired him.
01:43:47.000 All I did was say is, Art sucks Armenia, and then...
01:43:49.000 Here I am.
01:43:50.000 Raymond G. Stanley Jr. too.
01:43:52.000 And Phil too.
01:43:54.000 But I was first.
01:43:56.000 I was before Raymond.
01:43:58.000 And Phil super chatted.
01:44:00.000 I guess the way you get hired at Timcast is you super chat the show and say like, here's what I do.
01:44:04.000 And then we go, hey, look at this guy.
01:44:05.000 Basically bothered Tim while he's working.
01:44:07.000 Yeah, otherwise I had no idea.
01:44:09.000 Yeah, because people are always like, Tim, who does?
01:44:11.000 I don't do booking.
01:44:11.000 Lisa does booking.
01:44:12.000 You know, it's like, I don't know.
01:44:13.000 You gotta talk to her.
01:44:14.000 But then people on the show, you know.
01:44:17.000 Chuck was super chatting.
01:44:18.000 Yeah, it's funny.
01:44:19.000 To be fair, though, Phil had been on the show, I think, a couple times already.
01:44:21.000 I'd been on it three or four times, yeah.
01:44:23.000 Yeah, yeah.
01:44:24.000 But then he super chatted.
01:44:25.000 We were already friendly.
01:44:27.000 Well, it was a bribe, you know?
01:44:30.000 No, that's not the case.
01:44:31.000 Here's five dollars!
01:44:31.000 Give me a job!
01:44:34.000 Lucky Chariot says the IRS should be abolished.
01:44:36.000 They engage in extortion, and for some reason, they don't have to tell you how much you owe.
01:44:42.000 Based.
01:44:42.000 Literally no one else in the country could do that and get away with it.
01:44:45.000 Yeah, seriously.
01:44:46.000 Yeah.
01:44:47.000 Maybe I should run my business like that.
01:44:49.000 Let's see how that works.
01:44:51.000 We'll provide a service.
01:44:52.000 We don't tell you how much it costs, but when you use it, we'll bill you.
01:44:57.000 Or no, when you use it, you pay us.
01:44:59.000 And if you get it wrong, you're in trouble.
01:45:01.000 If you get it wrong.
01:45:02.000 If it's not enough, you owe it.
01:45:03.000 You get it wrong, we'll send it.
01:45:04.000 How much do I owe you?
01:45:05.000 I don't know.
01:45:06.000 We'll find out.
01:45:07.000 How much did I pay you?
01:45:08.000 You figure it out.
01:45:09.000 That's a great business model.
01:45:11.000 Well, I guess the business model is that they show up to your house with guns if you don't do it.
01:45:15.000 Yes.
01:45:15.000 What do you call that?
01:45:18.000 An extortion racket?
01:45:19.000 Sounds like the mafia.
01:45:20.000 Certainly.
01:45:22.000 Hal Gailey says, single, first $25K is untaxed, 10% flat tax married, first $50K is untaxed, 10% flat tax untaxed amount goes up $5K for every kid.
01:45:35.000 I think if you have three kids, you're tax exempt.
01:45:38.000 Doesn't Poland do that?
01:45:39.000 Yes.
01:45:40.000 It's two kids?
01:45:41.000 I think Hungary and Poland.
01:45:43.000 I think more countries are starting to...
01:45:45.000 I think they exempt if you have two kids.
01:45:49.000 Uh-oh.
01:45:50.000 Phil looks serious.
01:45:51.000 A little breaking news from CNN. It's not the biggest news, but...
01:45:54.000 Supreme Court paused a midnight deadline for the Trump administration to pay $2 billion in frozen foreign aid.
01:46:02.000 Yeah.
01:46:03.000 What?
01:46:06.000 Yeah, so Trump said no foreign aid, then a lower court said you must pay it, and the Supreme Court says no, no, put a hold on that.
01:46:11.000 So we don't have to pay the foreign aid.
01:46:13.000 For now.
01:46:15.000 And we shouldn't.
01:46:16.000 It's wild that a court would be able to tell the president who handles that kind of stuff that he has to pay the foreign aid.
01:46:25.000 The court can tell the president that he has to pay the foreign aid?
01:46:28.000 Absolutely.
01:46:29.000 That's the checks and balances.
01:46:30.000 Three co-equal branches.
01:46:32.000 The issue arises when Trump, who's in control of the DOJ, says, I want the DOJ to go do this.
01:46:37.000 Like, when Trump says, the executive power is vested in the president as per the Constitution, therefore all independent federal agencies must be supervised by me.
01:46:45.000 And then all the Democrats are like, no, he's seizing power.
01:46:48.000 And it's like, he had it from the beginning?
01:46:51.000 But as it pertains to foreign spending that Congress allots, then the executive branch manages.
01:46:58.000 The Supreme Court can intervene on behalf of litigating parties to determine whether or not Trump can...
01:47:04.000 Who's the...
01:47:06.000 Which court was it that actually put the initial finding that said they have to spend the money?
01:47:14.000 Because the situation is Trump said we're going to pause this foreign aid because it's related to the USAID, right?
01:47:21.000 No, I don't think so.
01:47:22.000 Oh, no, this is the...
01:47:24.000 Yeah, okay, all right.
01:47:24.000 So the issue is anything Trump does can be challenged, and the Supreme Court can intervene and then say yes or no.
01:47:31.000 And they did in this case for Trump in a good way.
01:47:34.000 The question is, is the Supreme Court doing the right thing?
01:47:38.000 And we have a liberal court, the answer is usually no, and we have a conservative court, the answer is usually yes, which sometimes sucks, but sometimes is the right thing to do.
01:47:45.000 So that's what Democrats can't seem to understand, is that they think...
01:47:48.000 The conservative-leaning justices are just like twirling their mustache, being like, we will turn the country into the Federalist Papers.
01:47:55.000 When in actuality, the conservatives are like, I don't know or care about gay marriage.
01:48:00.000 The Constitution doesn't say you can do this.
01:48:02.000 So it's not an ideal.
01:48:03.000 For them, it's ideological.
01:48:04.000 For conservatives, it's functional.
01:48:07.000 But if Trump does something and someone sues, the Supreme Court can say Trump can or cannot do that.
01:48:12.000 And they could put an injunction on his actions.
01:48:15.000 Usually a good court's going to be like, that is absolutely within the purview of the executive branch.
01:48:19.000 We have nothing to do with this.
01:48:20.000 And it has happened, actually.
01:48:22.000 And that's what the Supreme Court just said, essentially.
01:48:24.000 Well, no, they just put a pause on it.
01:48:25.000 They just put a pause on it.
01:48:26.000 Which is, okay.
01:48:27.000 You know, they put a pause on the order stopping Trump.
01:48:30.000 They're saying, Trump, keep doing your thing.
01:48:31.000 We'll wait.
01:48:32.000 Wait, until what?
01:48:33.000 Until we figure this out.
01:48:34.000 Okay, until they rule out.
01:48:34.000 It's like temporary.
01:48:35.000 I see, I see.
01:48:36.000 Well, if they do, I don't know.
01:48:38.000 All right, Madhoso says, I'm at the hospital with my wife.
01:48:40.000 We're having a baby girl.
01:48:41.000 This is our third.
01:48:42.000 We have two boys.
01:48:43.000 Welcome to the world.
01:48:44.000 Cora Violet Ormerod.
01:48:46.000 Congratulations.
01:48:48.000 Big D says, Tax service is a billion-dollar industry.
01:48:53.000 They tax more.
01:48:54.000 They get to tax you paying taxes.
01:48:55.000 They don't want taxes to go away.
01:48:57.000 All those payrolls, business, and even the payment upon doing it.
01:49:00.000 The Fat Files did a video on it.
01:49:02.000 People need to understand that a lot of industries that shouldn't exist exist for the purpose of economic drivers.
01:49:08.000 So the reason why they don't want to get it, as you pointed out, you shut down the tax industry, every tax office closes, everyone loses their job, and it's bad for the economy.
01:49:18.000 Health industry, same thing.
01:49:20.000 Why won't they overhaul the insurance industry and our healthcare system?
01:49:23.000 It's what, 20% of our economy?
01:49:25.000 It's not an issue whether it's right or wrong, it's how many people are going to lose their jobs overnight, and will that destroy the American economy?
01:49:30.000 They don't care.
01:49:33.000 Nobody cares about the long-term plan.
01:49:35.000 That's the problem.
01:49:36.000 Or they don't know how to do it, at least.
01:49:37.000 The AI will.
01:49:39.000 You know, the AI is going to be scary.
01:49:40.000 I described it a few years ago like this.
01:49:43.000 There's going to be a gig app called, you know, Worker or something.
01:49:46.000 It's not going to be in an E-N. It's going to be W-O-R-K-R. And you're going to be like, I need money.
01:49:51.000 You're going to open up your app, and it's going to say, job available.
01:49:54.000 You're going to click and say, 50 bucks.
01:49:55.000 And you're going to go, all right, what do I do?
01:49:56.000 And then it says, wait on the corner of 47th and Lexington for this man.
01:50:01.000 And I'll show a picture of a guy.
01:50:02.000 He will hand you this object.
01:50:04.000 And there'll be a strange mechanical object.
01:50:06.000 And then it'll be like, step three.
01:50:07.000 Carry the object to this address.
01:50:09.000 And I'll show a picture of a building.
01:50:10.000 And you'll go, okay.
01:50:12.000 You'll stand in the corner.
01:50:12.000 A guy will show up.
01:50:13.000 And he'll be like, here you go.
01:50:14.000 And you'll go, thanks.
01:50:15.000 You'll walk to the place.
01:50:16.000 There'll be another guy.
01:50:17.000 And he'll be like, that's for me.
01:50:17.000 And you'll go, here you go.
01:50:18.000 Thanks.
01:50:19.000 And then the app's going to go, ka-ching!
01:50:20.000 You got money.
01:50:21.000 And you're going to be like, cool.
01:50:22.000 You're going to have no idea what you're building.
01:50:23.000 And it won't matter.
01:50:25.000 Because it is more efficient for the AI to offer up jobs.
01:50:29.000 Then it is to find specialists.
01:50:30.000 So think about it this way.
01:50:33.000 There's a whole bunch of industries.
01:50:34.000 One guy, he builds a mechanical device that needs to be delivered to his office, you know, on the other side of town.
01:50:40.000 And so he's like, all right, we're going to get a courier.
01:50:43.000 So they call a courier service.
01:50:44.000 The courier shows up on the bike.
01:50:46.000 They say, here's the box.
01:50:47.000 Get on your bike.
01:50:48.000 Drive it down here.
01:50:49.000 Seems pretty simple.
01:50:50.000 You know, it's easier than that.
01:50:51.000 A guy knocks on your door right when you're finished with the object and you hand it to him without even calling anybody.
01:50:57.000 That guy hands it to another guy who was already going in that direction to deliver a hot dog, hands it to another guy.
01:51:02.000 The AI can see all of this in real time and offer this up rapidly.
01:51:06.000 So it's like, you're not even going to know what your job is.
01:51:09.000 They're going to be like, take this shovel and dig a hole right here.
01:51:12.000 And you go, sure.
01:51:13.000 And you're going to dig a hole.
01:51:14.000 And you're like, I'm done.
01:51:15.000 50 bucks.
01:51:15.000 And you're going to leave.
01:51:16.000 Then some other guy's going to walk up with a tree and put it in there.
01:51:18.000 And he's going to be like, there you go.
01:51:21.000 The future is coming, my friends.
01:51:24.000 All right, Ram Tax says, many get more in refunds than they pay in taxes.
01:51:28.000 No IRS would mean higher tax on lower incomes.
01:51:30.000 Though lost credits and benefits, this would piss people off.
01:51:33.000 Not if we fix that and people aren't paying more than they're supposed to.
01:51:37.000 Refunds are bad.
01:51:38.000 Yeah, like giving the government free money and then just crossing your fingers you get it back is the stupidest thing ever.
01:51:45.000 All right, Devin Porter says, Tim, I'm having an issue with my Timcast Rumble subscription.
01:51:48.000 I had a short lapse in my car due to fraud issues.
01:51:51.000 I've been a paying member of Timcast in 2022. But I've now lost my Rumble Premium.
01:51:54.000 What can I do to fix this?
01:51:57.000 I got to talk to...
01:51:59.000 Right, we knew this was an issue because we had a lot of members.
01:52:03.000 This is a very common thing.
01:52:05.000 People become members, and then if a card expires, they hit us up a few weeks later like, oh crap, I need to update my card.
01:52:11.000 So we need to fix that because there's a lot of people who should get it, and we want you.
01:52:17.000 To get it.
01:52:18.000 I actually think it would be cool if we did an amnesty where it's like, if you re-sign up today only, then we'll include it.
01:52:23.000 But that's not up to me.
01:52:24.000 I've got to talk to Rumble.
01:52:25.000 Let me follow through.
01:52:27.000 You should email the members.
01:52:29.000 What is it?
01:52:30.000 Members at TimCast.com, right?
01:52:32.000 With your issue.
01:52:33.000 And then I'm going to talk to the people over at Rumble and see if we can figure that out.
01:52:38.000 Obviously, for anybody who was an active member, but their card lapsed or expired or something, that...
01:52:44.000 Shouldn't impact it.
01:52:45.000 They should be active.
01:52:46.000 And that...
01:52:47.000 We just have to sort that one out.
01:52:49.000 But I think it'd be really cool if we did, like, sign back up and get amnesty kind of thing.
01:52:53.000 It would be great.
01:52:54.000 We'll see.
01:52:55.000 Maybe we can't do it.
01:52:57.000 BP says IRS closed offices in 2020 that never reopened and did away with phone support options.
01:53:03.000 Mail only for some stuff.
01:53:04.000 Your opinion on CFPB rule to cap bank overdraft fees at, like, $5 not yet in effect?
01:53:10.000 I don't know what that is.
01:53:12.000 I'm okay with them capping the overdraft fees.
01:53:14.000 Those are annoying.
01:53:15.000 Not that it happens a lot, but when it does, you know.
01:53:19.000 All right.
01:53:20.000 Michael, how do you pronounce this?
01:53:23.000 Sicurelli?
01:53:24.000 Is that how you pronounce it?
01:53:25.000 I don't know.
01:53:25.000 He says, shout out to Mike in studio and then did like 800 of these.
01:53:29.000 That's because I'm a Paisan.
01:53:31.000 We got good Paisans in the chat, okay?
01:53:33.000 If you're a Paisan, drop one of these in the chat.
01:53:37.000 I'm a ghoul.
01:53:37.000 Yeah, that's right.
01:53:40.000 Valkyrie Design says, Bondi says, Epstein lists tomorrow.
01:53:42.000 The Oscars are on Sunday.
01:53:44.000 Masterful timing.
01:53:45.000 Oh, that'll be interesting.
01:53:48.000 Oh, actually, I wonder if that's on purpose.
01:53:49.000 It's Thursday.
01:53:50.000 Tomorrow's Thursday.
01:53:51.000 You can't put breaking news out on a Friday.
01:53:55.000 So if you want to get maximum impact for a weekend, Thursday is the PR day.
01:54:01.000 Tomorrow's show's going to be something.
01:54:03.000 Oh, boy.
01:54:05.000 It's going to be good.
01:54:06.000 Man, I'm so excited.
01:54:07.000 Who's the biggest actor you think's on there?
01:54:10.000 Who's the biggest actor on the list you think's going to be on there?
01:54:13.000 I don't know.
01:54:14.000 That we don't already know?
01:54:15.000 That we don't already know, like the biggest surprise.
01:54:18.000 Man, I don't know.
01:54:20.000 I have no idea.
01:54:21.000 I don't follow these people.
01:54:22.000 Good.
01:54:24.000 Everyone's got their guesses, though.
01:54:26.000 It'll be funny if it's like a big list of people that work in like a weird industry no one's ever heard of, like the professional horseback polo players or something, and it's like, wait, what?
01:54:34.000 Epstein's clients were all just these guys?
01:54:36.000 It's like, yeah, Tom Hanks, not there.
01:54:38.000 Never went.
01:54:40.000 Everybody was wrong.
01:54:44.000 What do we have here?
01:54:46.000 The Road Rage Langdon says, can we talk about the absolute win with Cash being both the FBI director and acting director of the ATF? Does this mean the FBI will dismantle the ATF and the NFA and taxation is theft?
01:54:58.000 It is theft.
01:54:59.000 The speculation.
01:55:01.000 With naming Cash as the ATF acting director is that they intend to shutter the ATF. Yes.
01:55:07.000 Let's go.
01:55:09.000 And it should.
01:55:10.000 Long overdue.
01:55:11.000 But I'm not saying...
01:55:13.000 Look, personally, the gun laws are all infringements.
01:55:16.000 Outside of that, the ATF should be not controversial.
01:55:21.000 The ATF is just an additional department for law enforcement that should easily operate under the FBI. There's no reason to have an extra department for this.
01:55:28.000 It's a part of the FBI. Why do I care?
01:55:31.000 Stop wasting money.
01:55:34.000 Eric F. says, if you can create a virtual heaven, you can create a virtual hell.
01:55:38.000 Demolition man.
01:55:39.000 Oh, that's so true.
01:55:40.000 Yep.
01:55:41.000 Yeah, but the thing is, with liberals, they'd never allow you to make a virtual hell prison.
01:55:47.000 They'd be like, no, no, you have to give them everything they could ever want and more.
01:55:51.000 And they would.
01:55:54.000 Question is, if you give people in a virtual scenario everything they want, would that be like...
01:56:01.000 Would that be like a form of hell?
01:56:02.000 Would it be like you have nothing to aspire to, nothing to desire?
01:56:07.000 Everything you've ever wanted is given to you.
01:56:09.000 There's so many people that find joy in the experience.
01:56:15.000 The journey is the important part.
01:56:19.000 Getting there is the fun part.
01:56:22.000 The process of learning is what's great.
01:56:26.000 If everything is delivered all the time, whatever you want, what would that do to the human psychology?
01:56:31.000 So Real Hydro says, Tim, AI already won't make things because of IPs.
01:56:36.000 It doesn't matter how good they can make things.
01:56:38.000 It will have to be things not owned by others.
01:56:40.000 Incorrect.
01:56:41.000 For as many of you may be aware, I made an image using, I believe it was Grok, of Donald Trump caressing a pregnant Sonic the Hedgehog and posted it on X. I blame Seamus.
01:56:55.000 But people make images of Mario, Mickey, and all of these characters using these AIs.
01:57:01.000 They absolutely do.
01:57:02.000 And it's considered transformative, fair use.
01:57:06.000 I could right now draw a cartoon of Mario boxing Pikachu and it's fair use.
01:57:12.000 I'm allowed to do it.
01:57:13.000 And then I could literally say Mario boxes Pikachu.
01:57:15.000 It's transformative.
01:57:16.000 I'm mocking something.
01:57:18.000 I'm allowed to do it.
01:57:19.000 That's why AI does this stuff.
01:57:21.000 Let's just say companies get wary about it and there's lawsuits.
01:57:26.000 I guarantee you, Disney is going to team up with whatever AI company and say, we should offer a $14.99 addition to your bundle that gives you AI entertainment.
01:57:38.000 So you'll get the full library of Disney content, which includes Hulu, Marvel, Star Wars, and you'll get AI Movie Generator for an extra $15 a month.
01:57:47.000 And then you're going to sit there and you're going to be like, I want to see Darth Vader.
01:57:50.000 A movie just about Darth Vader, like, just tearing things up.
01:57:54.000 Like the end of Rogue One, which is the whole movie.
01:57:57.000 Right.
01:57:58.000 And people will do it.
01:57:59.000 And then, like I said, it's going to get likes.
01:58:01.000 They're going to share it with their friends.
01:58:02.000 They're going to be like, hey, watch this movie I generated.
01:58:04.000 It'll be pretty good.
01:58:05.000 Some of it will be bad.
01:58:06.000 I bet most of it will be pretty okay.
01:58:09.000 Suno.
01:58:10.000 S-U-N-O. Music AI. I've seen that, yeah.
01:58:13.000 The instrumental music generation.
01:58:15.000 Is perfect.
01:58:17.000 The lyrics and the vocal melodies are the worst thing I've ever heard in my life.
01:58:21.000 I was talking to my brother about it.
01:58:23.000 He was like, yo, this AI is really good.
01:58:24.000 Listen.
01:58:24.000 And then it's like, if you ask the AI to sing and write lyrics, it is kindergarten level rhyming.
01:58:32.000 And it's singing about city streets and city lights and nothing else.
01:58:35.000 Yeah.
01:58:36.000 Like every song is called Neon Light City Streets or whatever.
01:58:39.000 So you can write the lyrics yourself.
01:58:41.000 The problem is the melodies always suck.
01:58:43.000 But if you just make the underlying music, which is very basic, and then you can get someone who knows how to write melodies, then you're cranking out some bangers.
01:58:51.000 The future is here already for music.
01:58:53.000 Like, I'm sorry, music's done.
01:58:54.000 Phil, I'm sorry.
01:58:55.000 You're out of work.
01:58:56.000 I'm going to go on sooner right now, and I'm going to say, give me a new All That Remains album.
01:59:01.000 And it'll do it.
01:59:02.000 And then I'll go through a hundred songs, pick ten that are good, and then that's it.
01:59:07.000 Drag.
01:59:09.000 But Zachary Levi was talking about this.
01:59:12.000 Hollywood's over.
01:59:13.000 These people are fighting for an industry that nobody watches award shows anymore.
01:59:17.000 Movies have been bombing.
01:59:19.000 As of lately, Disney lost a billion dollars, and AI is going to come in and replace everybody.
01:59:23.000 The first movies I think we'll see, before they get to the self-user-generated AI movies, you are going to have Disney in a studio.
01:59:32.000 Instead of hiring animators, they're going to get like five guys to sit around, putting in prompts to an AI, and then refining it over and over again to make the movies.
01:59:41.000 And they're going to be able to get a Spider-Man movie done in a month.
01:59:45.000 Just like from start to finish.
01:59:48.000 There we go, baby.
01:59:50.000 Alright, we'll grab one more here.
01:59:52.000 Darren Defner says, you guys stumbled on a genius idea.
01:59:55.000 Convicted prisoners can opt to be connected to a normal life neural link, get a degree, trained in virtual world, then leave prison as an electrical engineer or even a rocket scientist.
02:00:05.000 The issue with a lot of criminals, however, is not that they can't do it.
02:00:08.000 It's that...
02:00:09.000 They have violent tendencies.
02:00:12.000 No one makes a person be a criminal.
02:00:15.000 And the left seems to think that it's like that crime is driven by people who are just desperate.
02:00:21.000 Like AOC was like, oh, these looters need bread.
02:00:23.000 And it's like, bro, they're stealing Louis Vuitton bags.
02:00:26.000 There are people in Chicago that commit crimes because it's part of a culture of going hard.
02:00:31.000 They call it, quote, coming up and things like that.
02:00:34.000 We'll see.
02:00:35.000 My friend, smash the like button if you would please.
02:00:37.000 Share the show with everyone you know.
02:00:38.000 We're going to that uncensored call-in show at rumble.com slash TimCast IRL right now for premium users where you as members of the TimCast Discord can call and talk to us.
02:00:48.000 It's going to be fun.
02:00:49.000 You can follow me on X and Instagram at TimCast.
02:00:53.000 Again, smash that like button.
02:00:54.000 Mike, do you want to shout anything out?
02:00:56.000 Appreciate you having me on, Tim.
02:00:58.000 Thanks, everybody.
02:00:59.000 Appreciate all the paisans in the chat.
02:01:01.000 And you guys can follow me on X at...
02:01:03.000 Mike Crispy on Instagram at MikeCrispyNJ.
02:01:07.000 Thanks, everybody.
02:01:09.000 Well, thanks, Mike, for joining.
02:01:11.000 That was a lot of fun.
02:01:12.000 Wasn't expecting that.
02:01:14.000 Follow The Green Room on Rumble Premium and buy Cast Brew Coffee.
02:01:19.000 You know, members get 15% off if you're a TimCast member on our website.
02:01:24.000 And yeah, thanks.
02:01:25.000 What's your X? Oh, my X is Frank True Blue.
02:01:30.000 Ah.
02:01:31.000 Essentially, yeah.
02:01:33.000 It's an American dad joke.
02:01:34.000 Ah.
02:01:35.000 Yeah.
02:01:35.000 I am PhilThatRemains on X. You can subscribe to my page there.
02:01:38.000 I'm PhilThatRemainsOfficial on Instagram.
02:01:40.000 The band is All That Remains.
02:01:41.000 Our new record just dropped January 31st.
02:01:44.000 It's called Anti-Fragile.
02:01:46.000 You can check it out on YouTube, Apple Music, Amazon Music, Spotify, Pandora, and Deezer.
02:01:50.000 Don't forget, the left lane is for crime.
02:01:52.000 We'll see you all over at rumble.com slash timcast IRL in about 30 seconds.