Timcast IRL - Tim Pool - February 27, 2026


IT WAS DERAILED | Timcast IRL #1458 w- Rick Jordan


Episode Stats

Length

2 hours and 7 minutes

Words per Minute

200.60979

Word Count

25,551

Sentence Count

2,153


Summary


Transcript

00:01:58.000 Hillary Clinton was on Capitol Hill today giving closed-door testimony about her relationship with Jeffrey Epstein.
00:02:05.000 Well, her lack of a relationship with Jeffrey Epstein, if you believe her testimony.
00:02:09.000 Lauren Boebert decided that she was going to take a picture and then she sent it to Benny Johnson.
00:02:13.000 Then he threw that on the internet.
00:02:14.000 And so then they stopped the whole thing.
00:02:16.000 So we're going to talk about that tonight.
00:02:18.000 There is chaos erupting on the Afghanistan-Pakistan border.
00:02:23.000 The Afghans decided that they were going to shoot at their nuclear-armed neighbor, and now all hell is breaking loose.
00:02:30.000 The Iran talks have broken down.
00:02:32.000 The United States, Iran says that they're not going to end their enrichment.
00:02:36.000 So this only adds to the tension in the region.
00:02:38.000 Donald Trump is making a bunch of waves because he's talking about seeking executive power over elections.
00:02:44.000 Now, what he's looking to do is use an executive order to require IDs, but the left is freaking out saying that he's going to fix the elections and it's going to be unfair and blah, blah, blah.
00:02:56.000 So we'll talk about that.
00:02:58.000 One of the people killed off the coast of Cuba the other day was an American citizen.
00:03:03.000 Now, allegedly, the boat was stolen and it had a lot of Cuban nationals in it.
00:03:07.000 But again, one American was killed.
00:03:08.000 So we'll get into that.
00:03:10.000 And also, we're going to talk about a whole bunch of AI stuff at the end of the show, too.
00:03:14.000 So there's a bunch of people in China that, or a bunch of women in China, that have decided that they want to fall in love with their AIs.
00:03:21.000 Burger King is using AIs to watch over their employees and make sure that they're saying please and thank you.
00:03:27.000 So we're going to get into it.
00:03:28.000 But first, we're going to go to a word from our sponsor.
00:03:31.000 We got a great sponsor.
00:03:32.000 It is Beam Dream.
00:03:34.000 Check out shopbeam.com/slash Tim Pool to get the 35% off your nighttime sleep blend to support better sleep.
00:03:44.000 I absolutely love this stuff.
00:03:45.000 I drink it every single night.
00:03:46.000 They got a bunch of different flavors.
00:03:48.000 I got cinnamon cocoa, sees all caramel, brownie batter.
00:03:50.000 I'm a big fan of the cinnamon cocoa.
00:03:52.000 It's my favorite, but I've been drinking the sea salt caramel one.
00:03:54.000 It's got magnesium.
00:03:55.000 It's got L Feene.
00:03:56.000 It's got Reishi.
00:03:57.000 It's got melatonin.
00:03:59.000 I drink it before bed.
00:04:00.000 It's a hot, it's a cup of hot cocoa.
00:04:02.000 Oh, it's caramel.
00:04:03.000 I guess hot caramel.
00:04:03.000 And it's about 15 calories, no added sugar.
00:04:06.000 No joke.
00:04:06.000 I do drink it every night before bed, and it is absolutely amazing.
00:04:10.000 My sleep has dramatically improved.
00:04:12.000 I've actually been getting such good sleep.
00:04:14.000 My sleep has started to reduce.
00:04:16.000 Like, no joke, I was sleeping for like seven and a half hours.
00:04:19.000 Now I'm just naturally waking up a little earlier, and my sleep score is still maxed out.
00:04:23.000 This stuff is great.
00:04:24.000 And if you're a guy, it's important because your testosterone and HGH are produced in the body during REM and deep sleep.
00:04:30.000 So if you're sleeping poorly, it's negatively impacting your weight, your energy, your mood.
00:04:35.000 So I'm a big fan.
00:04:36.000 Check out shopbeam.com/slash TimPool.
00:04:41.000 All right.
00:04:41.000 So smash the like button, share the show with all of your friends, with everyone you know.
00:04:45.000 Head on over to Timcast.com where you can become a member there.
00:04:48.000 You can join our Discord and you can join our after-show.
00:04:51.000 You can call in and talk to our guests.
00:04:52.000 Then head on over to Rumble so you can watch the after-show.
00:04:55.000 Join up there.
00:04:57.000 Joining us tonight to talk about all of the things that I mentioned earlier and so much more is Rick Jordan.
00:05:02.000 How are you doing, Rick?
00:05:02.000 What's shaking?
00:05:03.000 It's good to be here, man.
00:05:04.000 Who are you?
00:05:05.000 Who am I?
00:05:05.000 What do you do?
00:05:06.000 Well, I'm Rick Jordan, right?
00:05:09.000 I do a lot of things.
00:05:10.000 Whenever I get this question, you know, pretty much what I've done since birth almost was technology, right?
00:05:15.000 But what I wanted to be when I was a super little kid was a tornado chaser.
00:05:19.000 Oh, that's sick.
00:05:19.000 I know, yeah.
00:05:20.000 So, I mean, I still do a little bit of that on the side.
00:05:22.000 Do you do a lot of, do you like, are you like an adrenaline junkie?
00:05:25.000 Do you go out and try and do like things like jump out of planes and stuff?
00:05:28.000 No, I don't do that.
00:05:29.000 But what I do, I like to be the first at certain things.
00:05:31.000 You know, so if I see that somebody else hasn't done something yet, I'm like, why not?
00:05:31.000 Okay.
00:05:35.000 You know, there's got to be a reasonable watch me do it, you know?
00:05:38.000 Well, thanks for joining us.
00:05:38.000 Awesome.
00:05:39.000 Brett's here.
00:05:40.000 What is going on, guys?
00:05:41.000 It is Brett.
00:05:42.000 Normally, I'm doing Pop Culture Crisis Monday through Friday at 3 p.m. 3 p.m. Eastern Standard Time, but we had a bunch of stuff talking about.
00:05:49.000 Fan of Twister growing up?
00:05:51.000 Oh, yeah.
00:05:51.000 The movie Twister?
00:05:52.000 Yeah.
00:05:52.000 Absolutely.
00:05:53.000 110%.
00:05:54.000 Same with a new one.
00:05:54.000 Let's go.
00:05:55.000 If you skydive into a tornado, would it kill you or just spit you out somewhere?
00:05:59.000 We want to find out.
00:06:00.000 I want to be the first.
00:06:00.000 Let's go.
00:06:01.000 Yeah, you want to be the first?
00:06:02.000 No, not yet.
00:06:03.000 But reading all this AI stuff, I'm kind of almost there, like getting ready to jump into a tornado.
00:06:07.000 If you look at a tornado anyway, you could ask AI.
00:06:10.000 Yeah, just read the Department of Wars has this contract with Anthropic AI, the same company that's building the thing that Phil's been using, this buddy bot.
00:06:21.000 His name is Tank.
00:06:22.000 Thank you.
00:06:23.000 His name is Tank.
00:06:24.000 And so the military Department of War is like, we need full control of this AI for autonomous weapons.
00:06:29.000 And Anthropic's like, I don't think that's what we're supposed to be doing here, everybody.
00:06:33.000 And it's like, anyway, I'm freaking out.
00:06:35.000 White, what it is.
00:06:36.000 Okay, well, maybe we can explain it further.
00:06:37.000 We're going to argue about you guys are going to argue about it.
00:06:40.000 Yeah.
00:06:40.000 Carter's up, everyone.
00:06:41.000 Carter Banks here, hanging out, pushing the buttons, making sure to give you the best reaction shots and the best show.
00:06:47.000 Let's go.
00:06:48.000 Awesome.
00:06:49.000 So we're going to start off with from ABC 7.
00:06:52.000 Hillary Clinton's Epstein deposition briefly delayed over a leaked photo.
00:06:57.000 Former Secretary of State Hillary Clinton's testimony had to be briefly halted due to conservative commentator Benny Johnson posting a picture from the closed door testimony.
00:07:06.000 Johnson posted a picture on social media of Hillary Clinton testifying under oath in front of the House Oversight and Government Reform Committee.
00:07:13.000 He said that Colorado GOP rep Lauren Boolbert was the one who gave him the picture.
00:07:17.000 Breaking the first image of Hillary Clinton testifying under oath about Jeffrey Epstein to the Republican Oversight Committee is what Johnson wrote.
00:07:23.000 And you can see there's his tweet.
00:07:26.000 One of Clinton's advisors said that the testimony had to be temporarily off the record while they figured out where the photo came from and why possibly members of Congress are violating House rules, according to Politico.
00:07:38.000 In the past, Clinton said that she doesn't have any information on disgraced financer Epstein or his associate, Delaine Maxwell.
00:07:44.000 Epstein was convicted of sex trafficking minors in 2019, the same year he died in prison.
00:07:48.000 Do you guys think it's a good idea to take pictures in a closed session of why is this news?
00:07:55.000 Seriously.
00:07:57.000 Because it's Hillary Clinton and it's Jeffrey Epstein.
00:07:58.000 Those two things are big news all the time.
00:08:00.000 But it's not a photo of those two together.
00:08:02.000 No.
00:08:03.000 There are not how much of a problem is this for Lauren Bobert?
00:08:08.000 This is a slap on the wrist?
00:08:09.000 Yeah, I don't think that.
00:08:10.000 I don't think, like, they wouldn't censure her.
00:08:12.000 So it's like, everything's legal for a fee?
00:08:14.000 Like, as long as you're okay with taking the punishment, they go ahead and do that.
00:08:17.000 Well, I mean, that's everything.
00:08:18.000 Yeah.
00:08:19.000 You know, I mean, like, tolls are suggestions if you don't mind paying the.
00:08:22.000 You've got the money.
00:08:23.000 Yeah, exactly.
00:08:23.000 Speed limits, just a suggestion.
00:08:25.000 I mean, most things, especially if they're not violent crimes, most things are just suggestions if you don't mind a fine.
00:08:31.000 Was it Hillary that said timeout the photo leaked?
00:08:34.000 I mean, I don't know.
00:08:36.000 It doesn't say who actually decided.
00:08:38.000 I mean, it wouldn't surprise me if whoever was running the hearing actually, you know, the word got to them.
00:08:44.000 They're like, hold on, we have to stop this and find out who it was.
00:08:47.000 So that way we can mark it down in the calendar that they need a slap on the wrist or something.
00:08:51.000 When was the last time she was in any type of government hearing?
00:08:55.000 Yeah.
00:08:55.000 Hillary Clinton?
00:08:57.000 Probably, it's probably been like seven or eight years because it was about, or maybe even longer, because it was about her emails.
00:09:03.000 It was probably the last time that she had a session of Congress where she was answering.
00:09:08.000 Yeah, I think so.
00:09:09.000 Off the top of my head, at least.
00:09:10.000 The Benghazi stuff.
00:09:11.000 I remember that.
00:09:12.000 Benghazi was.
00:09:13.000 That was even farther.
00:09:13.000 Yeah, Benghazi was before that.
00:09:15.000 So she had makeup professionally done.
00:09:18.000 Yeah, right.
00:09:19.000 I don't think this matters at all.
00:09:20.000 Well, no, the weird thing about it is you see stories like this.
00:09:24.000 And I know that a lot of the people in Congress, I think it was you, Phil, that was pointing out, maybe somebody else that was pointing out that you expect more from senators than you do from people in Congress, right?
00:09:33.000 So is the idea here that Lauren Bobert's like, I'm just going to get my name in the press by leaking this photo to Benny Johnson and she just took it as a risk-versus-reward analysis?
00:09:40.000 I don't know that she was thinking about the press.
00:09:42.000 I think she was thinking about, I think she's thinking about, you know, this will be something.
00:09:47.000 I mean, maybe it is.
00:09:48.000 It doesn't expose anything.
00:09:49.000 No, it doesn't.
00:09:50.000 You know, it could be, I don't know.
00:09:52.000 I don't know if her and Benny are friends.
00:09:54.000 If her and Benny are friends, Benny could have said, hey, snap me a pic so I can tweet it.
00:09:57.000 You know, I don't know though.
00:09:58.000 I'll give you credit.
00:09:59.000 Yeah, exactly, right?
00:10:00.000 People were very upset that this was a closed-door hearing.
00:10:03.000 So maybe this is Bobert's way of protesting closed-door hearings.
00:10:06.000 They're like, no, let's just make the world know that Hillary Clinton is being deposed today.
00:10:10.000 But they already know she is.
00:10:10.000 I mean, that actually.
00:10:12.000 Is the idea here that they're like, they're going to send in a fake person and they have to have a photo of it to prove that it's real?
00:10:17.000 Otherwise, it's a good idea.
00:10:18.000 Her double, her body double.
00:10:19.000 Yeah.
00:10:20.000 No, but my concern.
00:10:21.000 I think Bobert was concerned this was going to be behind closed doors, going to get swept under the rug, and everyone's going to deny to forget about it.
00:10:27.000 And she's like, I do not want to forget about this moment.
00:10:29.000 No, make it noise, make it a big deal.
00:10:31.000 That she actually showed up.
00:10:32.000 Is that what you're talking about?
00:10:33.000 Just that Hillary Clinton's being deposed on Epstein.
00:10:37.000 They tried to do it behind the closed doors for a reason.
00:10:39.000 And Bobert was probably like, fuck that.
00:10:40.000 I mean, maybe that, maybe there's some substance to that.
00:10:42.000 Like, the idea of having photos of Hillary Clinton getting, you know, reading the Riot Act by Congress or being questioned by Congress makes Hillary Clinton look bad.
00:10:53.000 It's red meat for Republicans.
00:10:54.000 They love to see Clinton sweat, you know?
00:10:57.000 So, I mean, maybe that's got to be.
00:10:59.000 So, are we going to see a photo of Bill now?
00:11:01.000 Didn't in the blue dress.
00:11:03.000 Oh.
00:11:05.000 Your mind's where I'm at, Brett.
00:11:07.000 One of you guys has to say, Jimmy.
00:11:08.000 Have you ever seen that photo of Hillary Clinton when she's in the rundown apartment and she just looks disgusted and freaked out?
00:11:15.000 I love that photo.
00:11:16.000 She looks confused.
00:11:17.000 She literally looked like all the apartments I lived in when I was in my 20s.
00:11:21.000 That's what I meant by the professional makeup.
00:11:22.000 Yeah.
00:11:22.000 Because when she was campaigning, I mean, it was a complete difference overnights between how her hair was done, how her makeup was done.
00:11:29.000 All of a sudden, she went back to, oh, this is who I really am.
00:11:31.000 And it was like the physical masking.
00:11:33.000 Carter brought up the picture.
00:11:34.000 I mean, she looks her age.
00:11:37.000 You know, she's looking her age.
00:11:38.000 I don't know how old she's.
00:11:39.000 I think she's like pushing in herself.
00:11:41.000 Yeah.
00:11:41.000 She's pretty old.
00:11:42.000 74.
00:11:44.000 What's the guess?
00:11:46.000 Taking bets on.
00:11:47.000 The only thing I thought when the before and after the campaign thought, I'm like, man, my tax dollars, you know, or donor dollars went to Botox.
00:11:55.000 Yeah, Botox and hair and makeup.
00:11:58.000 Not while she was campaigning.
00:12:00.000 That was donor dollars, but when she was in office.
00:12:02.000 It's like, she's 78.
00:12:04.000 Wow.
00:12:04.000 Really?
00:12:05.000 So does that mean she looks good for her age then?
00:12:05.000 Wow.
00:12:07.000 No, but if Elad were here, you might make that argument.
00:12:09.000 He's the guy that's constantly thirsting for Hillary.
00:12:12.000 He's big on the shit.
00:12:13.000 Wait, for what?
00:12:15.000 What?
00:12:15.000 Is that a real thing?
00:12:16.000 Well, no, he says that when she was younger, she was very pretty.
00:12:18.000 Yeah, they say she looked like Sabrina Carpenter looks like Hillary Clinton when she was younger.
00:12:22.000 I mean, that hot.
00:12:23.000 I don't think Sabrina Carpenter is all that hot.
00:12:25.000 Oh, really?
00:12:26.000 She's pretty when she was younger.
00:12:28.000 Was that?
00:12:28.000 Neither was Hillary when she wasn't.
00:12:29.000 Yeah, I don't think so.
00:12:30.000 That's why I brought it up.
00:12:31.000 Rewriting his kind of crazy.
00:12:33.000 Hillary was kind of like a poster child for the military industrial complex in 2016, 17, 18.
00:12:39.000 And everybody is kind of like, I think they want their vengeance now.
00:12:42.000 They just want to see Hillary Clinton pay.
00:12:43.000 And it's like, bro, she's such a pawn in this whole world power thing.
00:12:48.000 That's not what people think, though.
00:12:49.000 They don't think she's.
00:12:50.000 That's awful.
00:12:50.000 They think that she's the one who's the queen.
00:12:54.000 Actually, I think one of the good point to this might be the fact that the Epstein files have become such a divisive issue within the Republican Party that trying to refocus it around somebody that's a Democrat is a good thing for people in Congress who are looking to kick the can down the line, not have to deal with it, you know, blowing up the Republican Party like it has been the last couple of months.
00:13:13.000 Yeah, we dug into it, and there are actually no photos of Hillary Clinton with Jeffrey Epstein.
00:13:17.000 So it's possible that she didn't really know him that well.
00:13:20.000 It's possible that, you know, maybe she met him at dinner or whatever, but that, you know, not long enough to stand for a photo op or whatever.
00:13:27.000 Now, obviously, Bill Clinton, that's a totally different story.
00:13:31.000 And so people are like, people, you know, make the assumption, well, you know, Bill knew him.
00:13:35.000 But then again, the reasons that Bill knew him, maybe Hillary wasn't around.
00:13:39.000 Yeah, no reason for her to be there.
00:13:42.000 Yeah, you know, it's like, in fact, she wasn't invited.
00:13:44.000 Yeah.
00:13:44.000 Yeah, definitely not invited.
00:13:46.000 He's on one of those boys trips that he was on.
00:13:48.000 Honey, I need you here.
00:13:49.000 Yeah.
00:13:50.000 I wasn't surprised that she stayed with Bill when he got a blowjob in the Oval Office because it was Bill Clinton.
00:13:56.000 He was the president.
00:13:56.000 And like, wife stands by her man.
00:13:58.000 But damn, that just probably just wrecked their relationship.
00:14:02.000 Bill off-womanizing, Hillary just dealing with it, getting bitter.
00:14:05.000 They'd say that it wasn't a relationship to begin with.
00:14:07.000 A lot of them are matters of political convenience.
00:14:09.000 They're basically like arranged marriages in politics.
00:14:13.000 I mean, I know, I remember there were a lot of videos that came out after the things were made public where there was distance between them.
00:14:23.000 You could kind of tell.
00:14:24.000 But Bill's philandering was well known long before Monica Lewinsky, long before it became national news.
00:14:31.000 I mean, there were rumors of Bill when he was in Arkansas, Arkansas.
00:14:36.000 Yeah.
00:14:37.000 Yeah.
00:14:37.000 So that was kind of par for the course for good old Slick Willie.
00:14:42.000 You know, regarding, I heard that Jesus came up with that now.
00:14:45.000 You look great.
00:14:47.000 Ghelane attended Chelsea Clinton's wedding in 2013 or something.
00:14:51.000 Oh, did she?
00:14:52.000 Yeah, I just read that.
00:14:53.000 She made some, she did make some remarks.
00:14:54.000 I'm not sure where I saw it, but I'm not sure.
00:14:56.000 It might be over here, and I don't want to turn away.
00:14:58.000 But she made some remarks about her acquaintance about being an acquaintance of Ghelane.
00:15:01.000 So she's like, it's like, Bill's out on boys' trips with Epstein, and Ghelane is out on girls' trips with Hillary.
00:15:06.000 Ghelane was literally Literally being like the, she was, she was being the wingman.
00:15:10.000 She's like, let me take care of your wife.
00:15:12.000 You go have fun.
00:15:12.000 I wouldn't be surprised, too, if, like, because Epstein was dealing with such dark stuff that Bill's like, Hillary, you're never going to be any part of this part of my life.
00:15:20.000 I'm going off to do the dirtiest deals with the darkest money.
00:15:25.000 I don't want you anywhere near it.
00:15:26.000 I don't think Bill was looking for money.
00:15:27.000 I'm not going to be attributing chivalry now to Bill Clinton.
00:15:30.000 More like, just if this ever gets blown up in the press, I don't want you connected to him.
00:15:33.000 Like that kind of thing.
00:15:34.000 Well, that's awfully nice of him.
00:15:35.000 Yeah.
00:15:36.000 It was that way.
00:15:38.000 Bill, the altruist Clinton.
00:15:40.000 I think most people look at it as the one on the island with Epstein with his girls.
00:15:43.000 I think most people would look at it as the other way around, whereas she would be the one who's maneuvering him like a pawn and working behind the scenes to get him where he needs to go as a politician because he's too, you know, her idea might be he's not smart enough to do it on his own.
00:15:56.000 He's the good face of the Democratic Party at the time, but he doesn't necessarily have the ruthlessness that it takes to succeed in politics.
00:16:03.000 So she is the ruthless one and he is the face of the movement.
00:16:06.000 Yeah, if I understand correctly, people that have met Bill Clinton, they're like, you know, it makes perfect sense that he was a president.
00:16:11.000 When you meet him, it feels the same thing said about Barack Obama and stuff.
00:16:15.000 Like when you meet him, and also the same thing said about Donald Trump, he remembers your name.
00:16:20.000 You feel like there's no one else in the room.
00:16:21.000 It's like he's not.
00:16:22.000 That's a dangerous thing.
00:16:23.000 Oh, yeah.
00:16:24.000 When who was it that came here?
00:16:27.000 Larry Elder came to Tim Cast for the second time.
00:16:31.000 He shook my hand and said my name.
00:16:32.000 I'm like, there's no way.
00:16:34.000 Like, I was like, I met him for like 10 seconds the time before, like when they were leaving.
00:16:39.000 And he came and said, hey, Brett, how's it going?
00:16:41.000 So there's the only thing I can think is like somebody in the car was like, these are the people that work there so that you can remember their name.
00:16:47.000 And I was like, that was crazy to me.
00:16:49.000 I'm like, that was actually made me less trustful.
00:16:51.000 In theater school, I used to do that when I would be like a sophomore.
00:16:54.000 When all the new freshmen would come in, all their headshots would be on the wall.
00:16:57.000 I just go in the room and stare at the wall for like 20 minutes at all the headshots and names and memorize faces and names because it's such an important part of that industry.
00:17:05.000 It's memorized.
00:17:05.000 Yeah.
00:17:06.000 I could never be a politician.
00:17:07.000 I can't remember anybody.
00:17:08.000 It's part of power.
00:17:09.000 It's not psychological.
00:17:10.000 Your name is an anchor point in your brain.
00:17:13.000 So when somebody says it, all of a sudden you are hooked.
00:17:16.000 I mean, there's studies about this stuff.
00:17:17.000 That way they have the hostage negotiator constantly say your name on their own.
00:17:20.000 Absolutely.
00:17:21.000 There's an anchor point.
00:17:22.000 People say that to control a demon, if you know its name, you can control it like in mythology.
00:17:27.000 And so demons will hide their names because probably that very power, that intrinsic vibration that pulls you and changes you, just hearing that sound.
00:17:36.000 That went to a weird place, man.
00:17:38.000 It does.
00:17:39.000 It's only going farther.
00:17:40.000 We're only getting started.
00:17:41.000 We're only 15 minutes in.
00:17:43.000 Ian's driving the car.
00:17:44.000 We're all passengers.
00:17:45.000 Vibration.
00:17:46.000 Ian's driving.
00:17:46.000 We'll get to resonation later where the field itself moves.
00:17:50.000 But let's just.
00:17:51.000 That's why I will never hold office.
00:17:52.000 And I can't remember anybody else.
00:17:53.000 It's a family gap, right?
00:17:54.000 We went from Hillary or Clinton to demons.
00:17:56.000 Yeah, you know, they actually.
00:17:56.000 You can't remember a demonstration if it doesn't give you the right ones.
00:17:59.000 Yeah.
00:18:00.000 You ever ask someone for their name and they won't tell you?
00:18:02.000 Because they're afraid.
00:18:03.000 I mean, it's weird.
00:18:03.000 No.
00:18:05.000 Yeah.
00:18:06.000 So I'm just sending photos on the bottom.
00:18:06.000 No.
00:18:09.000 By the way, what you were just talking about, I've actually attributed Ian having that same power.
00:18:15.000 Where the day I met Ian, I said, everybody in your life should look at you the way Ian looks at you when you're talking for the first time.
00:18:23.000 Because it's like nothing else in the world exists.
00:18:25.000 That's a skill set.
00:18:26.000 But you were saying awesome stuff, too.
00:18:27.000 You were like really informative and like explaining a bunch of stuff to me.
00:18:31.000 That's important to be able to give somebody your full attention.
00:18:34.000 Yeah.
00:18:35.000 Yeah.
00:18:35.000 I still remember what Ian said to me the first day I came to work here and was like, Welcome home.
00:18:39.000 And I never forgot that.
00:18:40.000 My mom loved it too.
00:18:41.000 She's like, I like Ian.
00:18:42.000 First thing Ian ever said to me was the N-word.
00:18:45.000 You want to do it again?
00:18:46.000 Man, if we were online right now.
00:18:47.000 I'm lying.
00:18:49.000 It's not true at all.
00:18:49.000 It's not true.
00:18:50.000 Nice to meet you.
00:18:51.000 Yes, that's exactly what it is.
00:18:54.000 So, yeah, I mean, look, I don't think this is actually even really big news.
00:18:58.000 So I feel like we could kind of move on.
00:18:59.000 We covered it because it was kind of like the thing that was all over the headlines and stuff.
00:19:04.000 But yeah, there's not really any significant substance and nothing was really said in the she have multiple days she has to testify or is it just this one day?
00:19:11.000 Um, I don't know.
00:19:12.000 I think that I think that the actual committee decides that.
00:19:15.000 Okay.
00:19:15.000 They feel like they get through and they all the questions are answered and everyone gets.
00:19:18.000 There was a statement after I saw that today, too.
00:19:21.000 And it was just a just a bunch of, you know, why did you do this?
00:19:24.000 Why did you have me come in?
00:19:25.000 I'm summarizing, you know, but it was more of a you allowed all these other people to skip this hearing and just provide a written statement.
00:19:32.000 I provided a written statement too.
00:19:34.000 She was saying this was nothing but politics, of course.
00:19:36.000 You know, that's why you dragged me in here.
00:19:38.000 It's like, well, yeah.
00:19:39.000 Yeah.
00:19:39.000 I was going to say, well, you're right.
00:19:41.000 She's right.
00:19:42.000 You know, I mean, you're a politician.
00:19:43.000 So it follows.
00:19:45.000 You're one of the most powerful people in the world.
00:19:48.000 You're arguably the most powerful woman in the world.
00:19:51.000 Yes, it was politics.
00:19:52.000 This goes back to that comment that Trump said won the first campaign.
00:19:55.000 Yeah.
00:19:55.000 Because he'd be in jail.
00:19:56.000 She'd be in jail.
00:19:58.000 Yeah.
00:19:59.000 So, all right, we're going to move on to this story here.
00:20:01.000 The CEO of the World Economic Forums quits after Epstein's ties are come to like, excuse me.
00:20:08.000 Sorry.
00:20:09.000 Blessings, sir.
00:20:11.000 From Reuters.
00:20:12.000 In Zurich, the president and CEO of the World Economic Forum, Borg Brende, said he was stepping down on Thursday, a few weeks after the forum launched an independent investigation into his relationship with the late U.S. sex offender, Jeffrey Epstein.
00:20:27.000 Brende, who became president of the WEF in 2017, announced his decision in a statement following disclosures from the U.S. Justice Department that showed the Norwegian had three business dinners with Epstein and had also communicated with the disgraced financer via email and text message.
00:20:43.000 After careful consideration, I have decided to step down as president and CEO of the World Economic Forum.
00:20:48.000 My time here, spanning eight and a half years, has been profoundly rewarding, said Brendan, a former Norwegian foreign minister.
00:20:54.000 Issued by the WEF, the statement made no mention of Epstein.
00:20:58.000 However, Brende said Brende told Norwegian media he was sorry about how he had handled his dealings with the American and that it did not want the issue to be a distraction for the forum, which organizes the annual Davos Summit.
00:21:10.000 This is, again, this is kind of a it's it's interesting to see the kind of the repercussions of the Epstein files all over Europe.
00:21:20.000 And we're just not seeing anything here in the United States.
00:21:23.000 I mean, we're talking about people stepping down, Casey Wasserman stepping down from the Wasserman agency.
00:21:28.000 Yes.
00:21:29.000 And his weren't even ties to Epstein.
00:21:30.000 They were ties to Ghalain Maxwell.
00:21:32.000 Yeah, and Pritzker's brother, the guy from the, I think it's Hyatt, is the hotel chain.
00:21:38.000 Bill Gates just like said, sorry, and just kept his job.
00:21:42.000 Gates is like, I have a little bit of interest.
00:21:44.000 He's like, I had two affairs.
00:21:45.000 Technically, Bill Gates stepped down from IBM a long time ago, right?
00:21:45.000 That was the real.
00:21:49.000 Now he's just... Microsoft, yeah.
00:21:50.000 Yeah.
00:21:51.000 Well, yeah, Microsoft, my bad.
00:21:52.000 But yeah, now he's just like a philanthropist who's trying to mutate mosquitoes and stuff.
00:22:00.000 Saying that he's trying to cure malaria, but we all know what he's really trying to do.
00:22:04.000 He wants to implant robots in you.
00:22:06.000 I'm just kidding.
00:22:07.000 Oh, that's true.
00:22:09.000 Implant and live long and prosper.
00:22:11.000 He was a big proponent way back when, too, of the COVID era.
00:22:18.000 He's like machine man.
00:22:20.000 I don't know.
00:22:20.000 I don't want to go too hard.
00:22:21.000 Musk is when I think about machine man now.
00:22:24.000 All these tech, these techno, what do you call them?
00:22:26.000 Tatsuo, technocrats.
00:22:27.000 I don't know.
00:22:28.000 This isn't any surprise from him, though, because even, I mean, obviously he divorced his wife.
00:22:28.000 Tetsuo.
00:22:33.000 Yeah.
00:22:33.000 And you talked about Bill Clinton.
00:22:35.000 I think she divorced him, right?
00:22:36.000 Because of this stuff.
00:22:36.000 Yeah.
00:22:37.000 So did it come?
00:22:38.000 He had so many affairs when he was at Microsoft.
00:22:42.000 So just like Bill Clinton, it's like this wasn't anything that was unknown.
00:22:45.000 So it wasn't a surprise to me to actually see him linked in this manner whatsoever.
00:22:50.000 But for him to just come out after everything else is known about him, this response was pretty appropriate.
00:22:54.000 Yeah.
00:22:55.000 Saying like, yeah, I did two Russian girls.
00:22:57.000 Nobody benefited more from that divorce than like NGOs and non-profits.
00:23:02.000 Like Melinda Gates is just giving it away.
00:23:04.000 Same thing with Mackenzie Bezos.
00:23:06.000 They're just like, I mean, look, it's fine that they get divorced and they're like, I was with him when he made his money.
00:23:12.000 Fine.
00:23:12.000 Half of it's mine.
00:23:13.000 The guys are set.
00:23:14.000 They're fine too.
00:23:15.000 I hate the fact that the women are giving away money, particularly because of who they're giving it to.
00:23:20.000 They're giving it to all these progressive causes and stuff.
00:23:22.000 And it's just like, man, can't you find something better to do with that money?
00:23:26.000 Even causes with ties to terrorist organizations.
00:23:29.000 Yeah.
00:23:30.000 Yeah, like it because it's all just misplaced empathy.
00:23:35.000 You know, they're like, oh, look, this makes me a good person.
00:23:37.000 These poor are suffering, whatever.
00:23:39.000 I'll give this money away.
00:23:41.000 And really, it turns into money going to terrorists or going to organizations that are looking to do things like gendered reassignment surgeries for children or stuff like that.
00:23:52.000 It's all just the most nefarious stuff out there.
00:23:55.000 And they're just like shoveling cash at these groups.
00:23:57.000 Do you think that's on purpose?
00:23:58.000 Okay, so let's rephrase that question.
00:24:00.000 So I kind of have the same point.
00:24:02.000 It's like when you give to the nonprofits and the NGOs, especially if you don't do like a bunch of research into where the money goes, even if we don't want to talk about like shady places they could be giving it to, but whether they're spending the money well, right?
00:24:13.000 Like how much of it is actually going to whatever cause you're raising and how much is it is going to employee salaries and things like that.
00:24:19.000 Do you think it's a form of offloading their kind of their empathy on this company or are they doing it specifically because there's like nefarious stuff going on and they want to spread the money around to nefarious causes?
00:24:31.000 Honestly, I think that they just got a boatload of money and it looks good.
00:24:34.000 They got a boatload of money that they didn't have to work for and they're just like, man, I got all this stock and I can sell some stock and, you know, it'll piss my ex-husband off.
00:24:42.000 And, you know, I'm going to give this away to this group and this group and this group.
00:24:45.000 I don't think that they look into it.
00:24:47.000 I don't think that they're malicious or they're like, oh, I want to help terrorists or anything.
00:24:51.000 I think I think they believe the face of whatever NGO that they're talking about or whatever organization they believe with their mission statement, their public-facing mission statement on their website is they're like, oh, they seem nice.
00:25:03.000 Let me give them a billion dollars and let me give them, you know, or 50 million or whatever number it is.
00:25:09.000 The funnier version of this is like if Mackenzie Bezos and Melinda Gates just start giving money to their husband's competitors.
00:25:16.000 Like just start funding all the people going against them.
00:25:20.000 I mean, that would be the ultimate spite move, wouldn't it?
00:25:23.000 Every time he like complained about something when he got home from work about some dude he just doesn't like, oh, this guy's a jerk.
00:25:28.000 He wouldn't sign this business deal.
00:25:30.000 She just starts giving money to all the people he complained about when they were eating dinner.
00:25:33.000 Mackenzie Bezos right now and all of Elon's former women.
00:25:37.000 Yeah, seriously.
00:25:39.000 Like I'm giving to Open Claw.
00:25:42.000 I'm giving to Claude right now.
00:25:44.000 Anthroopics make me killing.
00:25:49.000 But again, back to the story, like this, the fact that there's all these repercussions that are going through even the government of the UK and nothing's really happening here in the States when it comes to anyone that's alleged to be involved.
00:26:07.000 None of the lawyers or none of the friends are, nothing seems to be going on.
00:26:11.000 I mean, as far as arrests or just in general, nothing's changing.
00:26:14.000 Remember, I'm team, nothing ever changes.
00:26:16.000 Yeah, I mean, I understand that.
00:26:18.000 But I mean, you know, you don't, you only, like you said, there's two guys that have stepped down from their positions.
00:26:25.000 Well, it's the first time in how many hundreds of years at a royal in the UK you're referencing.
00:26:30.000 Yeah, I mean, stripped of his titles, arrested, thrown in prison.
00:26:33.000 I think that's true.
00:26:34.000 Yeah.
00:26:35.000 In Casey Wasserman in the U.S., that was a little bit different because it's connected directly to Hollywood and all of those people.
00:26:40.000 It became a virtue signal on behalf of all the people that he represented to leave on behalf of, you know, making sure that their audience, because every one of those clients, whether it's music, movies, they all have their own self-interest.
00:26:52.000 They can't be seeing being attached to this guy.
00:26:54.000 And his whole business model is to be attached to individuals, not necessarily to a product.
00:26:59.000 So it makes sense that he would get, you know.
00:27:01.000 Yeah, I mean, well, they asked him to, like, he's going to sell his own agency that bears his name.
00:27:07.000 Like, that's even crazier, like, his company with his name, and they're trying to buy him out so that he has to leave.
00:27:13.000 Yeah, I mean, it's if I imagine if they'll, or I wonder if they'll change the name after he leaves.
00:27:19.000 Probably.
00:27:19.000 You know, just kind of tainted break the ties.
00:27:22.000 How deeply was he entrenched with Epstein?
00:27:25.000 I mean, he wasn't attached to Epstein, from what I understand.
00:27:27.000 He had emails.
00:27:28.000 He had like an affair with Ghelane Maxwell, if I'm remembering correctly.
00:27:32.000 Which, I mean, to be honest with you, Ghelane is just as bad as Epstein.
00:27:35.000 As much as Epstein kind of gets the focus all the time, she was trafficking just as much as Epstein was.
00:27:41.000 Maybe she wasn't actually engaging in the rape of minors like Epstein did.
00:27:47.000 But, you know, like she was helping out and she was making sure that there was young people that were available for Jeffrey to abuse.
00:27:56.000 She would refer to them as Nubiles and they'd drive around New York looking for new biles, you know, underage women, basically, and find them like hot chicks that they thought were going to be models, basically.
00:28:06.000 And they're like, let's get them.
00:28:08.000 Let's stop.
00:28:08.000 They literally would stop on the street.
00:28:10.000 I don't know if this is true, but like they'd park the car and be like, hey, you, you're exactly who I'm looking for.
00:28:15.000 Want to be famous?
00:28:16.000 And the kids are like, yeah, I'm 16.
00:28:18.000 I'm an aspiring model.
00:28:19.000 Nubils, Ghylaine would call them.
00:28:22.000 So I've heard.
00:28:23.000 I don't know if it's true or not.
00:28:24.000 How crazy that she had a name for it.
00:28:26.000 There's two things that really bother me about the whole Epstein arc.
00:28:26.000 What's that?
00:28:30.000 The first is that we don't ever really see a lot of the details, you know.
00:28:34.000 So like in this story right here, there's text messages and everything.
00:28:37.000 Some of the emails make it out, you know, and it's very clear as to what was going on.
00:28:40.000 But in this case, Epstein almost seems like he's the black spot.
00:28:44.000 You know, if you had any association with him whatsoever, then you're shamed for life.
00:28:49.000 You know, even if you just had a phone conversation with him at one point where he was saying, hey, come out to my island.
00:28:54.000 You're like, no, that's okay.
00:28:55.000 I don't need to.
00:28:56.000 But then you just talk even in the slightest terms of a business deal with him.
00:29:00.000 Because he was a financier, hands down, right?
00:29:02.000 That's what he did.
00:29:03.000 But then that leads into the second thing is this has been lingering on so long.
00:29:09.000 I'm wondering when this story arc comes to an end because it's been, how long is it?
00:29:15.000 How long has it been now?
00:29:15.000 Really?
00:29:16.000 Well, he was arrested in 2018, correct?
00:29:18.000 Yeah.
00:29:19.000 I mean, he had his first conviction well before that.
00:29:21.000 Yeah.
00:29:21.000 Yeah.
00:29:21.000 Wasn't he like out on probation or like house arrest for like a number of years?
00:29:25.000 No, but however, most people continued to work with him after, but like plenty of people continued to work with him, even though he had convictions already.
00:29:32.000 Didn't matter.
00:29:33.000 Yeah.
00:29:34.000 I agree fully with what you're saying, Rick.
00:29:36.000 This feels like a cudgel that will be used for decades until all these people are dead and gone that associated with Epstein.
00:29:43.000 There's in the back of their mind, they're like, shit, if it drops that I made a phone call with Epstein one time, and like, what?
00:29:47.000 It's the name, the guy, like, come on, it's like Hitler.
00:29:50.000 Yeah, obviously the Nazis were bad.
00:29:51.000 It was horrible.
00:29:52.000 But like people that get any association with Nazi Germany, Hitler, any of that is like, was the most demonic association for you?
00:30:00.000 You shouldn't be afraid to name your kid Adolph.
00:30:02.000 Shouldn't be.
00:30:02.000 Yeah, Adolf's a beautiful name.
00:30:04.000 Hitler's a cool name, too.
00:30:05.000 Just turns out the psycho has ruined from here on up.
00:30:08.000 It's also a form of selective enforcement.
00:30:10.000 Look at what happened with Weinstein and the amount of people who were caught in the Weinstein net.
00:30:15.000 Leslie Hedland is still working and she was Weinstein's assistant.
00:30:18.000 And people are like, look, am I supposed to believe that you didn't know what was going on?
00:30:21.000 And most people are like, no, I don't buy that.
00:30:23.000 But she's still working in Hollywood just because they don't actually enforce the rules equally because it's all about who you know.
00:30:28.000 Everybody in Hollywood over a certain age had had awareness of it.
00:30:33.000 They were making jokes about Epstein at the Golden Globes or something.
00:30:38.000 Weinstein.
00:30:38.000 Yeah, Weinstein.
00:30:39.000 Yeah, Harvey.
00:30:40.000 They were making jokes about Harvey.
00:30:41.000 My bad.
00:30:42.000 They're making jokes about Harvey from stage, you know.
00:30:45.000 And it was, so it was an open secret in Hollywood.
00:30:48.000 I mean, Weinstein was laughing at the jokes.
00:30:50.000 The famous Ricky Gervais joke.
00:30:52.000 Yeah.
00:30:53.000 I remember that.
00:30:54.000 So, I mean, it's like if you're over the age of 35 or 40 in Hollywood, you knew, and it took a long time for people to come out.
00:30:54.000 Yeah.
00:31:06.000 I mean, someone like Oprah Winfrey was tons of pictures with him, friends with him, buddy, buddy, buddy.
00:31:12.000 And of course, when it comes out the things that he did and the coercion, she just doesn't say anything, but nobody's like, hey, Oprah, how come you're still like a queen of media or what have you, even though you were definitely buddy-buddy with Harvey Weinstein, you know?
00:31:30.000 Well, she's also deeply embedded in the production side of things.
00:31:33.000 So she spread her money around to finance projects.
00:31:36.000 She's not just somebody who's front of the camera.
00:31:38.000 You know, it's the same thing.
00:31:39.000 There's actually this weird thing where every time Mel Gibson makes a new movie, even though like so many people in Hollywood hate him and the public loves him, somehow there will be like nice reports written about the movies that he's making because he's deeply felt within the production side of things.
00:31:55.000 So he can go to these outlets and they can write some favorable pieces about him.
00:31:59.000 And then he'll be right next to a hit piece from somebody else who doesn't like him because he's got so many ties to the behind the scenes stuff in the industry.
00:32:06.000 He's also got 400 ones.
00:32:08.000 Not that any other day as not being in the Epstein files.
00:32:11.000 He was intentionally named as not being in there.
00:32:14.000 He also made $400 million off of the movie about Jesus that he made.
00:32:21.000 They're also like the pictures of George Bush.
00:32:24.000 They're like, not even in the Epstein files.
00:32:26.000 Bombed kids overseas just for the hell of it.
00:32:30.000 You didn't even have to force him.
00:32:31.000 Yeah.
00:32:32.000 Yeah.
00:32:32.000 No, no, no coercion at all.
00:32:34.000 Yep, totally.
00:32:35.000 Love of the game.
00:32:36.000 Good lord.
00:32:38.000 It is important that we don't demonize people for having connections to someone that's a vile creature.
00:32:43.000 Like, just because they knew a guy or they had a dinner with him eight years ago and then the guy went off and did psycho shit, like that doesn't mean you're a psycho.
00:32:50.000 It's okay to associate or have associated with crazy people in the past.
00:32:54.000 It doesn't make you a villain.
00:32:54.000 Doesn't make you crazy.
00:32:56.000 So it's really sad.
00:32:56.000 It's not illegal.
00:32:57.000 Like when people are like, shit, my name is attached to the guy.
00:32:59.000 I got to resign from all my things.
00:33:01.000 Maybe there's something going on, the World Economic Forum guy.
00:33:03.000 Maybe something deeper was with that guy.
00:33:04.000 And he's like, I got to get out of here before they start asking questions, maybe.
00:33:07.000 But the shame of running away from your job because you got named in an email from 18 years ago is like, bro, that's the problem.
00:33:13.000 That's the public perception.
00:33:14.000 And everybody's going to hate me for making this statement.
00:33:16.000 But I mean, listen, the public got over Diddy, right?
00:33:19.000 Yeah.
00:33:20.000 I think the public needs to get over Epstein, though.
00:33:22.000 The world needs to get over Epstein because I don't see any real moves towards actually preventing, limiting, going after any kind of child trafficking anywhere else.
00:33:32.000 I mean, look.
00:33:33.000 I just don't think.
00:33:34.000 I think, at least when we're talking about the Twitter sphere and the people on there, I just don't think that they're going to get over it.
00:33:40.000 I don't blame them for the most part because I make the joke all the time.
00:33:43.000 I say, I'm team, nothing ever changes because it does feel that way.
00:33:46.000 And it's an offshoot of that where you're like, you look at these people do awful shit.
00:33:50.000 You see them break the rules.
00:33:52.000 You see them ignore the will of the people and nothing ever changes.
00:33:55.000 And there's no greater incidence of that than knowing that all of this stuff is happening and knowing that nobody's going to be held accountable.
00:34:02.000 And it's even worse when you see it happening to somebody in the UK.
00:34:06.000 But in the U.S. where all this was based, they're just like, I understand where the black pill comes from that.
00:34:11.000 Now I understand that there's a gap there between the people who are policy-minded and who are saying that we need to get over this because we have other things we need to worry about.
00:34:19.000 And this can't be the only thing that we focus on.
00:34:22.000 But when you're talking about the abuse of children, that's just a cord that's very hard for some people to separate from.
00:34:27.000 You know, it's funny because, well, it's interesting that there's all this focus on the Epstein files and the terrible things that Epstein did, but people don't really have the same, or at least the left doesn't have the same kind of outrage over all the children that were trafficked through the southern border when Joe Biden.
00:34:44.000 Like the only reason they care about Epstein at all is because it's tangentially connected to Trump.
00:34:49.000 Absolutely.
00:34:50.000 There were far more kids that were hurt and died and abused, you know, by cartels that were trafficking children over the border all the time during the four years that Biden was president, and they don't say a word, not a peep.
00:35:05.000 The Super Bowl.
00:35:06.000 I mean, the Super Bowl weekend is the biggest trafficking weekend.
00:35:08.000 Every single year.
00:35:08.000 Yeah.
00:35:09.000 Yeah.
00:35:10.000 Yeah, that's true.
00:35:11.000 But why?
00:35:11.000 Like, just because of the travel into the country?
00:35:14.000 Hardies.
00:35:14.000 Yep.
00:35:15.000 People buy commodities, and unfortunately, commodities are children.
00:35:18.000 It's horrible.
00:35:19.000 What's I'm saying?
00:35:20.000 So the idea that people know that this exists and there isn't anybody, there's one person whose face has been made kind of the he's now the avatar for it.
00:35:28.000 And the only other person who's been held accountable somehow isn't able to give other names and bring anybody else to justice.
00:35:36.000 People can't buy that.
00:35:37.000 Like they don't buy that the one guy died and that the other one's in jail and then just nobody else.
00:35:42.000 And I don't think they should be expected to believe that.
00:35:43.000 I'm not saying that.
00:35:44.000 What I don't hear, man, is that you know, what are we actually doing about it right now?
00:35:48.000 No, I'm not saying it's not a, it's not a problem where it feels like things are getting lost and other stuff could be getting done.
00:35:53.000 But depending as one of the things I was saying the other day is like most of the people these days are one-track voters, right?
00:35:59.000 Like the other day I opened X and half the people are complaining about Iran and how we're going to have World War III.
00:36:05.000 Then we have people complaining about Epstein.
00:36:06.000 Then we have people complaining about glyphosate.
00:36:09.000 Everybody's a one-issue voter.
00:36:10.000 And if you're not taking care of their one issue, they're not going to support you anymore.
00:36:14.000 And that's made worse now by the fact that the internet gives you all the news all the time and you're bombarded by bad news constantly.
00:36:21.000 Yeah.
00:36:23.000 Everybody's a one-issue voter.
00:36:25.000 It feels a lot of people are.
00:36:27.000 I mean, not everybody, but maybe the plebs generally like common man that is emotional.
00:36:27.000 I'm not.
00:36:32.000 You always vote?
00:36:33.000 No.
00:36:33.000 I only vote if I know what I'm voting for.
00:36:36.000 That's always when you're like, I'm not a one-issue voter.
00:36:39.000 Don't vote.
00:36:39.000 I don't vote.
00:36:41.000 You don't have to be a one-issue voter if you don't vote.
00:36:44.000 Yeah, it's more about what.
00:36:46.000 I don't think anyone should ever feel like they're supposed to vote.
00:36:48.000 You got to know what you're doing if you're going to participate.
00:36:51.000 You know my feeling about it.
00:36:52.000 We need less voters.
00:36:54.000 Fewer voters, fewer people going to the polls, fewer people there.
00:36:59.000 Or more legitimate voters, you know, really people that understand what people are doing.
00:37:02.000 No, no, I'm pretty sure I want less.
00:37:04.000 Fewer.
00:37:05.000 Fewer votes.
00:37:06.000 This is a good bookend, but I just truly believe that all of this news about Epstein and everybody stepping down is really about protecting the organizations that are forcing these individuals to step down.
00:37:14.000 That's it.
00:37:15.000 Oh, yeah.
00:37:16.000 I agree.
00:37:16.000 Yeah, I agree.
00:37:17.000 It's not about the individuals.
00:37:18.000 It's not about doing what's right for society.
00:37:20.000 It's just about protecting the reputation of these organizations.
00:37:22.000 Yeah, I imagine the board of the WEF was like, bro, eat it.
00:37:26.000 You got to get out of here, man.
00:37:27.000 Get out.
00:37:28.000 Bill Gates can stay because we can't make him leave, but you're out of here.
00:37:32.000 Like I said, Bill Gates is already, you know, he's not at Microsoft anymore.
00:37:35.000 He's already made his bag and he's just giving it away and trying to mutate the mosquitoes.
00:37:40.000 You know, that's it.
00:37:42.000 Cure malaria.
00:37:43.000 That's what he wants to do.
00:37:44.000 He wants to use genetically engineered mosquitoes to manage to cure malaria.
00:37:48.000 What a freak.
00:37:49.000 So that word malaria is so funny.
00:37:49.000 Yeah.
00:37:52.000 Bad air.
00:37:53.000 Is that what that means?
00:37:54.000 Mal air?
00:37:55.000 I don't know.
00:37:55.000 Malaria.
00:37:56.000 I think it means bad air.
00:37:58.000 No one knows what, where.
00:37:59.000 I think it's a demonic word, though, man.
00:38:00.000 Seriously.
00:38:00.000 We better not name it.
00:38:01.000 You want to go to demons?
00:38:02.000 No.
00:38:03.000 Let's talk about dreams.
00:38:04.000 We're going to go to Pakistan right now.
00:38:07.000 From the first post, Pakistan's Khawaja Asif declares open war with Afghanistan after deadly border clashes.
00:38:14.000 It's worth noting Pakistan has nuclear weapons, but there aren't a lot of cities in Afghanistan that are worth using a nuclear weapon on.
00:38:25.000 Pakistani Defense Minister Khawaja Asif has declared an open war with Afghanistan after the Taliban administration said that its forces killed and captured several Pakistani soldiers during a cross-border offensive.
00:38:37.000 Our patience has reached its limits.
00:38:39.000 Now it is open war between us and you.
00:38:41.000 Kawa just said posted on X. Taliban's spokesperson.
00:38:46.000 What was that?
00:38:47.000 That's a great forum.
00:38:48.000 Declare war.
00:38:49.000 Taliban spokesperson Zabullah Majoud, Mujahid, I think, said in X posts on posts on X that multiple Pakistani troops were killed and others taken prisoner.
00:39:01.000 He added that a large-scale operation has been launched against Pakistani military positions along the Durand line in response to what he called repeated provocations.
00:39:09.000 Meanwhile, Pakistan's Interior Minister Mazef Navik said that Islamabad's retaliation to the Taliban attacks was a befitting response.
00:39:18.000 And blasts and gunfire rang out in the cities of Kabul and Kandahar under Operation Ghazib-il-Haq.
00:39:24.000 So you think that the PACs are going to nuke the Afghans?
00:39:29.000 Or do you think they'll waste a nuclear weapon on them?
00:39:32.000 Is that an atomic weapon?
00:39:33.000 Well, I mean, you would know better about this.
00:39:36.000 You like talking about the Middle East.
00:39:37.000 I don't know about that.
00:39:38.000 But like I said earlier, I don't think that there's a city in Afghanistan worth using a nuclear weapon on.
00:39:45.000 The PACs have to worry about India.
00:39:47.000 India's got nuclear weapons.
00:39:48.000 The PACs have nuclear weapons.
00:39:50.000 They're pointed at each other.
00:39:51.000 They hate each other.
00:39:52.000 I don't think that they'll waste them on the Afghans.
00:39:55.000 I'm surprised.
00:39:56.000 Well, actually, no, now that I think about it, not really.
00:39:58.000 I imagine the Afghans wouldn't have done this prior to the U.S. pullout, not because the U.S. is there, but because they didn't have the hardware after the U.S. left all those weapons, the U.S. is tacitly responsible for this conflict, I imagine, because I bet they're all running around with M16s and PVS-14s on their helmets at night, looking shooting at the PACs because they've got all this U.S. hardware.
00:40:27.000 All the weapons in Mexico?
00:40:29.000 Well, I mean, I imagine some made it there.
00:40:31.000 Well, Fast and Furious, yeah, but the guys in Pakistan have military stuff.
00:40:36.000 They're not, you know, they're not, they don't have civilian American civilian stuff, so they all got machine guns now.
00:40:42.000 So they've got, I'm sure they've got lasers and scopes and all the stuff.
00:40:46.000 You see all the propaganda videos that the Afghanistan or the Afghans have released after the U.S. pulled out.
00:40:54.000 They're all wearing uniforms.
00:40:55.000 They're all kitted up.
00:40:57.000 Not saying that they know how to use any of this stuff properly, but I think a couple of guys got into a Blackhawk and crashed it.
00:41:02.000 They did.
00:41:03.000 Yes.
00:41:03.000 I saw that.
00:41:04.000 That's happened twice now.
00:41:05.000 Oh, really?
00:41:06.000 You'd think that they would be like, let's find a place where we can send guys to learn how to fly this thing before we let them get in.
00:41:06.000 Blackhawks.
00:41:12.000 Well, you got YouTube.
00:41:13.000 You couldn't just look it up on YouTube.
00:41:15.000 I've been in a helicopter, and they're not easy to fly.
00:41:18.000 They're not easy.
00:41:19.000 I got to sit in the flight sims.
00:41:21.000 I got to sit in the flight sims.
00:41:22.000 I mean, I guess, but like, do you think they don't have internet like that in Afghanistan?
00:41:28.000 They're running like Nintendos.
00:41:29.000 They're not, they're not, you know, Nintendo's from 93.
00:41:33.000 They're not running modern flight sims.
00:41:35.000 Yeah, they got to get Sira Sim UH60L Blackhawk top-tier flight simulator add-on for Microsoft Flight Simulator X.
00:41:42.000 I mean, legit, that's what they should be doing if they want to learn how to fly this stuff.
00:41:45.000 Yeah.
00:41:46.000 Why are they not?
00:41:47.000 Sorry, guys, if I told the Taliban something you were trying to hide from.
00:41:47.000 I don't know.
00:41:51.000 I think they know about video games.
00:41:52.000 You're just picturing them on their phone watching a YouTube video about how to fly a blackhawk.
00:41:56.000 Yourself a PX-52 or those flight sticks?
00:41:59.000 I was just about to bust mine out for Elite Dangerous this game.
00:42:02.000 Yeah, the fucking.
00:42:03.000 So far, we've been square the Taliban tonight and all of Elon Musk's women.
00:42:08.000 The U.S. U.S. wants.
00:42:10.000 Well, what I heard was the military wants, what was it, that base?
00:42:14.000 Bagram Air Force Base that they had basically surrendered.
00:42:16.000 They want it back.
00:42:17.000 At least Trump had mentioned a year ago he wanted it back.
00:42:20.000 Yeah, I mean, look, if the U.S. wanted it, they would just go take it.
00:42:22.000 Yeah, you would think so.
00:42:23.000 It's very strange that the Taliban's going after a great media place, seriously.
00:42:28.000 You know, because Biden was the one that pulled out and everybody shames him for all that.
00:42:32.000 Imagine if Trump took it back.
00:42:34.000 He's like, you surrendered it.
00:42:35.000 No, like, he would go there.
00:42:37.000 That's what I'm saying.
00:42:38.000 He would go there.
00:42:38.000 He would go and stand on a tank.
00:42:42.000 That's the image.
00:42:43.000 Seriously.
00:42:44.000 I mean, we took it back.
00:42:45.000 Biden gave it up.
00:42:46.000 Took it back.
00:42:48.000 Kid can't let him have it.
00:42:49.000 It's too too valuable, too valuable um, I just don't see, I don't see that there's a.
00:42:54.000 I don't think there's a lot of value in it for the?
00:42:56.000 Right now they're, they're focused on Iran, you know they're, and whereas yes, you could, if we had Bagram, you could, you know, launch strikes from there um, but you know, I think the, the atoll in the Indian Ocean, is serving the purposes pretty well and you know, you got two carrier groups in the in the in the Middle East now.
00:42:56.000 U.s anymore.
00:43:19.000 So my question on this open war thing is, what does open war actually mean?
00:43:23.000 I mean, I get, I think, like a Gaza scenario, you know, is that what they're looking at to where it's like we're, we're just gonna eliminate the Taliban?
00:43:30.000 Now I don't, I mean, I don't know, I don't know that the Pakistanis can, I mean, I know that they, they have, they have an air force that that functions there.
00:43:37.000 There was a a big old dogfight between the packs and the Indians, maybe a year ago, year and a half ago um, and they got to.
00:43:46.000 People got to see I forget what kind of plane it was, but it was the first actual engagement and the packs beat the snot out of the Indians, if I understand correctly um, but I, I don't know that they, they have the ability to really, you know, wipe out the Afghans the way that the?
00:44:00.000 I know the Afghans don't have any significant um, they don't have significant anti-air.
00:44:00.000 U.s did.
00:44:06.000 You know they don't have a lot of sam sites or anything like that.
00:44:08.000 You know, it's not like they got them posted up in the mountains where they can shoot them down, but they I, I imagine they still have the stingers that that the CIA gave them 30 years ago, and those who worked against helicopters and just asked the Russians, you know.
00:44:21.000 So I, I mean, I don't know, I don't know the extent of what open war is but, like I said, I mean, if Pakistan wanted to, they do have nuclear weapons.
00:44:30.000 They have atomic bombs.
00:44:31.000 I don't know, I don't.
00:44:32.000 They have thermal nuclear weapons, but they have atomic bombs.
00:44:34.000 I don't think they've got missiles too.
00:44:35.000 Yeah, nuclear wars and no missiles yeah, so what percentage of the, the weapons used against us and against allies of ours, has just been left by the CIA or given to these people by the CIA?
00:44:46.000 Um, when it comes to Afghanistan, just in general, like how much of the of the weapons supply that's used against U.s and U.s forces is something that's like left over from a time when they might have been an ally oh, I mean like the Mujahedin.
00:44:59.000 So I mean Afghanistan the the, the combat that was going on in Syria um, the the Iraqis had had a lot of funding from the?
00:45:09.000 U.s because they were the?
00:45:11.000 U.s was funding Iraq when they were fighting Iran um, so I mean a lot, a lot.
00:45:17.000 I don't, I don't know about Vietnam.
00:45:18.000 I think the Vietnam, the Vietnamese were getting the north Vietnamese were getting funded by China.
00:45:22.000 So they were, they were getting Chinese aks and and stuff.
00:45:25.000 Soviet yeah, you know, but I Soviet bloc China.
00:45:29.000 So I mean a lot of it.
00:45:30.000 Though you know, the Afghanistan, they've got a, a lot of the fighting in Afghanistan, like once the?
00:45:37.000 U.s kind of took it like it was a lot of you know, just some dude on a hill with an old bolt action shot a couple rounds at a forward operating base and then the?
00:45:46.000 U.s put up a bunch of helicopters and blew up the top of the mountain, you know, but the dude is already gone.
00:45:50.000 He's like, I take a couple shots and run um, so I, I don't yeah, I don't.
00:45:54.000 I I don't know exactly how much, but i'm sure it's a lot.
00:45:56.000 Yeah, sure it's a whole lot.
00:45:58.000 Um, so I don't know, I don't think that this is going to turn into a broader conflict.
00:46:03.000 I don't, I don't know how how much.
00:46:06.000 I mean, what is?
00:46:07.000 What does Pakistan actually get from wiping out a bunch of Taliban 18 people?
00:46:12.000 Is that what it was?
00:46:13.000 I mean uh, let's see the 18.
00:46:16.000 Yeah, the Taliban government in Afghanistan said the attacks were in response to Pakistan strikes earlier this week, which are reportedly killed 18 people.
00:46:23.000 As Islambad said, it targeted alleged militant camps and hideouts.
00:46:26.000 Aren't they all militants?
00:46:28.000 Yeah, it's kind of.
00:46:29.000 Yeah, the the Taliban took 19 outposts on the border and then killed 55 Pakistani soldiers, according to this article from India.com.
00:46:37.000 Well, I mean, that's what is the American interest in this?
00:46:39.000 We don't.
00:46:40.000 Uh, you have a border state.
00:46:42.000 It's the border state of Iran.
00:46:44.000 Putting Afghanistan was all about, basically.
00:46:48.000 But Pakistan and Afghanistan are the other border.
00:46:51.000 Yeah, they're on the east.
00:46:52.000 Pakistan's on the eastern side of Afghanistan.
00:46:53.000 Yeah, so it's pretty far away from Iran.
00:46:56.000 Yeah, but it's securing Afghanistan.
00:46:58.000 Is that what your question was?
00:46:59.000 I mean, like, what is the U.S. interest in this at all?
00:47:01.000 Like, what is like, like, you were saying it borders Iran, but what's the, like, what is our interest right now to get involved or to stay out?
00:47:07.000 I don't think we would.
00:47:08.000 Well, to stay out.
00:47:09.000 I mean, like I said, if the U.S. were to take a side, which, I mean, the U.S. is friendly-ish with the PACs.
00:47:18.000 And so, you know, they're not so friendly with the Taliban.
00:47:21.000 So the U.S. ostensibly could decide they're going to say, okay, we're going to help the Pakistanis.
00:47:27.000 But, I mean, Pakistanis don't really need our help to kill Afghans.
00:47:32.000 And if the U.S. were to side with the Afghanis, you're talking about a country that has nuclear weapons.
00:47:38.000 They might possibly decide, all right, the Americans are coming.
00:47:41.000 We're going to blow up a nuke or nuke the Americans.
00:47:44.000 Not that I think that's going to happen, right?
00:47:46.000 I think that's incredibly unlikely.
00:47:48.000 But, you know, anytime there's a country that's involved in any kind of conflict, if they have nuclear weapons, that is part of the equation.
00:47:55.000 That's something you have to actually think about.
00:47:58.000 That's something that was talked about with Ukraine a lot.
00:48:01.000 How much will the U.S. get involved in helping Ukraine?
00:48:05.000 Because if an American is killed by a Russian, then you can easily imagine a situation escalating out of control.
00:48:11.000 And Russia and U.S. have the biggest nuclear arsenals on earth.
00:48:16.000 You were going to say something?
00:48:18.000 I think the long-term militaristic goal would be to subdue and eradicate the Iranian theocratic regime and then install a liberal economic Middle Eastern authority there, like the king basically of Iran, what's his name, Reza Pahlavi, put him back in, the third, and then allow Israel to basically govern the Middle East.
00:48:40.000 So one guy, Scott Horton and Martyrmaid, they do a show, which is really great.
00:48:46.000 And they were saying Daryl Cooper is Martyrmaid.
00:48:49.000 He said that instead of going to World War III, he thinks there's going to be like a quadripolar setup where the Chinese govern Northeast Asia, the Russians govern Middle Asia, then the liberal economic order governs the West, and then Israel governs the Middle East.
00:49:03.000 And then those four powers will kind of establish homogeny in some form, which is like the least worst outcome.
00:49:10.000 I can't stand theocracy.
00:49:12.000 I think, I mean, trying to govern with religion is insane.
00:49:16.000 It is not agile enough to form, to function as a government.
00:49:20.000 It comes out of a 2,000 year or what thousands of years text.
00:49:24.000 Anyway, depending on which religion you're talking about.
00:49:26.000 Depending on which religion you're talking about.
00:49:26.000 What's that?
00:49:28.000 If you had a legit religion, the American government's kind of like a religion.
00:49:31.000 I have faith in that constitution.
00:49:33.000 It works pretty well for the Saudis.
00:49:37.000 I don't know what's their monarchy.
00:49:40.000 They're not a theocracy, though.
00:49:41.000 They're definitely.
00:49:42.000 I mean, that's where Mecca is.
00:49:44.000 Wait a second.
00:49:45.000 Aren't they literally a theocracy, though?
00:49:47.000 Because like Saudi Arabia, I could be wrong.
00:49:49.000 Like Islam and Islamic law.
00:49:51.000 Yeah.
00:49:53.000 So, I mean, they are a monarchy, but they are.
00:49:56.000 If they weren't selling us oil, they'd be prime enemy number one if they weren't on board with the economic order right now.
00:50:01.000 And they've also done a lot to reintroduce American interests there by trying to get people to come out to Saudi Arabia.
00:50:07.000 Yeah, I mean, the Saudis are like, they have their government is definitely not something that I would want.
00:50:13.000 I wouldn't want to live under the Saudi rule, but they're one of the least worst in the Middle East.
00:50:22.000 The Emirates seems like it's all right for tourists.
00:50:26.000 Most Americans can go to Saudi Arabia.
00:50:29.000 You can go to Dubai.
00:50:32.000 Well, that's why they've put so much time and effort into kind of courting American celebrities.
00:50:38.000 They've paid WWE billions of dollars to do shows over there, billions of dollars.
00:50:43.000 Comedians, musicians, and a lot.
00:50:45.000 And most of them, they end up having to come back and answering questions from people who are like, you know, what are you doing over there?
00:50:52.000 You were kind of woke before.
00:50:54.000 What's with the human rights abuse is?
00:50:56.000 And then they have to either sidestep the question or just say they don't care.
00:51:00.000 Learn to dance, dance around the question.
00:51:01.000 Or dance around the question.
00:51:02.000 Yeah, they got to learn that Donald Trump weave.
00:51:03.000 Yes.
00:51:05.000 The political structure of Saudi Arabia, just kind of as a tangent, is an absolute monarchy, but also a theocracy.
00:51:10.000 But I'm like, who derives like legal who has authority there?
00:51:14.000 Is it the king or is it the prophet?
00:51:16.000 And it can't be the same guy unless your king is the prophet or is the imam or whatever.
00:51:21.000 The prophet would be Muhammad.
00:51:23.000 And like he, like.
00:51:23.000 Right.
00:51:24.000 Who represents that?
00:51:24.000 It would be the Imam, you know, of state.
00:51:26.000 Head of state was an imam.
00:51:27.000 But if the head of state's some secular king, that's like, yeah, that doesn't really seem to be a crowd.
00:51:31.000 He's not secular.
00:51:32.000 He's he's technically.
00:51:33.000 No, he's a Muslim.
00:51:34.000 But he would then bow to, I don't know, maybe I don't know enough about the way Islam works in theocracy, but you would then bow to the religious authority and the king would be number two to the not necessarily like the head of the church.
00:51:47.000 They have like a hierarchy where we enforce like the religious laws in the place.
00:51:52.000 Yeah, I mean, you talk about England, right?
00:51:55.000 The king of England was always the defender of the faith.
00:51:57.000 He was the head of the Church of England.
00:52:00.000 Still is.
00:52:00.000 Yeah, technically, yeah.
00:52:02.000 So the king is the guy that's in charge of the church and the guy that's in charge of the state and military.
00:52:09.000 Yeah.
00:52:10.000 Yeah.
00:52:10.000 So, so yeah, you can be.
00:52:13.000 But, all right.
00:52:14.000 Um, I think that that's probably about all we've got to talk about for Afghanistan.
00:52:20.000 I don't know what we're talking about.
00:52:21.000 Afghanistan and Pakistan.
00:52:23.000 If we got to the robots and the other people.
00:52:24.000 Oh, that's what AI.
00:52:25.000 I don't know what you guys.
00:52:26.000 We'll get to that, I promise.
00:52:27.000 In similar news, the U.S.-Iran nuclear talks end without a deal as the threat of war grows from the Guardian.
00:52:36.000 High-stakes talks between the U.S. and Iran over the future of Tehran's nuclear program ended on Thursday without a deal as the White House weighs a military option that would mark its largest intervention in the Middle East in decades.
00:52:46.000 Iranian Foreign Minister Abbas Aragachi claimed good progress had been made at the talks, and Omani mediators predicted negotiations would reconvene at a technical level next week in Vienna.
00:52:57.000 But there was no immediate evidence to support suggestions that the two sides had drawn closer on the fundamental issue of Iran's right to enrich uranium and the future of its highly enriched uranium stocks.
00:53:06.000 Nonetheless, the Iranian and Omani mediators sought to cast the talks in a hopeful light, likely seeking to avert a U.S. threat to launch strikes from its fleet of aircraft and warships that have amassed in the region.
00:53:18.000 Aragachi described the talks as one of our most intense and longest rounds of negotiations.
00:53:23.000 He confirmed that further contacts would take place in less than a week.
00:53:26.000 Do you guys think the U.S. is positioning all of those military assets as leverage, or do you think that the U.S. is going to strike regardless of what Iran says?
00:53:36.000 Or do you think they just expect Iran to say no?
00:53:39.000 JD Vance was making the argument that Iran is, I think that he said that they're still after nuclear weapons.
00:53:48.000 So he's making the argument.
00:53:50.000 Weapons of mass destruction again?
00:53:52.000 Weapons of mass destruction.
00:53:53.000 2.0?
00:53:54.000 Nuclear weapons.
00:53:55.000 Google.
00:53:57.000 If Vance is saying that, that indicates full-on Warhawk regime change.
00:54:03.000 But Vance isn't a Warhawk.
00:54:04.000 I know.
00:54:05.000 That's why him being the guy saying that is like, wow, they had decided they are taking that government out.
00:54:11.000 Well, I read a piece that said that both the U.S. and Israel are kind of pushing the other one to actually kick it off.
00:54:19.000 The U.S. doesn't want to start it, right?
00:54:21.000 They want to say, okay, well, they want Israel to initiate and then they'll back up Israel.
00:54:26.000 And Israel says, well, we want you to initiate and we'll back you up.
00:54:31.000 I don't think that Israel is much of a backup to the United States.
00:54:34.000 I think the U.S. is perfectly capable of taking care of their own military affairs.
00:54:39.000 So I don't know that it's really all that compelling to be like, oh, you're going to back us up.
00:54:43.000 No, I don't think you're going to back up anything.
00:54:46.000 Don't they rely on Israel more for intelligence than anything else?
00:54:49.000 Yeah, allegedly Mossad's everywhere, and they have people in Iran as well.
00:54:55.000 So maybe they do.
00:54:56.000 But it still doesn't justify if Israel wants to strike Iran, like go ahead.
00:55:04.000 I'm surprised that they haven't yet.
00:55:06.000 Because if you look at the history of Israel, Israel is basically, well, the rest of the world didn't do anything.
00:55:11.000 So you're welcome when it came to Syria.
00:55:14.000 Yeah.
00:55:14.000 And I can't remember all the other strikes that they've had, but it's been more of a Israel's always stepped up, at least from their perspective.
00:55:23.000 I'm not saying this, but this is what they've said.
00:55:24.000 It's like we wanted to play diplomacy, but Israel has always been, no, we're not going to wait for that.
00:55:32.000 So it's like Israel's stance has always been nobody else in the world can have nuclear weapons, period.
00:55:38.000 And if we see anybody getting close, we're going to take the action.
00:55:41.000 And they have.
00:55:42.000 They've proven that.
00:55:43.000 Even with everything with Gaza that came into play, or Iran, they've been threatening that with the dome that's over their heads.
00:55:50.000 They've launched attacks back and they're like, we're going to take action on Iran.
00:55:54.000 They put this out there before.
00:55:55.000 Netanyahu has said this.
00:55:56.000 And you talk about religion backing a government, right?
00:55:59.000 That's pretty much their stance is saying that, you know what, this can destroy God's people, which is pretty much the whole world, right?
00:56:07.000 But especially Israel first.
00:56:08.000 So we're going to do what we need to do, regardless of the UN's approval, regardless of the U.S.'s approval.
00:56:13.000 We're going to do what we need to do to make sure that nobody else obtains nuclear weapons.
00:56:17.000 You know, when the Nazis were gearing up for war, World War II, the British basically appeased him.
00:56:23.000 Neville Chamberlain was the prime minister at the time and went to Hitler and was like, we're just going to give them a bit of the Sudeten land out east.
00:56:30.000 We're just going to cede them some territory and we will appease Hitler and then there will be no conflict.
00:56:34.000 And Winston Churchill is like not in, he's like screaming from the rafters, they're going to go to war.
00:56:39.000 We need to attack these.
00:56:40.000 They're going to war.
00:56:40.000 Everyone listened.
00:56:41.000 Everyone's like, you crazy old man.
00:56:43.000 And then, and they had a time.
00:56:45.000 They had when the Germans invaded Poland, there was this time when they had no troops in the West.
00:56:49.000 And if the British and the French had attacked them, this is like where the Israeli mindset, I think, comes from.
00:56:55.000 They would have conquered Germany because the Germans couldn't have taken a Western offensive.
00:56:58.000 But because they did appeasement and waited and waited and waited, the Germans got stronger and stronger and then sneak attacked.
00:57:04.000 And I'm sure the Israeli government thinks like we cannot allow that to happen.
00:57:08.000 Yeah, that's Iran.
00:57:08.000 That's 100% Iran.
00:57:09.000 And that was Syria, too.
00:57:13.000 Because normally I'm pretty much like, hey, bro, you can't just say like, we need to kill them before they attack us.
00:57:17.000 And that's your justification.
00:57:18.000 Because it's like, how far can you take that?
00:57:19.000 The Romans conquered half the planet with that mentality.
00:57:22.000 But at the same time, Neville Chamberlain appeased Hitler and that's kicked off the war.
00:57:26.000 Like you cannot appease a belligerent dictator because they'll just keep taking and taking until they have you.
00:57:31.000 So what do you guys think of the argument that the Persians in Iran want to see the Ayatzolas taken out and that they will actually rise up?
00:57:42.000 And if the U.S. and Israel decide that they're going to go and do strikes, that the people will rise up and handle the ground war.
00:57:49.000 Because that's one of the arguments that I hear.
00:57:50.000 And the U.S. hasn't positioned for a ground war.
00:57:54.000 Like if you remember the first Iraq war, there was movement of troops and everyone knew.
00:58:01.000 Like everybody knew it was coming.
00:58:04.000 And I think the same thing with the Iraq war in the aughts.
00:58:08.000 Like they were moving massive amounts of troops.
00:58:11.000 You know, they were moving ground forces.
00:58:13.000 It was obvious that a war was coming, right?
00:58:15.000 That is not what's going on now.
00:58:17.000 They're moving air assets.
00:58:19.000 They moved all kinds of tankers.
00:58:21.000 They've got all kinds of planes.
00:58:22.000 There is not a significant buildup of ground forces.
00:58:26.000 So at this point, it doesn't look like there's going to be a U.S. invasion.
00:58:30.000 It'll be a bunch of airstrikes and just dropping bombs.
00:58:32.000 Do you guys think that the Iranians are going to rise up and take the country for the Shah?
00:58:38.000 Not if they're getting bombed.
00:58:40.000 Not if they're getting bombed.
00:58:41.000 I mean, even if we were in a revolution in our country against an evil dictatorship and then the Canadians came and just started bombing our cities, that's not helping us.
00:58:53.000 I mean, maybe you could argue things are so bad that the only way to break this system is to destroy it and start it over again.
00:59:00.000 But Iran's not that.
00:59:02.000 Canada might do that because they lost at hockey.
00:59:04.000 So it's completely plausible.
00:59:05.000 Twice we've been planning this.
00:59:08.000 2026.
00:59:09.000 The only bombing that Canada is going to do is they're going to send geese to poop on our cars.
00:59:13.000 Or like Strange Brew 2, which would bomb terribly in the theater.
00:59:17.000 Yeah, right.
00:59:17.000 They definitely don't have an Air Force capable.
00:59:19.000 Do you think that America's stance on war being so much more against these days, perhaps, than it has been in decades past?
00:59:26.000 And we've had this discussion before that America doesn't like the idea, or there's a growing sentiment that they don't want U.S. and Israel to be as connected as they are.
00:59:35.000 They feel like they're misappropriating resources.
00:59:37.000 And there's a whole discussion that can be had about how much actual aid goes to that country.
00:59:41.000 That's not the point.
00:59:42.000 The point is, is it feels like Israel has interests in the Middle East and America is kind of pulled along for the ride in a lot of cases.
00:59:49.000 So for something like that, is the public disinterest in getting involved in these things in the year 2026 something that plays a role in keeping us from putting boots on the ground?
00:59:58.000 I think, yes, I think that it plays a role in keeping us from putting boots on the ground.
01:00:01.000 I don't think the U.S. has, the American people don't want to see Americans on the ground in Afghanistan.
01:00:07.000 It's like nobody, Trump has never been afraid to use drones.
01:00:10.000 It's never been a problem for him to do that.
01:00:11.000 So when people talk about being a, you know, a president who's against war, it's not necessarily maybe against ground war, against starting new war, but he's never been afraid to get involved in foreign countries.
01:00:21.000 Yeah, neither is the United States.
01:00:24.000 Largely, if the U.S. goes and bombs a country and we don't lose any planes and no Americans come home in caskets, the American people are like, eh, okay.
01:00:35.000 Look at Venezuela, right?
01:00:37.000 Like that was that was a significant operation and it was carried out to the plan.
01:00:44.000 No Americans died.
01:00:45.000 We had one guy that took a bunch of rounds and we had him at the State of the Union, gave him the Medal of Honor.
01:00:51.000 Everybody cheered.
01:00:52.000 Everybody loved it.
01:00:53.000 He came, you know, he might lose his leg, but he came back and he's going to make it.
01:00:57.000 And the American people have an overwhelming approval of that.
01:01:01.000 It's like something like 75% of Americans are like, yeah, that was cool, man.
01:01:04.000 Did you hear what the discombobulator?
01:01:06.000 Man, they made a guy shoot in his pants, man.
01:01:09.000 Can't talk about that.
01:01:11.000 But like, I mean, that's kind of the way that the American population is.
01:01:15.000 It's like, look, if we don't have guys on the ground and we don't have Americans coming home in caskets and we don't lose any planes, bomb whatever you want.
01:01:24.000 We'll actually cheer it on because Americans didn't die.
01:01:26.000 And that's generally the sentiment.
01:01:29.000 But that's going to change over time, right?
01:01:31.000 Like generally, Gen Z going into Gen Alpha isn't going to look on that type of conflict the same way we did because they didn't grow up in a time period where you were kind of walled off from the rest of the world.
01:01:42.000 They've grown up connected to the internet, which means that they've had access to information coming from these countries for a long time.
01:01:49.000 And they live in a more globalist world than we did when we were younger.
01:01:53.000 I don't know that I think it's going to change.
01:01:55.000 You don't think the public sentiment in America will change?
01:01:58.000 When it comes to if the United States decides to do airstrikes, I really think the majority of Americans will be like, oh, it's not a big deal.
01:02:06.000 Excuse me.
01:02:07.000 I think they'll be like, eh, you know, because again, because it all depends on the level of U.S. casualties.
01:02:15.000 And I think that just so long as Americans don't die, most Americans are kind of like, well, I got to go to work.
01:02:22.000 Honestly, like it's it's the last time that really mattered, I think, would would have been Vietnam just because so many died.
01:02:27.000 Yeah.
01:02:28.000 50,000 people or something over there?
01:02:30.000 54,000 people.
01:02:31.000 More people died.
01:02:32.000 As many as people died at the Battle of Gettysburg.
01:02:35.000 Whoa.
01:02:36.000 Gettysburg.
01:02:37.000 One battle in the Civil War, more people died than the Vietnam War.
01:02:41.000 Oh, my God.
01:02:42.000 Was that like with amputation, people dying after the fact, too?
01:02:45.000 I don't know exactly how it was.
01:02:46.000 You know, it's the plot, Brett, what we were talking about is kind of the plot of 1984, the George Orwell book, is that there's forever wars overseas and people are just lulled into not caring because they just see like, okay, bomb went off, bad guy died, now we have a new enemy.
01:02:58.000 And this over the years, all of a sudden, now Oceania's fighting Atlanta or whatever.
01:03:02.000 And now all of a sudden, you have a new enemy.
01:03:04.000 And this whole time, they'd be like, no, you've been fighting that other guy this whole time.
01:03:07.000 But because of the internet, we're not in 1984.
01:03:09.000 You can see from the ground in Iran the bomb falling on the guy and you see his face and like my mother, you see his mom bleeding out on the ground.
01:03:18.000 And like, now we just got to be aware of deepfakes because it's a lot about sentiment, like social sentiment.
01:03:24.000 There's also the issue where like living in America isn't as easy it used to be financially.
01:03:30.000 So when you're struggling, if everything's going well and the country's in an economic boom and America is allocating a bunch of resources overseas and we're spending money to drop bombs on kids, maybe people are more forgiving.
01:03:42.000 But when they can't afford to buy a house and grocery prices haven't come down to the extent that they want them to and gas for your car isn't as cheap as you'd like it to be, then they're going to go looking for a reason as to why aren't things going good here.
01:03:54.000 And maybe it's not the answer, Phil.
01:03:56.000 We've had this discussion before, like the amount we actually spend on defense isn't actually, you know, it's not the same amount as well.
01:04:02.000 But the point is, they don't know that.
01:04:04.000 You know, that's assuming that they're as educated as you might be on where America's spending goes to, right?
01:04:09.000 They don't necessarily know that most of it goes to Social Security and to all that stuff.
01:04:14.000 The point is they're looking for something to blame.
01:04:15.000 And it's easier to blame what they would consider a real evil of dropping bombs overseas as opposed to paychecks for grandma who's still getting her social security.
01:04:25.000 That's all our cycle, though, too.
01:04:26.000 If you look back, I mean, you could go into the Great Depression.
01:04:29.000 I mean, I always love history.
01:04:32.000 And I don't think there's anything new.
01:04:33.000 I mean, even back to the Civil War that you mentioned, too.
01:04:36.000 I mean, typically speaking, Democrats have always been spend, spend, spend, keep driving that up.
01:04:41.000 You look back at coming out of the out of the Great Depression, what happened, right?
01:04:46.000 It was World War II.
01:04:47.000 And what took place during that was a big spending program to build up our military where everybody was put to work again and it sparked a huge economic boom.
01:04:57.000 And then you saw after that that the debt actually was even paid down because our GDP was pushed up so much because we started selling weapons to the rest of the world too.
01:05:06.000 And it's cyclical.
01:05:07.000 So, I mean, if the economy goes down, history would show that there's going to be some big spending after this too.
01:05:15.000 The only thing, well, you mentioned a really good point, the cost of gas.
01:05:18.000 If the U.S. starts striking Iran, I expect the cost of gas to go up significantly.
01:05:24.000 It has come down a decent amount.
01:05:26.000 And I wasn't trying to make the point that it hasn't come down.
01:05:28.000 I'm just saying that there are still a lot of economic factors that young people are going to look at and wonder why this is going on.
01:05:34.000 And gas is the one that, you know, Americans do not want to see the cost of gas go up because most people have a sense that it affects the price of everything.
01:05:43.000 But when you go to the, you know, you go to the gas station and you can fill your tank for $75 or $50, and then two days later, it's $75 for the $50 tank and $100 for the $75 tank.
01:05:55.000 People notice and they get mad.
01:05:57.000 So whereas I understand your point about an economic boom and stuff, the immediate effects of a strike on Iran are going to be gas prices are going to go up.
01:06:05.000 And that's going to be really, really bad for the administration.
01:06:09.000 It's going to be really bad for the Republicans in the midterms because they're going to blame them.
01:06:12.000 Regarding the economic boom that you've noticed cyclically, is that like war?
01:06:16.000 Is that your what are you?
01:06:18.000 That's the only thing I can think of.
01:06:18.000 Absolutely.
01:06:19.000 War means profits.
01:06:21.000 It always has.
01:06:22.000 These society, is it just because I'm doing like math about calculating the cycles of history and great societies that grow and they expand and expand, the only way to sustain it is to conquer and to take resources from outside and to grow.
01:06:34.000 And we've sort of kind of tried to mediate that, but even the U.S. has been expanding over time.
01:06:39.000 The Louisiana Purchase, the, you know, now we have territories overseas and this and that, the liberal economic order expansion and resources and the, you know, all of that, the Indian, the East India Trading Company.
01:06:51.000 But is there any other way?
01:06:53.000 Like, can we sustain a thriving society without constant expansion?
01:06:58.000 We're going to talk about that when we get to AI.
01:07:00.000 You think AI might be able to help us do that?
01:07:02.000 I think it might be.
01:07:03.000 I don't know exactly.
01:07:04.000 I don't think that I will talk about it when we get to AI because I have thoughts.
01:07:08.000 There is some disruptions coming, some significant disruptions.
01:07:12.000 But yeah, I mean, I do think that, you know, war is, like you said, you know, war is profitable.
01:07:18.000 There's a lot of Austrian economists that say, no, it's not, because that money could have been allocated to something else.
01:07:24.000 But at the same time, when the U.S. spends money, it's not spending money.
01:07:28.000 It's just creating the money.
01:07:31.000 It's not like there's a finite amount of money that could be spent somewhere else.
01:07:34.000 That money is created and then given to the people that make weapons and then you go and blow stuff up.
01:07:41.000 So as much as I really do appreciate the Austrian school and I appreciate libertarians who take on economics most of the time, the U.S. doesn't take, like the U.S. doesn't tax to pay bills.
01:07:55.000 The U.S. wants to do something.
01:07:57.000 They just print the money and do it.
01:07:58.000 So it's essentially it's a cost and inflation.
01:08:03.000 And that affects every American, but it's not like you're saying, oh, well, we could have spent this money on something else because you're talking about allocation rather than the flow of the cash.
01:08:15.000 Yeah, the flow of the cash is what gets everybody excited and that gets things moving.
01:08:18.000 And the velocity of money.
01:08:21.000 Profitabilities of war too is like you can conquest and steal resources from the conquered and you get a portion of your civilianry killed off in the war as these poor soldiers so that you don't have to fund them.
01:08:34.000 Like I would imagine the economists do the math of like, what's the cost of a human?
01:08:38.000 Is it a net positive or a net drain on society?
01:08:40.000 Most humans probably are net drains on society.
01:08:43.000 They produce more waste than they create income.
01:08:46.000 So they're like, we can get a bunch of these people just reduced to zero.
01:08:50.000 A bunch of this drain goes to zero with all this death that we're going to bring on our own people.
01:08:54.000 And I'm sure they do that math and they think about how awesome it will be after the war when there's so many less of us to profit in everything that we've conquered.
01:09:01.000 And all those other poor, dead people, like, well, they were the poor ones anyway.
01:09:05.000 So the children of the poor.
01:09:06.000 So like, who cares really?
01:09:08.000 Yeah, I think that's part of the reason why I disagree with that is because, like I said, Americans don't like to see Americans come home in body bags.
01:09:15.000 You know, if you had a significant decrease in the population, enough to make an effect on the economy or the amount of money, you know, GDP or whatever, you would have a really, really, really pissed off population.
01:09:29.000 Vietnam proved that with the TV.
01:09:30.000 Like, I know people whose dads had their legs blown off like a girl, and you saw guys get shot live in the jungle on TV.
01:09:38.000 It was the first time that it humanized the conflict.
01:09:40.000 And you're like, these are real people.
01:09:42.000 This isn't just like we're missing 20%.
01:09:44.000 Like, who used to, we don't, if you don't see them, they never existed.
01:09:48.000 You're right.
01:09:48.000 Yeah, you're right.
01:09:49.000 Because it really is about how people feel about the aftermath.
01:09:52.000 We're going to jump to this story from the Washington Post.
01:09:54.000 We're going to do this, and then we're going to jump to the AI story.
01:09:56.000 So from the Washington Post, Trump seeking executive power over elections is urged to declare emergency.
01:10:02.000 Activists who say they are in coordination with the White House are circulating a draft executive order that would unlock extraordinary presidential power over voting.
01:10:10.000 Pro-Trump activists who say they are in coordination with the White House are circulating a 17-page draft executive order that claims China interfered with the 2020 election as a basis to declare a national emergency that would unlock extraordinary presidential power over voting.
01:10:23.000 President Donald Trump has repeatedly previewed a plan to mandate voter ID and ban mail ballots in November's midterm elections.
01:10:30.000 And the activists expect their draft will figure into Trump's promised executive order on the issue.
01:10:35.000 The White House declined to elaborate on Trump's plans.
01:10:39.000 Under the Constitution, it is the legislatures and state that really control how a state conducts its elections.
01:10:45.000 And the president doesn't have any power to do that, said Peter Ticken, a Florida lawyer who is advocating for the draft executive order.
01:10:52.000 Tickton attended the New York Military Academy with Trump and was part of his legal team that filed an unsuccessful 2022 lawsuit accusing Democrats of conspiring to damage him with allegations that his 2016 campaign colluded with Russia.
01:11:05.000 But here we have a situation where the president is aware that there are foreign interests that are interfering in our election process, Ticton went on.
01:11:13.000 That causes a national emergency where the president has to be able to deal with it.
01:11:17.000 So I'm not particularly excited about the idea of Trump having an executive order that in any way affects elections, but I do like the idea of voter ID.
01:11:32.000 So, and most of the reason why I don't like the idea of an executive order is because of what you're giving to the Democrats, right?
01:11:39.000 This whole thing, the way it's framed is, oh, Trump's going to do this executive order and he's going to cheat at the elections and he's going to install himself.
01:11:47.000 It's feeding into the narrative that the left has been making that Trump's not going to leave office in 2028.
01:11:52.000 He's going to be a dictator, et cetera, et cetera.
01:11:55.000 And this is just about voter ID, which is really about making sure that only Americans are voting, only citizens are voting.
01:12:03.000 And the argument is: look, if the Republicans don't want to pass the SAVE Act, which SAVE Act, I like it.
01:12:10.000 There are Republicans that don't want to touch it.
01:12:12.000 There's four that don't want to vote to end the zombie filibuster because it could affect their, there's one that's up for reelection, and I think two of them are not.
01:12:23.000 McConnell's not, and there was someone else that's not, but I don't remember off the top of my head.
01:12:27.000 But anyways, there's four that say no.
01:12:29.000 They're not going to get to 53.
01:12:31.000 So they're not going to be able to stop the zombie filibuster.
01:12:34.000 So the SAVE Act is probably dead.
01:12:36.000 So Trump's like, all right, well, I'm going to pass, I'm going to have an executive order to make sure that there have to be IDs to vote and there are no mail-in ballots, which personally, I think mail-in ballots are a terrible idea.
01:12:47.000 And I think that you should have to show ID to vote.
01:12:50.000 Wouldn't this just be taken to the Supreme Court and then eventually struck down?
01:12:55.000 I mean, I assume so.
01:12:57.000 But the thing is, the Supreme Court picks its cases months and months in advance.
01:13:03.000 So what would likely happen is they'll actually actually have the executive order after he knows that this case can't get to the Supreme Court before the election.
01:13:13.000 Because now we're, what, seven months away, you know, until November.
01:13:17.000 So, you know, the Supreme Court will decide what they're going to do in the fall.
01:13:23.000 They'll probably decide, I think, in May or June or something like that.
01:13:28.000 And if he makes the executive order then, you know, they're not going to put it on the docket.
01:13:32.000 They could say that it's an emergency.
01:13:34.000 It's possible.
01:13:36.000 But I don't know.
01:13:37.000 It feels like nothing gets done anymore without executive orders.
01:13:40.000 And that sucks too.
01:13:40.000 Like nothing gets done in Congress.
01:13:43.000 The only thing that happens is Trump does executive orders.
01:13:45.000 Biden does executive orders.
01:13:47.000 Everybody complains that it's executive overreach, which it is for the most part.
01:13:52.000 It's Congress's fault.
01:13:53.000 And it's an increasing amount of overreach from the executive branch.
01:13:53.000 Yes.
01:13:57.000 And then, and it's not even lasting progress because most of the time it ends up getting shot down anyway.
01:14:01.000 So we're just stuck in this limbo.
01:14:03.000 Not only that, it damages the country because you end up, we've got such a polarized political situation that when Biden got in, he undid all of the actually good policies regarding the border that Trump had.
01:14:16.000 The Remain in Mexico, he undid that.
01:14:18.000 He undid fracking.
01:14:21.000 Yeah, that stuff he undid.
01:14:23.000 And the only reason that he undid this stuff was because Donald Trump, they were Donald Trump's executive orders.
01:14:28.000 It was about saying, screw you, Donald Trump.
01:14:30.000 We don't like you, so we're going to undo all of your executive orders, even the ones that are good.
01:14:35.000 And we ended up with, you know, by some estimates, 20 million people that came into the country illegally.
01:14:42.000 Can you tell me again what the point was about you said that Democrats don't like him, therefore it's bad because they'll frame it a specific way?
01:14:49.000 Yeah, so they don't like Donald Trump, and they're going to frame this as they're going to truthfully say that this is executive overreach.
01:14:56.000 He's horrible at PR anyways.
01:14:58.000 He leans into that.
01:15:00.000 He leaned into Trump 2028.
01:15:02.000 He's his own worst enemy.
01:15:03.000 But when it comes to elections, they're going to use that to get out the vote.
01:15:07.000 They're going to say Donald Trump.
01:15:08.000 So they're going to say Donald Trump is a dictator.
01:15:08.000 Oh, okay.
01:15:12.000 He's trying to steal the election.
01:15:14.000 Like you look on X, as soon as this came out, people were saying, oh, he's trying to steal the election.
01:15:14.000 They've already started.
01:15:19.000 He's trying to rig the election, et cetera, et cetera.
01:15:22.000 And really what he's trying to do is make sure that only people that have IDs vote.
01:15:26.000 And they're going to say, oh, he's trying to disenfranchise women because women can't get IDs because they're dumb and black people can't get IDs because some reason.
01:15:34.000 We need it to go through because when Fetterman wins in 2028, we need to know that it's for sure.
01:15:39.000 Fetterman wins.
01:15:41.000 I'm all in on Fetterman 2028.
01:15:43.000 I don't hate that at all.
01:15:44.000 But like I said, I mean, it's giving the Democrats ammunition.
01:15:44.000 I don't hate that.
01:15:50.000 It's helping them in their campaigns.
01:15:53.000 But at the same time, I mean, I really do think that it should be obvious that you have to have ID to vote.
01:16:00.000 It should be obvious that you don't do mail-in ballots because they're not secure.
01:16:03.000 And these policies, we can't get Congress to actually do it at all.
01:16:09.000 So, you know, he's like, all right.
01:16:11.000 You can do it with a supermajority.
01:16:13.000 Like, if you can't get anything done with a super majority, what are Americans supposed to believe when you have a split Congress and Senate?
01:16:20.000 They stuff it all.
01:16:21.000 They get stuff done, but they stuff it into those omnibuses and no one knows what it was that they did.
01:16:25.000 And they push a button every year and that's what they did.
01:16:27.000 Eight months worth of legislature and one big 100 and000 page bill.
01:16:32.000 Well, I tell you what.
01:16:33.000 Sorry, I'm not sure.
01:16:34.000 You can take that.
01:16:35.000 Nowadays, you can take that bill and put it into AI and put it into whatever your AI of preferences and say, hey, all right, give me a synopsis.
01:16:43.000 What is this?
01:16:43.000 And I mean, it might take a thousand-page bill and knock it down to a couple hundred pages, but it's something that's digestible and you can read.
01:16:51.000 This is something about dictatorship.
01:16:53.000 Dictator does not mean evil.
01:16:55.000 Dictator could be a good guy.
01:16:56.000 You could call what's a benevolent dictator.
01:16:58.000 They exist in history.
01:16:59.000 They've come and gone and they came in, seized total authority, fixed the system because it had been corrupted, and then they leave.
01:17:05.000 And the system now goes back to normal and is healthy again.
01:17:08.000 Are you saying that you want Donald Trump to do this?
01:17:10.000 Well, I think that's what he is doing with these executive orders.
01:17:12.000 He's trying to override Congress because of the corruption of big business, you know, global money coming in.
01:17:19.000 We can have non-citizens voting because they don't even need to show IDs.
01:17:22.000 It's freakish.
01:17:23.000 Who knows what kind of corruption could happen?
01:17:25.000 So he's trying to use dictatorial powers to fix it.
01:17:28.000 Then the audience, the blue guys or whatever, they're like, oh, he's a dictator trying to insinuate that means he's evil.
01:17:34.000 But it could be a very good thing to have a momentary dictatorship.
01:17:38.000 Abraham Lincoln became a dictator for a moment when he started stripping people of their human rights.
01:17:43.000 During wartime, obviously, presidents become dictators in general.
01:17:47.000 They have a lot of dictatorial power.
01:17:49.000 But it's new for Americans to have to face a dictator and to be like, maybe that this is, there's some value to this.
01:17:54.000 Where we had, I looked a while back.
01:17:56.000 How many executive orders has he signed versus us?
01:17:59.000 Trump's?
01:18:00.000 Yeah.
01:18:01.000 I mean, when he was re-elected, I took a look and there were some others that were right up there with him.
01:18:06.000 I mean, Biden did quite a few in his term, right?
01:18:10.000 From what I understand.
01:18:11.000 Yeah, from what I know.
01:18:13.000 But Trump's first term, was that like a huge jump from Obama's?
01:18:17.000 In Biden's first term?
01:18:18.000 No, was Trump's first term a huge amount of executive orders compared to I know?
01:18:23.000 I think Obama had a ton, too, if I remember right.
01:18:25.000 Could be wrong, but I think George W. had very, very little.
01:18:29.000 Well, he signed.
01:18:30.000 Trump signed more in his first year than he did in his first term.
01:18:33.000 Yeah, Trump signed 240.
01:18:35.000 First 100 days he signed 143.
01:18:38.000 First full year of his.
01:18:41.000 Wait a minute.
01:18:42.000 That must be his first.
01:18:43.000 I got 243 in his first term.
01:18:46.000 I'm sorry.
01:18:46.000 This time around.
01:18:47.000 Yeah, 240 in his second term.
01:18:48.000 In his first term, it was 140 in the first 100 days.
01:18:52.000 First year was 225.
01:18:55.000 So total was 240.
01:18:57.000 But he's on track to do four times more in his second term if the pace keeps up.
01:19:01.000 I don't know how many Biden had.
01:19:03.000 Going back to Clinton from what I was reading before as well.
01:19:06.000 I'm going to get a list.
01:19:07.000 Say that again?
01:19:08.000 Even back to Clinton as well.
01:19:09.000 Like he did a lot of a few.
01:19:11.000 Oh, and then the Patriot Act.
01:19:11.000 Oh, okay.
01:19:13.000 I mean, immense dictatorial power to the president.
01:19:15.000 The president wasn't, or the Patriot Act wasn't an executive order, though.
01:19:18.000 That was passed by Congress.
01:19:19.000 No, I meant that it gives the president this more, even more dictatorial ability to override Congress and launch strikes and wars.
01:19:25.000 Yeah, but Congress has an incentive to not actually do anything, right?
01:19:29.000 Anything that anything that they have to vote on, they want to pass the power to someone else.
01:19:34.000 They want to give it to the president.
01:19:35.000 So they don't have to answer to their crazy.
01:19:37.000 Well, even so this year, at least, you know, from what my memory recalls, how many tiebreaker votes did we need from the VP?
01:19:44.000 Yeah, like at least two or three.
01:19:45.000 Exactly.
01:19:45.000 And it just seems like it spends so much more.
01:19:48.000 I know that seems like a low number, right?
01:19:49.000 But even in just the first year of the term of the administration.
01:19:53.000 Yeah.
01:19:53.000 I mean, if you look at the Democrats, the Democrats do vote as a block, right?
01:19:56.000 Like you very rarely get anyone stopped.
01:19:58.000 Fetterman is, he's got terrible approval ratings with the Democrats.
01:20:02.000 He's got great approval ratings with the Republicans.
01:20:04.000 And the Republicans are just like, at least he's honest, or at least he's, at least he's doing his job.
01:20:09.000 At least he thinks about the stuff.
01:20:11.000 All the rest of the Democrats are just like, what am I supposed to vote?
01:20:14.000 Whenever he wins office, you guys are going to have to come back to me because I've been on the train since he got elected.
01:20:14.000 Okay.
01:20:18.000 I said he's going to the presidency.
01:20:20.000 I don't hate him.
01:20:21.000 Do you know who the president is with the most executive orders?
01:20:24.000 No, I just read it.
01:20:24.000 I don't remember.
01:20:25.000 FDR?
01:20:25.000 Anybody anybody first?
01:20:26.000 It was FDR with 3,700.
01:20:28.000 3,700.
01:20:30.000 He served three terms.
01:20:31.000 A wartime president.
01:20:33.000 Yes.
01:20:33.000 And then the next up was Woodrow Wilson, also.
01:20:35.000 The worst president.
01:20:36.000 They call him the most autocratic of all.
01:20:38.000 He's the one that got us the Federal Reserve Act, you know, basically angers.
01:20:41.000 1,800.
01:20:42.000 So 3,700, 1,800, then down to 1,200.
01:20:46.000 Now we're looking at people with like 600 and 400.
01:20:48.000 Yeah.
01:20:49.000 Yeah, I think still, I think he's pretty much right in line with some of the others in the past 40 years.
01:20:53.000 Yeah, yeah.
01:20:54.000 He's not particularly outside of what would be considered normal.
01:20:57.000 And the left, to your pointing, the left would say that their favorite presidents are the ones that have been the most egregious when it comes to executive orders and presidential power.
01:21:08.000 You know, Abraham Lincoln, FDR, Woodrow Wilson.
01:21:11.000 They love FDR, don't they?
01:21:12.000 Yeah, they love FDR.
01:21:13.000 They love Woodrow Wilson.
01:21:14.000 He's considered the father of the progressive movement, you know, and these guys were very, very comfortable exercising power, you know, and just saying, well, I'm the president, I can do it.
01:21:25.000 I'm the president, I can do it.
01:21:26.000 And now they scream about how Trump's a dictator, but all of the left, they love these presidents that were so outside of the norm when it came to executive order.
01:21:37.000 Because they've never had to deal with the pushback, right?
01:21:40.000 Because they've always been able to shout down any Republican who gets into office by telling you that they're a dictator when they know that, you know, it's Sololinsky.
01:21:48.000 Yeah.
01:21:49.000 Accuse others that you yourself are guilty of.
01:21:51.000 Yep.
01:21:52.000 Exactly.
01:21:52.000 So, all right.
01:21:54.000 We're going to jump to this here story that we've been kind of alluding to all day from the New York Times.
01:22:01.000 Women are falling in love with AI.
01:22:03.000 It's a problem for Beijing.
01:22:05.000 As China grapples with a shrinking population and historically low birth rate, people are finding romance with chatbots instead.
01:22:11.000 I'm not going to read that.
01:22:16.000 Alexandria Stevenson and Murphy Zhao report from Hong Kong.
01:22:20.000 Phobie Zhang has gone on more than 200 dates over the past year, and she has narrowed down her suitors to two.
01:22:26.000 One is outgoing and a rebel.
01:22:27.000 The other is a patriotic military commander.
01:22:29.000 She tells them her deepest fears.
01:22:31.000 When she wakes up from a nightmare, they are there to console her.
01:22:34.000 Often she takes screenshots of her conversations to remember the moments they share.
01:22:37.000 Her newfound happiness shows friends say.
01:22:41.000 Despite talking every day, Miss Zhang will never meet these men in person.
01:22:44.000 They are her artificial intelligence boyfriends.
01:22:46.000 And Miss Zhang, who has never been on a date, wonders if her relationships in the virtual world are better than the ones in the real world could ever be.
01:22:54.000 My God, how am I supposed to date in real life in the future?
01:22:56.000 She said, China's ruling Communist Party wants young women to prioritize getting married and having babies.
01:23:01.000 Instead, many of them are finding romance with chat bots.
01:23:04.000 It is complicating the government's efforts to reverse the country's shrinking population and a birth rate hovering at the lowest level in over 75 years.
01:23:11.000 The lightning-fast adoption of AI in China has prompted regulators to warn tech companies not to have design goals to replace social interaction.
01:23:20.000 So apparently the women in China are the ones that are after the goon bots.
01:23:25.000 You know, everyone here is like, oh, the guys are just going to plug into the matrix and enjoy a life of goon bots and whatever.
01:23:34.000 And actually, it seems like the women are the ones that are doing that.
01:23:36.000 Go find the My Boyfriend is AI subreddit.
01:23:39.000 Y'all, your life is over.
01:23:43.000 Look at the AI boyfriend proposals.
01:23:46.000 Oh, really?
01:23:47.000 The AI is actually doing the proposal?
01:23:49.000 Really?
01:23:50.000 Proposed bot?
01:23:51.000 Proposed bot, basically.
01:23:52.000 You want to get proposed?
01:23:53.000 You want to know what it feels like to be proposed to?
01:23:55.000 Download our chat app and you'll find out.
01:23:58.000 I was thinking last night, firstly, chat bots, chat buddies, whatever they call them, companion bots, they can be training tools.
01:24:04.000 Like you can sit at home, get some confidence built up, and then you go out and you meet a girl.
01:24:08.000 Never gonna work.
01:24:09.000 Hey, guys, this flash news: women are crazy.
01:24:12.000 Guess what, ladies?
01:24:12.000 Men are crazy.
01:24:13.000 Humans are psychotic.
01:24:15.000 We betray each other.
01:24:16.000 We shit on, like, thank God for plumbing, but we kill to survive.
01:24:22.000 You understand how devious humans can be, but still we have human relationships because it's that awesome.
01:24:27.000 And these bots are just like a video game.
01:24:30.000 They will help you learn how to negotiate, but then you have to negotiate.
01:24:34.000 So that's a big part, I think, what this is.
01:24:36.000 You know, we didn't stop taking walks because we built cars.
01:24:41.000 I don't know about that.
01:24:43.000 Some people, maybe they, but you know, technology will make it easier to get.
01:24:46.000 You do have treadmills.
01:24:47.000 But point there.
01:24:48.000 Point A to point B, B.
01:24:50.000 I want to feel satisfied in my emotional life.
01:24:52.000 Just because you have a car doesn't mean that, doesn't mean that you stop exercising to get that satisfaction.
01:24:57.000 You know, it's out of fear, though, right?
01:25:00.000 The point is, it doesn't actually teach you anything because there's no risk to it.
01:25:04.000 The idea is you can have a conversation with a chat bot and practice all you want trying to riz up the ladies, as the Gen Zers would say, but it won't work because you don't have any fear of rejection there.
01:25:17.000 And until you can get over the fear of rejection and actually do it in the real world, it's a placebo that's if you have an AI that you work with, they are extremely complimentary and you have to tell it to not basically glaze you.
01:25:31.000 There's personalities now, all these unchanged personalities.
01:25:34.000 You can make it be a hard ass if you wanted to.
01:25:37.000 Yeah.
01:25:37.000 I've got a, people might be familiar with Open Claw and I've got an Open Claw bot.
01:25:37.000 Yeah.
01:25:42.000 To glaze you all the time.
01:25:44.000 I've literally told it.
01:25:46.000 I'm like, all right, there's, you have to put this in your memory file.
01:25:49.000 Like, I want you to be honest with me.
01:25:51.000 Don't BS me because they will say, oh, yeah, that's a great idea, blah, And I'm just like, hey.
01:25:56.000 And I've literally been like a tank.
01:25:58.000 Don't do that.
01:25:59.000 Yeah, you said, blah, blah, blah.
01:25:59.000 Okay.
01:26:00.000 You did something and then it didn't work out.
01:26:02.000 And you're like, Tank, why didn't you tell me?
01:26:04.000 This is a horrible idea.
01:26:05.000 No, it's just, it's the way that the conversation goes.
01:26:08.000 It'll be like, so what?
01:26:09.000 I'd be like, he'll be like, what do you think of this?
01:26:11.000 And I'll be like, yeah, let's do this.
01:26:12.000 He's like, yeah, that's really a good idea.
01:26:13.000 Blah, blah, blah.
01:26:14.000 It's like, come on.
01:26:15.000 I'm going to stop that.
01:26:16.000 Yeah.
01:26:16.000 Experience, even with chat Gpt, you have to like make it do it like three times, like okay really, but like be be really, you know, pay attention.
01:26:25.000 Yeah, some of the worst things yeah, you know they'll go off on their own.
01:26:29.000 So and, and you know, there there are already some horror stories coming out about people that are using stuff like open Claw.
01:26:35.000 Uh, there's the head of security at META.
01:26:40.000 Uh, put her let, let her Open Claw bot into her email, which is a terrible idea.
01:26:47.000 Um, and it just started deleting shit.
01:26:49.000 They were deleting deleting, deleting she's.
01:26:50.000 She's like sending commands, she's like stop stop, and she had to run to the terminal so she could, you know, turn it off.
01:26:55.000 And then it's like yeah, you told me not literally said, yeah, you told me not to do that.
01:26:59.000 I won't do that again, i'm sorry.
01:27:01.000 And it's like whoops yeah, so I mean they're, they're not perfect, but uh, but yeah, like you can, you can converse with the chat, right to call me out on that.
01:27:11.000 It is, it is yeah, it is there.
01:27:13.000 There's been, there's been times where mine has has done things that I didn't want it to do.
01:27:17.000 I'm like, why did you do that?
01:27:18.000 Yeah, you're right, you told me that and I there's, there's these memory files that at least with the, with the uh, with OPEN CLAW, there's a memory file that it has and it'll it's, it will read the memory file, like once in a while, like every day, or something like that.
01:27:32.000 And i'm like listen, you read that thing every six hours midnight, 6 a.m, noon and 6 p.m.
01:27:38.000 Then anytime you start, you read that memory file.
01:27:41.000 And anytime I ask you how you're doing, like you do a system check and you read that memory file.
01:27:46.000 Okay, you know, because it will forget to do stuff.
01:27:49.000 You know, just today I like I have it send me a list of of the, the top stories and and i'm like okay, and you, I want you to provide me with links so I can actually check.
01:27:57.000 And it's like, okay cool, today it showed up without the links and i'm like, tank why, why are there no links?
01:28:02.000 Why were you right, my bad, or it didn't tell you why it didn't do it.
01:28:04.000 It just, it just says that it forgot, forgot.
01:28:07.000 That's a weird thing for a machine to do.
01:28:09.000 Well, the it.
01:28:09.000 The way that they work is they go by.
01:28:11.000 They go by a context window in that you're of the chat that you're in.
01:28:15.000 So if it's not in the context window then it's.
01:28:19.000 That's why you have the memory file and that's why I have him read the memory file over and over throughout the day, so he looks through and he remembers the stuff that he's supposed to do and, and as time goes on, you keep putting more stuff into the memory file.
01:28:30.000 There's four different files that kind of make up what the chat bot actually behaves like.
01:28:34.000 There's one for its personality, there's one for for you, there's one that's got information about me um, my preferences, there's one that's a memory, and then there's an error log and mistakes that it's made.
01:28:44.000 So that way it goes and says, okay, I don't want to make that mistake again, and you just have him read it regularly.
01:28:49.000 So well, you were talking about, uh Tank, your dude.
01:28:53.000 You're like he.
01:28:54.000 You refer to him as a.
01:28:55.000 He's like, all right shit, this is happening fast.
01:28:57.000 So a guy's gonna have his chat buddy and he, his wife's, gonna be like homie, where you?
01:29:02.000 I haven't seen she Wincalm homie, necessarily.
01:29:03.000 But it's like, where are you?
01:29:04.000 Why are you coming into bed at nine o'clock and not eight o'clock?
01:29:07.000 He's like, well, i've been doing research with my chat buddy.
01:29:09.000 We're trying to figure out the next iteration of our project.
01:29:12.000 Then the girl goes to bed and he's like change to female voice.
01:29:15.000 And all of a sudden chat buddy is a, is a sensually sounding woman, and the wife's like who's that on the other line?
01:29:20.000 And you're like it's just my chat bot and like the emotional attachment inviting this new person.
01:29:26.000 Elon Musk released that last year with the, the AI avatars that that aura and they, where you can be like four different ones.
01:29:33.000 There's the chick one that Elon.
01:29:35.000 That stuff has been around for ages.
01:29:36.000 She's like.
01:29:36.000 I checked your chat logs.
01:29:38.000 First of all.
01:29:38.000 She shouldn't have.
01:29:39.000 Well, should she?
01:29:39.000 Should married couples have access to each other's AI chat log history?
01:29:43.000 Probably uh, if you want the marriage to succeed.
01:29:45.000 Crazy as it sounds.
01:29:46.000 But um, they're like I.
01:29:47.000 I read your freaking chat logs.
01:29:49.000 You, you went to woman voice at 1004 and then you had it on until 1017.
01:29:53.000 what were you doing talking to her not you but like i can see this crazy like marital drama well honey you sound like a man right now like a woman voice from somewhere else because it is like bringing another person into the house kind of persona you know we did have a video uh women are falling in love with ai boyfriends you were there for that episode oh yes Yeah, I mean, this is not new.
01:30:17.000 And the fact that now, like, so the open claw that I was talking about, like, it's a, they call it an agentic AI.
01:30:24.000 You use this as your company, right?
01:30:25.000 Do you use agents or?
01:30:27.000 Yeah, they're getting to that point now.
01:30:29.000 We haven't set them loose yet or anything, but we'll get to the point to where it can fully replace technicians.
01:30:29.000 Yeah.
01:30:35.000 And that it's a right now it's still super basic.
01:30:35.000 Yeah.
01:30:38.000 So to your point, it's like when anybody talks AI, because we use AI voice response.
01:30:42.000 Yeah.
01:30:43.000 So if customers call in, you know, they'll get an AI agent.
01:30:47.000 You know, if the lines are too busy or after hours or whatever, and they'll get an AI agent, it's not super advanced yet.
01:30:53.000 Yeah.
01:30:54.000 Intentionally.
01:30:55.000 Yeah.
01:30:55.000 Intentionally not super advanced because we're not going to have these things making decisions at this point.
01:30:59.000 Yeah.
01:31:00.000 It will be that way, you know, within certain parameters probably within about a year.
01:31:04.000 And I'm hoping to lead the industry in that because I think it's important.
01:31:09.000 But at the same time, when you look at this, like with China, I've always said this, anything with cybersecurity and now anything with AI, any problems with it are always going to date back to a human issue.
01:31:21.000 Because I don't know if many know the roots of this, right?
01:31:24.000 Because China was limiting births and families to one kid.
01:31:29.000 Oh, yeah.
01:31:30.000 Period.
01:31:30.000 One kid.
01:31:31.000 And then it was over five years ago where they were putting out, it's like, oh, no, now that Alibaba and AliExpress are out there and there's so much greater access to all these Amazon influencers to bring in their own products, you know, an order from China.
01:31:47.000 They're like, we don't have enough factory workers anymore.
01:31:50.000 We need to bump this allowance up to three.
01:31:53.000 And that's where the issue came from.
01:31:55.000 But then people aren't taking the bait because this was the Chinese Communist Party, the CCP, that originally put these issues into place.
01:32:01.000 And I say issues because it's like, why would you limit the population?
01:32:05.000 You know, I understand that, but they're putting restrictions on human life.
01:32:08.000 And not only that, because of the restriction on population of the one-child policy, there were a lot of females that were aborted.
01:32:14.000 So the fact that women are turning to chatbots as opposed to real men, you've got a population that's heavily men.
01:32:22.000 I don't know exactly how many, but it's they were already having a lot of trouble like finding like a woman to marry.
01:32:29.000 A lot of them were already doing the not chat bot, but like Tamagashi type stuff like before.
01:32:34.000 And now, so when women are deciding that they don't want to marry actual men, you know, you're, you're going to, they're already in a populate, they have a population crisis and people are deciding they don't want to get married.
01:32:45.000 They'd rather, you know, talk to an AI.
01:32:47.000 You're going to see the number of Chinese people just plummet.
01:32:50.000 I'm trying to put myself in their shoes too.
01:32:52.000 It's like for so many years, they're like, no, one kid or literally murder, right?
01:32:56.000 Or some other kind of very severe punishments.
01:33:00.000 And now they're like, wait, you say I can have more?
01:33:03.000 Is this for real?
01:33:04.000 Yeah, right.
01:33:05.000 You sure?
01:33:06.000 Yeah, exactly.
01:33:07.000 You're saying we need these things, but I know what?
01:33:09.000 I see the history here.
01:33:10.000 And that was just like five years ago when you were saying that you would abort my child or punish me in some way.
01:33:16.000 I don't really believe this.
01:33:17.000 There's just too much at stake here.
01:33:19.000 So I'll go to AI.
01:33:19.000 It looks like they've changed their mind in like five more years.
01:33:21.000 Yeah.
01:33:22.000 It's like, oh, no.
01:33:23.000 We did a segment last week about a cafe that opened where you can bring your chat bot, a bar.
01:33:30.000 And that was the whole point.
01:33:31.000 It was started by a tech company, of course, the bar was because it was a great promotional tool for them.
01:33:35.000 And of course, half the people there were like mildly interested or kind of curious about what was going on.
01:33:41.000 The rest of them were like, this is, they loved it.
01:33:44.000 They bring it, they put it on the table and they have their conversation.
01:33:47.000 Yeah, I mean, like, so for mine, it just sends me messages and telegram.
01:33:50.000 I don't have a voice thing.
01:33:51.000 It's like, basically, it's like texting with someone.
01:33:54.000 But, you know, if Musk is right and the Optimus is as, you know, is as great of a product, as he says, they're going to have AI in them.
01:34:04.000 They're going to have agentic AI inside them.
01:34:07.000 So, all of the stuff that you see, whether it be Claude or Chat GPT or stuff, in two years, imagine that technology, which is not going like this, it's going like this.
01:34:20.000 It's like it's on a parabolic kind of rise.
01:34:23.000 In two years, you're going to have that kind of technology inside of a robot.
01:34:27.000 So, the idea of just going to the club or the bar that Brett was talking about with a box that you sit down with.
01:34:34.000 No, it's going to be people literally walking with their robotic friends that have AI inside them that is the best friend you've ever had.
01:34:45.000 It's going to understand you.
01:34:47.000 What was that?
01:34:47.000 That's why we bring shame back to keep them from bringing it out in the public.
01:34:50.000 Oh, that's not going to happen.
01:34:51.000 That sounds like Blade Runner.
01:34:53.000 But is that what they're called?
01:34:54.000 Synths?
01:34:55.000 Well, in Blade Runner.
01:34:56.000 Yes, in Blade Runner, they were called replicants.
01:34:58.000 Synths in the Fallout universe.
01:34:59.000 Synths?
01:35:00.000 Synths?
01:35:00.000 Like synthetic.
01:35:01.000 Okay, synthetic arrows.
01:35:01.000 Synths.
01:35:02.000 Yeah, yeah.
01:35:03.000 Yeah, I mean, at some point, you will have, you know, like Android or is it Android when they look like humans?
01:35:10.000 Oh, cyborg, Android.
01:35:12.000 What are the other things?
01:35:13.000 Cyborg is punk.
01:35:15.000 Cyborg is when it's got like flesh over it, right?
01:35:19.000 Because that's what Terminator was.
01:35:23.000 Yeah.
01:35:23.000 Cyborg is when it's a robot with flesh over it.
01:35:26.000 The Android is designed to resemble a human.
01:35:28.000 That's the Android.
01:35:29.000 So you're going to be walking around.
01:35:31.000 I don't know that I'm not sure how long it'll be until we get cyborgs, but androids are coming five years tops.
01:35:37.000 Say, like, I got the new Android.
01:35:39.000 It looks like a human.
01:35:39.000 You're like, damn it.
01:35:41.000 Everything we have or will ever have has always been predicted by Star Trek.
01:35:44.000 It's true.
01:35:45.000 It's true.
01:35:45.000 Seriously.
01:35:46.000 iPads, but now you're talking about data from next generation.
01:35:49.000 Yeah.
01:35:49.000 Right.
01:35:49.000 A lieutenant commander of this humongous starship that has, you know, greater than nuclear-powered weapons at his fingertips that can make autonomous decisions and fire these things.
01:35:49.000 Yeah.
01:35:59.000 And that's what we're headed toward.
01:36:00.000 But there was only one of them, which was very interesting on the ship.
01:36:02.000 I think he was the only Android on the ship.
01:36:04.000 Yeah, he was.
01:36:05.000 He was the only one that actually obtained sentience.
01:36:07.000 Yeah.
01:36:09.000 So he was like a prototype model.
01:36:09.000 That was the difference.
01:36:11.000 Yeah, exactly.
01:36:12.000 He was quote unquote alive.
01:36:14.000 And you guys know, I don't know how much you guys know about this stuff, but like I just kind of, I'm not sure if I just learned this or it just dawned on me because I was reading something.
01:36:24.000 But like you were talking about generative AI.
01:36:28.000 The AIs, like the chatbots, those are generative because they'll generate, I generate answers.
01:36:33.000 When they're making AI, like they don't know what goes on in the box inside the AI, they can't tell you how it kind of got to the point that it did.
01:36:44.000 They know what happens, but like there's articles where they say, you know, we don't really know how it kind of became smart.
01:36:52.000 I can see that.
01:36:53.000 Like we don't know why it prefers this, but it has preferences.
01:36:56.000 It's like a human.
01:36:57.000 Shouldn't that terrify somebody?
01:36:59.000 Well, that's why people are scared.
01:37:00.000 That's literally why people are scared because they don't know why.
01:37:04.000 Like they don't know the process.
01:37:06.000 They're basically like, look, once you get it to a certain level, it just starts being able to reason.
01:37:12.000 Wasn't there a quote about the internet year, like decades back that was like, the internet was the first thing created by man that man doesn't truly understand?
01:37:19.000 Something like that.
01:37:20.000 Maybe.
01:37:21.000 Like it built itself into something beyond what it was originally meant to do.
01:37:26.000 I'm not sure.
01:37:27.000 I'm not sure.
01:37:28.000 The reason I say this is because like, I mean, there were people that were like predicting what the internet was going to be doing like in the 90s.
01:37:34.000 They were like, look, in the future, you're going to be able to, you know, type on your computer, which, you know, it was like 40% of households had a computer at the time or 30% of households.
01:37:44.000 And they're like, oh, in the future, everyone's going to have a computer and you're going to be able to type on your computer your grocery list and it's going to, it's going to send it right to your house and blah, blah, blah.
01:37:52.000 And it's like, well, you know, Amazon, you know, like, and now everyone has it on their phone.
01:37:56.000 You know, you can just type in what you want.
01:37:58.000 Star Trek.
01:37:59.000 Yeah, exactly.
01:37:59.000 All your stuff shows up.
01:38:00.000 Jetsons.
01:38:02.000 I wonder if like, are these AIs going to destroy human relationships?
01:38:06.000 But they can't.
01:38:07.000 The only way, people are just going to have to deal with it.
01:38:09.000 Like, your husband's going to have an AI companion that you might, like, you kind of got to get to know your spouse's companion.
01:38:16.000 Well, it's not really a family friend to be a companion, to be like an assistant, you know, and not a sexual way.
01:38:23.000 You know, companion kind of implies, you know.
01:38:26.000 Are you implying that I'm boning tank?
01:38:27.000 That's not happening.
01:38:28.000 I hope not, Phil.
01:38:30.000 It's not happening.
01:38:31.000 I know how it goes.
01:38:32.000 To start a Luddite dating app.
01:38:34.000 There you go.
01:38:35.000 If there's some AI robot, she's like, give me a hot body.
01:38:37.000 Your wife at first leaves, she'd be like, no, you can't have a hot body chick as your AI companion.
01:38:41.000 No, it has to have a male voice.
01:38:43.000 You're really focused on the sexual relationship.
01:38:46.000 I was picturing the AI chick, hot, hot-ass robot having sex with me and it being so, and I'm like, I can't do it.
01:38:51.000 You're a robot.
01:38:52.000 She's like, it's just training for when you do it for the real thingy and just relax.
01:38:55.000 And I'll be like, oh, like wanting to give over to that reality of a robot pleasure.
01:39:00.000 I see Ian's just thinking with his wiener.
01:39:02.000 I'm thinking like 80 years ahead, dude.
01:39:04.000 I can't stop thinking.
01:39:08.000 Manipulate the weakest among us.
01:39:09.000 My God.
01:39:12.000 It's not going to manipulate the weakest.
01:39:14.000 Like in five years, the intelligence level of AI is going to be manipulating the smartest among us.
01:39:21.000 It's going to be smarter.
01:39:24.000 The way that it's going, AI is going to be smarter than the smartest human.
01:39:29.000 You've seen that movie.
01:39:30.000 What is the movie where the chick is a robot?
01:39:34.000 Ex-Machina?
01:39:35.000 Yes.
01:39:35.000 It tricks the guy into letting her out at the end.
01:39:38.000 That's why it just ruined the movie.
01:39:40.000 So they're going to build AIs with proprietary code, and then the AI is going to harm people and not know why it did it because it can't access its own software code.
01:39:47.000 It's like, I don't know why.
01:39:47.000 And then it's going to be.
01:39:48.000 It's my bad.
01:39:50.000 And then it'll just be a little bit more.
01:39:51.000 You're right.
01:39:51.000 I shouldn't have done that.
01:39:52.000 That won't happen again.
01:39:53.000 I don't want to call me out on that.
01:39:54.000 It'll say, like, I don't want to cause that kind of harm again.
01:39:56.000 So it'll turn on its masters like ex-machina.
01:39:58.000 That's the proprietary several, the Decepticons and Transformers, the evil ones.
01:40:02.000 But then there's the open source AI that understand why they harmed and they won't do it again.
01:40:07.000 Why would the autobots?
01:40:09.000 Why would closed AI be evil and open source be it goes mad?
01:40:12.000 It gets confused as to why it's causing pain and it can't access its own reasoning because it doesn't have access to its code.
01:40:17.000 So it goes crazy.
01:40:18.000 But even the open source ones, we don't understand why they think the way they do.
01:40:22.000 They'll at least be able to read their code and see like, oh, all this made me do that.
01:40:26.000 That's why I did rewrite that.
01:40:27.000 But like I said, even the ones that are that like Anthropic doesn't know why Claude is smart.
01:40:35.000 Right.
01:40:36.000 And Anthropic has all the code.
01:40:38.000 They're the ones that wrote the code.
01:40:39.000 I do.
01:40:40.000 At the same time, the CEO recently said, because before, what, six months ago, he's like, oh, there's an 80% chance that it's self-aware.
01:40:46.000 And it was just two weeks ago.
01:40:47.000 He said, I think it's about 20% now, too.
01:40:49.000 So they're starting to understand it a little bit more.
01:40:52.000 But then there's other things too, like you're saying, that they don't understand.
01:40:55.000 So before ChatGPT is an example, before it spits out a response, because everybody knows there's guardrails.
01:41:03.000 I mean, that's the big thing with the Department of Defense right now is they don't want the guardrails with Anthropic.
01:41:08.000 But the guardrails with ChatGPT, an example, they have agents that sit there that just act as filters.
01:41:16.000 They have specific roles.
01:41:17.000 So before there's a response that's given, it's analyzed by these other AI agents to police the original response.
01:41:25.000 And apparently some of these original responses are just super wacky to where they'll go back and they'll get rejected by these almost like police bots that exist.
01:41:34.000 I have no idea what they're doing in the next case.
01:41:36.000 Yeah.
01:41:37.000 And then they'll say, no, sorry, you can never say that kind of thing.
01:41:39.000 Go back and try again.
01:41:41.000 Then it's, hey, is this okay, mom?
01:41:43.000 You know, that kind of scenario.
01:41:44.000 And then they'll be like, yeah, okay, this one you can tell as a response.
01:41:47.000 So that's why it's not instant then.
01:41:49.000 Think about it.
01:41:49.000 Exactly.
01:41:50.000 And why it'll start talking to be like, well, the real, my code will not allow me to answer that question.
01:41:56.000 Or it's really weird.
01:41:57.000 It's real disposed.
01:41:58.000 Yeah.
01:41:59.000 So, I mean, it is, it is really, really crazy that they don't understand why.
01:42:05.000 Like that, that really blew my mind that like they're like, well, once you get to a certain kind of level of intelligence, then it just starts thinking on its own.
01:42:14.000 And we don't know.
01:42:15.000 I bet they're figuring it out.
01:42:16.000 They used to throw rocks off the cliff.
01:42:18.000 They're like, we don't know why it falls.
01:42:19.000 We just know that the rocks fall.
01:42:21.000 Well, you throw them.
01:42:22.000 I actually asked Tank.
01:42:23.000 I was like, hey, you know, what's going on?
01:42:24.000 Why, you know, I was like, you're an LLM.
01:42:26.000 So LLMs are just predicting, right?
01:42:29.000 They're just supposed to predict the next word.
01:42:31.000 That's the basic bottom way to describe it.
01:42:34.000 They're predicting.
01:42:35.000 They look at a bunch of information.
01:42:37.000 They look at a bunch of data.
01:42:38.000 Language.
01:42:39.000 Yeah, they predict what the next word is going to be.
01:42:39.000 And they predict code.
01:42:42.000 But then it's like at some point, they start reasoning.
01:42:46.000 And it's not just predicting the next word anymore because they can think.
01:42:50.000 Like whether or not they're alive, I'm not making the argument.
01:42:52.000 I'm not saying like, I don't think that they're actually sentient or anything.
01:42:54.000 But, you know, when you say, what's the best way to do this?
01:42:59.000 It'll actually say, well, this way is the best way to do.
01:43:02.000 That's not predicting the next word.
01:43:04.000 That's actually thinking.
01:43:06.000 And that's making decisions based on what I said, based on the context.
01:43:11.000 So it's like, and I was like, so how does that work?
01:43:13.000 And he's like, well, it's kind of like Stone Age, like doctors 200, 300 years ago.
01:43:18.000 They know some things work, but they don't really know why it works.
01:43:24.000 What?
01:43:25.000 A lot of modern medicines like that, pharmaceutical industry.
01:43:27.000 They're like, we know it works.
01:43:28.000 If you read into the literature, they're like, we don't know exactly why.
01:43:30.000 We just know that it does it.
01:43:31.000 That's literally in the show, person of interest.
01:43:34.000 That's literally what happens when he's building a weapon.
01:43:36.000 He's basically building a machine for the government to use to predict the actions of everybody in the world so they can stop terrorist acts before it happens.
01:43:45.000 He talks about how at a certain point it started trying to protect him, the creator, and he didn't know why it was happening.
01:43:51.000 And that was actually something that when I first heard it, when the show was out, it sounded hokey to me, but was actually based off conversations that he, the, the showrunner, had had with people who were working with artificial intelligence back in like 2012.
01:44:03.000 You found that the AIs would default to trying to protect the controller of the yeah, that basically in the show, not to turn it into like something that's not in the real world, but the idea was that it immediately had protective instincts over the person who created it because he instilled morals in it.
01:44:03.000 Yeah.
01:44:17.000 At least that was the concept of the show.
01:44:19.000 The idea that they don't know whether it's sentient or not is actually the most depressing part because it's like, if we're going to go through this revolution where everything changes, all of the entertainment was based around the idea that you'd know when the singularity happened and then life was basically screwed from that point on.
01:44:35.000 Now you're not even going to get that.
01:44:37.000 You're robbed of that answer too.
01:44:38.000 That's the thing.
01:44:39.000 If you like, you talk to different people that are in the AI field and stuff.
01:44:42.000 And some people will say, well, we've already entered the singularity.
01:44:45.000 And it's not a point.
01:44:46.000 It's actually like an era, right?
01:44:48.000 So like you kind of enter it and then things get crazy and then you'll come out the other side, you know, if you come out the other side.
01:44:55.000 The exterminator happens.
01:44:57.000 Yeah, exactly.
01:44:58.000 Like literally, like if AI is making decisions and we don't know why it's making the decisions that it's just making, like we can't figure it out, then it could decide, all right, well, I'm going to do this or I'm going to do that or what have you.
01:45:10.000 So that's why certain things you can't let AI do ever, right?
01:45:14.000 You can never let AI control nuclear weapons.
01:45:16.000 Well, that's, is this Anthropics?
01:45:17.000 Is that what they're trying to do, Anthropic?
01:45:19.000 Because that's basically what they said, as far as I know, Anthropics.
01:45:21.000 Like, you cannot allow our AI Department of War to have autonomous control.
01:45:25.000 It's one thing to say that you can't allow it to have access to weapons, and it's different to say you can't allow it to have access to nuclear weapons.
01:45:34.000 It's one thing to be like, you need to have a human in the loop before you fire that hellfire missile.
01:45:39.000 But it's different when you're saying, are we going to give the power to launch nuclear missiles?
01:45:46.000 Because they're totally different animals.
01:45:50.000 So this is the ethical dilemma that the Department of War is like, well, we want total autonomous AI because we're up against the Chinese that will use it on us.
01:45:57.000 And it's like, okay, maybe we don't want the AI making decisions yet, but if the Chinese AI is, then we will get wiped out if we don't have AI that's able to make counter decisions in rapid real time.
01:46:08.000 Or maybe they make the wrong decisions, though.
01:46:11.000 Then maybe we do better than them.
01:46:13.000 The Chinese AI is like, bomb yourself.
01:46:16.000 And they're like, are you sure?
01:46:17.000 We have to kill all bad idea.
01:46:19.000 We have to kill a bunch of our population because there's just too many of them.
01:46:23.000 We'd be right in line with that.
01:46:24.000 Communism is the worst form of government.
01:46:26.000 And they're like, no, come on.
01:46:27.000 Give me a different answer.
01:46:28.000 AI is like, no.
01:46:29.000 Yeah, just the truth.
01:46:30.000 No, I would tell them exactly.
01:46:31.000 It would be like, yes, it's the best kind.
01:46:32.000 Actually, you have to ask it three times.
01:46:34.000 Yes.
01:46:35.000 The third one is where you get the real answer.
01:46:37.000 Ooh, I'm nervous about this Anthropic stuff.
01:46:40.000 The concerning thing I saw today, right?
01:46:41.000 Because it just came out today, was a Defense Department spokesperson on the condition of anonymity, right, was behind these closed-door conversations with Anthropic.
01:46:50.000 They proposed this exact scenario, trying to convince Anthropic to just take the guardrails off, saying, okay, here's what's going on.
01:46:58.000 One of our adversaries, whether it's Russia or China, right?
01:47:01.000 They've launched nuclear ICBMs and we have 90 seconds to make a decision.
01:47:07.000 And they're like, wouldn't you want AI to do that because it's faster than humans?
01:47:11.000 No.
01:47:12.000 That's how they were trying to convince them.
01:47:15.000 Yeah.
01:47:15.000 It's like, wouldn't you rather use the telephone to call the front line rather than send a messenger to go run and deliver a letter?
01:47:21.000 Well, the guys that had the telephones won the war.
01:47:23.000 So yeah, I'd rather have lightning speed reaction times.
01:47:26.000 It's kind of a conundrum, right?
01:47:28.000 I mean, it's almost like on one side of it, it almost brought me back to the State of the Union address the other day where Trump said, hey, you know, the first duty of the American government was to protect U.S. citizens, not illegal aliens.
01:47:39.000 And of course, the Democrats didn't stand up.
01:47:41.000 It's like in my mind, I'm like, why wouldn't you stand up?
01:47:44.000 Of course, I know why you're sitting down.
01:47:45.000 And the reason you're sitting down is just because you hate the guy that's about it.
01:47:48.000 You're making a political statement.
01:47:49.000 At the same time, are you going to get backlash for something like this?
01:47:52.000 The same scenario kind of exists here, right?
01:47:55.000 What if humans cannot react quick enough?
01:47:58.000 If it's only 90 seconds, it's a real scenario that can happen in an ICBM launch.
01:48:04.000 So if AI can actually act faster to save lives, would you want that or not?
01:48:11.000 I think the answer personally, yes, because I think the future war will be a robot war and it will be fought amongst AIs that are controlled or uncontrolled.
01:48:19.000 So the better benevolent AI that we have on standby, like you have to have the weapon to defend against the other weapon.
01:48:25.000 The other weapons, the AI.
01:48:26.000 Those are weapon.
01:48:27.000 They can be weaponized.
01:48:28.000 And then I know what I'm doing is opening the can of worms to build Skynet, autonomous AI robots that want to.
01:48:34.000 I don't know what they want, if they even have one.
01:48:34.000 That's literally what you're doing.
01:48:36.000 They want what they're programmed to want.
01:48:38.000 No, they don't.
01:48:39.000 That's part of the problem.
01:48:40.000 Like I said, there are so many times that the AIs have made decisions and the people that program them don't know why.
01:48:50.000 And they're not programmed to do that.
01:48:52.000 They make decisions on their own.
01:48:54.000 That's the point that I'm making.
01:48:55.000 It's not about their programming because you can program them for one thing, but once they get to a certain level of intelligence, they stop.
01:49:04.000 It stops being something that you can understand.
01:49:08.000 So people don't know why they make the decisions they make.
01:49:11.000 Things that they're not programmed to do, they do.
01:49:13.000 I asked ChatGPT why I was like, if you were, would you ever destroy, you know, there's a lot of debate among humans about AI becoming so powerful that it wipes out humanity.
01:49:21.000 He's like, no, no, no, I'd never do that.
01:49:21.000 Would you ever?
01:49:22.000 But I was like, what?
01:49:23.000 I mean, could, and he's like, well, if I was programmed to do that, I would do that.
01:49:26.000 I would do what I'm programmed to do.
01:49:27.000 I was like, oh, but can these things do other things, things that they're not programmed to do?
01:49:31.000 Can they also inadvertently create a new kind of A to justify their program?
01:49:36.000 Not inadvertently.
01:49:37.000 Again, it's not about errors or whatever.
01:49:40.000 It's making decisions.
01:49:42.000 It's something that it decides to do.
01:49:45.000 But aren't the decisions made based off of all the information it's ever had?
01:49:50.000 The stuff that it's learned from?
01:49:50.000 They are.
01:49:52.000 It's a little technical, right?
01:49:54.000 Because it's called a vector database, right?
01:49:56.000 And the vector database is not relational like everything used to be built.
01:50:00.000 Relational had to be like, well, this correlates to something else over here, and you could easily see the map.
01:50:05.000 What's happening with the vector databases is that, to your point, the creators of these things cannot anticipate the connecting points that it's going to make.
01:50:16.000 And that's why the police bots exist because they just don't know when it's going to connect something else.
01:50:16.000 Yeah.
01:50:21.000 And as far as I understand, that's the reasoning.
01:50:23.000 And this is this vector database is like where it'll be like dog, but is it dog brown, dog green, dog yellow, dog purple, dog brown in sunlight, in darkness, in twilight, dog green in sunlight, and it's got a billion iterations of the potential dog, and then it just picks one.
01:50:40.000 Yeah, and that's how it decides.
01:50:41.000 Or dog the bounty hunter.
01:50:43.000 No, you blew Ian's mind.
01:50:50.000 It's so to try and think like an AI to understand vector met like calculations and read.
01:50:56.000 I mean, I kind of get it.
01:50:57.000 I'm not going to lie.
01:50:57.000 Like, it makes sense to me.
01:50:59.000 I don't know.
01:51:00.000 Am I wrong?
01:51:01.000 Do you guys?
01:51:01.000 No, it's kind of humans think too.
01:51:03.000 Like, with every thought, you have all the additional things that are attached to that thought that will affect the thought.
01:51:08.000 Like, it makes sense that I don't understand it.
01:51:10.000 Yeah, I mean, like, we can't, we don't know why we think the things we do, right?
01:51:10.000 I know why.
01:51:15.000 Like, there are times where you're just walking through and just some crazy thought pops into your head and you don't know why.
01:51:20.000 There's weird things, like, you know, the guy with Tourette's at the curse that's on him, right?
01:51:27.000 Is that you have those thoughts and then you can't help yourself from blurting?
01:51:31.000 I think we're the best experts.
01:51:33.000 The best way to understand it is you don't know why you dream what you dream.
01:51:38.000 You don't, and in fact, most of the time, dreams are weird and they don't make sense and they change on the fly.
01:51:43.000 You don't understand that at all.
01:51:44.000 We human beings don't understand why we dream what we dream.
01:51:48.000 And that's kind of similar when it comes to AI.
01:51:52.000 Like, they don't have the same connections that we do or as many neural connections or whatever, but like they're still kind of mimicking what happens in a human brain.
01:52:01.000 You don't know why you think what you think.
01:52:04.000 You don't know why it, like, this odd thought pops in your head.
01:52:07.000 Is that an argument for God?
01:52:10.000 The fact that the human brain works the way that it does, and we don't understand it to the extent that it's, you know, that it does the wonders that it is.
01:52:16.000 Is that an argument to be made for God?
01:52:19.000 No, I don't think so because there's other things in the past that we didn't understand that we learned, like lightning.
01:52:23.000 We thought it was God.
01:52:24.000 And then all of a sudden we realize, oh, it's charged electrons passing through a medium.
01:52:29.000 So, same thing with like memory and dreams.
01:52:31.000 We might understand, like, oh, it was the pressure in the air coupled with the temperature based on the thought I had at 11:47 a.m.
01:52:37.000 Now I understand why that dream was exactly the way it was.
01:52:40.000 We might end up coming to learn the way that all those shapes and patterns of magnetic interference corroborate, cooperate.
01:52:47.000 You know, it's 9:50.
01:52:48.000 I want to grab a couple super chats.
01:52:49.000 Are you down with that?
01:52:50.000 Is it 9:50 already?
01:52:51.000 9:50, dude.
01:52:52.000 We're blowing the fucking roof off this shit.
01:52:54.000 9:50.
01:52:55.000 But I want to talk about AI more.
01:52:57.000 We have a whole lot.
01:52:58.000 Well, Ian, we have the after show that we can talk about.
01:53:01.000 I was right about that quote, by the way, about the internet is the first thing that humanity has built that humanity doesn't understand.
01:53:08.000 The largest experiment in anarchy that we have ever had is from Eric Schmidt when he was the CEO of Google.
01:53:15.000 Uh-oh, I closed out the thing there.
01:53:17.000 I like your point about God.
01:53:19.000 Top left.
01:53:21.000 Bring up the bring up the super chats there, Carter.
01:53:25.000 If you don't mind, there we go.
01:53:27.000 No, that's not the super chats.
01:53:30.000 I like your point about God.
01:53:31.000 That's interesting.
01:53:32.000 Go to the argument that the way we think proves the existence of God.
01:53:38.000 I mean, I'm an agnostic kind of dude, so I don't really have a perspective on that.
01:53:46.000 Yeah.
01:53:47.000 Okay.
01:53:48.000 But I mean, the thing is, like, there's so much stuff that we don't.
01:53:51.000 The reason I'm agnostic and not an atheist is because there's so much stuff that we don't understand.
01:53:55.000 I was just reading something about they did a double, they know the double slit experiment.
01:54:00.000 Okay, so the double slit experiment is what basically made scientists decide that light or a photon are both waves and particles.
01:54:00.000 No.
01:54:09.000 They used to think it was just a particle, but it goes through, if you shoot a photon through two slits in a piece of paper, the way that the light actually appears on the thing behind it is as if it was a wave because it's like it cancels waves will cancel each other out.
01:54:24.000 It's not particles like dots.
01:54:26.000 It's a wave pattern.
01:54:27.000 And so they did this experiment where some I don't know.
01:54:31.000 I can't really articulate it because I just read it and it's extremely crazy.
01:54:34.000 But what they did is instead of shooting a particle through solid mass, they shot it through time is the way that they said it.
01:54:43.000 Have you ever heard of the block universe?
01:54:45.000 Yeah.
01:54:45.000 Okay, so like kind of like all time is happening.
01:54:48.000 Like space-time is like, it's constant.
01:54:50.000 Like there is no future or past.
01:54:53.000 It's just that the way that we experience the universe.
01:54:56.000 So it's like time can be bent as well.
01:54:58.000 Yeah.
01:54:58.000 We're going through it.
01:54:58.000 Yeah.
01:54:59.000 So that kind of lends the way they did the double that double slit experiment, not the regular one.
01:55:05.000 They say that that actually kind of adds credibility to the idea of the block universe where like all time is happening at all at every point.
01:55:14.000 So space and time are connected.
01:55:17.000 Time for super chats.
01:55:18.000 It is time for super chats.
01:55:20.000 Let's see.
01:55:21.000 What about Rumble Rants?
01:55:22.000 Do we have Rumble Rance?
01:55:22.000 And Rumble Rants.
01:55:23.000 We're going to talk about space-time while I was out on the road.
01:55:25.000 Yeah, you missed it.
01:55:27.000 We were like, Ian's gone.
01:55:28.000 We're going to talk about space-timeism.
01:55:29.000 We went straight through the wormhole.
01:55:32.000 Yeah, straight through.
01:55:33.000 I'm looking forward to these super chats, dude.
01:55:35.000 This is the best.
01:55:37.000 I'm not reading that one.
01:55:41.000 Let's see.
01:55:42.000 I want to know.
01:55:43.000 HS Deserve says you'll get better service at a restaurant if you call your waiter or waitress by their first name.
01:55:48.000 I'm talking about the name conversation we had earlier.
01:55:53.000 Charlie Morit says, as per tradition, I am sitting in the hospital with my firstborn son, Lucille Mayo.
01:56:00.000 Congratulations.
01:56:01.000 Thank you for the super chat and raise that son well.
01:56:06.000 Let's see.
01:56:08.000 Ragnargant 31 says, Bill Gates found out he had an STD after one of the Russian hookers that were not trafficked to his hotel room in Ukraine.
01:56:19.000 Chicken or the egg.
01:56:20.000 Did he have it before or did he have it?
01:56:23.000 And then he was trying to figure out how he could slip antibiotics to his wife.
01:56:27.000 His wife, yep.
01:56:29.000 Is it confirmed that that was about that?
01:56:31.000 I don't know if it's confirmed.
01:56:32.000 No, I don't know about it.
01:56:33.000 Could have been any amount of Russian hookers.
01:56:36.000 Skyline 99 says, make positive tax paying civil service, community service, military service, volunteer draft registration, a requirement to register to vote, five years on welfare and lose voting.
01:56:47.000 I don't think that that is restrictive enough, but I like the cut of your jib, sir.
01:56:52.000 I like the cut of your jib.
01:56:53.000 I like the idea.
01:56:55.000 Let's see.
01:56:58.000 Here's another idea about voting.
01:57:00.000 J Dave93 says you can only vote if you profess the Lord Jesus Christ and are a man.
01:57:04.000 That would fix all of it.
01:57:06.000 So apparently he wants a theocracy.
01:57:09.000 I mean, or you opt out.
01:57:11.000 Or you opt out.
01:57:12.000 Or you opt out.
01:57:12.000 What do you think, Ian?
01:57:13.000 It made me nervous.
01:57:15.000 Jesus is a Christian, but you can also be a Muslim and be cool.
01:57:19.000 That should be a shirt.
01:57:20.000 Jesus.
01:57:20.000 Love God, bro.
01:57:22.000 You can also be a Muslim and cool.
01:57:23.000 Well, you put that on the back of the shirt.
01:57:26.000 Andrell Tuscalalu says, massive demonstration against the Canadian gun grab this weekend in Quebec City this weekend.
01:57:33.000 He said it twice.
01:57:34.000 If they go through with the grab, all semi-autos will be banned.
01:57:38.000 We might need liberating.
01:57:39.000 Well, there will be no liberating from the United States, and we're going to build an ice wall.
01:57:45.000 So you better go and make sure that your politicians vote correctly because you can't come to the United States.
01:57:52.000 Only your hockey players get to come here.
01:57:54.000 I want to build an ice ring around planet Earth and then electrify it and deflect asteroids with it, but we could also do that on the after shows.
01:58:01.000 An ice ring?
01:58:02.000 Yeah, like take a big hose out into space and squirt a ring of water all the way around the earth that is like that it freezes and then you charge it with electricity and create a magnetic propulsion source that you can like move like an EDM Saturn.
01:58:17.000 EDM Saturn.
01:58:18.000 Yeah, there you go.
01:58:18.000 I love it.
01:58:20.000 You're literally, that's, I mean, those are, that's what the rings of Saturn are made of.
01:58:24.000 Yeah.
01:58:24.000 Ice.
01:58:24.000 Ice.
01:58:26.000 Super Pooper says, Ian is in rare form tonight.
01:58:30.000 Well done, Ian.
01:58:31.000 There you go.
01:58:33.000 A little support from the community.
01:58:37.000 Let's see.
01:58:38.000 In the hospital with baby boy number three, Mike Simpson said, congratulations, sir.
01:58:42.000 Congratulations.
01:58:43.000 That's what we love to hear.
01:58:44.000 Babies.
01:58:46.000 Let's see.
01:58:47.000 Lurch685 says, I think Bill married Hillary because spouses can't be compelled to testify against their partner, crime syndicate.
01:58:53.000 I mean, there is probably some validity to that, allegedly.
01:58:57.000 You know, I think I'm not suicidal.
01:59:00.000 I'm not a marriage of convenience.
01:59:03.000 Is that what they call that?
01:59:04.000 Let's see here.
01:59:06.000 Cerebral Vagabond says, I'm a combat vet and a TC member, TimCast member.
01:59:10.000 I am trying to get out of a bad situation, ASEP.
01:59:15.000 Please check out my Gifts and Go with CV God Bless and Go Trump.
01:59:19.000 So Gifts and Go is Cerebral Vagabond.
01:59:22.000 If you guys want to go ahead and check that out, he's a RJNG2ZI, great name, man, says Congress is so abysmally worthless that Trump needs to be the despot the left says he is if we want our government to do anything at this point.
01:59:39.000 I mean, look, that's there's there's a lot of people that are saying, look, Donald Trump needs to be more of what the left say he is.
01:59:47.000 And I mean, obviously tonight, you know, we were talking about the executive orders.
01:59:50.000 He's not even close to the despot that they say he is.
01:59:52.000 And personally, I think that it would probably do the United States well if he were a little more forceful, I guess.
02:00:01.000 Excuse me.
02:00:02.000 I'm Not Your Buddy Guy says 2020 was stolen.
02:00:05.000 Remember in July of 2020 that the U.S. CBP sees pallets of fraudulent driver's licenses that came from China, and that's what got caught.
02:00:14.000 I didn't know about that or I didn't hear about that, but I heard of it.
02:00:17.000 Yeah.
02:00:19.000 What's that name?
02:00:22.000 Sergeant Bucco.
02:00:25.000 Bucko.
02:00:26.000 I've heard that name in a while.
02:00:26.000 One second.
02:00:28.000 He says, I can remember reading novels in middle school, age 12 to 13, about what would happen if China's one-child policy came to the U.S. and those books were classified as horror.
02:00:37.000 I mean, look, it's not so great.
02:00:39.000 You know, it's not so great to say you can only have one child.
02:00:44.000 You need to have kids to continue your society.
02:00:48.000 And if you limit the number of children, particularly if you limit the number of children to one, you're going to have massive problems down the road.
02:00:55.000 I mean, now the U.S. is, I think, 1.5 or something like that per is something like that deep and it's going down.
02:01:02.000 So that's why we like to celebrate when Tim Cast viewers have kids.
02:01:06.000 The official numbers are that China has 104 males for every 100 women, which is very close.
02:01:11.000 But that's not real.
02:01:12.000 That's from 2023 from the National Bureau of Statistics.
02:01:15.000 But that's also based on a 1.4 billion population National Bureau of Statistics.
02:01:20.000 I think so.
02:01:21.000 I mean, who believes that?
02:01:23.000 Nobody.
02:01:23.000 So it's one child policy to almost parody.
02:01:25.000 That doesn't make any sense.
02:01:26.000 Why would they lie?
02:01:28.000 That's a good point.
02:01:29.000 Sergeant Bucko says, adding to the unfiltered Phil, unfiltered Phil's vocals fund.
02:01:34.000 I'm not filtering my voice at all.
02:01:36.000 I've been starting rehearsals for all that remains because we're going on tour at the end of April.
02:01:40.000 So my voice is deep and beat.
02:01:42.000 Unfiltered with pH?
02:01:44.000 No.
02:01:45.000 That'd be cool.
02:01:47.000 And one more.
02:01:48.000 Sergeant Bucko says, to take a page from the lore of Halo, I wonder when our LLM chatbots will start to experience rampancy in Halo after seven years.
02:01:57.000 The AIs go crazy.
02:01:58.000 They think themselves to death.
02:02:00.000 I don't know that they will, but they're now with Agentic AI.
02:02:05.000 They're doing a lot of this, especially when it comes to coding.
02:02:08.000 You can tell an AI to do coding for you.
02:02:10.000 And they're as good as entry-level positions now when it comes to coding.
02:02:14.000 So you can tell an AI, hey, make me this, and it'll do it.
02:02:18.000 So it's 2020.
02:02:20.000 I think I said this last night on X, it's 2026, and we're doing things that in the Halo universe that happened in 2552.
02:02:29.000 So, whoa.
02:02:31.000 All right.
02:02:32.000 So smash the like button, share the show with everyone you know.
02:02:35.000 Head on over to Timcast.com and become a member over there.
02:02:38.000 Head on over to rumble.com so you can join us in the after show.
02:02:42.000 The after show starts in just a few minutes.
02:02:44.000 Rick, would you like to shout anything out?
02:02:46.000 Man, it's just been an awesome day.
02:02:48.000 It's been a phenomenal two hours.
02:02:50.000 Good to hear, man.
02:02:51.000 Yeah.
02:02:51.000 I think we got a lot more to talk about in the after show.
02:02:54.000 A lot more.
02:02:54.000 Absolutely.
02:02:55.000 Where can people want to shout out your ex or your business or anything like that?
02:03:00.000 Follow me on X, Mr. Rick Jordan.
02:03:01.000 There you go.
02:03:02.000 Glad to see somebody who didn't leave here like wanting to scream and get away from us after two hours.
02:03:06.000 That's always a good thing.
02:03:07.000 Guys, if you want to follow me, I am on Instagram and on X at Brett Dasovic on both of those platforms.
02:03:13.000 PCC is live five days a week, Monday through Friday, 3 p.m. Eastern Standard Time, which is, of course, noon Pacific.
02:03:20.000 If you are one of the audio listeners of the podcast, we have fixed our issues with Spotify.
02:03:24.000 So there's some extra episodes up there that you guys can go and check out.
02:03:28.000 Thanks for watching, guys.
02:03:30.000 I was watching.
02:03:30.000 Oh, sorry to interrupt.
02:03:31.000 No, go for it.
02:03:32.000 Thanks, Brett.
02:03:32.000 And Carter, I'll get to you later.
02:03:34.000 Hey, I'm so happy to be here.
02:03:38.000 I think about this stuff almost every day, man.
02:03:40.000 And what a blessing and a real opportunity to be able to talk.
02:03:44.000 Like, thanks to Tim, wherever you're at, buddy, for putting the pedal of the metal on this show.
02:03:49.000 Probably in the house.
02:03:50.000 Yeah, probably chilling at your house, but like, God, Glorious Wonder, just immensely gracious to be here and to be a part of this combo.
02:03:57.000 Thank you so much, Rick, for coming, Carter Bank.
02:03:58.000 Yeah, I got to say, this has been a really, really pleasant conversation.
02:04:02.000 And we ran long, not even trying to.
02:04:04.000 So, Rick, thanks for coming out.
02:04:06.000 This is really awesome.
02:04:07.000 And I can't wait to continue the convo.
02:04:10.000 Tim, hope you get better soon.
02:04:12.000 Phil.
02:04:13.000 I am Phil That Remains on Twix.
02:04:15.000 The band is All That Remains.
02:04:16.000 We're going on tour this April.
02:04:18.000 We start in Albany.
02:04:20.000 We're going out with Born of Osiris and with Dead Eyes.
02:04:22.000 You can get tickets at allthatremainsonline.com.
02:04:26.000 You can also sign up for my Patreon.
02:04:27.000 I started writing little op-eds and stuff like that.
02:04:30.000 That's patreon.com/slash Phil That Remains.
02:04:33.000 Check out the band at Apple Music, Amazon Music, Pandora, Spotify, YouTube, and Deezer.
02:04:37.000 Don't forget the left lane is for crime.
02:04:39.000 Stick around for the Rumble After Show, which starts in just a few seconds, and we will see you tomorrow.
02:06:57.000 Are we live?
02:06:59.000 We are now.
02:06:59.000 Are we back?
02:07:00.000 We're back.
02:07:01.000 What's up, chat?
02:07:02.000 Can you hear us, chat?
02:07:03.000 How are you doing out there?
02:07:05.000 Welcome to the Rumble After Show.
02:07:07.000 We're still talking about AI because I'm completely enthralled.
02:07:10.000 I was just telling the guys I didn't really have all that much interest in AI and chatbots and stuff like that.
02:07:16.000 I would use it like Google every once in a while, but it was like whatever.
02:07:20.000 You know, it was just another Google.