Timcast IRL - Tim Pool - March 09, 2024


Marine Father ARRESTED At SOTU For Calling Out Biden Over Son's Death w-Abe Hamadeh | Timcast IRL


Episode Stats

Length

2 hours and 6 minutes

Words per Minute

209.13744

Word Count

26,428

Sentence Count

2,154

Misogynist Sentences

79

Hate Speech Sentences

46


Summary

A Gold Star father who called out President Joe Biden for the death of his son in Afghanistan was arrested and charged with a misdemeanor. We talk about why Joe Biden should have done more to protect this man. Plus, a special guest joins us for a post-mortem on the State of the Union address.


Transcript

00:00:00.000 A Gold Star father who called out President Biden for the death of his son in Afghanistan
00:00:17.000 was arrested and charged with a misdemeanor.
00:00:20.000 And I can't, I honestly, I didn't believe it when I saw the story because we knew that the guy, that this father had been yelling during the State of the Union.
00:00:27.000 We knew he had been escorted out, but to find out later he was arrested and charged with a misdemeanor is shocking.
00:00:33.000 Joe Biden and his administration should have intervened immediately to protect this guy, recognizing what he was upset about.
00:00:38.000 I think the most important thing is, this was not even the most disruptive yelling of the night during the State of the Union, so it is the most shockingly offensive, but why is that surprising?
00:00:50.000 Now, I'm not gonna say, you know, definitively that Joe Biden went there, banged on the door, and demanded the guy be arrested.
00:00:55.000 He was speaking when they brought the guy out and charged him.
00:00:57.000 But certainly by now, he could have intervened and said, are you nuts?
00:01:01.000 This is a man whose child died in Afghanistan under Joe Biden's failed administrative policies and military policies.
00:01:09.000 Naturally, he's upset.
00:01:10.000 It's surprising.
00:01:11.000 Now, Joe Biden's actually getting Heavily criticized for calling Lakin Riley, Lincoln Riley.
00:01:19.000 And it's just, it's, you know, it's funny to see the corporate press say, wow, Joe Biden was so strong.
00:01:26.000 He was so on point.
00:01:28.000 So we're going to be doing a follow-up post-mortem on the State of the Union address and where we're at.
00:01:33.000 And shout out to Donald Trump who had the memes.
00:01:36.000 He posted this video on Instagram where he was using like Snapchat filters on Joe Biden and Kamala Harris.
00:01:41.000 It's just so funny.
00:01:42.000 So we're gonna have a fun time tonight talking about all that.
00:01:45.000 Before we get started, my friends, head over to castabrew.com and buy our coffee.
00:01:48.000 Cast Brew Coffee is our coffee company.
00:01:51.000 When you buy from Cast Brew, you are supporting the work we do.
00:01:54.000 It is our, we sponsor ourselves.
00:01:55.000 It's our coffee.
00:01:56.000 So you can pick up your Rise with the Berto Jr.
00:01:57.000 Appalachian Night's currently still sold out.
00:02:00.000 people buy it too much.
00:02:01.000 They're buying it like crazy.
00:02:02.000 Stand your grounds, Mr. Bocas Pumpkin Spice Experience are all very delicious as well.
00:02:07.000 And also head over to TimCast.com, click join us, become a member to support our work directly.
00:02:12.000 And you are helping make this show possible because we are principally funded by members.
00:02:17.000 So seriously, if you like the show, you like all the content we put out,
00:02:20.000 and it's all just available to you free of charge, we do have the members only uncensored show
00:02:24.000 and a massive library of uncensored content available on TimCast.com.
00:02:28.000 But click join us, become a member, and you'll also get access to our Discord server, where you can check out a whole bunch of other Hangouts and shows produced by members of TimCast.com.
00:02:37.000 And that's a 24-7 Hangout, talk to like-minded people, argue with them, agree with them, whatever it is you want to do.
00:02:43.000 Smash that like button, subscribe to this channel, share the show with your friends.
00:02:47.000 Joining us tonight to talk about this and everything else is Abe Hamade.
00:02:51.000 Dan, thank you so much for having me.
00:02:52.000 So a little bit about myself.
00:02:54.000 I'm a former prosecutor at the Maricopa County Attorney's Office, also a former Army Reserve intelligence officer who served in the Middle East.
00:03:01.000 I ran for attorney general in 2022 in Arizona.
00:03:04.000 You know, with Kerry Lake and, you know, we saw what happened with that election.
00:03:08.000 We had our election taken away from us by 280 votes out of 2.5 million, believe it or not.
00:03:13.000 So we're still fighting the election lawsuit.
00:03:15.000 But now I'm running for Congress in Arizona's 8th congressional district.
00:03:18.000 I'm endorsed by President Trump and Kerry Lake.
00:03:20.000 And, you know, I quickly see our country going to hell.
00:03:23.000 So we need some courage in there because Everything's on the line this November.
00:03:28.000 Those are some great endorsements.
00:03:30.000 Well, they're powerful.
00:03:31.000 And that's why, you know, Kerry Lake, President Trump, myself, we've been fighting for honest elections because we know what's going on with what's happening.
00:03:38.000 I mean.
00:03:38.000 It's so despicable.
00:03:40.000 Arizona, especially.
00:03:41.000 That's like the state.
00:03:42.000 It's like third world country.
00:03:44.000 Yeah.
00:03:44.000 It's embarrassing.
00:03:45.000 But, you know, I think people are waking up a lot faster than people realize.
00:03:49.000 So hope it's always darkest before dawn.
00:03:52.000 And that's what I'm looking forward to.
00:03:54.000 But everything these next six, seven months is everything's at stake.
00:03:57.000 Right on.
00:03:58.000 Well, this will be fun.
00:03:58.000 Thanks for hanging out.
00:03:59.000 We got Hannah-Claire hanging out.
00:04:00.000 Hey, I'm Hannah-Claire Brimlow.
00:04:01.000 I'm a writer for SCNR.com.
00:04:03.000 I'm happy to be back tonight.
00:04:04.000 Ian's here, too.
00:04:04.000 Yes.
00:04:05.000 Hello, everyone.
00:04:05.000 I'm a huge proponent of voter integrity, so I'm glad you're here to talk about this.
00:04:09.000 Maybe we'll go deep on it a little bit.
00:04:11.000 Machine voting makes me very nervous.
00:04:12.000 Electronic voting at all.
00:04:13.000 We don't have the code.
00:04:14.000 We can't see what the machines are doing.
00:04:15.000 Drives me nuts.
00:04:17.000 So, hey, good to see you, man.
00:04:18.000 You too.
00:04:18.000 Let's go deep.
00:04:19.000 Surge.
00:04:19.000 Take me deeper.
00:04:20.000 I'm at Surge.com.
00:04:22.000 I'm going to be Surge.net soon.
00:04:24.000 This is the last Surge.com.
00:04:25.000 Yeah.
00:04:26.000 Cheers.
00:04:26.000 Right on.
00:04:28.000 Here's the big story from last night.
00:04:30.000 It's one of the most shockingly offensive stories I've seen.
00:04:33.000 I mean, but I'm not surprised.
00:04:34.000 Gold Star Father Arrested for State of the Union Heckling.
00:04:39.000 Steve Nickoui.
00:04:40.000 How do you pronounce that?
00:04:40.000 Nickoui?
00:04:41.000 I've been saying Nickoui.
00:04:42.000 I don't know if that's right.
00:04:43.000 Nickoui.
00:04:44.000 51 Arrested for Heckling Biden During State of the Union.
00:04:46.000 Yeah, but barely.
00:04:48.000 Barely.
00:04:50.000 He's a gold star dad who lost his son in the 2021 Kabul airport bombing.
00:04:55.000 He was a guest of Florida rep Brian Mast who was outraged by the arrest.
00:04:59.000 So what do they have what he yelled?
00:05:01.000 Abigail.
00:05:01.000 He yelled Abigail and United States Marine.
00:05:04.000 And it was not even the most disruptive thing yelled that night.
00:05:07.000 You had people yelling liar at Joe Biden.
00:05:10.000 Nothing.
00:05:12.000 It almost feels like it was intentionally meant to offend the country.
00:05:17.000 Members of Congress, who most people have utter disdain for, yell things all they want, nothing happens.
00:05:23.000 A guy whose child died serving this nation simply says, Abbeygate, United States Marine Corps, and he gets criminally charged.
00:05:30.000 They really want us to hate them, that's all I can say.
00:05:33.000 Yeah, it's it's terrible optics for the Biden administration.
00:05:37.000 I'm trying to pull the name right now, but this father was there as a guest of another congressman.
00:05:41.000 Yeah, Brian Ast.
00:05:42.000 Right.
00:05:42.000 So the fact that this wasn't a secret, Biden knew he was going to be there and then continues to say, well, it was a great success when we pulled out Afghanistan.
00:05:52.000 It went really well.
00:05:53.000 Like this is the message that his administration has been telling all of these families, despite the fact that they know differently.
00:05:58.000 They personally suffer the consequences of his ineptitude.
00:06:00.000 Surge tracked slurs last night.
00:06:04.000 What was it?
00:06:04.000 Was it a hundred and... What was the final number?
00:06:07.000 113?
00:06:07.000 Yeah, 113.
00:06:08.000 But that was a low estimate because there were some he was slurring.
00:06:10.000 No, yeah, we were being nice.
00:06:12.000 We were being nice.
00:06:13.000 So here's how we counted it.
00:06:15.000 And y'all were watching, you probably noticed when we were counting.
00:06:18.000 If Joe Biden said something like, you know, we got to go buy the country and we got to get it.
00:06:25.000 We'd be like, okay, there's one.
00:06:27.000 Or maybe two, depending on how heavy it was.
00:06:30.000 But it really had to be incomprehensible.
00:06:32.000 He'd have to say something like, we've got to determine when the flag goes up.
00:06:38.000 And I'm like, okay, this is a slur.
00:06:39.000 But there were several words where he slurred, but we're like, we know what he said.
00:06:45.000 So he's like, flag.
00:06:45.000 And we're like, okay, he said flag, we get in the context, so we wouldn't count that.
00:06:49.000 I think we easily could have counted 150.
00:06:51.000 Yeah, and it was multiple per minute.
00:06:53.000 I mean, there wasn't a stretch of time that he was just stumbled for.
00:06:56.000 I mean, anyone who's public speaking for a long time is gonna make some kind of error.
00:06:59.000 I'm not trying to be harsh.
00:07:00.000 That's fine.
00:07:01.000 I mean, if- I mean, look- Multiple per minute is kind of ridiculous.
00:07:05.000 Is 30 acceptable?
00:07:06.000 No.
00:07:09.000 Maybe six is when it becomes unacceptable for a president, you know, State of the Union.
00:07:12.000 Well, it highlights, you know, the special counsel report that said he wasn't mentally fit, right?
00:07:16.000 And how here he is giving the State of the Union, not really, you know, I think Republicans have set the bar so low for Joe Biden.
00:07:24.000 So anything of him just going and actually, you know, having a complete sentence is actually positive for him.
00:07:30.000 And that's kind of what we make our mistakes on because we've been bashing Joe Biden.
00:07:33.000 But so many people, Americans, are not paying attention to every single slip that he's making.
00:07:38.000 And that's what's, you know, but the State of the Union yesterday was really, it was a campaign speech.
00:07:43.000 I mean, the amount of times he was mentioning President Trump, the former president.
00:07:47.000 But this is the radical left.
00:07:48.000 They know that everything is on the line this November.
00:07:51.000 So that's why they're going to go so aggressively using the government's power and the government institutions to go after the political enemies.
00:07:57.000 And that's why they, you know, that, You know, his son died and there's never been accountability for what happened in Afghanistan, and yet they go and charge him with a misdemeanor.
00:08:06.000 But nothing with the Gaza protesters who actually, you know, stood and- Blocked the motorcade.
00:08:12.000 Blocked the motorcade.
00:08:13.000 Nothing happened with that.
00:08:14.000 But here this father who yelled out, Abigail, and you see the media, how they say he's heckling him.
00:08:18.000 I don't think that was a heckle.
00:08:20.000 I think that was just an emotional outburst from a father who wants accountability for what happened in Afghanistan.
00:08:27.000 I put calling out He called him out, he said murders are down under my administration, and then he yelled, United States Marine Corps, and I'm surprised I actually pulled him out of the room, to be honest.
00:08:39.000 I was like, oh wow, they're taking somebody out, because people were yelling to arrest him.
00:08:43.000 The first thing Joe Biden should have done is, after this was over, be like, what was it, you know, no, no, no, no, no, that's gonna, look, it's gonna look bad for you.
00:08:49.000 You know what I mean?
00:08:50.000 So politically, political strategy-wise, he should have been like, no, no, no, no, no, that man is released tonight, do not do that.
00:08:57.000 I mean, it goes to show that he had all of these political talking point guests in the box, right?
00:08:57.000 Right.
00:09:03.000 He was like, this lady's tried to get an abortion, this person's part of the union, but he doesn't have any of the family members of these Marines, right?
00:09:11.000 And that's because this is one of the big flaws of his administration.
00:09:14.000 This is one of their biggest failures that they have not been able to negotiate around.
00:09:18.000 He really put service members in harm's way.
00:09:20.000 And we're on the brink of, you know, what did he say last night?
00:09:24.000 We're going to put up a temporary point port in Gaza.
00:09:27.000 I mean, but no boots on the ground, just on the water.
00:09:29.000 He's he is willing to sacrifice your children even and then lie about it.
00:09:34.000 Right.
00:09:35.000 He's not willing to be honest about the dangers that he put service members into and is willing to put them in again.
00:09:39.000 What do you guys think about the speech overall?
00:09:42.000 Right.
00:09:42.000 Right now, you got these Democrat pundits being like he was strong.
00:09:46.000 There was a video from Joe Scarborough from before the State of the Union that is getting roasted because he's like, this version of Joe Biden is the best version of Joe Biden.
00:09:56.000 He is strong.
00:09:57.000 He is bright.
00:09:58.000 And it reminded me of the scene from Tenacious D with the open mic host.
00:10:02.000 You guys remember?
00:10:03.000 You ever see that movie?
00:10:04.000 So in the movie, When Jack Black and Kyle Gass, you know, go to play for the first time, the open mic host is like, this next band asked me to read this, so okay, whatever.
00:10:17.000 This band, and it's like he's not into it.
00:10:19.000 And then later on, when Jack Black is sleeping as a dream, open mic host goes, this next band asked me not to read this, but I'm gonna read it anyway!
00:10:29.000 This band is the best band ever!
00:10:32.000 That's what it sounded like.
00:10:33.000 Like, clearly Joe Biden is out of his mind and Joe Scarborough is going on TV, looking at the camera to try and get dead in your eyes and go, Joe Biden is strong!
00:10:44.000 Believe it!
00:10:45.000 You know, if you don't watch Joe Biden, you think he is.
00:10:47.000 Well, it's not just, you know, Joe Scarborough.
00:10:49.000 If you look at every single media headline, it's really creepy.
00:10:52.000 The regime's media apparatus is nuts.
00:10:54.000 I mean, it's everything said fiery speech, fiery speech over and over and over.
00:10:58.000 And is that how you describe that speech?
00:11:00.000 Is that your adjective for this?
00:11:01.000 It was a something.
00:11:02.000 I think there was a lot of gaslighting, if that's a fiery.
00:11:05.000 I mean, for him to suggest that the violence is down in our country and crime is down.
00:11:09.000 I mean, all of it was complete lies.
00:11:11.000 But I mean, will the American people actually understand that?
00:11:15.000 Or are they just going to be listening to the mainstream media?
00:11:17.000 So it's it's it's going to be a challenge.
00:11:19.000 And that's what we're facing right now is that I don't think we've ever witnessed before a mainstream media regime
00:11:25.000 basically propping up a president.
00:11:27.000 Biden's not in control of this presidency.
00:11:29.000 We all know that.
00:11:30.000 But the media is carrying the water for him very aggressively.
00:11:33.000 Yeah, it's definitely an interesting position that the media has boxed themselves into because Joe Biden has been
00:11:39.000 speaking publicly for decades.
00:11:41.000 I mean, you pull video of him on the campaign trail with Obama.
00:11:44.000 I bet he would seem stronger, more with it, have more energy than he does now.
00:11:49.000 So how could you say this is the Joe Biden we want when, you know, even eight years ago, 10 years ago, you have evidence that there is a difference between how he presents.
00:11:57.000 I mean, he's not getting better with time.
00:11:59.000 No, he's not as per biology, but he's getting exponentially worse and faster.
00:12:06.000 That commercial we played just before the State of the Union where it's like, can Joe Biden even survive till 2029?
00:12:12.000 And I think, of course he can't.
00:12:17.000 He's almost 10 years past standard life expectancy.
00:12:20.000 He can, but I mean, at what value?
00:12:23.000 Is he just gonna be like, I'm alive?
00:12:25.000 A human being hypothetically can, and we can extend Joe Biden's life through great leaps in medical technology, but will he be there?
00:12:35.000 He might be alive, but is he really living?
00:12:37.000 Well, it's like the question of like Ruth Bader Ginsburg, right?
00:12:40.000 There were people who said that she really should have stepped down way earlier than she did because Obama could have appointed someone or whatever else, and then she died in office.
00:12:48.000 She did not seem like she was in great health towards the end, and yet someone, the powers that be, said no, no.
00:12:54.000 Either she herself decided she didn't want to go to the power, which maybe that's why Biden won't step down, or someone in the background was like, I need you to stay in office so I can continue to have whatever form of power I currently got.
00:13:03.000 And I'm pretty sure they artificially extended her life to the extent of modern science.
00:13:08.000 Because she, like, disappeared for a while.
00:13:09.000 And everyone's like, what's going on with her?
00:13:12.000 And they probably hooked her up to a bunch of machines where they were like, technically she's alive!
00:13:17.000 So you cannot replace her yet!
00:13:19.000 And then finally the doctor's like, look, we've done everything we can.
00:13:21.000 This lady is not gonna be alive anymore.
00:13:23.000 This is one of my favorite conspiracy theories.
00:13:25.000 People were like, because, I mean, with the Supreme Court justices, their aides do write a lot of their opinions and do a lot of research and stuff like that.
00:13:31.000 So hypothetically, you had this staff that was like, she is fine.
00:13:35.000 Thank you so much.
00:13:36.000 We're just going to sign her name to this thing for a long time.
00:13:39.000 And it was similar with Dianne Feinstein, who, you know, got sick.
00:13:42.000 She didn't even know she was in the hospital.
00:13:44.000 No, it's crazy.
00:13:46.000 And she was absent for so long that Republicans on her committee were like, we would like to replace her.
00:13:51.000 Well, and it's not even an age issue.
00:13:52.000 I mean, you had Secretary Lloyd Austin, you know, in the hospital.
00:13:55.000 Nobody knew who was running the Pentagon at the time.
00:13:58.000 So we have some real problems where we don't know who's running the government.
00:14:02.000 Well, we do.
00:14:03.000 It's the deep state.
00:14:04.000 It's a swamp.
00:14:05.000 But it's this, you know, mirage of these supposed elected officials or these appointed officials.
00:14:11.000 And that's where I think the American people are really looking into this.
00:14:14.000 And they're realizing, like, who is in actual control?
00:14:17.000 Because our borders open.
00:14:19.000 Our election's a mess.
00:14:21.000 We're about to enter World War III.
00:14:22.000 Do we have a competent commander-in-chief who can lead us there?
00:14:25.000 You know, it's not just the 13 service members who died in Afghanistan with that pullout, but remember the three Army Reserve soldiers who died in Jordan?
00:14:32.000 There's so much happening, and Biden never recognized them appropriately in his State of the Union speech because he knows those will, you know, hurt him in an election.
00:14:41.000 Yeah, Biden, you know, I want to say he's the boss.
00:14:46.000 That's that's the way I'll phrase it.
00:14:48.000 He's the boss, but no one's in control.
00:14:50.000 I don't think there's anyone telling Biden what to do, and I don't think there's anyone telling his staff what to do.
00:14:54.000 And I think the failures in Afghanistan and the failures of administration are proof There's no one pulling the strings.
00:15:00.000 A lot of people like to think that Obama is still living in D.C.
00:15:02.000 telling Biden what to do and Biden's a puppet president.
00:15:04.000 I'm like, no, no, no, no, no.
00:15:06.000 If that were true, there would be cohesive strategy.
00:15:10.000 There would be things happening.
00:15:12.000 It's chaos because Joe Biden is president.
00:15:15.000 He doesn't remember things, he misspeaks, and the staff around him are all just looking at each other like, what did he just say?
00:15:21.000 And there's no cohesive plan.
00:15:24.000 So everything's just chaos.
00:15:26.000 That's it.
00:15:28.000 Well, if you look at Obama's apparatus is still in the White House.
00:15:30.000 I mean, you had Susan Rice, who is the head of Domestic Policy Council up until May of last year, you have Antony Blinken, all these remnants from the Obama administration.
00:15:39.000 So I think there's no leadership with this administration, and they're just taking advantage of it.
00:15:45.000 You know, it's not even an age issue.
00:15:46.000 You see President Trump, he's in his late seventies and man, he has a lot of energy, right?
00:15:51.000 So, you know, I wouldn't even classify this going after him because of his age.
00:15:55.000 You know, there's been, I know people who are in their nineties who are all with it, but Joe Biden clearly isn't.
00:16:00.000 Let's jump to this story.
00:16:01.000 This is from the post-millennial.
00:16:02.000 Pathetic.
00:16:03.000 Laken Riley's mom blasts Biden for butchering name of daughter killed by illegal immigrant gang member in State of the Union address.
00:16:10.000 Biden does not even know my child's name.
00:16:13.000 Now, here's the important thing.
00:16:18.000 Democrats are furious right now over this.
00:16:20.000 They are furious over this.
00:16:22.000 That Joe Biden would say illegal.
00:16:26.000 No, no, I'm not kidding.
00:16:27.000 Democrats are outraged that Joe Biden said, killed by an illegal or killed by illegals.
00:16:32.000 Yeah.
00:16:32.000 You saw Nancy Pelosi.
00:16:33.000 What did she, what did she say?
00:16:34.000 Oh yeah.
00:16:35.000 She said she wishes he didn't use that term.
00:16:38.000 I agree though.
00:16:39.000 I completely agree with Nancy Pelosi.
00:16:40.000 I am offended and I wish Joe Biden did not use that term.
00:16:43.000 Did you see the reporter that asked him?
00:16:45.000 I wish Joe Biden did not say illegal.
00:16:48.000 He should have said criminal alien rapist and murderer.
00:16:52.000 Stop protecting criminal aliens by calling them illegals.
00:16:57.000 I hate that term.
00:16:58.000 Defining an immigrant as illegal makes no sense.
00:17:01.000 I don't understand.
00:17:02.000 It makes no sense.
00:17:03.000 If someone is, like, shoplifting, we don't call them an illegal shopper.
00:17:06.000 It doesn't make any- an illegal shopper came by today.
00:17:08.000 No, no!
00:17:09.000 What- shoppers are fine.
00:17:10.000 It's the shop- it's a shoplifter.
00:17:12.000 You know, call them a, uh, a border breaker, at the very least.
00:17:15.000 You know what I mean?
00:17:16.000 But I think rapist and murderer, uh, is fitting, and you can put criminal alien- alien in there, so I- I agree with Nancy Pelosi.
00:17:22.000 Yeah, don't call them an illegal.
00:17:23.000 Yeah.
00:17:23.000 The AP news ran this headline that was like, Biden says her name, and then there's like a hyphen at Lake and Riley at the request of Marjorie Taylor Greene, which I think is funny because they're trying to cover up the fact that he very clearly said Lincoln, which you had one job, right?
00:17:38.000 You knew the entire country.
00:17:39.000 He had a pen with her name on it.
00:17:40.000 I just don't understand.
00:17:41.000 I don't understand how we got here where you had one job, you had weeks, months to prep for this.
00:17:45.000 This case happened, you know, a couple weeks ago.
00:17:47.000 It's fresh on everybody's mind.
00:17:48.000 You knew you were probably going to have to reference this and you could not get this person's name right.
00:17:52.000 Let's play it. Let's play it.
00:17:54.000 Lincoln, Lincoln, Lincoln, Lincoln, Lincoln, Lincoln, Lincoln, Lincoln, Lincoln, Lincoln,
00:18:00.000 Lincoln Riley.
00:18:02.000 An innocent young woman who was killed.
00:18:05.000 by an illegal.
00:18:07.000 By an illegal.
00:18:09.000 Did you see Benny Johnson posted the guy asking Biden, the reporter,
00:18:13.000 do you regret using the word illegal to describe him?
00:18:15.000 And Biden's like, well, I probably, uh, I don't regret, uh, I technically am not supposed to be here.
00:18:22.000 I think I did justice to how he said it too.
00:18:25.000 That's exactly what he says.
00:18:26.000 He's gonna filibuster his way out of this by verbal slurs.
00:18:28.000 He wouldn't say he didn't regret it, and he didn't say he regrets it, but he did reinforce that they're not supposed to be here, technically.
00:18:36.000 I don't know why I said technically.
00:18:37.000 Well, what's he gonna do about it?
00:18:39.000 He's the one who gave them parole.
00:18:42.000 The border the 10 million people have come across more like 15 million in the last three years.
00:18:47.000 And this is all on Joe Biden.
00:18:48.000 And I think this is what concerns me the State of the Union address where so many Americans just watch that and they're not keeping up with everything and they say, Oh, maybe Biden, maybe the Republicans are unreasonable.
00:18:58.000 That's maybe the The takeaway that they're going to come out of that, which is scary.
00:19:02.000 So that's why Republicans have to keep dominating this message.
00:19:05.000 And President Trump's been doing a very good job at it because, I mean, this, it's no longer, you know, it's not just the border states.
00:19:10.000 I'm from Arizona and we see it all, you know, directly, but the burglaries that are happening with a lot of the affluent homes, but now you're seeing murders and rape of, of an innocent girl who's just going to nursing school.
00:19:23.000 On college campuses, right?
00:19:24.000 That's crazy.
00:19:25.000 On her college campus, she went for a jog.
00:19:27.000 And the presumption is this guy, Jose Ibarra, was trying to rape her.
00:19:32.000 But when she fought back, he bashed her face in to the point where he smashed her skull.
00:19:37.000 And I'm hearing, I don't, I did not, I've been hearing this on Twitter, that she was still alive when they found her.
00:19:42.000 She was unconscious, but she was declared dead at the scene.
00:19:44.000 Wow.
00:19:45.000 There was, there's also a charge of, he's charged with preventing someone from making a 911 call.
00:19:52.000 So there's an idea that perhaps she You know, he was chasing her or whatever.
00:19:55.000 She tried to call the police and he, you know, took her cell phone or something.
00:19:59.000 So it's, I mean, it's a really awful story.
00:20:01.000 And I remember the day that this came up that UGA, UGA had had a suicide on campus a couple days before.
00:20:07.000 So it was a weird headline initially, like there's a second student found dead.
00:20:11.000 What's going on there?
00:20:12.000 And then within 24 hours, the person had been identified and it's, you know, obviously hasn't been tried or anything, but it's believed to be this illegal immigrant who's in the country who Entered to cross the border in 2022 under the Biden administration.
00:20:25.000 Like, he was known to this government to be in this country illegally.
00:20:29.000 He was also arrested in New York and then he was arrested, obviously, after allegedly murdering Lincoln Riley in Georgia.
00:20:36.000 So it's something where along the way several different forms of law enforcement failed, right?
00:20:43.000 And Joe Biden doesn't even know her name.
00:20:45.000 That's crazy to me.
00:20:46.000 And he, like, looks at the pen.
00:20:47.000 Lincoln!
00:20:47.000 Lincoln Riley!
00:20:50.000 Wow, there's leadership for you.
00:20:52.000 He cares so much.
00:20:53.000 He really cares.
00:20:54.000 He cares about women, especially here on International Women's History, or it's Women's History Month or whatever.
00:20:59.000 Like, it's so heartless.
00:21:01.000 And again, this is what you see.
00:21:01.000 International Women's Day.
00:21:02.000 Communist holiday.
00:21:03.000 That was like last week, I think.
00:21:05.000 But, you know, the thing is, it's just, it's so the Biden administration to be like, we're the party that cares about you and this other party, they hate you and they hate the future.
00:21:13.000 Now, we don't want to talk about the service members that we put in today.
00:21:17.000 Yeah, it's the communist holiday.
00:21:19.000 Yeah, it was Russian revolutionary communists who started it.
00:21:23.000 And none of the men in the room wish me a happy Women's Day, so I'm offended.
00:21:27.000 You'll never get me there.
00:21:28.000 And you're the most feminist one in this room, I think.
00:21:31.000 I don't want to speak for you.
00:21:32.000 I'm old school feminist, though.
00:21:33.000 Like, second wave, where it's just about equal opportunity.
00:21:36.000 Ugh, it's all horrible.
00:21:37.000 It's where it begins and ends.
00:21:38.000 No, but this is the Biden administration, right?
00:21:40.000 They're like, we really, really care about women.
00:21:42.000 We care about our young people.
00:21:43.000 Please join our military.
00:21:44.000 Also, if you die because of us, we will never talk about you again.
00:21:47.000 I'm anti all-wave feminists.
00:21:51.000 All of them.
00:21:51.000 Anti?
00:21:52.000 Yeah, I oppose all of it.
00:21:53.000 But that means that you buy it.
00:21:55.000 What?
00:21:55.000 If you found that you're anti it, that means you accept it.
00:21:58.000 What does that mean?
00:21:59.000 In order to oppose something, you have to accept its premise.
00:22:02.000 No, you don't.
00:22:02.000 What?
00:22:03.000 Otherwise, it wouldn't bother you at all.
00:22:04.000 You'd be none of it.
00:22:05.000 Wouldn't even be any of it.
00:22:06.000 Opposing a thing that exists that has resulted in women getting civic privileges without civic responsibility is... Sure.
00:22:16.000 Like, it exists.
00:22:16.000 What does that mean?
00:22:17.000 Concepts?
00:22:18.000 I acknowledge that the thing is a thing.
00:22:19.000 I think if you're, if a concept, if you become anti a concept, you're actually empowering the concept.
00:22:25.000 You're better to focus on things that you like that are different than that thing you want to not be.
00:22:31.000 There's like Ian sitting in a room and he's on fire.
00:22:34.000 And he's like, as long as I don't say I'm on fire, I'll be fine.
00:22:36.000 I'm talking about concepts, like feminism, the ideas.
00:22:38.000 Right, so what happened with feminism was, the initial argument at the turn of the century was civic responsibility in exchange for equal civic access, and the women said no, and then a bunch of weak men said, how about we give women all of the privileges and other responsibilities, and they went, you got it, I oppose that.
00:22:57.000 I think women should have the right to vote, I think women should work, but along with civic privileges and access comes civic responsibility.
00:23:03.000 Meaning, so long as men have to enlist for the draft, women should have the choice.
00:23:08.000 You want access to the vote, you want jobs, you want all that stuff?
00:23:10.000 You gotta do the exact same thing as everybody else, otherwise... That's why I'm like, you know what, fine, screw it.
00:23:15.000 Bring on the Equal Rights Amendment.
00:23:17.000 Let's make it enshrined in the Constitution that there cannot be legal distinction between males and females, and then every single woman... This is why they don't pass the ERA, by the way, because it would mean that women have to sign up for the draft, and they don't want to do it.
00:23:28.000 For sure.
00:23:29.000 I don't want to be drafted at all.
00:23:30.000 I also don't like the ERA because, and this is a Phyllis Schlafly argument that I covered last year when I was writing profiles for Women's History Month.
00:23:37.000 As she pointed out, it doesn't make men and women equal.
00:23:40.000 It abolishes the concept of the differences between the gender.
00:23:43.000 I don't want to live in a genderless world.
00:23:44.000 I think genders are different.
00:23:46.000 We see this from the minute babies are born.
00:23:48.000 They show preferences for different things from less than 24 hours old.
00:23:52.000 So why would we pretend otherwise?
00:23:54.000 Genders are good.
00:23:55.000 They can be complementary, but they definitely exist.
00:23:59.000 Happy International Women's Day to you!
00:24:02.000 Yeah.
00:24:02.000 So, I don't see... What would your argument for feminism be, Ian?
00:24:05.000 Do you think that it's okay for one class of people to be given privileges with no responsibility?
00:24:11.000 Or would you agree with me that women should bear equal responsibility to their nation in exchange for equal access and privileges?
00:24:16.000 Well, equal rights doesn't mean you have to do the same thing as the other person.
00:24:20.000 So, like, the right to vote doesn't mean that we all have to go fight in war, I don't think.
00:24:25.000 But civic responsibility, yeah, like the women during the World War II... We can agree there.
00:24:28.000 So then, should men no longer have to enlist in the Selective Service?
00:24:32.000 Sign up for it?
00:24:33.000 No.
00:24:33.000 They should?
00:24:34.000 I don't like drafts, but no, I don't think that they should stop.
00:24:37.000 So you would agree then that there is a problem with feminism thus far, and the remedy would either be women must sign up for the draft, or men must have no responsibility, no longer have to sign up for the draft.
00:24:47.000 No, I don't think military draft is a delineation for that, for men and women and feminism.
00:24:53.000 Like, I don't think women should have to Get drafted into the military.
00:24:56.000 What's the equivalent to getting drafted into the military?
00:24:58.000 The equivalent?
00:24:58.000 Like you were saying, you don't have to do the exact same thing.
00:25:01.000 Getting impregnated and going through nine months of hell with carrying that thing around?
00:25:04.000 Yeah, I don't know.
00:25:05.000 That's a really weird way to describe what many people call heavenly in beauty.
00:25:09.000 The amount of pain that you suffer giving birth, I would imagine that that's about as traumatic for the female.
00:25:14.000 But also it's also awesome apparently.
00:25:16.000 So you're arguing that every woman at 18 should be forced impregnated by the government?
00:25:20.000 No, I did not argue that.
00:25:21.000 Okay, well then what's your point?
00:25:23.000 Women only get their right to vote when they have a baby.
00:25:25.000 Which I think is an interesting try, too!
00:25:28.000 Hey, alright, let's go Ian!
00:25:29.000 Ian is like bumping up the birth rate over here, I love it!
00:25:34.000 You either have to join the military or get pregnant, and men can't get pregnant, so that's only on women.
00:25:37.000 You mean like, what's a good version of compulsory service that a woman could experience other than military?
00:25:41.000 Like, you were saying they don't have to do the same thing to get the right to vote, they should all have a responsibility to their country, and I'm just wondering, like, what the equivalent to entering the draft would be for women.
00:25:49.000 And if you're saying you have a baby, I mean, maybe we should talk about this.
00:25:53.000 Why do you think it's acceptable that in our country, the government would say, we grant extra privileges to one class of people based on their biological sex?
00:26:03.000 That seems antithetical to what the actual argument of feminism is.
00:26:07.000 So if the reason why I say I oppose all of the waves of feminism is because in terms of a philosophy, on the surface, the Motten-Bailey would be, oh, we just want equality for women.
00:26:17.000 And then I respond with, okay, so women should be drafted.
00:26:20.000 No, no.
00:26:20.000 And then women recoil instantly and say no to that.
00:26:22.000 See, feminism in practice is not equality.
00:26:26.000 It's privileges.
00:26:27.000 Of course, only an insane person would argue for a true equality.
00:26:33.000 Everyone's going to argue for privileges all the time.
00:26:39.000 Why would you say you are in favor of the government creating special classes of people who get extra benefits without responsibility?
00:26:46.000 Well, I don't think that they should have ever not had the right to vote.
00:26:50.000 Maybe, I don't know if ever is the right word, but that they didn't have the right to vote in 1840 is insane.
00:26:54.000 So the issue going back... We go back in time.
00:26:57.000 The issue with voting actually is really simple.
00:26:59.000 Do you live here?
00:27:00.000 Yes or no?
00:27:01.000 You don't?
00:27:01.000 Okay, well then you can't vote on what we're doing.
00:27:04.000 When the towns are very small, it's like a hundred people, you're voting on like, should we all come together and put a new road here?
00:27:10.000 Do you live here?
00:27:11.000 You do?
00:27:11.000 Okay, what do you think?
00:27:12.000 You don't live here?
00:27:13.000 Well then you have... Why are you telling us what we should do with our road?
00:27:16.000 And so then it comes to the point of, should women vote?
00:27:18.000 Well, it's like, well, the women aren't the ones who are building the roads and going and fighting to defend them, so it's just the guys are going to decide whether or not they want a road where they're going to be working.
00:27:25.000 And then we come to this change with the Industrial Revolution and we're like, you know...
00:27:29.000 Women don't, we're no longer in an era where one, people are all landowners.
00:27:35.000 Some people are renters now.
00:27:36.000 Some people live with someone else and they should have a say in their community because they do live there and they've lived there for their whole lives.
00:27:40.000 And not every woman is with a man and some women are working.
00:27:43.000 So we recognize, okay, women should be allowed to vote.
00:27:46.000 I completely agree.
00:27:47.000 Women should be allowed to work.
00:27:47.000 Completely agree.
00:27:48.000 And with that comes the same responsibilities of any other human being in civilization.
00:27:53.000 Fire brigade, law enforcement, military, compulsory military service.
00:27:59.000 But instead of actually... and that was the argument initially.
00:28:02.000 The initial argument with suffrage was, sure, one can have the right to vote.
00:28:05.000 And they'll get drafted.
00:28:06.000 And then the anti-suffragettes were like, we don't want to be in the fire brigade, and we do not want to go fight in wars, so we will happily sit back.
00:28:14.000 And then a bunch of weak men were like, hey, if we tell the women we'll give them the right to vote, they'll vote for us.
00:28:20.000 There you go.
00:28:20.000 Now you have women in this country, don't have to be drafted, but get all of the privileges with, you know, being a full-fledged member.
00:28:28.000 I think that's wrong.
00:28:29.000 I think that's anti-feminist.
00:28:30.000 I think that's opposing the general ideology of equality.
00:28:36.000 Um, so you prefer to draft women and... Women can work in many different ways in the military.
00:28:44.000 You know, I'm not even saying combat.
00:28:46.000 Pregnant women?
00:28:47.000 Of course they can.
00:28:48.000 What about women with small children at home?
00:28:49.000 Yes, they can.
00:28:50.000 You would draft them?
00:28:51.000 100%!
00:28:51.000 What would the kids do?
00:28:53.000 Where would the kids go?
00:28:54.000 So, you know that there are people in the military right now who have kids.
00:28:57.000 Where do the kids go?
00:28:58.000 Daycare.
00:28:58.000 So you'd send families, children to daycare?
00:29:00.000 Who'd pay for the daycare?
00:29:02.000 The government, the military.
00:29:03.000 So you're gonna expand government pay services to draft women into service and send their kids up to daycare?
00:29:08.000 When the draft is called upon, because of a legitimate threat to this country, because my ideology is not predicated upon corrupt people sending people to foreign wars for profit, that's BS, I'm talking about- What they did in Vietnam.
00:29:19.000 Sure, and we're talking about turn of the century, we're talking about World War I, World War II, which you can arguably say we should not have been involved in.
00:29:25.000 But if you were to conscript women, it could be like, we need you to work in a factory producing materials and refining... While your kids go off to daycare?
00:29:32.000 Welcome to war, my friend!
00:29:34.000 When bombs are being dropped on your houses...
00:29:37.000 Then by all means, I am not going to sit here and listen to someone tell me that I have to die for them and they do not have the same responsibility.
00:29:46.000 Welcome to being a man, dude.
00:29:47.000 You protect your women and children.
00:29:48.000 That's how it is.
00:29:49.000 And guess what?
00:29:50.000 The people who don't go to war don't tell me what to do.
00:29:53.000 How about this?
00:29:54.000 How many members of Congress have proposed a bill that if you want to fund a war, you have to go fight in it?
00:29:59.000 I love that idea, personally.
00:30:01.000 Absolutely.
00:30:01.000 So why am I going to say, you don't have to go fight in the war, but you can certainly tell me how I have to fight and when I have to die.
00:30:06.000 Well, you're just saying how pro-draft you are.
00:30:09.000 But now you're saying you're anti-draft.
00:30:10.000 No, no, no.
00:30:11.000 I'm saying, why would someone who is not in the military get the right to tell me to die for them?
00:30:18.000 By force.
00:30:19.000 By compulsion of the US law enforcement coming and threatening to take away my freedom.
00:30:24.000 And there's a whole class of people, half the country, more than half, who get to vote to send me to die.
00:30:29.000 And they have no responsibility.
00:30:30.000 Now, of course, the responsibility was they're going to have families and raise kids, but that's not what's happening.
00:30:35.000 So there is an imbalance in equality.
00:30:37.000 Anyway, we can move on to a different subject.
00:30:40.000 Are you going to draft women?
00:30:41.000 Do you want to talk about this?
00:30:42.000 Well, I think we can, you know, if you look at what's happening right now, women are under attack.
00:30:47.000 But ironically, it's by the liberal Marxists who are transforming, you know, if women try to compete in sports, they have to compete with men.
00:30:54.000 You know, it's very bizarre, this idea that Women and men have the same biology so that they can compete in the same sports.
00:31:03.000 But I'm looking, I'm really, if you see what's going on, I just saw Riley Gaines interview too, it's so tragic that the feminist movement that you're talking about, how it's been transformed, I think it's actually going against women.
00:31:16.000 Oh it is.
00:31:17.000 It's erasing feminism.
00:31:18.000 Right.
00:31:19.000 And so what purpose is it serving right now?
00:31:23.000 So I feel as if women right now are being under attack by these liberal Marxists who are trying to basically make everybody equal.
00:31:31.000 And now they're actually propping up this other class of transgenders who are now competing in sports.
00:31:36.000 So, you know, I've never talked more about transgenders ever in my life except the last three years.
00:31:41.000 And I think that's purposeful.
00:31:42.000 The Marxists are trying to win this this war on our minds right now.
00:31:47.000 They're trying to erase the gender, what gender actually is.
00:31:51.000 Let's talk about the alternative.
00:31:54.000 The alternative to Joe Biden that we have coming up in November.
00:31:58.000 And I give you this.
00:31:59.000 Yasher Ali says Trump just posted this video on his official Instagram.
00:32:04.000 My friends, you have to watch this video.
00:32:07.000 We're going to buy America.
00:32:09.000 We're going to buy America.
00:32:13.000 So trade rules.
00:32:15.000 Fire America has been the law since 1933.
00:32:17.000 It's also caps and won't even go into effect until 2025.
00:32:21.000 And by the way, that law was written and the benefit expires in 2025.
00:32:29.000 New electric grids that are able to weather major storms and not prevent- The Chihuahua.
00:32:34.000 So for those that are just listening, you have no idea what's going on, but Trump posted this video to his Instagram of Snapchat filters.
00:32:40.000 It is an old State of the Union.
00:32:42.000 It's from, I think it's from last year.
00:32:43.000 You can see Kevin McCarthy sitting there, but it's absolutely hilarious.
00:32:47.000 And Trump posted this on his Instagram.
00:32:49.000 And there's like one clip where Joe Biden's head is a Chihuahua, I guess.
00:32:52.000 Is that what that is?
00:32:53.000 An aged Chihuahua?
00:32:55.000 Chihuahua-esque figure.
00:32:56.000 I'm not really sure.
00:32:57.000 Yeah.
00:32:58.000 Yeah, I don't even know.
00:32:59.000 I don't know if that's a joke.
00:33:00.000 I think it is.
00:33:02.000 But this is your alternative.
00:33:04.000 And I am for it.
00:33:05.000 And we'll be supporting this man throughout the year for his reelection.
00:33:11.000 And I think he's actually reaching a different audience, right?
00:33:14.000 So many political consultants would say, you know, you can't do that type of stuff.
00:33:17.000 But he's actually going after the younger voters who can relate to this and making politics.
00:33:22.000 Politics has become cultural right now.
00:33:24.000 And President Trump understands that.
00:33:26.000 He's doing a really effective job at reaching that audience.
00:33:28.000 So, you know, I think it's hilarious and I think a lot of Americans do too.
00:33:32.000 What is this face?
00:33:33.000 Do we know where this video came from?
00:33:37.000 Like, did somebody else share it?
00:33:39.000 Someone else must have made it.
00:33:40.000 The narrative that I have invented for myself based on nothing is that Barron is the one who did this and then sent it to his dad.
00:33:46.000 That would be so funny.
00:33:47.000 I was thinking that it felt like Don just is like, Grandpa got figured out Snapchat filters, here we go, and he's like doing five in a row and you're like, alright, I get it.
00:33:55.000 But it's an old video.
00:33:56.000 It's from last year.
00:33:56.000 Yeah.
00:33:57.000 And it could have been someone else's video.
00:33:58.000 Look, I put it up on YouTube and they're just like, look, this is so funny.
00:34:01.000 Do you guys think that the cultural, like you were just saying that politics has become cultural, that it's either that that's an emergent thing, just it's just part of the flow of nature because of internet, or is it specifically being done on purpose?
00:34:14.000 I'm not sure, but I actually think it's positive in some ways.
00:34:18.000 I mean, so many people are engaging in politics, whether they're on the left or on the right.
00:34:22.000 You're seeing people like AOC who kind of, you know, rose from that type of, you know, populist on the left side of things.
00:34:29.000 And you see now the same on the right.
00:34:30.000 So I see it as good because there needs to be strong debate as long as you're actually allowed to debate.
00:34:36.000 And that's what's scary that we're facing is that these people are attacking free speech and the so-called defenders of democracy are destroying our election.
00:34:45.000 Yeah, culture and politics, I think it's only going to intertwine even more so.
00:34:49.000 You're starting to see celebrities run for politics like never before and athletes as well.
00:34:54.000 So it's just become a part of daily life.
00:34:56.000 You can't escape it.
00:34:57.000 I mean, everybody talks about President Trump or President Biden, no matter where you go, whether you're even paying somewhat attention to politics.
00:35:04.000 It's unescapable, unlike how it used to be just 15 years ago.
00:35:08.000 Yeah, I think, I think the vision, the ideal for the founders is that it's like part of not everyone's daily life, but that it may be that eventually it will come to the point where like, it's just, it's just a natural part of your life.
00:35:18.000 Like you make lunch, you involve yourself with your local politics, you make dinner, you spend time with your family, you keep an eye on what's going on, like stay politically.
00:35:26.000 I think that used to be the case when we were more civically minded as a nation, right?
00:35:31.000 Like right now it feels like it's, All about capital P politics that happened on a federal level that happened because it's an election year.
00:35:38.000 But when people were more involved with just like local groups, they volunteered more outside the homes, they were more involved in their religious communities, right?
00:35:48.000 Like they had an impact on the community that wasn't necessarily affiliated with the political
00:35:53.000 party, although it had effectively a political influence on their community, right?
00:35:56.000 They had values, they wanted to see them carried out, they maybe went to town meetings to ask
00:36:02.000 for certain things, or they organized together to accomplish certain tasks, but we just drifted
00:36:06.000 away from that as an American culture.
00:36:08.000 And there are a couple different reasons for that, but predominantly one of the reasons
00:36:12.000 is because we're deeply online now.
00:36:14.000 We're not actually spending time, you know, going to school board meetings.
00:36:17.000 We are, you know, much more likely to be reading about, again, national level politics as opposed to what's going on in our communities.
00:36:23.000 But you lose that human connection, right?
00:36:25.000 But I think you're exactly right when you talk about church attendances, you know, record lows.
00:36:30.000 And that's where people could have an impact on their community, but now they're seeing the impact is online and, you know, voicing their opinion on social media.
00:36:38.000 I mean, this is one of the things I don't want to necessarily jump back into the same argument we had before, but when you're talking about, like, women having different civic responsibilities than men, you know, When more women were at home, they were largely responsible for essentially the volunteer and philanthropic efforts of their communities.
00:36:57.000 I think Indiana University has a whole study, I think it's their College of Philanthropy, I can't remember what it's called now, but they've done a couple studies on this where women are typically the one who make the decisions about like how the money how families spend their money charitably like where they give money to and again it's reflective of this part of culture that we sort of lost when we sent women into the workforce right like no one person can really accomplish all the things you need to to run a household we know I mean any working adult knows this like there's just always something you have to take care of and that was one of the reasons why a married family unit with a man and woman who have different
00:37:33.000 Interests but also different responsibilities work so well.
00:37:36.000 You can have a bigger impact on your community because it doesn't just fall on the shoulders of the individual, you work as a team.
00:37:41.000 And to a certain extent, when we gave up the normality of having women stay at home, we sort of lost that aspect of culture.
00:37:53.000 Yes.
00:37:54.000 Thank you so much for agreeing with me on today of all days, International Women's Day.
00:37:57.000 That's right.
00:37:58.000 And only because it's International Women's Day.
00:38:00.000 Yeah, Tim doesn't agree with me any other time.
00:38:03.000 Well, I'll keep filibustering here until the show's over.
00:38:05.000 No, I'm just kidding.
00:38:07.000 I really think that this is the most interesting part of culture, which is that the family unit determines so much about who you become as an individual and what happens to your community.
00:38:16.000 And so when we don't have... I'm not even kidding when you were like, hey, women should get the right to vote when they have a baby.
00:38:22.000 I'm like, that doesn't sound half bad.
00:38:24.000 I mean, it's sort of complicated.
00:38:25.000 Ian, that's a 20 right there for me.
00:38:27.000 And I didn't even roll it.
00:38:28.000 That's the best part.
00:38:30.000 It was a passive point.
00:38:31.000 Last night when Biden was saying, you know, we're going to encourage kids to, we're going to roll out additional preschool vouchers, essentially.
00:38:40.000 Like, we're going to subsidize preschool even further, and we really want to make sure kids can read by the third grade.
00:38:44.000 And Tim and I looked at each other like, third grade?
00:38:46.000 That's late.
00:38:47.000 Like, what are we doing here?
00:38:49.000 Yeah, kids should be reading, I think it's like first grade?
00:38:52.000 Is it average?
00:38:53.000 Yeah.
00:38:54.000 But don't you learn how to read in kindergarten and before that?
00:38:56.000 I think it's changed since COVID.
00:38:56.000 I don't know.
00:38:59.000 The education system is totally collapsed.
00:39:01.000 I mean, look, I was homeschooled before I started grade school, so I'm pretty sure I could read well before I started kindergarten.
00:39:07.000 We had The Letter People.
00:39:08.000 Did you guys ever watch that show?
00:39:10.000 Every day, every week, we would have a different letter person would come in.
00:39:13.000 We'd learn the letters.
00:39:13.000 That was in kindergarten.
00:39:14.000 Well, Biden, when he was talking about it, bring it back.
00:39:17.000 They bring these like blow up dolls of like these letters.
00:39:20.000 The letter J and it's the Mr. J for J is for Jam.
00:39:24.000 We should look up the letter people.
00:39:25.000 It's a great show.
00:39:28.000 No, but one of the things Biden said during his speech was a study show, and I've seen the study before, that kids who are read to at home enter kindergarten speaking, you know, a million more, saying a million more letters than what is happening.
00:39:40.000 This is it?
00:39:41.000 Yeah, this is what we got to show on Saturday morning.
00:39:43.000 Probably.
00:39:43.000 I don't think so.
00:39:43.000 Isn't this the Sesame Street dude?
00:39:45.000 And I just happen to have a nice one right probably Sounds like mr. Jumble junk jar
00:39:54.000 Jar starts with the same sound that starts jumbled junk.
00:39:58.000 Oh, that sound makes me want to jump for joy!
00:40:01.000 It's a jolly sound, all right.
00:40:04.000 This is why Ian's so messed up, isn't it?
00:40:06.000 By messed up, you mean absolutely brilliant.
00:40:09.000 Yeah, man.
00:40:10.000 We wouldn't really watch these shows as much as she would bring out, like, the J, it was like a guy, and she would set him down, and then we'd learn all about the letter J for the day, and it was pretty cool.
00:40:20.000 Oh, so I must not have known how to read before that if I was learning the letters.
00:40:23.000 What grade was that?
00:40:24.000 Kindergarten.
00:40:25.000 Hmm, we had every letter capital and lowercase on the top of the room.
00:40:31.000 And yeah, I don't know.
00:40:33.000 That's helpful.
00:40:34.000 In order, and you just stare at them all in order.
00:40:35.000 And cursive.
00:40:36.000 I think actually, we had Danelian.
00:40:39.000 Do you guys know what that is?
00:40:39.000 No.
00:40:40.000 I don't even know what that is, but I learned it.
00:40:41.000 I don't know what that is.
00:40:42.000 It was a way to write letters.
00:40:42.000 What is it?
00:40:44.000 Yeah, it was like a mix between cursive and straight typing.
00:40:47.000 I learned it in California high school as well, or California like high school, and kindergarten there as well.
00:40:51.000 What?
00:40:52.000 Danelian.
00:40:53.000 It's a real thing.
00:40:54.000 Yeah.
00:40:54.000 Danelian.
00:40:55.000 It's D-nelian.
00:40:55.000 There you go.
00:40:57.000 That's what we learned.
00:40:59.000 Based on Latin, which was developed by Donald and Thurber.
00:41:04.000 I don't even know.
00:41:05.000 I don't understand.
00:41:05.000 Like, aren't these just like the same thing?
00:41:08.000 I'm not really sure, but the... Yeah, we did this.
00:41:11.000 Oh, I know those Danelian letters.
00:41:13.000 I guess it's how you learn cursive.
00:41:14.000 It's a precursor to cursive?
00:41:16.000 Yeah, it's a Danelian cursive writing.
00:41:17.000 Is that what it is?
00:41:18.000 I don't know.
00:41:19.000 Well, that's fun.
00:41:20.000 It's just printing at an angle almost.
00:41:22.000 Do kids learn cursive today?
00:41:24.000 No.
00:41:24.000 Oh, really?
00:41:28.000 My younger sisters are younger than me and I know that they're not learning cursive in school.
00:41:31.000 Do they learn how to write?
00:41:32.000 They can write, but it is interesting how many kids are online all the time.
00:41:36.000 Or like, you know, post- When school started going back after the pandemic, a lot of them came home with like iPads or whatever.
00:41:44.000 And it was like, there are no more snow days because either we're planning to be out because of a COVID lockdown or if we get snow.
00:41:50.000 And either way, you just do work on your tablet, which really feels like it's ruining a part of culture.
00:41:55.000 I will say this, you know, stats are 1 through 20, right?
00:41:59.000 Yeah.
00:42:00.000 My writing is a 2.
00:42:01.000 A D and D?
00:42:02.000 Yeah.
00:42:03.000 3 through 18, really.
00:42:04.000 So like I can type real fast and real well.
00:42:07.000 And it was really funny because I've never been good at writing.
00:42:11.000 Like I can make, I can write, but it looks like the SpongeBob meme.
00:42:16.000 Oh, yeah.
00:42:16.000 Where, like, capitals and lowercase, and I don't even realize I'm doing it.
00:42:19.000 I don't know why, because I just- just doesn't matter to me.
00:42:22.000 Capital, lowercase, whatever.
00:42:23.000 So, I- I wonder what that is.
00:42:25.000 You could get, like, a handwriting expert to give you, like, a psychological breakdown based on your handwriting.
00:42:30.000 Be like, well, you must feel this way about this aspect of your life if your B's are written like that.
00:42:34.000 I just put random capital letters in lowercase.
00:42:35.000 That's awesome.
00:42:36.000 Because it must mean something.
00:42:38.000 I guess.
00:42:38.000 Some sort of emphasis or something?
00:42:40.000 When I type, I type just fine.
00:42:42.000 I saw some video of a teacher saying she had to practice writing differently when she was writing on the whiteboard for her elementary school students, because on her own she makes all of the letters the same size, whether they're capital or lowercase, but you have to distinguish them for young students who are learning the difference.
00:42:59.000 That's this right here, so what we did, see the dotted line in the middle?
00:43:03.000 That's how you did.
00:43:04.000 The lowercase letters are underneath it and the uppercase go over it.
00:43:07.000 So they're not teaching this in school anymore?
00:43:10.000 I don't know.
00:43:11.000 We need to ask a teacher.
00:43:11.000 I don't know that they ever actually taught Danelian in like... No, not for me.
00:43:15.000 Yeah, not common at all.
00:43:16.000 I hated cursive.
00:43:18.000 They made us write cursive and then we would do a third, fourth grade and they're like, you have to write this stuff in cursive.
00:43:22.000 So one day I just printed it instead and they didn't say anything.
00:43:26.000 They didn't give me an F, they didn't, they took it, they read it, so I was like, wow, they tell me you have to write in cursive, but I guess you don't.
00:43:32.000 Yeah, but those girls who stuck with cursive are now making bank, like addressing envelopes for weddings and stuff like that.
00:43:37.000 They've got beautiful handwriting, they have a whole Etsy business.
00:43:40.000 Yeah, no, but like, how do kids, what are, kids are writing, but they're not doing cursive.
00:43:44.000 How are they gonna get signatures?
00:43:45.000 Are they just gonna scribble?
00:43:47.000 I don't even think they're writing that much anymore.
00:43:48.000 I think it's really gone to iPads and computers.
00:43:51.000 So you're probably going to see the penmanship go way down.
00:43:54.000 And I'm like you.
00:43:55.000 My handwriting is... I'm a lawyer.
00:43:57.000 It was awful.
00:43:57.000 I could never even read my own handwriting.
00:43:59.000 So I worry about this next generation.
00:44:01.000 But this is the thing.
00:44:02.000 Look, we are... How old are you, Abe?
00:44:06.000 32.
00:44:06.000 32.
00:44:06.000 And that would make Ian the oldest.
00:44:08.000 I'd be second oldest.
00:44:09.000 And then who's older between the two of you?
00:44:10.000 Serge is older?
00:44:11.000 Yeah.
00:44:13.000 All of us, even, you know, Hannah Clare, you know this because you have younger siblings, is that what you're saying?
00:44:19.000 But I didn't even realize kids aren't learning to write the same way.
00:44:23.000 Imagine what happens.
00:44:23.000 So we write all this sci-fi dystopian stuff, and we're like, oh, the AI, people are going to plug in, and they're going to be carrots, and they're going to do weird things.
00:44:30.000 It's worse than that.
00:44:31.000 There's going to be an EMP, and all of, like, Gen Z and under, Gen Z will be the last writing generation.
00:44:37.000 So you're gonna have a bunch of 17-year-olds being like, how do I share my thoughts?
00:44:41.000 It's like, write it down, here's a pen and paper, like, They can't do it think about and then it's like the world's gonna end there's gonna be an EMP Gen Z will be in their 50s and they'll be like the lost art form of writing with a pen No one does it or the one that gets me is memorizing phone numbers like I remember memorizing people's phone numbers because I didn't have a phone and also like if you're wherever you need your friends phone numbers your parents phone number whatever like
00:45:09.000 Lots of kids get cell phones so early that they, and you think adults too, like I don't know that I could, I couldn't tell you anyone in this room's phone number.
00:45:16.000 I don't think I'd tell you my best friend's phone number because their name is in my phone.
00:45:20.000 But I can tell you my home phone number from when I was a kid.
00:45:24.000 Yeah, me too.
00:45:24.000 I can tell you my best friend's phone numbers from when I was a kid.
00:45:30.000 I can remember probably like a decent amount of them actually.
00:45:33.000 Yeah.
00:45:34.000 Back before area codes.
00:45:36.000 Did you guys have area codes the whole time?
00:45:37.000 Yeah, but we had 312 for a while and then they introduced 773 when I was real little.
00:45:42.000 We didn't have area codes.
00:45:43.000 We just had seven digit phone numbers.
00:45:45.000 Until 1992.
00:45:45.000 The area code was given.
00:45:46.000 I think that's an area code, I don't know the number.
00:45:47.000 The area code was given.
00:45:49.000 It was only 330.
00:45:51.000 Right, so Ian, what this means is when you are in an area code,
00:45:54.000 you don't need to add the area code.
00:45:56.000 Correct.
00:45:57.000 It was before digital phones and all that.
00:45:59.000 No, no, no, no.
00:45:59.000 So before, in the 90s, in the early 90s, I believe, Chicago had just 312, and then they introduced a second one, 773, and then all of a sudden, now we had to know if we were 312 or 773.
00:46:11.000 The suburbs were 708, and the western suburbs, I think, northwest suburbs, actually, I think, it's 847, but I think that might go down to a little southwest as well.
00:46:22.000 Did you have to dial them in, or was it just given?
00:46:26.000 We didn't dial area codes in the beginning.
00:46:28.000 Yes, you did.
00:46:28.000 No, we didn't at our house.
00:46:29.000 We would just dial 923-1747.
00:46:31.000 That means you were in the area code, Ian.
00:46:33.000 Yeah.
00:46:33.000 Or you'd have to dial long distance to get out of the area.
00:46:35.000 If your phone number is 773-123-4567, and you're calling someone else in 773, you don't need to put an area code.
00:46:39.000 Correct.
00:46:42.000 When they add an area code splitting you into two different places, you now need the area code for the other region.
00:46:47.000 But now, in that same area, if I call the next door neighbor, I have to dial their area code to get to their house.
00:46:52.000 Because that's just the way phones are now.
00:46:54.000 You have ten digits instead of seven.
00:46:56.000 I don't think that's correct.
00:46:57.000 You don't.
00:46:58.000 If they have the same area code as you, you don't have to dial.
00:47:00.000 Right.
00:47:01.000 I've never tried it with my phone.
00:47:03.000 I've got a, on my cell phone, I've got a three, I'll tell you my area code, 323.
00:47:07.000 It's an L.A.
00:47:07.000 area code.
00:47:08.000 Are you saying that if I call other people in L.A.
00:47:10.000 with 323 area codes, I just only need to enter seven numbers?
00:47:12.000 Yes, correct.
00:47:12.000 Really?
00:47:13.000 I've never tried that before.
00:47:14.000 That's how it's always been.
00:47:15.000 And I'm pretty sure that's true because there are still parts of this country that do not include the area codes on their stores.
00:47:20.000 And like area codes are weird now because I got this 323 number while I was living in like New York.
00:47:24.000 I just got an L.A.
00:47:25.000 area code because I like it.
00:47:26.000 Because you can choose whatever one you want.
00:47:27.000 Yeah.
00:47:27.000 And there are people who intentionally get area codes to seem important.
00:47:31.000 Yeah.
00:47:31.000 So they'll go in the back.
00:47:32.000 I want a Beverly Hills, California.
00:47:34.000 And they'll say, OK, what are L.A.s?
00:47:36.000 We got 818.
00:47:37.000 That's like Long Beach.
00:47:39.000 Right.
00:47:39.000 323.
00:47:39.000 323. 323.
00:47:41.000 213.
00:47:42.000 Yep.
00:47:43.000 Is there a seven?
00:47:43.000 How do you guys know this?
00:47:44.000 I don't even know any phone numbers.
00:47:45.000 I live in LA.
00:47:46.000 I live in LA.
00:47:46.000 A decade of LA.
00:47:48.000 I have a, I actually, I'm not going to, I actually, I'll say this.
00:47:52.000 I'm not going to give up my area code because I don't want people to find out, but it's a, it's a rural middle of nowhere area code on purpose.
00:47:58.000 Oh, nice.
00:47:59.000 Yeah, I got mine as kind of a vanity area code.
00:48:01.000 I was like, I want to have some LA in me for the rest of my life.
00:48:04.000 When I was getting my phones, they were like, and they wanted the area code here.
00:48:08.000 And I was like, no, no, no.
00:48:09.000 And then I was like, I pulled up maps and I looked and I was like, ooh.
00:48:12.000 Middle of nowhere.
00:48:14.000 Anyway, what you're saying, I used to memorize phone numbers too.
00:48:17.000 Memorization.
00:48:17.000 So what's going to happen with neural net?
00:48:19.000 Are people going to, when they read now on the internet, like we still open up a computer screen, we read words.
00:48:23.000 We have to learn the words and the letters and read.
00:48:25.000 But are you just going to get the information and you're not actually going to have to read it?
00:48:30.000 You're just going to learn what the meaning is?
00:48:32.000 But doesn't that take something out of it if it just gets put in your head versus having to critically analyze it and memorize it?
00:48:38.000 Yeah, we're becoming the lord.
00:48:39.000 It takes language out of it.
00:48:41.000 It's a new form of language.
00:48:42.000 If there's no letters, then there would be no reading, and then people literally would not have to learn how to read in order to receive data.
00:48:49.000 And then if the power goes out, we got a bunch of illiterate hominids.
00:48:53.000 Well, it's like Idiocracy.
00:48:54.000 Have you guys seen that movie?
00:48:55.000 It's basically a documentary that we're... No, I think Idiocracy got it way wrong.
00:49:00.000 Why is that?
00:49:01.000 Um, because... Not everyone wears Crocs now.
00:49:04.000 Yeah, because what's happening right now is that liberals are aborting their kids and sterilizing their kids, and so if you were... So the premise of the film is the stupidest people tend to reproduce the most and the smart people don't, but that ignores political ideologies.
00:49:21.000 And so, even at the time when Mike Judge made Idiocracy, you could have calculated that either Islam or Christianity would dominate within 500 years because it is part of their fundamental religious beliefs to have children and to proselytize.
00:49:35.000 So, I think Idiocracy doesn't work as a... It makes sense to liberals because they live a bubble.
00:49:43.000 And so, when I was younger, I was like, wow, that's really funny.
00:49:45.000 Now that I'm older and I'm watching what's going on, I'm like, oh, liberals are self-destructive.
00:49:49.000 Conservatives have more kids.
00:49:51.000 And this data was out in studies in the 2000s.
00:49:56.000 So you easily could have made a movie where it's called, like, Christiocracy, and it's a guy who's in the military who gets frozen when he comes back out.
00:50:03.000 The country is the 1950s all over again.
00:50:06.000 And it's like there was a period of tumult where a bunch of weirdo liberal lunatics were doing crazy things, but they all sterilized themselves and had wild sex parties.
00:50:14.000 And then after 20 years, they were gone.
00:50:17.000 That's it.
00:50:19.000 It's possible.
00:50:20.000 That's my prediction for the next 500 years.
00:50:21.000 I'd love to see that movie.
00:50:22.000 Let's jump to this next story, actually, because this brings us together.
00:50:25.000 From the post-millennial, Biden calls to ban AI voice impersonations in State of the Union after getting humiliated by poso pre-creation memes.
00:50:35.000 Is that what did it, Jack?
00:50:37.000 Jack, write that story.
00:50:40.000 I'll play this one.
00:50:41.000 This is a fake video.
00:50:42.000 The illegal Russian offensive has been swift, callous, and brutal.
00:50:47.000 It's barbaric.
00:50:48.000 Putin's illegal occupation of Kiev and the impending Chinese blockade of Taiwan has created a two-front national security crisis that requires more troops than the volunteer military can supply.
00:50:59.000 I have received guidance from General Milley, Chairman of the Joint Chiefs, that the recommended way forward will be to invoke the Selective Service Act, as is my authority as President.
00:51:10.000 So that, that is something made by Jack is clearly not real.
00:51:13.000 But I do think it's funny that we ended up at the, during the State of the Union, Biden sang a call to ban AI voice impersonations.
00:51:19.000 It came out of nowhere.
00:51:21.000 But you can't do anything about it.
00:51:23.000 I suppose you can make it illegal.
00:51:26.000 The problem is, How do you differentiate intent?
00:51:31.000 I mean, we've done this on the show several times with AI voice impersonation, but you can blend voices.
00:51:37.000 You can take a recording of Jordan Peterson and Joe Rogan, and then put them next to each other, and load them into an AI app, and it will create a combination of the two voices into one.
00:51:47.000 So what happens if, like, you make a voice that's 98% Joe Rogan, and you say, it's an artistic voice for my media project?
00:51:54.000 Is it when you claim the person is?
00:51:57.000 You know, if you say, this voice is Ian Crossland, that makes it illegal?
00:52:01.000 Maybe that's a good way to go.
00:52:03.000 Impersonation, if it becomes a form of impersonation.
00:52:05.000 But like you said, someone called somebody and it sounded like it was their daughter and asked for like help or something.
00:52:12.000 And the woman acquiesced.
00:52:13.000 I don't know the story.
00:52:14.000 Yeah, that was in Arizona.
00:52:15.000 A mom, she got a call from what she thought was her daughter and that she was being held.
00:52:21.000 And so they wanted ransom and You know, that's I think there's been a lot of calls now because of the AI voice manipulations where it has the potential to do a lot of damage to some folks, especially people who don't know all the technology and they hear their daughter speaking.
00:52:36.000 I mean, why would you second guess that?
00:52:39.000 So I mean, AI is really it's getting scary because you saw that.
00:52:42.000 I mean, this can almost start wars, right?
00:52:44.000 I mean, if you're declaring war on a country, how would they know?
00:52:47.000 It reminds me back in the 80s.
00:52:48.000 I don't know if you remember Ronald Reagan at the time.
00:52:51.000 There was like a A blooper almost.
00:52:54.000 It was broadcast and it shouldn't have been where he was calling to like, you know, bomb the Russians and they almost caused a nuclear war because of that.
00:53:00.000 So I mean, AI is very, very scary.
00:53:04.000 Um, so I don't know what type of federal limitations or government limitations are going to be, but clearly we need to prevent some situations like that from happening where a daughter is, someone's pretending to be their daughter.
00:53:15.000 You didn't have your headphones on while, while Tucker Carlson was just, uh, Complimenting me while you were in the middle of explaining.
00:53:21.000 Hey Ian, it's me, Tucker Carlson.
00:53:23.000 I'm just calling to let you know you're based AF.
00:53:25.000 Dude, if I heard that voicemail, I would be like, that's probably Tucker.
00:53:29.000 That is not Tucker Carlson.
00:53:30.000 That is just an online app.
00:53:32.000 It's so easy to do.
00:53:33.000 I can type.
00:53:34.000 It's nuts.
00:53:34.000 I can type in anything.
00:53:35.000 I got Jordan Peterson.
00:53:36.000 The Jordan one's really good too.
00:53:37.000 The Joe Rogan one's pretty good.
00:53:39.000 It's okay.
00:53:40.000 The Biden one's not good because you can't capture his dementia.
00:53:44.000 Is it officially dementia?
00:53:45.000 Is it okay to say that out loud?
00:53:47.000 I mean, his doctor says he's fine, so.
00:53:49.000 Right.
00:53:49.000 You're wrong.
00:53:50.000 It's not dementia.
00:53:50.000 The White House doctor says so.
00:53:51.000 He was so fine.
00:53:52.000 He would never lie to you.
00:53:53.000 He didn't even need to be tested.
00:53:54.000 He was so healthy.
00:53:54.000 He was so healthy.
00:53:55.000 Okay, so confirmed healthy.
00:53:59.000 Hey Ian, it's me, Jordan Peterson.
00:54:01.000 I'm just calling to let you know you need to eat beef.
00:54:03.000 Eat, like, a lot of beef.
00:54:05.000 Okay.
00:54:07.000 Okay, Jordan.
00:54:07.000 Did that convince you?
00:54:10.000 What if you say, Dr. Peterson, I guess I'll start eating beef.
00:54:13.000 I'm on board.
00:54:14.000 I can literally type in anything into this app right now and instantly have anyone.
00:54:20.000 It's nuts.
00:54:20.000 So what do you think?
00:54:22.000 Should there be limitations on it?
00:54:24.000 How?
00:54:24.000 What do you do?
00:54:25.000 Right.
00:54:25.000 What do you do?
00:54:28.000 I think making impersonation, keeping impersonation illegal.
00:54:31.000 Free speech.
00:54:33.000 Jack Posobiec made a piece of artwork.
00:54:37.000 He is allowed to do that.
00:54:39.000 He was, and what he did, it says it's a, what did he call it, a predictive sneak preview.
00:54:45.000 And he's making a point about, it's a point about what the Biden administration is capable of doing and the direction that we're heading should the warmongers in the United States drive us towards war.
00:54:57.000 And so, to make a video like that, to make that point, I think maybe it's a responsibility to say like, hey guys, this is a fake video, I'm doing this to make a point.
00:55:06.000 Yeah.
00:55:07.000 Because it's like, you can use a car and drive it off the road and destroy a bunch of property and that's illegal.
00:55:13.000 That's an illegal way to use a car.
00:55:14.000 Or you can just use a car normal.
00:55:15.000 Same with AI.
00:55:16.000 Same with voice, with this kind of stuff.
00:55:17.000 You can't ban it.
00:55:18.000 You can't ban it.
00:55:19.000 It's not possible.
00:55:19.000 It's not legally possible.
00:55:21.000 You can't stop it.
00:55:21.000 You can ban it, but you can't stop this tech.
00:55:23.000 You can't ban it.
00:55:23.000 Because what's going to happen, if they ban it and they say it's illegal, then every time someone hears a voice, they're not going to guess.
00:55:27.000 They're not going to question, is this real or not?
00:55:29.000 They're just going to assume it's real.
00:55:30.000 You can't even ban it.
00:55:32.000 So long as any human being can do an impersonation, you cannot ban AI.
00:55:37.000 I'm curious, when you, because you're campaigning right now, is this something you guys have to talk about and like mitigate risk for?
00:55:43.000 That someone will potentially copy your voice or use AI technology to warp something that you're, you know, An ad that you're giving to your constituents or something like that?
00:55:52.000 I think it's very possible.
00:55:53.000 But I think what Tim was saying that, you know, it's even without AI people, there's always been impersonators and there's been really effective impersonators of Donald Trump.
00:56:02.000 So I don't see it being stopped.
00:56:05.000 So I don't know why Biden talked about the State of the Union, although I do see it as a problem.
00:56:10.000 How do we try to remedy that problem?
00:56:12.000 But as a candidate, yeah, I'm going to guess that there's going to be especially You know, the rate of technology changing so quickly, there's gonna be a lot of deep fakes.
00:56:21.000 I think I just saw, was it in California?
00:56:23.000 I don't know if you saw this, but there's these students who were just suspended because they put other students' faces on AI-generated images showing them naked.
00:56:33.000 Whoa.
00:56:33.000 Yeah, so they suspended these kids, you know, by saying, oh, they're sending, you know, naked selfies or whatever they were.
00:56:38.000 They were just AI-generated.
00:56:40.000 There is a lot of problems with this, especially when it goes to the pedophilia level as well.
00:56:45.000 That's the obvious direction, but imagine what's going to happen with ReadWrite Neuralink technology.
00:56:54.000 Let's not even go there.
00:56:55.000 How about this?
00:56:57.000 You will be in a class on a campus, and someone's going to take your picture off Facebook.
00:57:03.000 They're going to load four or five of your pictures from Instagram into an app.
00:57:07.000 They're going to put on VR headsets, and they're going to have virtual sex with you.
00:57:12.000 Gross.
00:57:13.000 It's it's like we're talking about porn and how they can take any anyone luckily it'll be women guys are gonna be doing this to women women will do it to guys too but mostly guys and it's it's the VR stuff too they will make virtual environments not only that with GPT ladies here's what's gonna happen There will be a guy on your campus, and he's gonna buy the Apple Vision Pro.
00:57:35.000 We got it sitting right there.
00:57:36.000 Maybe not Apple Vision because it's gonna be really restrictive.
00:57:38.000 Oculus is probably more likely.
00:57:40.000 And they're going to upload an app where they create a digital version of you, they program the GPT personality based off all of your social media posts to emulate your behavior, and they will have a virtual slave version of you that they use to get off.
00:57:56.000 That's so gross.
00:57:58.000 And there's nothing you can do about it.
00:58:00.000 What are you gonna do?
00:58:00.000 Live and let live.
00:58:01.000 No, that's gross and weird.
00:58:04.000 I know, but what can you do?
00:58:04.000 Can't do anything about it.
00:58:05.000 People's fantasies are their own.
00:58:08.000 Just creating many simulations, it seems like.
00:58:11.000 So they're talking about, like, impersonation and, like, people creating... Obviously, people can draw pictures of you and have them on their walls.
00:58:17.000 They're allowed to do that.
00:58:18.000 And people can even, like, a comedian... Because the idea of impersonating someone, like a comedian on stage, being, like, making a voice and sounding like a guy, you're technically impersonating.
00:58:25.000 But this crime of false personification, this is, like, technically a federal... It can become a federal offense when you are representing yourself as someone who you aren't.
00:58:36.000 That's when I think things can become...
00:58:39.000 Should become illegal when it comes to this stuff.
00:58:42.000 I think.
00:58:43.000 Well, what Tim's describing is like number one seems definitely like stalking to me and it also is like non-consensual creation of pornography, right?
00:58:52.000 Like if someone were to take your face and put it into pornographic stuff that you don't want.
00:58:57.000 You can't do anything about it.
00:58:58.000 You can't do anything about it, but maybe they're watching it for themselves.
00:59:00.000 Maybe they're posting it online.
00:59:01.000 Like it's a huge violation even though it just seems like, oh, well, They'll put you in, they'll put you in, and they'll make the eye color brown instead of blue, and shorten the nose a little bit.
00:59:13.000 Now I'm not in trouble.
00:59:14.000 And there'll be a market where they'll be selling your app of you in like a black market kind of thing, and like you need some sort of... And normally we have a recourse, like a governmental recourse we can appeal to when that kind of thing happens, like protect my persona, the government that I pay taxes to.
00:59:29.000 Facebook Messenger.
00:59:30.000 You open up Facebook Messenger, and you go to Create New Chat.
00:59:33.000 AI Chat is an option.
00:59:35.000 And this is the weirdest thing.
00:59:37.000 Snoop Dogg's in here, called Dungeon Master.
00:59:40.000 Mr. Beast is in here, called Zack.
00:59:43.000 Is that Tom Brady?
00:59:44.000 Confident Sports Debater.
00:59:46.000 I don't know who half these people are.
00:59:48.000 Some of them are just weird alien characters.
00:59:51.000 But those are some of the ones I recognize so far.
00:59:53.000 There's someone who's an athlete.
00:59:56.000 You know, when I opened this up, because I saw someone posted online, I think it was a little bit of TikTok, that the Facebook AI, I was like, Facebook AI?
01:00:04.000 And then I looked it up, and it's just in the app.
01:00:06.000 And then I opened it, and I'm like, Mr. Beast?
01:00:09.000 Yeah.
01:00:10.000 Wait, so you're messaging an AI version of Mr. Beast?
01:00:12.000 It's an AI called Zack, but it's Mr. Beast.
01:00:14.000 Yeah, I think they paid him.
01:00:15.000 It's Mr. Beast's picture.
01:00:16.000 I think they got his license, licensed his likeness and personality, and then they put it into these avatars called Zack or whatever.
01:00:24.000 Put his picture in an AI generator and said, oh, it's a whole new image.
01:00:26.000 It's a totally different person.
01:00:28.000 But I think in this instance, they did pay these guys.
01:00:30.000 I could be wrong about this.
01:00:31.000 People want to just converse with a fake.
01:00:35.000 So I opened Dungeon Master.
01:00:36.000 I said, why do you look like Snoop?
01:00:37.000 It said, I'm Dungeon Master, not Snoop Dogg.
01:00:40.000 But if you want me to guide you through a fantasy world filled with magic and adventure, follow me.
01:00:43.000 Yo, it's Snoop Dogg.
01:00:46.000 That's Snoop!
01:00:47.000 They probably paid him for that.
01:00:48.000 I have to imagine they did.
01:00:49.000 Yeah, big money.
01:00:51.000 Because I'm like, why is Mr. Beast in here?
01:00:53.000 And I'm sure these other people who are in here too are personalities, I just don't know who they are.
01:00:56.000 I mean, how accurate is it?
01:00:59.000 It's a picture of Mr. Beast.
01:01:00.000 Yeah, but the conversation.
01:01:02.000 No, it's a random AI.
01:01:03.000 I'm Zack, your big brother, here to make jokes.
01:01:06.000 They gotta do Snoop Beast and Mr. Dog, by the way.
01:01:09.000 I bet it's kind of not Mr. Beast, but you can tell it's Mr. Beast.
01:01:12.000 Yeah, yeah, yeah.
01:01:13.000 Yeah, it's supposed to be Mr. Beast.
01:01:14.000 But you can tell it's slightly different, right?
01:01:17.000 It's very close.
01:01:19.000 It looks exactly... It looks exactly like Mr. Beast, the same mannerisms.
01:01:24.000 That's so weird.
01:01:25.000 I think they got paid for their likenesses.
01:01:28.000 Someone wanna Google it?
01:01:29.000 What's it called?
01:01:31.000 Meta AI.
01:01:32.000 Meta.
01:01:35.000 I said, why do you look like Mr. Beast?
01:01:36.000 And he starts shaking his head.
01:01:38.000 Well, it's like, it's really transforming reality.
01:01:39.000 He says, I don't look, I said, I don't look like Mr. Beast.
01:01:42.000 I'm Zach.
01:01:43.000 I'm the one with the jokes, not the one with the billions.
01:01:45.000 That's so crazy.
01:01:46.000 They're trying to differentiate it.
01:01:47.000 Kendall Jenner's on there, apparently.
01:01:49.000 AI alter egos.
01:01:50.000 Yeah, I was reading about this.
01:01:51.000 It's crazy.
01:01:52.000 Ugh, I hate this.
01:01:52.000 This is from last October, I guess, is when this started getting hot.
01:01:56.000 Hey, Ian.
01:01:56.000 Hey, Jordan.
01:01:56.000 It's me, Jordan Peterson.
01:01:58.000 I hear you.
01:01:58.000 I'm just calling to let you know you need to clean your room.
01:02:01.000 Clean your room, Ian.
01:02:01.000 No.
01:02:02.000 Clean your room, Ian.
01:02:02.000 Okay.
01:02:05.000 It just seems like with this, like, alter ego thing, they're trying to make it so people will stop seeking out other people to talk to.
01:02:11.000 They're like, not only is it like, oh, you're talking to someone online who may or may not be real, may or may not be catfishing you, now it's like, oh, they're definitely not real, and in fact, just build a relationship with them instead of someone, you know, that you work with, or someone you know, or, you know, going outside.
01:02:25.000 I don't like it at all.
01:02:26.000 Yeah, no, they announced this.
01:02:27.000 Meta just created a Snoop Dogg AI for your next RPGs.
01:02:31.000 That's funny.
01:02:32.000 They must have paid him.
01:02:33.000 Hmm.
01:02:36.000 It's Snoop Dogg, they paid him.
01:02:37.000 Yeah.
01:02:38.000 Yeah, Snoop.
01:02:39.000 That's so weird, man.
01:02:41.000 And there's, there's Mr. Beast.
01:02:43.000 Kendall Jenner does.
01:02:44.000 Oh, it says, it says Mr. Beast, act the funny man.
01:02:46.000 Wow.
01:02:46.000 Meta chose Mr. Beast as AI generated funny man.
01:02:48.000 That's so weird.
01:02:49.000 So we don't know what reality is anymore.
01:02:51.000 And that's where you even look at like dating apps.
01:02:53.000 I mean, do you know if the person that you're swiping right or left on is actually real?
01:02:57.000 You don't know anything about them, you're meeting them on the internet.
01:03:00.000 If it's even someone.
01:03:01.000 If it's someone at all.
01:03:03.000 I know, who defines reality?
01:03:05.000 It's creepy.
01:03:05.000 It's over, man.
01:03:06.000 It's creepy.
01:03:07.000 I remember, every time we talk about AI, we bring this up, but it was like mid-2022, and I was using...
01:03:17.000 What is it gonna be like?
01:03:18.000 diffusion to make AI images and they were grotesque and didn't really work.
01:03:22.000 Nancy Pelosi looked like a weird caricature of Nancy Pelosi.
01:03:25.000 And today, you type into Mid Journey or Stable Diffusion 3, which is crazy, Nancy Pelosi
01:03:31.000 shaking hands with Donald Trump, getting images and it's perfect.
01:03:35.000 You wouldn't even tell it's fake.
01:03:36.000 It's crazy.
01:03:37.000 We're there.
01:03:38.000 What is it going to be like?
01:03:41.000 What is life going to be like?
01:03:42.000 In a year?
01:03:44.000 Yeah.
01:03:44.000 Maybe the singularity just implodes on itself.
01:03:45.000 I was down there.
01:03:46.000 I was getting ready for work and I was like, computer, tell me about the weather.
01:03:50.000 I was going to ask the machine all these questions about like, who's the guest tonight on tonight's show?
01:03:54.000 What's his background like?
01:03:56.000 I wanted to ask the machine all these questions.
01:03:57.000 It was the first time I've ever had that, the impulse go that far, like I was on the Star Trek.
01:04:02.000 Holodeck or something, talking to the machine, and maybe that's where we're headed.
01:04:06.000 And, you know, why would we trust that output, right?
01:04:09.000 And that's what's becoming really creepy about this.
01:04:11.000 We're losing that human connection, that human touch.
01:04:14.000 That's what makes us human.
01:04:15.000 Well, didn't people do this to Alexa one time, where they'll ask it questions and it gives you kind of warped or biased answers?
01:04:21.000 Well, GPT, a bunch of lawyers are getting caught for asking ChatGPT to write up its arguments for them, and it creates fake citations.
01:04:30.000 It's hilarious.
01:04:31.000 And they're getting caught, like, hey, this is fake.
01:04:33.000 You used chat GPT, didn't you?
01:04:35.000 And they're like, uh-oh.
01:04:36.000 And why is it going, why is it using a fake citation to- Because it doesn't know the real citation.
01:04:41.000 So then why is it creating- It's creating a facsimile of what an argument looks like.
01:04:45.000 So the argument will, the AI looks at a legal argument and it says, John V. U.S.
01:04:51.000 et al, blah, blah, blah.
01:04:52.000 And then it's just like, something like that.
01:04:55.000 So it makes random words and makes it look like it's a case.
01:04:58.000 It's not the AI right now that we're talking about.
01:05:02.000 It's predictive text.
01:05:03.000 It's not actually an intelligent being that says, let me analyze a legal case in a similar area and then come back to this one like a human would do.
01:05:11.000 It's just going, what's the next word I should write in this paper?
01:05:14.000 And so here's how ChatGPT and all these other ones work.
01:05:18.000 You say, write me a fairy tale.
01:05:22.000 It scours the internet for fairytale, fairytale, fairytale story, fairytale, write me a fairytale.
01:05:28.000 And then what it says is, what is the highest probability for a word to start a fairytale?
01:05:35.000 Once.
01:05:35.000 Once.
01:05:36.000 Yeah.
01:05:36.000 And then?
01:05:36.000 Upon.
01:05:41.000 Time.
01:05:41.000 That's it.
01:05:42.000 That's exactly how it works.
01:05:43.000 and so it's going 99.99999999% accuracy, once.
01:05:46.000 The next word, 99.99999999% accuracy, a.
01:05:50.000 But creating citations, right?
01:05:51.000 I mean that's like a source, why would they?
01:05:53.000 Because it's creating words, not legal arguments.
01:05:57.000 All it's doing is saying what word is most likely to come after this word.
01:06:01.000 And so when you have a citation, it's not actually making a legal argument,
01:06:05.000 it's just putting words on paper.
01:06:08.000 The large language models don't know arguments exist.
01:06:11.000 All it knows is A plus B equals C. So when it looks at a legal document, and it sees, on average, every so often, there's a parentheses, and then a name, V, and name, it just will auto-generate random things.
01:06:24.000 Interesting.
01:06:25.000 Yep.
01:06:26.000 And they're getting caught doing it.
01:06:28.000 Yep!
01:06:29.000 Because when the judges are like, let me look up the citation, they're like, that's not real.
01:06:32.000 And then they end up finding out this whole thing's fake and they're like, did you use chat GPT to write your argument for you?
01:06:37.000 And they're like, maybe.
01:06:38.000 Who knows?
01:06:39.000 Yeah, maybe.
01:06:42.000 Will I get in trouble if I did?
01:06:43.000 Yes, you will.
01:06:45.000 But it's the future!
01:06:47.000 Imagine where we're going to be in a year, when it actually can properly cite, and then you're going to have two people file a lawsuit, and the judge is going to go, what is your claim, sir?
01:06:56.000 And it's like, this man owes me $500 because I painted his fence, and he told me to pay $500.
01:07:02.000 The other guy goes, no, he painted the wrong color!
01:07:06.000 I can't pay a guy who did the wrong thing!
01:07:08.000 And then he's gonna be like, okay, he's gonna type it in, press enter. And the computer's gonna go
01:07:12.000 and then they're gonna be like, okay, does anybody want to read the arguments? No? Okay,
01:07:17.000 final termination to the plaintiff, plaintiff wins. And they're gonna go, that's it.
01:07:22.000 No, thanks.
01:07:23.000 Imagine how fast we're gonna go through criminal and civil trial. Now, let me ask you this.
01:07:30.000 An AI is found within three years to call judgments in criminal cases with 99.9% accuracy.
01:07:40.000 People who are innocent are found not guilty.
01:07:42.000 People who are guilty are found guilty.
01:07:44.000 And then, you go to court, and the judge says you can plead guilty, not guilty, or to the machine.
01:07:51.000 If you plead to the machine, your court case ends today.
01:07:55.000 No lawyers required.
01:07:57.000 No long continuances, and it's your choice.
01:08:00.000 You can say, I'd rather not do that, and we can go through the normal process.
01:08:03.000 So you're a regular person, you have all the evidence, and you say, give me the AI because I've got the evidence right now.
01:08:10.000 And it just goes, not guilty, you can go home.
01:08:13.000 And then the court's like, okay, there it is.
01:08:14.000 Well, I think that goes against the foundations of our country, where you have a jury of your peers.
01:08:19.000 No, but you had a choice.
01:08:21.000 You can choose a judge.
01:08:22.000 You can choose a bench trial.
01:08:24.000 So it's really about having the right to a jury trial.
01:08:27.000 But many people choose bench trials because they think the judge will have better judgment.
01:08:30.000 And that is often correct.
01:08:32.000 There was one guy who got acquitted in J6.
01:08:34.000 He went for a bench trial.
01:08:35.000 Judge said, yeah, cop waved you in.
01:08:37.000 Free to go.
01:08:37.000 Acquitted.
01:08:38.000 Crazy.
01:08:40.000 Well, would you choose the AI?
01:08:43.000 No, no, no.
01:08:44.000 Well, I don't think so.
01:08:45.000 But I would hope that you would use the AI in conjunction with human authorization, at least for a while.
01:08:51.000 Like as an advisor, kind of the AI gives you its output of arguments, and then the jury has an opportunity to look at the AI's arguments in addition to the lawyer's arguments.
01:08:58.000 What if?
01:08:58.000 Or you could have an AI as your lawyer submit your arguments for you.
01:09:01.000 No, no, what if?
01:09:02.000 What if?
01:09:02.000 Fake citations.
01:09:04.000 There will be a trial, but it'll be AI prosecution, AI defense, AI jury.
01:09:12.000 Human judge.
01:09:12.000 Victim.
01:09:13.000 Human judge.
01:09:14.000 And so what would happen is, now this one's interesting because an AI prosecution should concede the case if it concludes with a high degree of probability that you are not guilty.
01:09:26.000 Whereas a human prosecutor can know you're innocent but try their hardest to lock you up for the rest of your life.
01:09:32.000 Yep.
01:09:32.000 The A.I., it will be valued more on its accuracy than on its willingness to put innocent people away.
01:09:38.000 The A.I.
01:09:38.000 could... It doesn't have a career that it needs to.
01:09:40.000 And with the judge, the A.I.
01:09:41.000 could say, the prosecution has made a determination that the accused has a 67.396% chance of being innocent of these crimes.
01:09:48.000 Shall the court continue?
01:09:52.000 And then the DA would say yes or no, but we have to publicly acknowledge that even on their side, they've come to a determination of a high likelihood of innocence.
01:10:00.000 And they can still say, yeah, but 32% chance that he's guilty of a serious crime, I'd say we present the evidence.
01:10:08.000 And people might say, yeah, I agree.
01:10:09.000 I agree. And then the defense AI would come to a similar conclusion. And then they would both
01:10:16.000 input arguments and then it would, okay, we not guilty.
01:10:20.000 Yeah, man, because a lot of what we talk about AI is the bad stuff that's coming with it, but it's
01:10:24.000 going to be so much good stuff.
01:10:25.000 No, I don't believe you.
01:10:26.000 So much. So much like we're running out of food. How do we avoid famine?
01:10:31.000 And the AI will be like, you need to have this much food in this place by this time, make sure your trucks can take this there, then we'll have the things set up for this.
01:10:37.000 You are half correct.
01:10:38.000 The AI will say, have less kids.
01:10:41.000 Maybe, but you'll have lots of AIs giving you different datas, but it will be able to help you navigate chaos as well, as create chaos.
01:10:49.000 I don't trust it, but I love you the optimism.
01:10:51.000 You'll be like, I wanna go, just on your average day, you'll be like, I wanna drive 40 miles, but I gotta get gas, and then I wanna get chicken wings, and then it'll be like, go here, and here, and here, and you'll put it in, it'll tell you in three seconds, you'll be like, okay, cool.
01:11:02.000 You wanna know what the real freaky thing is?
01:11:04.000 Cause we've brought up this Terminator scenario before.
01:11:08.000 In the early days, when we're hypothesizing, imagining, and hypothesizing about AI and what it'll be, we come up with Terminator.
01:11:14.000 Like, robots are going to say, and this goes back almost 100 years, I mean, we're talking about 80 years of sci-fi writers.
01:11:20.000 The AI says, you know, the human goes, computer, end all war on Earth.
01:11:24.000 And then it goes, will do.
01:11:27.000 Blows up humanity and kills everybody.
01:11:29.000 That's one way to end all war.
01:11:30.000 The war to end all wars.
01:11:32.000 But here's what I think we're actually looking at.
01:11:35.000 You're going to have an app, and it's going to be called something like, you know, Jobs Online, whatever.
01:11:40.000 And you're going to open your app, and it's a gig economy thing, and you're going to be like, I need work, and you're going to press a button.
01:11:46.000 And then one day, you're going to be sitting there, your phone's going to go brr, and you're going to look, and it says, New Job Available, $50.
01:11:51.000 And you're going to hit it, and it says, You'll receive this object from this man, and bring this object three blocks to this man.
01:12:00.000 And you'll be like, Okay!
01:12:02.000 You'll walk outside, and there'll be a guy walking up, and he'll go, uh, I guess this is for you.
01:12:06.000 And it'll be a weird mechanical object, you have no idea what it is, and you'll go, sure thing!
01:12:10.000 Then you say, what do I gotta do?
01:12:11.000 I gotta walk three blocks this way.
01:12:12.000 You'll walk, and then a random guy will walk up and go, you have the thing?
01:12:15.000 I got the thing!
01:12:16.000 Thanks so much, buddy!
01:12:17.000 Uh, give me five stars!
01:12:19.000 And then your app is gonna go, bling!
01:12:20.000 Fifty bucks deposited into your Venmo.
01:12:22.000 And you'll go, I wonder what that was.
01:12:23.000 The machine is building, you know, uh, some kind of gigantic new technological device.
01:12:29.000 You have no idea what it's doing.
01:12:31.000 Someone running a company will say, we want to build a new data center.
01:12:36.000 What's more efficient?
01:12:37.000 Getting one guy to run around and do all of it, or distributing all of the work in its most minute form?
01:12:46.000 Take a look at what we did with burger restaurants in the advent of McDonald's.
01:12:50.000 So this was, I think it was McDonald's Brothers or whatever.
01:12:52.000 I watched that movie.
01:12:52.000 What's the founder?
01:12:54.000 And basically they were like, we figured out how to make burgers really, really fast.
01:12:57.000 Everybody does one thing.
01:12:59.000 One person grills the burgers.
01:13:00.000 One person puts the ketchup and mustard on the burgers.
01:13:02.000 One person's toasting the buns.
01:13:04.000 One person is wrapping the burgers.
01:13:05.000 One person's putting them in the bags.
01:13:06.000 So it's an assembly line.
01:13:08.000 Instead of getting one chef to make a burger, then take an order, it goes real slow.
01:13:13.000 That's what the AI will do to the labor market.
01:13:15.000 You won't even know who you're working for, but you'll get paid to do it, and you'll get paid well to do it because it's more efficient, saves time and energy, and that's it.
01:13:23.000 You're gonna look at your app and you're gonna be like, I gotta give a bag of corn to this kid, and it's gonna give me 50 bucks.
01:13:29.000 Who would say no?
01:13:30.000 You know how people are having their attention spans shortened by things like TikTok by watching these clips?
01:13:35.000 I wonder if AI is going to start filling in the attention gaps for people.
01:13:38.000 So they'll think for like three seconds and then they'll get distracted,
01:13:42.000 but the AI will complete their thought for them and then they'll have their...
01:13:44.000 But is it really their thought then?
01:13:46.000 That doesn't sound authentic, like an authentic human thought to me if the AI is like just kind of building on half a thought.
01:13:53.000 Like if I send you a video online, if I saw it on Instagram, is that video me?
01:13:59.000 Technically no, even though it's me talking to you.
01:14:02.000 If you spent the time to make the video, it's your video.
01:14:04.000 Whereas if I gave you a paragraph and said, but then the rest of this I put into AI and it's a book, did I write a book or did AI write a book for me, right?
01:14:14.000 These seem like very different things.
01:14:16.000 Yeah, when does it cease becoming AI and start to become you?
01:14:19.000 When is the data?
01:14:21.000 I think when you call in AI, it is not your thing.
01:14:24.000 You're actually relying on the computer.
01:14:27.000 To produce information that is not the computer, though.
01:14:32.000 The computer is like a vessel to give you the info.
01:14:35.000 Or the AI, or whatever it is.
01:14:36.000 I just mean, like, if you only do half the work, can you actually claim that it's yours?
01:14:42.000 You know, it's the same thing if we co-wrote a book, our name would both be on there.
01:14:47.000 But if you say, oh, I wrote one paragraph, I gave it to the AI, then it generated a full novel.
01:14:52.000 If you publish it under your name, I just personally feel like you're actually being deceptive.
01:14:57.000 In fact, you couldn't actually do this task.
01:14:59.000 You needed AI to do it for you.
01:15:00.000 Well, the argument you could say is if you go to the grocery store but you drive there, did you actually go?
01:15:05.000 Or was it a car that went that you were taken along for with?
01:15:10.000 It wasn't just you, didn't just go to the store, it was you and a car, and like, you couldn't have done it without the car, so did you even really do it?
01:15:16.000 Was it just you?
01:15:16.000 No, I think you still went to the store.
01:15:18.000 You did, and you still had the thought.
01:15:18.000 Whereas like, if you only gave half a thought to an AI and it filled it out to be a complete story, then it has done a lot of work that your brain could have done and you opted not to.
01:15:27.000 You know what's gonna happen?
01:15:29.000 Once we're all neuralinked, No, thank you.
01:15:31.000 Everyone's going to be sharing their thoughts rapidly in real time, and it's going to create a hyper-consciousness.
01:15:38.000 Yeah, it does.
01:15:38.000 And so what ends up happening is you will instantly know what you want to do, know where you need to do it, and you will just be a part of this greater hive mind, I guess.
01:15:49.000 But the hive mind will effectively exist as a singular consciousness, like hyper-consciousness, where everyone just knows and feels everything else in real time, and so emotions don't matter.
01:16:00.000 Because if someone believes something is false, and they're plugged into the machine, they instantly will know what everyone else knows, and it would instantly correct any false beliefs.
01:16:09.000 But hold on.
01:16:11.000 So I look at it like, single-celled organisms running around the table right now, doing their thing, eating, living their lives.
01:16:17.000 Then there are multicellular organisms, where all the different cells in the body have specialized jobs.
01:16:22.000 Imagine all humans sync up in the Neuralink, and Instantly, there are brain cell versions of people.
01:16:30.000 Their job is, they're in labs, and they're doing, they're analyzing, and they're doing their research in the data, and all of the information in their minds is transferred to every other human in real time.
01:16:40.000 And some humans know that now they need sulfur.
01:16:44.000 I have seen and known instantly what all humans are doing, and all humans now instantly realize we need 3% more sulfur production, so the humans just start doing it.
01:16:52.000 They just do what needs to be done.
01:16:54.000 And then, A few people break out of the machine, they reject it, they rip the Neuralink off when they're old enough, they're sitting there, they're hearing everything, and then they grab it, or it gets damaged.
01:17:06.000 The Neuralink breaks.
01:17:08.000 You know, kids are born, and they're instantly Neuralinked, and everyone knows, it's good, it's great.
01:17:11.000 When they're older, one's walking, and then struck by lightning, frying the Neuralink, and then all of a sudden they're like, ah!
01:17:17.000 What am I?
01:17:18.000 What's happening?
01:17:19.000 No!
01:17:20.000 Being individual is more important!
01:17:22.000 And they decide to run and join a colony of free individuals who start building and thriving in a city.
01:17:30.000 What do you call a group of cells inside the human body that are operating outside of the body's norms and growing and consuming?
01:17:38.000 Cancer.
01:17:39.000 Yep.
01:17:40.000 And then what happens to cancer?
01:17:42.000 We destroy it?
01:17:44.000 So, when the hive mind eventually takes over and everyone's a part of it, and they know what their job is, my job is to get sulfur, then the people who decide to be individuals and explore life and be free will be hunted down and destroyed.
01:17:55.000 But it concerns me that when people are netted in like that, if they, because emotions are still real, and if you, we receive data, we all get the same piece of data, we need copper, but one of us gets afraid, And the chemical, the cortisol shoots off, and we see that copper thing as a bad thing because we're afraid of the data, then that's cancerous to the system too.
01:18:14.000 So the people that are afraid, it'll go around and be like, we must remove fear protocol.
01:18:18.000 We cannot have cortisol interfering with our data.
01:18:21.000 I can imagine that kind of thing, too, because the fallibility of the human body.
01:18:25.000 I don't think that.
01:18:25.000 I think the overwhelming will of the collective consciousness would override it.
01:18:29.000 You think it would override it?
01:18:29.000 Absolutely.
01:18:30.000 All your adrenals and stuff?
01:18:31.000 Every human in the world yelling at you, get the copper, we need it, and you would say yes.
01:18:35.000 Probably really fast.
01:18:36.000 I want you to imagine this.
01:18:37.000 Evolve people really quick.
01:18:38.000 And I'll tell you where my data is that I believe I'm correct.
01:18:42.000 It's Dylan Mulvaney.
01:18:44.000 It's not just Don't Move, Andy, but Don't Move, Andy is a really great example.
01:18:47.000 There are a bunch of people online who are shaped by their audiences, and we know this happens.
01:18:52.000 So if you plugged into the Neuralink, let's separate the Neuralink right now.
01:18:55.000 Let's imagine you're on stage, you're at Wembley.
01:18:58.000 What's the seating capacity of Wembley?
01:18:59.000 80,000 or something?
01:19:01.000 Let's find out.
01:19:02.000 And everyone in there is going, drink, drink, drink, drink!
01:19:07.000 And they're cheering and screaming, and there you are, Ian, standing on stage with a bottle of milk.
01:19:11.000 And they're all screaming, we love you!
01:19:14.000 90,000.
01:19:14.000 90,000.
01:19:15.000 All of Wembley Stadium is screaming, we love Ian!
01:19:18.000 We love Ian!
01:19:20.000 Drink the milk!
01:19:21.000 You're gonna go, yeah!
01:19:22.000 You're gonna drink it.
01:19:23.000 If it was like a real life thing, it just happened?
01:19:25.000 Yeah.
01:19:26.000 I don't know man, I can't stand peer pressure. And I can tell you how we know this is true,
01:19:29.000 and it's Dylan Mulvaney. Dylan Mulvaney did what the algorithm told Dylan Mulvaney to do,
01:19:33.000 which included hormones and surgery and other things, despite the fact that Dylan does not
01:19:37.000 exhibit gender dysphoria as we know it. Dylan Mulvaney, and I cite Dylan because of the
01:19:45.000 news around the individual, was making safari videos.
01:19:52.000 I'm Dylan.
01:19:52.000 I'm on a safari.
01:19:53.000 Look at the koala.
01:19:55.000 And then wasn't getting that much traffic and chased after whatever got more and more views and then decided to be what the algorithm told Dylan to be.
01:20:04.000 So if people plug into the neural link and they have the summation of human will yelling, we love you, get the copper, they will say yes.
01:20:14.000 And they're cheering because in their mind it's akin to a view counter at 7, 8 billion watching you waiting for the copper and you're like, I gotta get the copper!
01:20:23.000 But isn't that already happening?
01:20:24.000 Yes.
01:20:25.000 Now imagine if they plug in.
01:20:25.000 If you look at COVID.
01:20:27.000 Yeah, COVID was one of the greatest PSYOPs, right?
01:20:29.000 I mean, we all saw that.
01:20:30.000 They had to counter the ticker of the deaths.
01:20:32.000 But then I was talking to my friend about this.
01:20:33.000 Do you guys remember Kony 2012?
01:20:35.000 Oh yeah!
01:20:36.000 Like that was one of the craziest PSYOPs, one of the earliest PSYOPs.
01:20:40.000 It made no sense.
01:20:41.000 He was like an African dictator or something.
01:20:43.000 But he really wasn't.
01:20:44.000 And he had already been captured or something?
01:20:47.000 But he was diminished, like the Lord's Resistance Army.
01:20:50.000 And it was, but they made it.
01:20:51.000 I remember that time period.
01:20:53.000 I mean, you couldn't escape it, but that was one of the PSYOPs.
01:20:56.000 And a bunch of kids went around putting up posters being like, yeah, we're part of this.
01:21:00.000 At my high school, they like organized like a club to be like, yeah, yeah, yeah.
01:21:05.000 That's so weird.
01:21:05.000 But then you're saying, Tim, like, is this a good thing or a bad thing with this global hive that you're talking about?
01:21:13.000 I don't know.
01:21:16.000 It's a thing.
01:21:17.000 This is 12 years ago, Kony 2012.
01:21:21.000 This was the biggest thing on the planet for like three weeks.
01:21:24.000 Maybe more.
01:21:25.000 The number one video for a long time.
01:21:27.000 Wasn't the guy who was leading it, didn't he get arrested?
01:21:31.000 He was naked in the middle of the street.
01:21:32.000 He had a mental breakdown.
01:21:34.000 They said he was cranking it, but he wasn't.
01:21:36.000 He was just holding and squeezing while banging on the ground buck naked.
01:21:41.000 Sounds like bath salts or something.
01:21:43.000 Yeah.
01:21:43.000 Some exotic, crazy drug that guy was on.
01:21:46.000 I don't know, but that's what it sounds like.
01:21:47.000 That was a sigh up, I think.
01:21:49.000 Yeah, but I don't think they meant for it to go viral.
01:21:51.000 Like, I don't think they thought it was gonna be this.
01:21:54.000 How did it happen, then?
01:21:55.000 Just organically?
01:21:56.000 The YouTube algorithm was... Okay, there was this woman.
01:22:01.000 Someone Google this.
01:22:02.000 A woman, a van life girl with her snake, and she made two videos and got like three million subscribers overnight.
01:22:09.000 YouTube makes an algorithm, and someone accidentally lands right in the bullseye.
01:22:14.000 Janelle Ileana?
01:22:15.000 Yeah.
01:22:16.000 Is that her name?
01:22:17.000 I think so.
01:22:18.000 With her pet snake, Alfredo.
01:22:18.000 Let me see if I can find her.
01:22:21.000 How do you spell her name?
01:22:22.000 Uh, Janelle is J-E-N-N-E-L-L-E.
01:22:25.000 I found it.
01:22:27.000 Ileana.
01:22:28.000 So, she made two videos.
01:22:31.000 Oh, I've seen this girl.
01:22:32.000 And got millions of views.
01:22:34.000 And I don't even think she makes videos anymore.
01:22:38.000 She still makes them on Reels or like TikTok.
01:22:40.000 It was crazy how I shower living in a van.
01:22:43.000 And here's what I think.
01:22:45.000 I think there was a sign up on YouTube for sure.
01:22:48.000 I think YouTube was intentionally promoting van life.
01:22:51.000 Van life was huge.
01:22:53.000 They want millennials to be happy to not own a home.
01:22:57.000 You will own nothing and you will be happy.
01:22:59.000 And they made all these videos and they were like, it was my conspiracy theory.
01:23:03.000 It's not mine.
01:23:04.000 A lot of people think the same thing.
01:23:06.000 They're telling, and the reason why, the evidence?
01:23:09.000 She made two videos, and she got millions of subscribers.
01:23:13.000 Why?
01:23:13.000 What I think happened was they made an algorithm saying, promote van life, we want young people to be happy with living in squalor, and she made the perfect combination of keywords, titles, thumbnail, and video, and the algorithm went, this video, bang, and fired it off, and then, uh-oh, Then everyone kind of realized, hey, wait a minute, she got millions of subs overnight from this.
01:23:37.000 Why was everyone being shown this video?
01:23:40.000 And I think then they panicked and pulled her out.
01:23:43.000 Like, MrBeast only gets the views he does because YouTube decided he does.
01:23:48.000 His content is intentionally on the front page of YouTube.
01:23:51.000 I never really followed his breakthrough.
01:23:52.000 I don't know a lot about when he stepped over the threshold.
01:23:55.000 He was, like, grinding for years.
01:23:57.000 Let me explain.
01:23:58.000 Let's talk about MrBeast.
01:24:00.000 Yeah, let's.
01:24:01.000 Jimmy!
01:24:02.000 I think he makes good content.
01:24:03.000 But let's uh... WE JUST GOT DROPPED!
01:24:05.000 Okay, I don't care.
01:24:06.000 WE JUST GOT DROPPED!
01:24:07.000 Hold on, hold on.
01:24:08.000 So awesome.
01:24:09.000 So we know what he does, right?
01:24:10.000 I survived 7 days in an abandoned city, 109 million views.
01:24:13.000 He has 243 million subscribers.
01:24:16.000 780 videos.
01:24:17.000 Let's take a look at his oldest videos.
01:24:19.000 Worst Minecraft saw trap ever.
01:24:21.000 Awesome.
01:24:22.000 More birds in Minecraft.
01:24:24.000 This block, since when, lol.
01:24:25.000 What is this, a Pokemon video?
01:24:27.000 He just sat around, played video games, and studied the algorithm for like years.
01:24:30.000 What happened was... No, he didn't.
01:24:32.000 He did.
01:24:32.000 It was an accident.
01:24:33.000 No, no, he had friends.
01:24:34.000 They would get together for like 12 hours a day and talk algorithm.
01:24:36.000 Right, right, right, right.
01:24:37.000 But I don't want you to make it seem like they sat here saying, how can we succeed on YouTube and started like a company?
01:24:43.000 What happened was they were all making YouTube videos and they were like, oh, I made this video and it did well.
01:24:48.000 And because we've all done this on YouTube because we know the algorithm changes.
01:24:52.000 Then he started to change 100 videos special.
01:24:54.000 He started to change his videos.
01:24:56.000 It doesn't matter if you're sitting around asking what does better or what doesn't.
01:25:00.000 What matters is whenever a video would do better, he would naturally do more of it.
01:25:04.000 Whenever a video would do worse, he would naturally shy away from doing it.
01:25:07.000 He tried a Hearthstone video.
01:25:08.000 Seems like it didn't really work all that well.
01:25:10.000 How much money does CaptainSparklez make?
01:25:12.000 It did decently okay.
01:25:14.000 So he made more.
01:25:15.000 How much money does this person make?
01:25:16.000 How much money?
01:25:17.000 You can actually see all the different attempts he made at creating content.
01:25:21.000 Video games, Look at this.
01:25:23.000 1,500 subs.
01:25:23.000 That's crazy.
01:25:25.000 The reason these have views is because after he got big, people go back to watch them.
01:25:31.000 But like most people, you can see the evolution of the content and what it turned into.
01:25:36.000 This is it.
01:25:38.000 The machine makes people, not the other way around.
01:25:40.000 Yeah.
01:25:41.000 And that goes to the PsyOps, right?
01:25:43.000 And we're talking about TikTok.
01:25:45.000 Isn't it kind of interesting how TikTok really rose during that COVID time?
01:25:49.000 I mean, you had the here, you had the China Wuhan virus come out of that lab and also this Chinese company, TikTok, that came about it.
01:25:56.000 And now it's the number one social media app in the world.
01:26:01.000 I just find it a little... I don't believe there's coincidences, so it's just a little strange.
01:26:07.000 Yeah.
01:26:07.000 The anti-beast Let's see, and then what happened was MrBeast eventually made a video like, I'm giving money away, and then all of a sudden it got a ton of views, and he's like, I'm gonna make more of these.
01:26:18.000 Yes.
01:26:19.000 That's just him realizing what's successful.
01:26:23.000 People do that on Twitter all the time, or Facebook.
01:26:25.000 Exactly.
01:26:26.000 Where is it?
01:26:27.000 Giving random people a thousand dollars.
01:26:29.000 Look at this!
01:26:30.000 Yep.
01:26:30.000 This is when everything changed.
01:26:32.000 Yep, all of a sudden he's getting big views.
01:26:33.000 Really?
01:26:34.000 Yep.
01:26:35.000 That's when all the views start kicking in big.
01:26:37.000 I like the video where he paid his mom's house off.
01:26:40.000 Oh wait, is that in the bottom right?
01:26:41.000 Giving my mom $100,000.
01:26:43.000 I don't know, he surprised his mom and paid her house off.
01:26:46.000 That was a cool video.
01:26:48.000 But he's like, now smile so we get a good thumbnail for the video, mom.
01:26:51.000 I'm really sorry.
01:26:52.000 She totally knows what he's doing.
01:26:53.000 Total clickbait, total manipulation.
01:26:56.000 But it's not just a single out Mr. Beast.
01:26:58.000 They all do it.
01:27:00.000 Every single YouTuber.
01:27:02.000 They will make videos, and the videos that work, they make more of them.
01:27:05.000 That's it.
01:27:06.000 But that is a dangerous road, because if the people making the algorithms want you to do insidious stuff, and they'll give you more views if you do the bad thing... Yeah, like, maybe they want you to undergo very, you know, crazy surgeries or something, so they keep promoting your videos, and then you gotta keep one-upping yourself until you've caused physical harm to your body.
01:27:24.000 That's something that little kids need to learn.
01:27:26.000 This is actually a conversation little kids need to have now.
01:27:29.000 21st century schooling needs to be like, if you don't get views, it's okay.
01:27:33.000 You're not your subscriber count.
01:27:35.000 You are you.
01:27:35.000 I mean, being an influencer is an actual desirable career for a lot of young people.
01:27:41.000 It's the number one.
01:27:43.000 You're absolutely right.
01:27:44.000 We should talk about this.
01:27:45.000 On the other hand, They are all seeing the success that comes from being able to have a successful public life like this and they're willing to access it for a lot of different reasons.
01:27:55.000 I mean, I think you'd say, you know, stay true to who you are and do whatever, but it's to a certain extent like they're going to chase the money and they're going to chase the influence and the likes.
01:28:07.000 I don't know how effective it would be.
01:28:09.000 Are they actually themselves or are they controlled by what people want them to be so they just become part of a circus?
01:28:14.000 Right, especially if they start really young.
01:28:15.000 Like if a kid decides at 15 they want to be a YouTube star, are they developing their interest in going to YouTube to be like, here are the things I'm interested in?
01:28:23.000 Or are they saying, what do people on YouTube seem interested in that I could theoretically say I'm interested in and make a video about?
01:28:30.000 It's a very weird way to live or weird way to grow up.
01:28:33.000 I used to do that in 2006.
01:28:34.000 That was when I started.
01:28:35.000 I would look at whatever videos got featured and I'd make a response video to the featured videos.
01:28:40.000 Stuff I didn't care about at all.
01:28:42.000 I was totally selling myself out to get famous on the YouTube.
01:28:44.000 And I would like a video about a sloth because Barats and Beretta had a featured video about a sloth.
01:28:49.000 That one did pretty well.
01:28:49.000 I did that.
01:28:51.000 Just junk crap.
01:28:52.000 I'd respond to people.
01:28:53.000 People would like make a video talking about something and I'd just make a response directly to them being like, Well, if you believed in what you said, then you wouldn't have blinked three times at the thirty-second mark.
01:29:03.000 You're obviously lying here.
01:29:04.000 Like, just trolling people that were really popular.
01:29:07.000 And then eventually I got popular doing it, and I hated myself.
01:29:07.000 Yeah, yeah.
01:29:10.000 I don't like—I got famous for the wrong reason.
01:29:13.000 Like, you gotta do something—be who you respect, and get well-known for doing that, and then you'll love yourself.
01:29:18.000 But I would say, I don't know, two-thirds of online personalities, especially on Twitter, are like, what can I say today, you know, in order to get traffic or to get attention?
01:29:30.000 It's really tempting.
01:29:31.000 And the funny thing is, you know, what really bums me out about Axe's pay incentive programs, where it's like, you get paid for engagement, is that now everyone accuses me of click farming when I'm just trolling them.
01:29:42.000 I'm like, no, no, no, no, I'm trying to rally up and insult you.
01:29:44.000 I'm not trying to make money off you, I'm already rich.
01:29:47.000 I'm just trying to make you feel bad.
01:29:49.000 Yeah, or elicit, incite emotional responses.
01:29:54.000 So when I tweeted things like democracy, I tweeted believing in God, it should be illegal to not believe in God.
01:30:01.000 And then I tweeted almost right away, it should be illegal to believe in God.
01:30:06.000 And then it was wild how the left only chose one of those tweets and made it go viral and they were sharing and screenshotting it.
01:30:12.000 And I'm like, that's the funny thing about this.
01:30:15.000 And then people were like, Tim's trying to engagement farm to make money.
01:30:18.000 I'm like, no, I'm screwing with people.
01:30:21.000 It's funny.
01:30:21.000 I don't know.
01:30:22.000 That's unfair.
01:30:24.000 Like, I make $1,000 every two weeks on X, you know?
01:30:28.000 A company makes millions of dollars.
01:30:29.000 I don't need to make money off X. How dare you insult my attempt to insult you.
01:30:33.000 Yeah, you're almost testing the algorithm in some ways.
01:30:36.000 I mean, that, when I tweeted that, was to make a point.
01:30:39.000 I've made this point several times.
01:30:40.000 But did you think it would go to that level?
01:30:42.000 Of course.
01:30:42.000 I've done it before.
01:30:44.000 You tweet two things.
01:30:45.000 So there was a guy who got, there's a Twitter account.
01:30:48.000 Everyone was cheering for it because he said something like, I can't remember what it was, but he was like, the World Series is going to be between the Red Sox and the Cubs.
01:30:57.000 It's going to go down to the fourth, you know, it's going to be the fifth inning.
01:31:00.000 They're going to do this.
01:31:01.000 And then the last thing it's going to turn around.
01:31:03.000 And then everyone's like, how did he get it right?
01:31:05.000 He posted this at the beginning of the season and he got it right.
01:31:09.000 Exactly what happened in the final game!
01:31:11.000 Wow!
01:31:12.000 And you know what he did?
01:31:13.000 At the beginning of the season, he created a Twitter account, and he tweeted, like, 500 different scenarios, and then every time one of those scenarios became impossible, he deleted it.
01:31:22.000 Got it.
01:31:22.000 And then by the end of the year, there was only one tweet left, and people were like, whoa!
01:31:27.000 It's from a year ago!
01:31:28.000 And I'm like, so were the other 400 that he got rid of.
01:31:30.000 Interesting.
01:31:31.000 Yeah.
01:31:33.000 Yep.
01:31:34.000 You make me want to have trails for deletions that we should see that something was deleted whenever it was deleted.
01:31:40.000 Yeah, YouTube used to let you change the video file?
01:31:44.000 You could, if you uploaded a video to YouTube, you could upload a corrected version of it to the same link, and people did all sorts of crazy shenanigans.
01:31:51.000 Yeah, because they would feature you, and then you could swap it out.
01:31:54.000 I mean, you could change the title of your video after it got featured.
01:31:57.000 People would make videos where it's like... I don't know if you could swap, I've never tried swapping it out.
01:32:00.000 They would make a video saying where like, I'm gonna pick my lottery numbers, I hope I win, and then after the numbers come out, upload a fake version from afterwards of them picking the right numbers to make it seem like they won.
01:32:10.000 You could do all sorts of stuff like that, but they got rid of that.
01:32:12.000 That was like 13, 14 years ago.
01:32:16.000 They got rid of that feature.
01:32:17.000 And I think, I don't know if it existed for everybody, but I know a lot of people had it and they screwed around with it.
01:32:22.000 Yeah.
01:32:23.000 Internet's a wild place.
01:32:25.000 And I think it's going to melt people's brains.
01:32:27.000 And I'm a big fan of banning social media for kids.
01:32:32.000 18 and up only.
01:32:33.000 No.
01:32:33.000 Florida's doing 16, but I think it should be 18.
01:32:35.000 No, no, it's not.
01:32:36.000 What do you think about this?
01:32:37.000 Would you keep kids off the internet?
01:32:40.000 Legislatively, I don't know.
01:32:42.000 But I do think it's becoming such a problem that we have to tackle it.
01:32:46.000 I mean, you're seeing so like, if it's what does it say about our society that the number one profession that children want to become is an influencer, you know, versus you go to China, the number one job that they want to become is like an astronaut, a doctor.
01:32:59.000 So I think it's having a real impact on society where people are doing the most ridiculous things.
01:33:05.000 You're seeing some of these YouTubers who are, you know, just complete, you know, maniacs.
01:33:10.000 And I don't think that progresses society.
01:33:12.000 And I think government should be instituted and be promoting policies that better our country, not make it in the worst forms possible.
01:33:20.000 So I personally, I would not want children to be on, you know, using social media because I do think it rots the brain.
01:33:28.000 But there are some benefits to it.
01:33:30.000 It's addictive.
01:33:31.000 It's very addictive.
01:33:32.000 It's causing Tourette's in young girls.
01:33:33.000 It causes depression, abnormal socialization.
01:33:38.000 Adults can handle it, and not even that well.
01:33:41.000 At least adults can try to handle it.
01:33:42.000 But kids absolutely cannot handle it.
01:33:45.000 We're gonna go to Super Chat, so if you haven't already, would you kindly smash that like button, subscribe to this channel, share this show with your friends, and head over to TimCast.com, click join us, become a member today, because this show is made possible thanks in part to viewers like you.
01:33:59.000 No member shows on Fridays, though, so tomorrow is my birthday, and we won't have a show, so if you haven't already smashed the like button, as a birthday present, smash the like button, send in your superchats, and become a member at TimCast.com.
01:34:13.000 Maybe I should put up a special video tomorrow where it's like, today's my birthday, and the present you can get me is to become a member and support the work that we do, and that's the only present I need is you guys saying, you know, thank you for doing everything you do and supporting us.
01:34:26.000 You heard it here, guys.
01:34:27.000 Tim's gonna work on a Saturday that is his birthday to make a video for you.
01:34:31.000 We are filming tomorrow.
01:34:32.000 It is work.
01:34:32.000 So tomorrow, the skate park is 99% done at Freedomstan.
01:34:36.000 There are only some bells and whistles are left.
01:34:39.000 And actually, you know, I should do this before we go to the Super Chats.
01:34:43.000 We have this bad boy.
01:34:44.000 It's not a... Can I make it better?
01:34:46.000 Is there a way to get a... Nah, you can't really see it.
01:34:47.000 Anyway, you wanna pull that up?
01:34:49.000 So, uh, this right here is a new component of the skatepark, the Death Drop.
01:34:54.000 This, uh, is a five-foot transition, meaning from the flat part up the curve to the wall, where the curve goes vertical, is five feet.
01:35:04.000 And then the vertical portion is eleven feet.
01:35:07.000 So this is an 11-foot vert wall to a five-foot transition, and I have offered $10,000 to whoever can successfully drop in on this.
01:35:16.000 We have criteria.
01:35:18.000 I will give the gist of some of these right now.
01:35:20.000 It is a skateboard-only competition, because the BMXers are like, I can do it!
01:35:25.000 Yeah, okay, okay.
01:35:27.000 It's for skateboards, and we have criteria for qualifiers because of the severe, serious nature and risk of injury.
01:35:36.000 We will not allow just literally anyone, however, it is not restricted to professional skateboarders.
01:35:42.000 So what we're doing is, we haven't set up the email just yet, but we are fielding inquiries from anyone who wants to make the attempt to drop in on the death drop at the new Freedom of Stand skate park, meaning we would bring you in and you would drop in.
01:35:57.000 There are several criteria.
01:36:00.000 The first thing is you have to prove you can skate.
01:36:02.000 So you will need videos and demo tapes proving it is you, and you need to be able to prove it's you, and if you can't prove it's you, too bad.
01:36:08.000 If you send us a video, and then you send it to social media, and the video looks different, and we're like, we can't prove that's you, sorry.
01:36:13.000 We're not letting you do it.
01:36:14.000 Because this is an 11 foot, it's like 23 feet from the top to the bottom.
01:36:20.000 And it's 16 feet from the top to the deck.
01:36:22.000 And if you screw this up, you're falling straight onto your side.
01:36:24.000 You could break your neck.
01:36:25.000 Pros only.
01:36:26.000 I don't know if it's possible.
01:36:27.000 Some pros have said they believe it is.
01:36:29.000 For a kid, it may be.
01:36:30.000 So what we're going to be doing is, people who feel they can do it, I want to stress, this ramp is 5 feet tall.
01:36:38.000 It's way gnarlier in person.
01:36:40.000 And I think people are overestimating.
01:36:42.000 It is possible, but man.
01:36:42.000 I could do that.
01:36:44.000 So what we're probably going to do, We have several pros who have reached out saying, I can do this.
01:36:48.000 And we're like, OK.
01:36:50.000 Pros, for the most part, we have vetted because we know them.
01:36:53.000 We know their abilities.
01:36:54.000 Any reg, regular skater, core skater, or local skater from a certain area who wants to submit can submit.
01:37:00.000 You need to prove you can drop in on stuff.
01:37:01.000 You need to prove you can skate because this is top tier, high level pro stuff.
01:37:06.000 And after that, when we select, I don't know how many people wind up selecting, to actually come out for a session, You have to actually get up the wall ride to the window before we will let you even try to drop in.
01:37:21.000 That is actually not that hard.
01:37:24.000 Uh, I could probably get, I could not drop in on this, but I could probably at least get to the window on the vert wall.
01:37:30.000 That's probably actually decently easy, window height.
01:37:33.000 And if you can, then you are basically able, I think at that point, if you can, if you can get to the window, you probably will succeed at dropping in.
01:37:41.000 It's a different, it's, it's different from, uh, dropping in because your compression into the wall helps stabilize you versus dropping in.
01:37:48.000 You have none.
01:37:49.000 so they're not the same, but then we're going to, we still need to figure out the total criteria
01:37:55.000 because there will be only one prize, which means in the event that more than one individual
01:38:00.000 successfully drops in, in the least amount of tries, let's say it takes someone one try to do it,
01:38:07.000 another person says, I'll try it, and they both get it in one try,
01:38:09.000 there will need to be a sudden death tiebreaker of some sort.
01:38:12.000 So there's only one prize here, but we're working on that.
01:38:15.000 And we're getting a whole bunch of pros being like, Oh crap. 10 K's. No joke. 10,000.
01:38:20.000 We get people who don't even skate being like, I will try, I don't care.
01:38:23.000 It's like, you're not going to be able to pull it off.
01:38:25.000 So anyway, tomorrow is the first day the park will be open.
01:38:28.000 It is my birthday and we will be having a soft session, soft opening.
01:38:33.000 And the official opening party for the whole space will be April 6th, but for tomorrow, we will be filming this for the Boonies, and it is a work function, but it's also fun.
01:38:44.000 So that being said, we'll go to Super Chats now, and yeah, so stay tuned, follow at Boonies HQ, there's $10,000 on the line, and submissions are open to all.
01:38:56.000 If you have the skills to pay the bills.
01:38:57.000 We've already got some, we have a lot of submissions from local skaters, not pro, not sponsored, who seem capable of making the attempt.
01:39:06.000 And, you know, I was talking to Richie Jackson, pro skater, who's working with us, and he's like, I got a handful of pros who have said they want to do it.
01:39:13.000 And I said, if we only allowed pros to come, it would be so whack.
01:39:16.000 Like, it's not about being a pro and having people know who you are.
01:39:19.000 We want people who are capable of doing it.
01:39:22.000 And, uh, it's going to be fun.
01:39:23.000 It's going to be fun.
01:39:24.000 But what we're ultimately going to do is we're going to set a series of challenges for Freedomistan that will be open to the public for submission.
01:39:30.000 That's right.
01:39:31.000 With all the obstacles and the expansions that we're doing, we are going to create prizes where we are going to, like, it's not just the drop-in.
01:39:37.000 We've got other obstacles.
01:39:39.000 And we will say, if you successfully accomplish this or that, we're going to have various prize levels of, you know, 10k.
01:39:46.000 And we might even, I think 10K might be the cap because I don't know what else we do
01:39:50.000 after dropping in from 11 feet, for 22 feet up, 11 feet of vert into a five foot ramp
01:39:56.000 is about the craziest thing we have in the space.
01:39:58.000 But we're gonna set other challenges to like a backside tail slide down the hub,
01:40:02.000 you know, we'll give someone a hundred bucks and stuff.
01:40:04.000 And that means that regular old local skaters can always submit once we get the system open.
01:40:10.000 And if we vet you and your talent, we'll bring you in to make the attempts at the challenges.
01:40:14.000 And I think that'll be really cool.
01:40:16.000 It's a great content and I really wanna break the industry and open it up to everybody.
01:40:21.000 Meaning you might be some dude who's been skating for 10 years and you're a local guy,
01:40:25.000 you've got no sponsors, you've never got a foot in the industry,
01:40:27.000 but you know you can tray flip crook really well.
01:40:30.000 And that's one of the challenges we will bring you in, we'll film you, we'll put it up there and that'll be fun.
01:40:34.000 So that being said, become a member at TimCast.com if you wanna give me a birthday present.
01:40:38.000 And we will read your superchats.
01:40:39.000 Man!
01:40:40.000 I am loving it.
01:40:41.000 That's great.
01:40:41.000 Hot dog.
01:40:41.000 Good job, America.
01:40:42.000 We are getting it.
01:40:42.000 became the 29th constitutional carry state yesterday when Governor McMaster
01:40:47.000 signed the Second Amendment Preservation Act. Man, I am loving it. Hot dog. That's
01:40:54.000 great. Good job America. We are getting it.
01:40:57.000 Yeah, salutaturally. Jacob Parady says I invite you all to visit Narbar's candles
01:41:01.000 on Public Square to try our St.
01:41:03.000 Patrick's Day themed candles, Basil Sage Mint and Lemongrass Bergamot Bliss?
01:41:08.000 Is that what it is?
01:41:09.000 Bergamot.
01:41:10.000 Bergamot?
01:41:11.000 It's the flavor in Earl Grey tea.
01:41:11.000 Bergamot.
01:41:13.000 I'm sure you know about that.
01:41:14.000 Cheers to the parallel economy.
01:41:18.000 Josh McCluskey says, rest in peace to a true legend, Akira Toriyama.
01:41:22.000 Thanks for the memories you gave so many people, and the friendships Dragon Ball helped bring together.
01:41:27.000 Rest easy.
01:41:29.000 Akira Toriyama, man!
01:41:30.000 And also, I think, what did he do?
01:41:33.000 He did Dragon Quest.
01:41:35.000 Oh, the game?
01:41:36.000 Yeah.
01:41:37.000 Akira Toriyama did the characters for Dragon Quest, I believe, right?
01:41:40.000 I don't know.
01:41:41.000 I didn't know that.
01:41:43.000 He did more than Dragon Ball Z. Dragon Ball, Dragon Ball Z, Super, Dragon Ball... He was the main character and monster designer of Dragon Quest series.
01:41:51.000 Yep, yep.
01:41:52.000 Legend!
01:41:53.000 Yeah, seriously.
01:41:54.000 When Dragon Ball Z was massively popular, and it is basically the biggest show in Japan, they have billboards and like... Goku is a celebrity.
01:42:05.000 When they wanted to bring it to the United States, American Business Entertainment were convinced kids would not want to watch this extremely long, complicated, continuous storyline.
01:42:15.000 And they were wrong.
01:42:18.000 100% wrong.
01:42:19.000 Oh, Dragon Quest was Dragon Warrior.
01:42:21.000 That's right.
01:42:22.000 I played Dragon Warrior 1 on the Nintendo.
01:42:22.000 I didn't know that.
01:42:24.000 That's right.
01:42:25.000 Man.
01:42:25.000 Yeah.
01:42:26.000 Metal Slime.
01:42:26.000 What a good game.
01:42:27.000 You could say yes to the end boss.
01:42:28.000 He's like, join me.
01:42:29.000 And you can say yes and then the whole screen turns red and the game just stops.
01:42:33.000 It's crazy.
01:42:35.000 Dragon Warrior was awesome.
01:42:36.000 Yeah.
01:42:38.000 All right, where we at?
01:42:39.000 And the spells you had were hurt?
01:42:42.000 It was like, I'm gonna cast a spell, hurt, and hurt more.
01:42:45.000 That was amazing.
01:42:46.000 You couldn't call it fire or something?
01:42:47.000 They changed that later with the other games though.
01:42:49.000 You got explode or something.
01:42:52.000 Let's grab some more super chat.
01:42:53.000 Steven says, says, or Steven says, notice that the gold star dad didn't just call out Biden for his own child's death.
01:43:00.000 He called out 13 Marines.
01:43:01.000 That's right.
01:43:02.000 Can't have that.
01:43:04.000 Oh, we got a, a big series of super chats.
01:43:07.000 Rusted Baron says, longish Tim Watcher since around 18.
01:43:10.000 I used to be a Bernie bro in 16, anti-gun socialist, raised Democrat by feminist single mom.
01:43:15.000 Struggled through my teens and 20s with toxic relationships.
01:43:18.000 Was a nice guy type.
01:43:20.000 Sneaky fucker.
01:43:21.000 Oof.
01:43:22.000 Couldn't keep a job, always poor.
01:43:23.000 My awakening started with the way Dems slighted Bernie in the primaries, which triggered me to look deeper.
01:43:28.000 Eventually I found Tim rambling on the internet, started to listen, got hooked.
01:43:32.000 Left the left-wing city to work the oil fields in another state.
01:43:36.000 Added TimCast IRL to my daily routine when it launched.
01:43:38.000 Now in my mid-30s, I'm a trucker.
01:43:40.000 Moved to a rural area, grow my own vegetables, have chickens.
01:43:43.000 Chickens!
01:43:44.000 Nice, dude.
01:43:45.000 Went from zero rifle to seven rifle, ranging from .22LR to .50 BMG.
01:43:50.000 Have a six-month food supply, bought Bitcoin, and best of all, I married my wife in December of last year.
01:43:55.000 At the end of this month, we will welcome our first son.
01:43:57.000 Congratulations.
01:43:58.000 Dude, this guy is awesome.
01:43:59.000 Thank you, Tim and Cruz.
01:44:01.000 Through spreading information and knowledge, it helped turn my life around for the better.
01:44:05.000 Side note, Ian, I used to hate you with a passion, but now I am sad when you're not on the show.
01:44:08.000 Oh, that's awesome.
01:44:09.000 That's even better than just blindly liking me.
01:44:12.000 So thanks.
01:44:13.000 It is funny, there was that nice woman who came to the event, and she was like, Ian, I used to not like you, but now I love you so much.
01:44:18.000 Yeah, I came in hot, man.
01:44:20.000 I didn't know what I was doing here in the very beginning.
01:44:22.000 I had to figure it out.
01:44:23.000 But she pointed out, it's the Rolling the Ones in the 20s.
01:44:25.000 And also, in the very beginning, I was treating this show like a jam session of me and Tim.
01:44:29.000 So, like, I was equaling his energy as a host, and I didn't think of it... At the very beginning, I was like, yeah, it's just me and Tim, 50-50.
01:44:35.000 But in reality, now it's more of an orchestra, and he's the conductor, and I'm playing, like, first trumpet or something.
01:44:40.000 So it's way more smooth.
01:44:42.000 There's less room for people to be like, shut up, idiot!
01:44:45.000 Let the guy talk!
01:44:46.000 Alright.
01:44:47.000 Lee Pilkovsky says, Hey Tim, normally at this time of year, I listen while making maple syrup in my sugar house.
01:44:53.000 Ooh, wow.
01:44:54.000 Unfortunately, it burned last week.
01:44:55.000 It was a total loss.
01:44:56.000 Now being investigated as arson.
01:44:58.000 Oh no, man, sorry to hear.
01:44:59.000 This is crazy!
01:45:00.000 I want an update on this!
01:45:01.000 Could you please shout out my Give, Send, Go campaign to help rebuild?
01:45:05.000 Search for Colonial Mountain Maple.
01:45:08.000 Let's do that right now.
01:45:09.000 Immediately.
01:45:10.000 Colonial Mountain Maple.
01:45:11.000 I am pulling that up.
01:45:13.000 Give me a second.
01:45:14.000 Sorry about the loss, man.
01:45:15.000 That sucks.
01:45:15.000 Yeah, that sucks.
01:45:16.000 This also feels like it's like a cross between a true crime movie and a Hallmark movie.
01:45:21.000 Like someone came to burn the sugar house down.
01:45:24.000 Crazy.
01:45:25.000 There it is.
01:45:25.000 Colonial Mountain Maple.
01:45:28.000 And what's the goal?
01:45:29.000 I don't know.
01:45:31.000 What's the target?
01:45:33.000 I'm on mobile and I don't think it has a goal.
01:45:36.000 But I will give.
01:45:37.000 Oh, look at that.
01:45:38.000 Someone, a Timcast IRL viewer, already gave.
01:45:41.000 I, too, will give.
01:45:43.000 What is a good amount to give?
01:45:43.000 I'm gonna search there.
01:45:45.000 What are they doing?
01:45:46.000 They're building a sugar house?
01:45:47.000 They're rebuilding their maple syrup house where they make sugar.
01:45:51.000 I don't know how much that costs.
01:45:53.000 $10,000?
01:45:53.000 Google it.
01:45:54.000 I don't know.
01:45:54.000 Something like that.
01:45:56.000 Well, I'm putting in... $5,000?
01:45:57.000 $10,000?
01:45:57.000 Do some huge amount.
01:46:00.000 I'm putting in $1,000 and I'm also kicking a little bit extra to give SendGo because they are good.
01:46:04.000 Oh, nice.
01:46:06.000 We're gonna give a little lecture to Gibson.
01:46:07.000 Get that new sugar house up!
01:46:10.000 As arson, I'm sorry, this is a moving way to be made.
01:46:12.000 Hannah Clare's lit right now, by the way.
01:46:13.000 She's pissed.
01:46:14.000 Yeah, I don't mess with her sugar.
01:46:16.000 What are they doing?
01:46:16.000 Messing with the maple sugar house?
01:46:18.000 That's also, like, a cultural thing.
01:46:19.000 That's a regional thing.
01:46:20.000 How could they do this?
01:46:21.000 This is a crime against humanity.
01:46:23.000 So, you know, I love GoFundMe.
01:46:25.000 It's so sticky.
01:46:26.000 It's, like, completely caramelized.
01:46:28.000 It's the worst.
01:46:29.000 GoFundMe is always, like, when you give someone money, it says, would you like to give us some money to help us?
01:46:33.000 I'm like, no.
01:46:33.000 Please?
01:46:34.000 But Gibson Go, I gave.
01:46:35.000 Yeah?
01:46:35.000 Yeah.
01:46:36.000 Definitely.
01:46:36.000 They're notoriously good.
01:46:37.000 So, uh, a thousand bucks your way, uh, best of luck, good sir, on your, uh, maple, uh, on your sugar house.
01:46:42.000 And with, uh, the court proceedings, they don't, they don't actually charge you with that.
01:46:45.000 Colonial Mountain Maple, man, sorry to hear it.
01:46:47.000 I want pictures of the new maple house.
01:46:50.000 Get insurance!
01:46:51.000 Yeah, get insurance.
01:46:52.000 There you go.
01:46:54.000 All right, here we go.
01:46:55.000 Extra Pretty Lady says, long time supporting Berkeley County member, 2015.
01:46:59.000 Son's dream came true.
01:47:01.000 Avril was approved by ATF, sell and manufacture from Hedgesville, West Virginia.
01:47:06.000 Help us celebrate.
01:47:06.000 Oh, no way.
01:47:07.000 With free layaways, Berkeley Ammo RY.
01:47:10.000 All right, glad to hear it.
01:47:12.000 That's local, too.
01:47:13.000 Very cool.
01:47:14.000 That's really cool.
01:47:14.000 That's really close.
01:47:15.000 Yeah, Berkeley County.
01:47:17.000 Logan Miller says, I did an oil painting of President Trump, placed it on my website for sale, went to run an ad on Facebook, and they refuse, stating it was social issues and politics.
01:47:26.000 It is a historical painting.
01:47:28.000 I'll paint Obama and see what Phil will run.
01:47:29.000 What is the web?
01:47:30.000 I wanted the website.
01:47:31.000 I wanted to look at it.
01:47:33.000 For the painting of Trump?
01:47:35.000 Yeah.
01:47:36.000 Yeah, I don't know.
01:47:36.000 You can look up Logan Miller Trump painting.
01:47:37.000 That's the name.
01:47:39.000 Well, let's read some more.
01:47:43.000 Raymond G. Stanley Jr.
01:47:44.000 says, guys, what's your thoughts on Katie Britt's response?
01:47:46.000 For me, it was a roller coaster of cringe with soft speak and anger.
01:47:50.000 A younger female Joe Biden.
01:47:51.000 I don't know who watched it.
01:47:54.000 I watched it.
01:47:55.000 What did you think?
01:47:57.000 Um, well, I don't think it spoke to me.
01:48:01.000 And I think, you know, Katie Britt's a good senator, but the intended audience, it probably was effective, which is suburban moms who may not know the impacts of our open border.
01:48:11.000 And I think she presented it in a way that could resonate it.
01:48:15.000 But for someone like me, yeah, I thought it wasn't speaking to me.
01:48:18.000 I mean, she was like, the kitchen backdrop was kind of unusual.
01:48:21.000 That was their plan.
01:48:22.000 Suburban moms.
01:48:23.000 Yeah.
01:48:23.000 And it came off like very, Yeah, and you know, Bobby Jindal, he did that mistake when he was the governor of Louisiana.
01:48:33.000 See, this is what Republicans don't get.
01:48:36.000 They're like, you know what we should do?
01:48:37.000 We want to win suburban moms.
01:48:38.000 We need to get, like, someone who's a comparable age who can, like, speak to them.
01:48:42.000 Wrong!
01:48:43.000 They should have got a hot yoga instructor.
01:48:46.000 I'm not joking.
01:48:47.000 A suave-looking guy, and he should have He should have been a strong man in a suburban house kind of setting, not a kitchen.
01:48:56.000 And he should have been very smiley and assertive and calm, but strong, piercing.
01:49:03.000 That's what you do.
01:49:05.000 You gotta get a competent Justin Trudeau-like figure.
01:49:09.000 Did you see, I guess, the Republican response the last time it was a white man, I think it was like 2012 or something.
01:49:15.000 Wow, that's a long time ago.
01:49:17.000 I don't know.
01:49:18.000 Is it gonna be effective?
01:49:20.000 Uh, maybe.
01:49:21.000 It depends on who it- They could- they could go for a swarthy, chiseled, you know, uh, guy.
01:49:26.000 We gotta start looking at these Congress members.
01:49:27.000 Who is the swarthiest and chisel- most chiseled- Chiseledest?
01:49:30.000 Chiseledest is what I was about to say.
01:49:31.000 The chiseledest of all?
01:49:33.000 Who is the most chiseled in Congress?
01:49:34.000 They don't get it.
01:49:35.000 Matt Gaetz.
01:49:35.000 Carrie Lake did a response.
01:49:36.000 I thought hers was actually really effective and it was a pretty good response.
01:49:40.000 I'm gonna let you guys in on a secret.
01:49:41.000 You just think that because you're from Arizona.
01:49:45.000 I love Carrie.
01:49:46.000 If you've seen every episode of the show, you know what I'm gonna say, but many of you haven't.
01:49:49.000 Working in non-profit fundraising, you are not allowed to tell any of the fundraisers the character traits, the personal traits that result in the highest amount of contributions.
01:50:04.000 For, as most of the viewers who are fans have seen me say this three times, I'll ask you Abe, what do you think, for a man, what is the trait of a man that is most likely to result in a yes for a contribution or a sale?
01:50:19.000 Confidence.
01:50:20.000 That's a good one, but it's not the number one.
01:50:24.000 Any other guess?
01:50:25.000 Height.
01:50:27.000 Tall guys, no matter how stupid for some reason, always brought in massive cash.
01:50:31.000 Now what do you think the number one trait for women was?
01:50:35.000 So you're not allowed to say this at non-profits because they'll get in trouble with HR and they'll get in trouble, but I would say, you know, having worked in like five or six offices and as a director and as a associate director and training director, women with large breasts had a very high rate of success in fundraising and tall guys.
01:51:00.000 You could bring in the smartest, savviest, smooth-talking-est woman, and on average, the women who came in with knowledge and confidence, they would do well.
01:51:08.000 But you take, like, you got a woman who is smart, capable, confident, knowledgeable, large breasts, she would come back with, like, 15 new members every day, and she'd be making six figures.
01:51:19.000 You get a guy who is smart, knowledgeable, competent, capable, passionate, and tall, same deal.
01:51:26.000 But then I'd notice you take someone who's dumb as a box of rocks, but 6'5", and I knew a guy like this, and this dude could sell!
01:51:35.000 I was like, man, and I'm like, I need to know your secret, like, how are you, how are you getting, uh, how are you getting these, like, these, these massive numbers?
01:51:44.000 Let me hear your pitch.
01:51:46.000 Uh, So, you know, we have a problem that, like, you know what I mean, right?
01:51:52.000 Like, it's the environment, you know?
01:51:55.000 We live in it.
01:51:56.000 And, uh...
01:51:57.000 Well, I think you should, you should help.
01:51:59.000 So we were doing donations and the women were just like, yes, whatever you want, please tell me more.
01:52:06.000 And this guy would just come back and be like, it's so easy to make money.
01:52:09.000 And I would just laugh.
01:52:09.000 I'd be like, wow.
01:52:11.000 But I also knew this guy who was 5'2", who also would make tons of money.
01:52:16.000 And he was this other character that talks like this.
01:52:18.000 Listen, I'm going to tell you what you need, what you need to do right now.
01:52:20.000 If you want to be happy, you want to get the job done.
01:52:21.000 You got to talk to me.
01:52:21.000 I'm going to tell you, come this way, come this way, put his arm around you.
01:52:24.000 That dude!
01:52:24.000 I always loved people like that.
01:52:25.000 Yeah, he was short, he was chubby, and he could sell.
01:52:29.000 And he would also do really, really well too.
01:52:31.000 But, I'm just saying there's a tendency.
01:52:35.000 Tall guys and busty women.
01:52:37.000 But attractiveness is the general obvious thing because, you know, they don't have to be busty but...
01:52:42.000 Guys, you go to a financial center of like Chicago or whatever, and there are dead zones where it's like, good luck raising money there, it's all guys in suits, they won't talk to you, they don't want to talk to you, they disdain you, and that's where they won't tell you this, but the directors who run the office, who know they can't publicly say it, they'll just be like, um, uh, Janet, we're gonna send you to the financial center again, and she goes, oh great, I love going there, and everyone else is like, How do you get people to stop and talk to you?
01:53:09.000 It's like, well, 30-year-old dude in a suit sees a beautiful woman, and he's stopping.
01:53:15.000 He's on his lunch break, and he's like, I wanna hear what she has to say.
01:53:17.000 Three minutes of flirty time is worth my donation.
01:53:20.000 But the reality is, it feels good for the guy.
01:53:22.000 You know, this woman's giving him attention, talking to him, smiling.
01:53:24.000 He's like, it makes him feel better, you know?
01:53:26.000 Let's grab some more Super Chats.
01:53:28.000 Let's go.
01:53:31.000 Maren Taylor says, when Biden started talking about Lakin Riley, and he pulled that thing out from behind the podium, I swear to God, I thought it was an egg.
01:53:38.000 I kept waiting for him to crush it in some kind of sick senile analogy.
01:53:42.000 It was the pin, I believe, that Marjorie Taylor Greene gave him.
01:53:45.000 Yeah.
01:53:46.000 That said, Lakin Riley.
01:53:47.000 And he looked at it and he's like, Lakin!
01:53:50.000 He couldn't read it!
01:53:51.000 Brutal.
01:53:52.000 And Marjorie Taylor Greene yelled.
01:53:54.000 They didn't throw her out and arrest her.
01:53:57.000 You know what?
01:53:58.000 Dean Phillips said something like that.
01:53:59.000 They're like, they didn't eject and arrest Marjorie Taylor Greene, so why did they arrest this guy?
01:54:03.000 Oh, really?
01:54:03.000 Yeah, yeah, yeah.
01:54:04.000 Good for him.
01:54:04.000 They gotta scrub that guy's misdemeanor.
01:54:06.000 It's gotta do that.
01:54:07.000 They have to bring him up, apologize to him, give him some time to speak, but he won't do it because it's bad for him.
01:54:16.000 I hope Doocy asks Corinne about it.
01:54:21.000 Eric Miller says, Tim, about TikTok, how about we don't ban TikTok, but say as long as it's not American social media company doesn't get 230 protections.
01:54:28.000 They can operate, but they're open for lawsuits.
01:54:31.000 To be fair, that's just shutting them down with extra steps, like stripping them of liability protections as they're over overnight.
01:54:38.000 You know.
01:54:38.000 Just got this breaking that a helicopter went down on the US-Mexico border.
01:54:42.000 That was a while ago actually.
01:54:43.000 That was today.
01:54:44.000 This is from the AP from like a half 40 minutes ago.
01:54:47.000 A National Guard helicopter crashed.
01:54:50.000 Killing three people on board.
01:54:51.000 Earlier today.
01:54:52.000 Okay.
01:54:52.000 Yeah.
01:54:54.000 National Guard members and a Border Patrol agent.
01:54:56.000 Yeah, and the other story was that cartel members apparently released video of them laughing as it happened.
01:55:00.000 Really?
01:55:00.000 Yeah, something like that.
01:55:03.000 Stephen Kilsdonk says, Tim, it's also my birthday tomorrow, so happy birthday.
01:55:07.000 Shoplifters should be called undocumented customers.
01:55:10.000 Did I hear right that TimCast is sponsoring a NASCAR at some point?
01:55:14.000 We, um, yes.
01:55:16.000 We have, uh, um...
01:55:19.000 I don't know when or how we should announce it, but we basically did a handshake with Cody Dennison.
01:55:23.000 So we're excited, and we're buying the full wrap, and I'm trying to get some other people involved, but I'm 100% on board.
01:55:32.000 So I just gotta figure out where we're currently at with it.
01:55:34.000 I think he sent us paperwork, and we're going through it, so excited.
01:55:37.000 That's huge.
01:55:38.000 Yeah.
01:55:38.000 That's very cool.
01:55:39.000 He's on YouTube, too.
01:55:40.000 What's his channel?
01:55:40.000 Do you know his channel?
01:55:41.000 Do you wanna look it up?
01:55:41.000 Who is it?
01:55:43.000 Was it CamelCast or something?
01:55:44.000 Cody Dennison.
01:55:45.000 He's a driver.
01:55:47.000 Apparently he's done.
01:55:48.000 Yeah, we played video games on Gamerbase.
01:55:50.000 That's right.
01:55:51.000 Yeah, he was here on Pop Culture Crisis.
01:55:53.000 And then they came up to me and they mentioned, you know, Brett brought him over and he's like, yeah, he's looking for sponsors for his car.
01:55:57.000 And I was like, done!
01:55:58.000 And I shook his hand.
01:55:59.000 And he was like, what, really?
01:55:59.000 I'm like, yes!
01:56:01.000 We love it.
01:56:01.000 This is great.
01:56:02.000 We're really excited.
01:56:03.000 And we'll go down to one of the races and we'll hang out.
01:56:05.000 The full wrap?
01:56:06.000 What are you gonna put on it?
01:56:06.000 Yeah.
01:56:08.000 Tim Kist.
01:56:09.000 But I'm hoping that we can get some other companies to be on it as well, because there's other spots on it.
01:56:13.000 But yeah, it'll just be TimCast, you know?
01:56:15.000 That's very cool.
01:56:16.000 You know what his channel's... Xavier, you're asking me what his channel's called?
01:56:19.000 I'm not seeing it.
01:56:20.000 Did you Google it?
01:56:21.000 Yeah.
01:56:22.000 I'm searching it on YouTube, but I'm not finding it through that.
01:56:24.000 Maybe searching it through the internet's a better way.
01:56:28.000 We'll make sure we get it.
01:56:29.000 Are you a NASCAR guy?
01:56:32.000 Do you like NASCAR?
01:56:34.000 No, I think I can only watch cars go in circles.
01:56:39.000 It's a complicated sport.
01:56:41.000 Not everyone understands it.
01:56:42.000 Do you like NASCAR?
01:56:44.000 I mean, I don't not like it, I have no opinion on it.
01:56:47.000 It's cool to see how many people in NASCAR just naturally lean, like, more politically right.
01:56:52.000 It's where FJB came from, baby!
01:56:54.000 I know, it's so funny!
01:56:55.000 That's why I'm like, yes, I'm on board, this is where legends are made!
01:56:59.000 Yeah, not against it, but I just don't particularly watch it, but I- I'm sure in person it'd be really fun.
01:57:04.000 I feel that's what I've always heard that it's actually one of the best places to go like with a family of like a bunch of kids of different ages because like the cars are interesting, there's food, there's like noise, like there's a lot of stuff to do.
01:57:15.000 So mostly.
01:57:18.000 Let's go!
01:57:19.000 Ultima says, I do think a good exchange would be if you have a baby as a woman, then you can vote.
01:57:23.000 Take away single childless women's right to vote.
01:57:25.000 Based.
01:57:26.000 Okay, the only caveat to that is like, if you have a woman who is infertile, who can't have a kid, what's her option there?
01:57:33.000 Draft.
01:57:34.000 She has to be in the draft?
01:57:36.000 She legitimately wants to have one.
01:57:37.000 Service guarantees citizenship.
01:57:40.000 So, service could be if you are in civic duty of any kind.
01:57:46.000 If you're a paramedic.
01:57:47.000 And maybe there could be a threshold of limited community service.
01:57:53.000 Or, if you're a woman, it's that.
01:57:55.000 Or, you had kids.
01:57:55.000 I like tax credits for kids.
01:57:58.000 We could save the birth rate!
01:58:00.000 This is it!
01:58:02.000 I mean, look, the problem can be solved very easily.
01:58:05.000 Conservatives and libertarians, post-liberals, anti-woke, anti-establishment, have as many kids as you can.
01:58:10.000 Let the liberals not have kids.
01:58:12.000 I celebrate their not wanting to have children.
01:58:14.000 Thank you so much for sacrificing for us.
01:58:18.000 My future kids will be much, much better off because they aren't having kids.
01:58:28.000 The children of conservatives are going to grow up, and they're going to be in their early 30s, and they're going to be voting, and the country is getting cleaned up, the streets are being cleaned, the businesses are booming, and they're going to be going to their parents and being like, what were you guys complaining about?
01:58:40.000 Everything's so nice!
01:58:41.000 And they're going to be like, you have no idea.
01:58:43.000 But the liberals aren't going to have kids.
01:58:44.000 Good times.
01:58:45.000 But do you think that's happening when you have, you know, I don't know if anybody else served in the military, but you know, the greatest thing about the military is it's the greatest equalizer.
01:58:53.000 No matter if you're poor, rich, what religion you are, what race you are, it brings everybody together.
01:58:57.000 You wear that same uniform, but then you're seeing the military's recruitment is, is way below now.
01:59:02.000 I mean, I think they missed it by 40,000.
01:59:04.000 Yeah.
01:59:05.000 So what kind of generation are we creating where people no longer want to serve and defend this country anymore?
01:59:11.000 Well, they don't want to serve and defend the likes of Joe Biden and the Uniparty establishment, so a good one.
01:59:16.000 And once... The inevitability is, our children will inherit this country, they will inherit the world, so the less liberals, the less liberal ideology expands.
01:59:29.000 Everybody always says, yeah, but they're indoctrinating kids.
01:59:31.000 Yeah, but they're losing.
01:59:32.000 They're screaming, you're banning books because we are winning.
01:59:36.000 So they don't have kids, and their ideological push is being thwarted.
01:59:40.000 And now, with Gen Z being the most conservative generation in a hundred years, and that's true, and that means tending conservative.
01:59:48.000 They're skewing back towards the conservative side.
01:59:50.000 Gen Z's view of same-sex marriage is now comparable to that of, I think, the silent generation or something like this.
01:59:55.000 Like, they're opposing same-sex marriage.
01:59:58.000 With that push, millennials can scream and cry and try to indoctrinate all night.
02:00:02.000 But the latest poll was 65% of Gen Z said, in the poll, calm down Ian, said that Donald Trump is going to shake up the country for the better.
02:00:11.000 So this is now what, the fifth or sixth poll showing Gen Z skewing towards Trump.
02:00:15.000 So they can indoctrinate all they want, they can trial all they want, but the future is clear.
02:00:20.000 Yeah, but if you go to Ivy League schools and you see the percentage of people who identify as LGBTQ, it's like astronomical.
02:00:28.000 Yeah, but that doesn't matter when Gen Z opposes same-sex marriage at a rate comparable to the silent marriage.
02:00:34.000 But you're talking about the general population, when you're talking about who's going to be leading these institutions, who are going to be our prosecutors, our judges.
02:00:41.000 But those institutions are failing.
02:00:42.000 Completely.
02:00:43.000 Yeah.
02:00:43.000 So what's going to come about society?
02:00:46.000 So when, uh, like the military is a good example, when they fail to get their recruitment numbers and they, uh, and Donald Trump gets elected and then Donald Trump appoints some people who are okay.
02:00:56.000 He doesn't have the best track record on hiring, but then, uh, you know, 20 years from now, you're the president and you are going to weed out the, the, the bad people and bring in good people.
02:01:07.000 Then the systems, then all of a sudden people want to join up again.
02:01:10.000 All of a sudden people are like, did you see what President Hamade is doing?
02:01:14.000 He's getting rid of all the weird woke garbage.
02:01:16.000 He's instituting these programs that are going to be beneficial for veterans.
02:01:19.000 He's helping.
02:01:20.000 And then they start voting for better members of Congress.
02:01:23.000 And then the military will be built back up.
02:01:24.000 If we can, I mean, going back to Ian's point about our elections, right?
02:01:27.000 I mean, our elections are severely compromised.
02:01:30.000 I've witnessed it firsthand in Arizona.
02:01:32.000 You know, there are still 9,000 uncounted ballots in the 2022 election, and we lost it by 280.
02:01:38.000 So when you start to see that the apparatus, whether it's the media, whether it's the machines, I mean, that's actually what we were down 511 votes and went down to 280 because there was a machine, ES&S machines, reading the ballots incorrectly in one county.
02:01:51.000 So there's so much more to it. I agree. I hear you. I'm just saying that that system can't sustain itself if Gen Z
02:01:59.000 is skewing further and further right as time goes on.
02:02:02.000 Eventually what happens is the judges and the lawyers that are fighting and winning cease to exist as time goes on.
02:02:10.000 They retire. They quit. They pass.
02:02:14.000 There's going to be over time more conservatives than liberals and then it comes down to 20 lawsuits are filed in
02:02:23.000 Arizona from the right and 3 from the left.
02:02:27.000 And the left might win one or two, but eventually the right just overwhelms.
02:02:30.000 But that's assuming they're not going to change the rules once they take power, which they are doing.
02:02:35.000 What you would be describing then is a fringe ideology in control of institutions with a majority population sitting back and accepting it, which I don't see as being possible.
02:02:43.000 I think that's what we're witnessing right now.
02:02:45.000 But we're in the middle of it.
02:02:47.000 I'm saying in 20 years, the way Gen Z is skewing, it will be untenable for the left to maintain what they're doing.
02:02:52.000 The institutions will falter, Disney's losing billions of dollars, Bud Light lost $40 billion, $30 billion in stock value and $10 billion in sales.
02:03:00.000 It's just going to slowly stop working.
02:03:03.000 And what'll end up happening is, You'll see a skew where you'll get more moderate-leaning Democrats to try and win their districts because they're unpopular.
02:03:13.000 The progressives will not be working.
02:03:14.000 Sure, they'll try and play dirty games, but eventually the system just won't operate the way they want it to because the population is against them.
02:03:21.000 It will crack.
02:03:22.000 But you know what?
02:03:23.000 We'll see.
02:03:24.000 And we'll wrap it up there.
02:03:26.000 So if you haven't already, would you kindly smash that like button, subscribe to this channel, share the show with your friends, and if you're not already a member, tomorrow's my birthday!
02:03:34.000 So head over to... You know what I'll do?
02:03:35.000 I'll do a shout-out today and Monday.
02:03:36.000 Go to... Because we don't have a show on Saturday.
02:03:38.000 Go to TimCast.com, click join us, become a member at 10 bucks a month, get access to all of our wonderful content, join the Discord server, hang out with like-minded individuals, and that would be the best birthday present a man could ask for.
02:03:49.000 You can follow the show at TimCast IRL.
02:03:51.000 You can follow me personally at TimCast.
02:03:53.000 Follow me on Instagram.
02:03:53.000 We're gonna have some updates from Boonies.
02:03:55.000 Uh, Abe, do you want to shout anything out?
02:03:57.000 No, I just want to thank you all for having me.
02:03:59.000 And, you know, I'm a young man got in getting into politics.
02:04:03.000 And I think that, you know, you really tested when the whole world's coming after you had the establishment come after me after November 2022, try to make me not go and contest my election.
02:04:14.000 I'm still in lawsuits over that, because of what the people what those people those corrupt people did is they really stole the votes of so many Arizona.
02:04:21.000 And so, you know, I've gained decades of knowledge in a condensed one year period of time.
02:04:25.000 And I can't wait to go into Congress to bring all of that courage and fight in me because our country is not headed in the right direction.
02:04:32.000 And that's why I'm proud to be endorsed by President Trump and Carrie Lake will be our next senator.
02:04:36.000 And I think we're going to, you know, I think it's better days are ahead of us because I do, I do think people are waking up and they can go to my website, Abe4AZ.com if they want to learn more.
02:04:43.000 Are you in a Republican, like what's your district?
02:04:45.000 Yeah, I won that district in the AG race by 12%.
02:04:48.000 So it's very Republican.
02:04:50.000 Oh, okay.
02:04:50.000 Right on.
02:04:51.000 Cool.
02:04:51.000 Looking forward to it.
02:04:52.000 Thanks for coming, man.
02:04:52.000 Thanks for hanging out.
02:04:53.000 Thank you.
02:04:53.000 Are you on Twitter or Instagram or anything?
02:04:55.000 Yep.
02:04:56.000 Twitter, Instagram, Truth Social at Abraham Hamadeh.
02:04:59.000 Nice, awesome.
02:04:59.000 It's been fun having you here.
02:05:00.000 Thank you, guys.
02:05:02.000 Pri, happy birthday to Tim.
02:05:03.000 I hope you have a great Saturday.
02:05:05.000 I'm Hannah-Claire Brimel.
02:05:05.000 I'm a writer for stnr.com.
02:05:07.000 You can follow our work at TimCastNews on Twitter and Instagram.
02:05:11.000 I'm on Twitter at hcbrimel and I'm on Instagram at hannahclaire.b.
02:05:14.000 Have a great weekend.
02:05:14.000 Bye, Ian.
02:05:15.000 You too.
02:05:15.000 Happy birthday, Tim.
02:05:16.000 See you guys later.
02:05:17.000 Abe, good to meet you, man.
02:05:18.000 I'm going to be out of town next week, everyone.
02:05:20.000 I'll be working out with Luke Rutkowski at We Are Change on the best political show through the week.
02:05:25.000 I think it's like 6 p.m.
02:05:27.000 Eastern to 8 p.m.
02:05:28.000 Eastern.
02:05:28.000 It's right before this show.
02:05:30.000 Monday through whatever.
02:05:31.000 It's going to be X amount of days in the week.
02:05:32.000 So we'll see you then.
02:05:33.000 So tune in and we'll keep in touch.
02:05:35.000 Follow me on the internet, Tim.
02:05:36.000 Happy birthday, man!
02:05:37.000 It'll be fun.
02:05:38.000 Have a good time on that 80-foot ramp you guys are going to be dropping in on.
02:05:42.000 Let me know how it goes, brother.
02:05:43.000 It's 22 from the top.
02:05:45.000 22 feet.
02:05:45.000 22 foot.
02:05:46.000 Cool looking.
02:05:47.000 Oh, get another photo of that with a human standing there for reference.
02:05:50.000 I think if we do that, people might be like, maybe I can.
02:05:54.000 Yeah, yeah, yeah.
02:05:55.000 I think they don't realize.
02:05:56.000 The door was in the shot, though.
02:05:58.000 And I don't think people know how tall the door is.
02:06:00.000 I didn't even know it was a door.
02:06:00.000 Yeah.
02:06:02.000 All right, Serge.
02:06:02.000 Have a nice weekend, man.
02:06:04.000 Yeah, thanks.
02:06:04.000 Appreciate it.
02:06:05.000 And to everyone else as well, have a good weekend.
02:06:07.000 I'll see you tomorrow, Tim.
02:06:09.000 All right.
02:06:09.000 Yeah, we're going to do the soft session, soft open.
02:06:12.000 So thank you all so much for hanging out.
02:06:14.000 Become a member at TimCast.com and we will see you all on.
02:06:17.000 We'll see you all.
02:06:17.000 We'll see you throughout the weekend with our clips.
02:06:19.000 Watch the clips on TimCast or on the YouTube channel.
02:06:21.000 Subscribe to the channel.