Timcast IRL - Tim Pool - February 02, 2023


Timcast IRL - Biden's Home RAIDED By FBI, Feds Trying To COVER UP SCANDAL w-Sameera Khan


Episode Stats

Length

2 hours and 2 minutes

Words per Minute

215.49075

Word Count

26,491

Sentence Count

2,106

Misogynist Sentences

33

Hate Speech Sentences

42


Summary

After the midterms, the FBI raided the home of Joe Biden, and it turns out they found some classified documents. Plus, the deep state is trying to get rid of him, and Samira Khan joins us to talk about all that and more.


Transcript

00:00:00.000 The FBI has searched the home of Joe Biden.
00:00:22.000 Now, I call it a raid because when they searched the home of Donald, it surfaced and it looks like Hunter Biden had access to classified information somehow.
00:00:29.000 Very strange indeed.
00:00:30.000 Well, I don't know exactly what's going on or why.
00:00:32.000 Some people think it's that the deep state's trying to remove Joe Biden because he is going to be running in 2024.
00:00:39.000 So his team says administration.
00:00:41.000 That's what that's what it's looking like.
00:00:43.000 We got a story about a crying chief of staff saying he's going to be with them when he runs or something like that.
00:00:48.000 So maybe the deep state really does want to get rid of him.
00:00:50.000 It's the only way they can do it.
00:00:51.000 Or I actually think this may be a cover-up.
00:00:54.000 We're learning now that the National Archives were barred from telling the world, telling the American people, that they were searching the home of Biden for these classified documents and in fact found some.
00:01:03.000 Now who would tell them to do it?
00:01:04.000 It had to be Merrick Garland or Joe Biden.
00:01:07.000 So it looks like the reason they searched his house is because they're working at his behest, trying to cover it up, collect the documents and stop the story from getting out.
00:01:15.000 And that's exactly what they did.
00:01:17.000 And now the midterms are over.
00:01:18.000 The story has gotten out only because CBS News reported on it.
00:01:22.000 So we'll talk about that.
00:01:22.000 Plus, we'll talk about whether or not he's going to run.
00:01:24.000 We got Donald Trump declaring war on the culture war.
00:01:27.000 Apparently, Trump's going to make his key issues all about culture war stuff.
00:01:31.000 And we've already started to see it.
00:01:32.000 So this should be pretty interesting.
00:01:33.000 And then we got Taiwan on high alert.
00:01:36.000 Power goes out at LAX, LA airport.
00:01:39.000 Some people are concerned, maybe cyber attack.
00:01:41.000 And then we have this really crazy story.
00:01:43.000 It's kind of e-drama celebrity gossipy, but it's interesting because it's about AI deepfake porn.
00:01:50.000 And how like some Twitch streamers are crying because they've been deepfaked or whatever.
00:01:55.000 So, uh, let's talk about how the future's gonna get crazy.
00:01:58.000 Before we get started, my friends, head over to TimCast.com, become a member to support our work directly.
00:02:03.000 Click that Join Us button and you can support not only this show, the videos I make over at YouTube.com slash TimCast and TimCast News, the website.
00:02:12.000 But you're helping with our cultural endeavors.
00:02:14.000 So we're going to be setting up a physical space.
00:02:16.000 It's the coffee shop.
00:02:17.000 We've already got the building.
00:02:18.000 The contractor's coming in.
00:02:19.000 We're going to start doing the construction.
00:02:21.000 Plus, we have Freedomistan.
00:02:22.000 It is nearing completion.
00:02:23.000 It's going to be really exciting.
00:02:24.000 We're going to be launching a morning show, a skate show, tons of really awesome stuff.
00:02:28.000 And it's all thanks to you.
00:02:29.000 These things take time.
00:02:30.000 Plus, of course, we have the fact-checking non-profit, which is currently going through the filing process.
00:02:35.000 And it's difficult to do these things because we have to file in every single state.
00:02:39.000 It's ridiculous.
00:02:40.000 But we're working on it.
00:02:41.000 We're getting there.
00:02:42.000 And with your support, we're going to get these things done.
00:02:44.000 So thank you so much for being a member.
00:02:45.000 Don't forget to smash that like button.
00:02:47.000 Subscribe to this channel.
00:02:48.000 Share the show with your friends.
00:02:49.000 Joining us tonight to talk about all of this and more is Samira Khan.
00:02:53.000 Hi.
00:02:54.000 Thank you so much for having me.
00:02:55.000 Very excited to be here.
00:02:56.000 So who are you?
00:02:57.000 What do you do?
00:02:58.000 I'm an independent journalist as of right now but I used to work for RT in DC.
00:03:02.000 I was their Washington correspondent for a little bit and yeah I've been in the game for a little while and I find it really interesting.
00:03:09.000 Now I used to be in the like progressive left.
00:03:13.000 I was very involved with the Bernie Sanders campaign and then recently I think with the country moving I guess left I guess I've taken more of like a centrist position, and I've gotten pretty anti-left, so yeah, that's my background.
00:03:29.000 Anti-woke journalist, is that it?
00:03:31.000 Pretty much, pretty much anti-woke journalist, but I used to focus more so on foreign policy, but I'm also, like Trump, getting more so into the culture war.
00:03:39.000 I find that really interesting.
00:03:41.000 And it's the same thing, right?
00:03:42.000 Culture war, they're also using in foreign policy, so on and so forth.
00:03:46.000 Seems fitting for this show, so thanks for joining us.
00:03:48.000 We've got Hannah-Claire Brimlow.
00:03:49.000 Hi, I'm Hannah-Claire Brimlow.
00:03:51.000 I'm a writer for TimCast.com.
00:03:53.000 Oh, okay.
00:03:54.000 Sorry, I need a longer intro.
00:03:55.000 Yeah, develop something.
00:03:56.000 I'm kind of in your boat, Hannah-Claire.
00:03:58.000 I'm Ian Crossland.
00:03:59.000 Got nothing to say too much, Samira.
00:04:00.000 We talked a little bit about your last name, Khan.
00:04:02.000 Yeah.
00:04:02.000 Maybe you come from Genghis.
00:04:03.000 Yeah, I mean, if he's my ancestor, that would be super based.
00:04:06.000 That's hardcore.
00:04:07.000 Yeah.
00:04:07.000 Well, good to see you.
00:04:09.000 People are saying that when I was talking about in the intro it glitched and like deleted the part where I said Donald Trump was running or something like this and going after the culture war.
00:04:18.000 Like a bunch like four different superstitious came and said something some weird glitch happened.
00:04:21.000 Just to clarify Donald Trump is running for president is what you said?
00:04:24.000 declaring war on the culture war. That's funny. War. I'm declaring war on war.
00:04:27.000 Or I should like... It'll never happen again. He didn't say that. Yeah. And I'm saying that
00:04:31.000 he's going to be directly addressing culture war issues because he gets the biggest applause for
00:04:36.000 it when he talks about it. The culture war is something I feel like I can impact. Yeah. Like
00:04:40.000 politics and stuff I feel like an outsider.
00:04:42.000 I can offer advice and things like that from the outside, but the culture where I'm in it, you know, we're in it.
00:04:46.000 Yeah, I mean, it's worked well for DeSantis and Junkin, so it's a smart decision on Trump's part because I feel like one of the criticisms that I've seen from his supporters, from MAGA people, Is that he hasn't gone hard enough on the culture war stuff.
00:04:59.000 But now that he's changing, I think that that's probably a good move for him.
00:05:03.000 Yeah.
00:05:03.000 It's worked for DeSantis, so.
00:05:04.000 What do you think, Serge?
00:05:06.000 Yo, I'm at Serge.com.
00:05:08.000 Joe's trying to get over this stupid sinus infection, so my voice sounds super weird.
00:05:12.000 Yeah, but almost there.
00:05:14.000 Done soon.
00:05:14.000 Well, let's jump into this first story.
00:05:16.000 We got this from the Daily Caller.
00:05:17.000 You may have seen the news earlier today.
00:05:18.000 DOJ searches Biden's Delaware home for a second time.
00:05:22.000 Searches!
00:05:23.000 Just a search.
00:05:24.000 Not a raid?
00:05:24.000 Not a raid, no.
00:05:26.000 It's only a raid when you're going after your political enemies.
00:05:28.000 Right.
00:05:28.000 Because Joe Biden basically is the DOJ.
00:05:31.000 They're just searching his house.
00:05:33.000 So here's what I want to point out.
00:05:35.000 We titled this, His Home's Raided, because that's what they say of Donald Trump.
00:05:39.000 They say his home was raided.
00:05:40.000 Right.
00:05:41.000 And it's like Donald Trump was cooperating with them, letting them come in, go through everything.
00:05:44.000 They went there twice.
00:05:45.000 They went there the first time, said put a lock on it, said okay, then they come back, break the lock off, take the documents, said you didn't cooperate.
00:05:49.000 And it's like, he was, but the media does that thing.
00:05:53.000 Yeah.
00:05:53.000 Where if Trump does it, it's evil.
00:05:55.000 If Biden does it, it's no big deal.
00:05:57.000 No big deal.
00:05:58.000 Is it his daily caller?
00:05:58.000 Yes.
00:05:58.000 And they're still calling it search?
00:06:00.000 And that's my point.
00:06:01.000 When I'm reading the post-millennial and even the post-millennial called Antifa domestic terrorists protesters, I'm like, come on, guys.
00:06:08.000 Libby was here and I told her, I'm like, Libby, what are you doing?
00:06:11.000 She's like, I know, I know.
00:06:12.000 And then there was even on TimCast.com, an article was put up where it referred to them as protesters.
00:06:17.000 I'm like, they're not, they're being charged with domestic terrorism.
00:06:20.000 So Joe Biden has the FBI go to his house and say it was a planned search.
00:06:24.000 Why is the FBI searching the home of the sitting president for illegally held classified documents?
00:06:31.000 What is this, the fourth search of his properties?
00:06:35.000 To be fair, you know, they've had a 100% success rate so far.
00:06:39.000 They rated his first home, definitely found documents.
00:06:42.000 Now they have to check the beach house and see if he left anything there.
00:06:44.000 They didn't find anything.
00:06:45.000 Yeah, I know.
00:06:45.000 But, like, it's worked for them in the past.
00:06:47.000 We can't trust that he hasn't.
00:06:48.000 I mean, Biden spends an enormous amount of time at these two properties.
00:06:53.000 It's not surprising.
00:06:54.000 I am surprised that they didn't check it sooner, right?
00:06:57.000 And I would only object to the term search.
00:06:59.000 I think possibly you could say it's a search because I'm sure the Biden administration knew about it beforehand, whereas I don't think the same courtesy was extended to Trump when they raided Mar-a-Lago, right?
00:07:07.000 A raid is an act of aggression.
00:07:09.000 The search is like, hey, just so you know, we're coming.
00:07:11.000 Don't forget to move that stuff in your garage.
00:07:13.000 Yeah, and also before this, before the entire Trump debacle, the FBI had very low trust with the American people, so they're probably trying to make the FBI look more credible again, possibly.
00:07:25.000 That's a good point.
00:07:26.000 Could be one of their motives for sure.
00:07:28.000 What was their reason for going to his house anyway, the Delaware house?
00:07:32.000 Did they have like a reasonable cause or probable cause for something being there?
00:07:37.000 Or was it just like... They're calling it a search.
00:07:39.000 Yeah, I think it's because he owns two properties in Delaware, one at Rehoboth Beach and one in Wilmington.
00:07:45.000 The Wilmington is where they found the documents for, so now they need to check the other place where he spends a lot of time.
00:07:50.000 One of the weekends where they were searching his house, he just went to his other house in Delaware, right?
00:07:53.000 Like, these are places that he spends an enormous amount of time, so if he is Doing something's not supposed to do documents.
00:08:00.000 It makes sense that you would check both properties.
00:08:02.000 It'd be weird to leave one off the list.
00:08:04.000 I feel like every president would have classified documents in their homes.
00:08:08.000 I mean, and like, what is it?
00:08:09.000 We talked about this one time.
00:08:10.000 Like, what is a classified document?
00:08:11.000 Is it like a menu from a dinner?
00:08:13.000 Because at one point those were classified.
00:08:15.000 I don't know.
00:08:16.000 I mean, it is weird.
00:08:16.000 They were in his garage and whereas Trump had his like stored in the appropriate way.
00:08:21.000 But I mean, I think it's cover up.
00:08:23.000 Yeah.
00:08:24.000 I mean, the spin is is interesting.
00:08:25.000 I think that says a lot.
00:08:27.000 And I feel like this headline in particular, if you're not aware that, I keep saying this, but because he has two homes in Delaware, they're like, they checked his Delaware home, there's nothing there.
00:08:35.000 It sort of canceled out the headlines from a couple weeks ago, where it's like, well, we checked his Delaware home and there were documents.
00:08:42.000 Here's what I'm thinking.
00:08:44.000 Why are they going after Biden and his documents?
00:08:47.000 But Biden's legal team, I think his legal aides are the ones who actually informed the FBI they had the documents.
00:08:52.000 Whereas with Trump, they raid, it's a legal issue.
00:08:55.000 Imagine this, you're Joe Biden, and you want to dig up dirt on a political opponent, your principal political opponent, or your Merrick Garland.
00:09:02.000 Basically, Biden says, oh, won't someone rid me of this Trump?
00:09:05.000 And Merrick Garland goes, I know exactly what you're saying.
00:09:08.000 He then goes and says, let's go raid Trump's house.
00:09:12.000 We're going to use the documents, you know, the what was the law?
00:09:15.000 What is the National Archives Act or whatever that law is?
00:09:18.000 Record Keepings Act or something.
00:09:19.000 And they're going to use that as the pretext to stop Trump from running from Trump running in 2024.
00:09:24.000 And then Merrick Garland says, but hold on, if this is going to work, we got to make sure Biden doesn't have the same problem.
00:09:29.000 Yeah.
00:09:29.000 He's got to be clean. So then he orders the search of Biden's property, not because he's
00:09:34.000 targeting Biden, because he's helping Biden. I see. So trying to make Trump look worse.
00:09:39.000 And if Biden gets out of office, and then they did the raids, it would annihilate the Biden name.
00:09:45.000 It would make him they'd be like, Look, you've got all the same things,
00:09:47.000 you hypocrite. So they're just getting it out of the way.
00:09:49.000 Is that what you're saying?
00:09:51.000 When they raided his house the first time, no one knew.
00:09:54.000 This was in November.
00:09:55.000 They covered it up.
00:09:58.000 No one knew it happened, okay?
00:10:00.000 So I'm thinking, you know, we were talking about this last week, like, why would they go after Biden?
00:10:04.000 Is the deep state trying to remove him?
00:10:06.000 Like, ending his 2024 chances?
00:10:08.000 Maybe.
00:10:09.000 It could be both.
00:10:10.000 It could be this.
00:10:11.000 They know Joe Biden can't run in 2024.
00:10:13.000 He's not going to win.
00:10:14.000 Yeah.
00:10:14.000 They also don't want Trump to run and he will win.
00:10:17.000 What do you do?
00:10:19.000 Oh, oh, geez.
00:10:19.000 Oh, Biden, you broke the record keeping.
00:10:21.000 You can't run for president now.
00:10:23.000 Oh, man, that means Trump can't run either.
00:10:25.000 And then if Trump tries to run, hey, look, we even stopped Biden from running because we're being fair.
00:10:30.000 Interesting.
00:10:30.000 So he's like their sacrificial lamb.
00:10:32.000 Yeah.
00:10:33.000 Well, I mean, there's also like a lot of infighting going on with the Democrats regarding Biden, whether he should be the one to run in 2024 or should they replace him.
00:10:41.000 But after the midterms, it seemed like Biden's approval rating went up.
00:10:45.000 So they stopped with the, you know, anti-Biden stuff for a while.
00:10:48.000 But I don't know.
00:10:48.000 Maybe this will change things.
00:10:49.000 I mean, he said he would announce after Christmas, right?
00:10:51.000 We're after Christmas.
00:10:52.000 He has not announced.
00:10:53.000 And I think that can only tell you that he wants to run.
00:10:56.000 Yeah.
00:10:56.000 And they don't want him to.
00:10:57.000 It's going to be a progressive.
00:10:59.000 It might not be a liberal guy.
00:11:00.000 Who?
00:11:01.000 Who could?
00:11:02.000 I think that's their biggest issue, they don't have a clear frontrunner.
00:11:04.000 I know people have said Newsom, I know people say Kamala, but like- Kamala's not likable.
00:11:08.000 Kamala's not likable, her approval ratings have always been lower than Biden's, and that's saying something.
00:11:13.000 over at predict it, you can buy there's a it's the prediction
00:11:18.000 market thing for those guys. For those that don't know, you buy
00:11:20.000 shares in a concept. So it's like, who will be the Democrats
00:11:25.000 2024 nominee? And you can buy one share of Kamala Harris?
00:11:29.000 Yes, for nine cents. That means if you buy a share for nine
00:11:34.000 cents, and she does become the nominee, you get $1 or like $1 10
00:11:37.000 or something like that. That's a tremendous that's a 10 to one
00:11:40.000 return, right? But come on, no sane person thinks she's going
00:11:45.000 Yeah.
00:11:46.000 I don't know why people are buying shares in Kamala Harris, but here's the best part.
00:11:49.000 You can buy no, she won't be for 91 cents.
00:11:54.000 Now that means for every 91 cents you spend when she invariably is not the nominee, you get nine cents.
00:12:01.000 That sounds like free money to me.
00:12:02.000 I'm not giving financial advice.
00:12:03.000 I'm just saying.
00:12:05.000 It's kind of crazy where there's no way.
00:12:09.000 It's gonna be really funny in 2024, or 2023, who knows, when they're like, Kamala Harris is the nominee, and I'm just like, wow, I did not see that coming.
00:12:16.000 I gotta tell you, man, if someone asked me to make a large wager, a large sum of money, on whether or not Kamala Harris would be the nominee, I'd say no.
00:12:27.000 It's not gonna happen.
00:12:27.000 Do you guys think that in 2036 there will be a United States?
00:12:32.000 Or that we will have a president?
00:12:34.000 That we'll be able to have a president?
00:12:34.000 You think that we'll be structurally sound enough for it?
00:12:37.000 Yeah, it'd be like four elections from coming or something.
00:12:40.000 Depends if, you know, there's a civil war coming or not.
00:12:43.000 I wonder, because we talk about, if we just sit here and wait, is it going to be calm?
00:12:48.000 Dude, if we just sit here and wait, the world's going to blow itself up.
00:12:51.000 We have to actively change the system.
00:12:54.000 I can't wait for idiot A or idiot B to become the next leader anymore.
00:12:59.000 I'm concerned that if we just play games and watch it like a TV show, that the inevitable demise of the American way is around the corner.
00:13:08.000 I hate to make a bet but I do think that they're in 13 years that that feels far right now but it's really not in the span of time anyone who's in their 60s will probably tell you that like 13 years can go by really quickly I feel like probably there's a country I would not be surprised if the party system looks different and that might be wishful thinking on my part but I think it might be a little bit yeah but I think that there's There's so much, like the criticism Democrats and the left-leaning media always levy at Republicans is that they're in-fighting and they don't get along, but the same thing is true for the left, right?
00:13:38.000 And I think as the youngest generation ages into being a solid voter bloc, you know, they're the most likely to cross lines on tickets, right?
00:13:45.000 If you ask them how they feel about different issues, it doesn't all fall in one camp.
00:13:49.000 So again, wishful thinking on my part, but I don't think the parties as we know them today necessarily will be where Be the same in 13 years.
00:13:57.000 I do think it's gonna get worse.
00:13:58.000 It's gonna get more progressive on the Democrat side.
00:14:01.000 On the Republican side too.
00:14:02.000 But they'll leave people behind.
00:14:04.000 They both will leave people behind.
00:14:05.000 Like we talked about this with like the shrinking middle class.
00:14:07.000 Like what if politics gives way to a solid middle class and that means ideologically not financially block of voters, right?
00:14:14.000 The people who don't feel like they have a home in the Republican Party or I don't think the two-party system is going to change anytime soon.
00:14:22.000 I mean, the Republicans and Democrats, they've collaborated and colluded to make sure that never happens.
00:14:27.000 They've pretty much stopped independent candidates from getting on the debate stage.
00:14:31.000 I mean, the last time was Ross Perot.
00:14:32.000 That's never going to happen again.
00:14:34.000 Unless, you know, the American people wake up.
00:14:36.000 I don't see that happening anytime soon.
00:14:38.000 Unless there is some sort of civil war, so.
00:14:41.000 I think the clearest sign would be if we had more independent governors.
00:14:44.000 I haven't thought that for a while.
00:14:45.000 Because Oregon ran an independent who got a fair amount of votes.
00:14:47.000 She didn't win.
00:14:48.000 They let the Democrat win in.
00:14:51.000 But I do think that, like, yes, you're right.
00:14:54.000 Administratively, it seems really hard to imagine a non-two-party system.
00:14:57.000 But I just don't think that's what... I mean, it's not in our Constitution or anything.
00:15:00.000 It's possible.
00:15:00.000 No, it's just the way things have worked out, right?
00:15:02.000 But I just think that young voters are more likely to leave the idea of, like, you have to be one or the other behind.
00:15:09.000 It's hard, though, because there are states where you have to register with a party to vote in that primary, right?
00:15:14.000 And that will be a difficult system to change.
00:15:15.000 Well, according to polling, I think 48 to 50 percent of Americans are independent, but then those independents end up voting either Republican or Democrat when push comes to shove.
00:15:25.000 Because you have to.
00:15:26.000 To participate in the primaries, you have to be in the party.
00:15:28.000 Yeah, and in the general, I'm talking But as more, as that gets left behind, like, if we were to reverse states, like, there are so many states that you have to do that and there are other states that don't want that.
00:15:37.000 If we didn't have to identify in the primaries, I think you would see a shift on the national platform.
00:15:41.000 Again, wishful thinking on my part, but... Or maybe it has to be a top-down approach.
00:15:45.000 Maybe.
00:15:46.000 Maybe it needs to start on the national level.
00:15:47.000 It needs to start in the presidential election for it to trickle down.
00:15:51.000 Yeah.
00:15:52.000 We'll see.
00:15:52.000 It'd be really interesting.
00:15:53.000 Yeah.
00:15:53.000 Let me pull up this story right here from the Daily Mail, and this is why I think that the Joe Biden FBI raid is actually a cover-up.
00:16:00.000 Daily Mail reports, What else are they hiding?
00:16:02.000 White House claims of transparency face even more scrutiny as it's revealed National Archives was blocked from releasing statement on classified documents found at Biden's think tank.
00:16:15.000 The revelation came during questioning of Archives General Counsel Gary Stern by top Republican James Comer.
00:16:21.000 It raises questions over who stopped the release from going public.
00:16:25.000 That is to say, just before the midterms, when they raided Biden's home and they knew he had these documents, the National Archives were barred by someone.
00:16:37.000 It'd have to be either Biden or Merrick Garland.
00:16:40.000 They said, do not make a statement.
00:16:42.000 You are barred from doing so.
00:16:43.000 That means when the midterms came, everybody, nobody had the opportunity to learn this.
00:16:50.000 CBS News eventually reports it, and that's how we find out.
00:16:53.000 And now we're actually getting some information as to what's going on.
00:16:57.000 Now there's more raids, there's more searches.
00:17:00.000 When it can't affect the Democrats.
00:17:02.000 Exactly.
00:17:03.000 And this says to me, if it's Garland or if it's Biden, They're covering it up.
00:17:07.000 Well, if someone asked the National Archives to cover it up.
00:17:11.000 For sure, 100%.
00:17:12.000 If the Archives were asked not to report, they were asked to cover it up.
00:17:14.000 And that means it was Biden or Garland.
00:17:16.000 So maybe the deep state is going after Biden and he said, no, no, no, don't let them, don't let anyone find out they're coming after me.
00:17:22.000 I don't believe it.
00:17:23.000 Cause they're all deep state.
00:17:24.000 They're all establishment.
00:17:25.000 What likely happened is they said, we're going to use this against Trump, the documents, but we got to make sure you're clean before we make that move.
00:17:32.000 So we're going to have the FBI come in and search anything and find anything.
00:17:34.000 And then we'll move forward with Trump.
00:17:36.000 But then they inadvertently found stuff.
00:17:39.000 How the story went.
00:17:40.000 Yeah.
00:17:41.000 When, when did they raid Trump's house?
00:17:43.000 When was that fall?
00:17:44.000 Was it November ago?
00:17:45.000 Yeah.
00:17:46.000 Was it November?
00:17:47.000 This is an important question.
00:17:49.000 Was it before or after?
00:17:50.000 August 8th.
00:17:51.000 August 8th is when they raided Trump's house.
00:17:53.000 Over the summer.
00:17:54.000 Yeah, so I kind of think they were like, hey, look, this is a big story.
00:17:56.000 And they're trying to say, maybe they do want Biden to run.
00:17:59.000 And they were like, if they go hard on this Trump can't run for president because of these documents, we got to make sure Biden doesn't have the same problem.
00:18:07.000 Lo and behold, he does.
00:18:09.000 Interesting.
00:18:10.000 I don't know.
00:18:10.000 What do you think?
00:18:11.000 Do you think it's a malicious cover-up, a beneficial cover-up?
00:18:14.000 Going back to the deep state thing, what other motives would the deep state have in getting rid of Biden?
00:18:19.000 Aside from, you know, he's not good for the Democrats.
00:18:22.000 I can't think of any other motives because he's doing exactly what they want them to do.
00:18:25.000 And they don't need to get rid of him.
00:18:27.000 They just need to be like, OK, Joe, we're going to get somebody else.
00:18:30.000 OK.
00:18:30.000 Yeah, I just don't buy the whole, like, yeah, the deep state is going after him.
00:18:35.000 I mean, I think it's his health, right?
00:18:36.000 Like, if they could have run him again, they would have, but, like, again, this sounds terrible, but, like, he doesn't seem healthy.
00:18:43.000 This is a continued issue.
00:18:44.000 Like, the other day, he just announced he's extending COVID orders, and his office said May 11th.
00:18:51.000 Everyone knew it, and then he told reporters May 15th.
00:18:53.000 Like, he can't keep Basic, important dates in mind, right?
00:18:57.000 He's a busy guy, I guess he's got a lot going on, but like, I think it's that, personally, I think it's that he is not healthy enough to run again, and therefore, they have to replace him.
00:19:07.000 He doesn't want to.
00:19:07.000 But wouldn't that happen internally within the Democratic Party?
00:19:10.000 Would they need this, you know, drama?
00:19:13.000 Not if he's gonna say, I'm running, I'm running.
00:19:15.000 You cannot make me go on that platform and say I'm not running.
00:19:17.000 Okay, well, we'll see.
00:19:18.000 Like if he digs his heels in, what are they gonna say?
00:19:20.000 I mean, I think this has more to do with making Trump look bad, to be honest.
00:19:25.000 Making Trump look bad?
00:19:27.000 I mean, sorry, yeah.
00:19:28.000 Making him look better in comparison to Trump.
00:19:31.000 Oh, he got rated too, but his wasn't that bad.
00:19:33.000 Exactly.
00:19:34.000 His wasn't that bad.
00:19:35.000 Precisely.
00:19:36.000 I think that it has.
00:19:37.000 Yeah.
00:19:37.000 Interesting.
00:19:38.000 I don't know, man.
00:19:39.000 It's also confusing because like... We don't have the details.
00:19:42.000 Maybe they just screwed up.
00:19:43.000 They were trying to cover this up.
00:19:44.000 They told the National Archives not to say anything, but then CBS News got the story out and now they're scrambling.
00:19:48.000 Yeah.
00:19:49.000 I don't know.
00:19:50.000 Maybe that's it.
00:19:50.000 Maybe Joe Biden can't run.
00:19:51.000 There was that former Clinton aide said this is it for him.
00:19:54.000 It's going to be the end.
00:19:55.000 And I'm sorry, I just don't feel that way.
00:19:57.000 I feel like Joe Biden could walk on Fifth Avenue and shoot somebody in the face.
00:20:01.000 Not kidding.
00:20:02.000 Not kidding.
00:20:03.000 Trump was right when he was talking about that.
00:20:04.000 Yeah, for real.
00:20:05.000 The current state of American politics is, if you're in the cult, you're in the cult.
00:20:10.000 There was a really great tweet I saw, and it said, if you are on the left, You are allowed to deviate from leftist economic policy without reprisal, but you cannot deviate on gender ideology, race ideology, and that explains exactly what the left is.
00:20:25.000 And it's like, that's an interesting point.
00:20:27.000 If you're woke, but you say something like, I don't know if universal healthcare could work, nobody cares.
00:20:32.000 They're just like, oh, that's interesting.
00:20:34.000 But if you are pro-universal healthcare and you say, hey, that woke stuff's nonsense, they call you right wing.
00:20:38.000 Yeah, I mean, it happens with minorities, too.
00:20:40.000 You saw when all of the Muslims in Dearborn protested against the sexualization of children, they called them terrorists, you know.
00:20:48.000 These are the same Muslims that the left pretended to defend during the war on terror, etc.
00:20:52.000 Now they're calling them literal terrorists because, you know, they're against woke.
00:20:56.000 I heard a story where it was like apparently some woman was at a diversity training and they said, you know, say your name and pronouns.
00:21:03.000 She said, oh, no, thank you.
00:21:04.000 I'm not religious.
00:21:05.000 And then people started chuckling and then everyone refused to give pronouns.
00:21:08.000 And I'm like, yeah, maybe that happened.
00:21:10.000 does kind of sound like a and then everyone clapped kind of story.
00:21:13.000 You know, because I don't I don't see people as willing to stand up.
00:21:17.000 Like I said, Joe Biden could walk on Fifth Avenue and shoot somebody and they'd still
00:21:20.000 vote for him.
00:21:21.000 Yeah.
00:21:22.000 So I'm not I'm not convinced that the average person who's aware of what's going on with
00:21:25.000 the corruption, the communism, whatever you want to call it, is willing to actually stand
00:21:30.000 up and say anything.
00:21:31.000 Because the story that breaks my heart, we heard the other day from Matt Strickland,
00:21:35.000 is that he defies these lockdown orders.
00:21:38.000 He wins in court.
00:21:39.000 He wins the political battle.
00:21:40.000 He was right the whole time.
00:21:42.000 And he said, people call him and say, what you did was great.
00:21:44.000 How can I help you?
00:21:45.000 And he said, and he says, do what I did.
00:21:46.000 Oh, no, no, no, I couldn't do that.
00:21:48.000 I know they'll come after me.
00:21:50.000 And that's how it feels.
00:21:51.000 Too many people are like, no, no, no, no, don't look at me.
00:21:53.000 I'm going to stay right here where it's easy.
00:21:55.000 They think it'll pass eventually, but it's only going to get worse.
00:21:59.000 We've seen this throughout history.
00:22:01.000 I mean, the gender pronoun thing.
00:22:04.000 I've made jokes saying, like on Twitter, saying, yeah, there's going to be Republicans with pronouns in their bios in 10 years.
00:22:10.000 Oh, yeah, yeah, yeah, of course.
00:22:10.000 We'll see.
00:22:12.000 The joke I made is that the Democratic Party is going to be hive mind singularity pod people versus transgender communists.
00:22:20.000 And that'll be the Republican Party and the Democrats will be the hive mind AI people.
00:22:23.000 Yeah.
00:22:24.000 And they'll be like, hive mind rights.
00:22:26.000 And they'll start arguing that once you join the collective, you know, you have a right.
00:22:30.000 Like if a non-citizen becomes part of the collective, they retain their voting power
00:22:34.000 because the collective is one unit or something like that.
00:22:37.000 Then they're gonna be like, we need a new system of voting
00:22:39.000 because ranked choice voting doesn't work anymore.
00:22:41.000 They're gonna be on like double bypass ranked choice inversion voting,
00:22:45.000 where you know, who you vote for the third time gets counted against you and some weird,
00:22:50.000 you know, anybody can vote.
00:22:51.000 And then they're going to be like, well, he's in the hive now.
00:22:53.000 So he's, we are one.
00:22:54.000 And that means, you know, he can, he can vote.
00:22:57.000 And there's one vote and we vote for Joe Biden again.
00:23:00.000 You vote with your feelings in those situations.
00:23:04.000 Whoever has the most feeling is the one that would decide what the hive does.
00:23:07.000 Yeah.
00:23:08.000 In order to vote in the future, you walk into a room, close your eyes and think real hard.
00:23:11.000 And then we record it.
00:23:12.000 Trust us.
00:23:13.000 We're getting the count right.
00:23:14.000 We know.
00:23:15.000 Yeah, we got your vote.
00:23:16.000 And then they leave and it's like, ooh, I wonder, you know, who's going to win?
00:23:16.000 Yeah.
00:23:19.000 And they're like, 99.9% Joe Biden.
00:23:21.000 You know, I don't think that the gender stuff is just going to get worse.
00:23:23.000 You said earlier, like Chloe Cole gives me a lot of inspiration.
00:23:26.000 I'm not sure if you're familiar with her story.
00:23:27.000 She's like, she might be 19 at this point, but she had underwent surgery, like transgender surgery and had her double mastectomy, her breasts removed when she was like 13 or 14.
00:23:38.000 I don't want to get the numbers wrong, but it's right around that age.
00:23:41.000 I think it was 15.
00:23:42.000 15.
00:23:42.000 15?
00:23:42.000 And then realized what the pharmaceutical companies were doing to her or enabling her to do to herself for profit and came out and started speaking out against it.
00:23:52.000 And she's immensely popular right now with all sorts of people from all ages.
00:23:56.000 So I think we went through a horrible period in the last six years of pharmaceutical overreach, in my opinion.
00:24:04.000 Yeah.
00:24:05.000 And digging into these kids for money.
00:24:07.000 But if you look at Europe and you see how overly woke they are, you can tell that the trajectory of the U.S.
00:24:14.000 is going to follow that of Europe, right?
00:24:16.000 You know, they've all had these debates, the bathroom debates and everything, and that their solution, for example, for the bathroom thing is that, you know, we're going to have unisex bathrooms.
00:24:25.000 So, I mean, that's just one example, but if you want to see the future of the U.S.
00:24:29.000 in terms of wokeism and leftist ideology, you look to Scandinavia and Western Europe.
00:24:36.000 In California.
00:24:37.000 Yeah, true.
00:24:38.000 Apparently, California is always five years ahead of the rest of the United States.
00:24:43.000 I covered this story a few years ago.
00:24:45.000 It was talking about something related to their weird policies, and it said, historically, the policies implemented there make their way to the rest of the country within five years.
00:24:55.000 So if you want to figure out where things are going to be in five years, look at California.
00:24:58.000 And if you do, well then any sane person is going to get out of the city, go to the middle of nowhere, and get some chickens.
00:25:04.000 Because otherwise you're going to be walking around New York, you're going to be walking around D.C., and there's going to be human feces all over the streets.
00:25:09.000 And there probably already are, it's just that in San Francisco it's way, way worse.
00:25:12.000 In Sacramento it's worse.
00:25:13.000 But that's coming to a neighborhood near you.
00:25:16.000 Well, and I would assume all the migration that happened during the COVID lockdowns is going to influence that, right?
00:25:22.000 Have you ever seen these maps of, like, how people migrated and a lot of them left California and went to states you wouldn't have predicted?
00:25:28.000 Like, we always talk about this.
00:25:30.000 Texas.
00:25:31.000 They bring their ideology and their policies with them, right?
00:25:31.000 Right.
00:25:34.000 They're frustrated by what's happening in their state, but not by everything.
00:25:37.000 They, like, dislike one policy, but they're going to keep most of the others.
00:25:41.000 Yep.
00:25:42.000 Naturally the country is just going to move more left in terms of sociocultural values.
00:25:47.000 I'm not so convinced.
00:25:48.000 Really?
00:25:49.000 Yeah, I think it's going to be... So we started seeing this shift with Gen Z because leftists don't have kids.
00:25:55.000 But that only goes so far when you start getting a pushback in the culture war.
00:25:55.000 They have yours.
00:26:00.000 And I think mathematically it doesn't matter whether or not they indoctrinate kids.
00:26:03.000 I don't.
00:26:04.000 I think that all that matters is the birth rate.
00:26:07.000 If conservatives have even one kid per family, and Democrats have none kids per family, then the future, even though will be a population reduction, will be way more conservative.
00:26:17.000 And then people argue, yeah, but the woke are trying to indoctrinate those kids.
00:26:21.000 Yes, but they don't exist in large enough numbers.
00:26:25.000 So if there's two leftist teachers trying to indoctrinate your child, Then as they get older, they're dying off.
00:26:33.000 Maybe they convert 10%.
00:26:35.000 Maybe, let's say they convert 40% of all conservative children into woke.
00:26:42.000 That still means they are not replacing their own ideological selves.
00:26:48.000 What's the conversion rate for a conservative kid into a woke kid?
00:26:52.000 It's probably very low.
00:26:54.000 You grow up in a religious conservative family, you probably become slightly libertarian as you get older, but you're still going to be somewhat conservative.
00:27:02.000 People go through phases, too.
00:27:03.000 I went through a real post-modernist, you-can-be-whatever-you-believe-you-are phase in my 20s, and then I realized, the reality starts to set in, and I realize, oh, people will starve if we don't get resources from point A to point B. The whole, I believe, is not good enough.
00:27:19.000 Reality will slap you in the face, metaphorically, if you just play the I am what I think I am game.
00:27:25.000 Yeah, I think you were making a point about generations, right?
00:27:27.000 So if Generation Z, if you're seeing like a conservative turn, usually the next generation will be more woke, right?
00:27:34.000 So if you see backlash in a certain generation... No, no, no, no, no.
00:27:36.000 You don't think so?
00:27:37.000 Pew Research shows that every generation starts skewing further and further left, but Gen Z was the first time in a hundred years it actually ticked back towards conservative.
00:27:46.000 Because something else I read, baby boomers, they were, for that time period, they were woke, you know, they were against the Vietnam War and et cetera, but then Generation X, that was the backlash generation.
00:27:57.000 But, yeah.
00:27:59.000 Well, they're all more progressive on everything.
00:28:03.000 Boomers were resistant to gay marriage.
00:28:05.000 Gen Xers were more accepting of it.
00:28:06.000 Millennials were completely accepting of it, resulting in a favorable cultural environment to legalize, or I should say to codify, not even codify, to rule in the Supreme Court that
00:28:18.000 it just is.
00:28:19.000 And I think it went from 20% in the 90s to like 80% now.
00:28:23.000 I mean, support for gay marriage.
00:28:24.000 It's considered one of the fastest shifting cultural issues of all time.
00:28:27.000 Crazy.
00:28:28.000 I wonder how Alpha is going to turn out.
00:28:29.000 I mean, if Summers are this way, then...
00:28:32.000 It's not about how they turn out.
00:28:34.000 It's about who has the kids.
00:28:35.000 And conservatives have more kids than liberals, and liberals are now more likely to sterilize and more likely to abort their children.
00:28:43.000 So it's only a matter of time.
00:28:45.000 People talk about all the politics and they're like, yeah, but they're in schools and they're indoctrinating and I'm just like...
00:28:50.000 Doesn't matter.
00:28:51.000 Math, you're not going to change it.
00:28:53.000 Like, over a long enough period of time, it's like saying, you know, you go to a casino, because I was making the, I said if I was going to make a bet on, you know, Kamala Harris, people were like, Tim's got a gambling problem.
00:29:06.000 Because I always do the gambling analogy.
00:29:08.000 But if you go to a casino, the house has an edge.
00:29:10.000 Like, how is it that you can play all these games?
00:29:11.000 The house always wins because in Blackjack, they have a 0.5% higher chance of winning.
00:29:17.000 That means you can win a million dollars, it doesn't matter.
00:29:20.000 Over the year, they win 0.5% of all of those bets statistically, and that's all that matters.
00:29:26.000 And that's what I'm talking about with Wokeness.
00:29:28.000 They can do whatever they want.
00:29:30.000 They can indoctrinate this kid and this kid.
00:29:32.000 They can get Greta Thunberg up on the big TV.
00:29:34.000 But over a long enough period of time, their attrition rate is just too high.
00:29:38.000 If they want to abort their kids and stay... Look, aborting their kids was one thing, right?
00:29:44.000 Leftists, liberals are substantially more likely to get abortions than Christians, conservatives, etc.
00:29:50.000 And that right there is a hard mathematical fact, which is why many people were saying for a while that the future is Muslim, because Muslims have even more kids than Christians, and that's the number, that's all that matters.
00:30:01.000 However, now liberals are sterilizing their kids, so that means it's almost like a retroactive removal from future gene pools.
00:30:11.000 Basically, if you go back to the 1990s and check abortion rates, You're gonna be like, we can do the math.
00:30:17.000 Liberals were aborting at this rate, conservatives at this rate.
00:30:20.000 You're gonna see more conservatives.
00:30:21.000 Guess what?
00:30:22.000 We are.
00:30:23.000 Conservatives are having more kids, liberals are having less kids.
00:30:25.000 Why?
00:30:25.000 Because liberals are aborting their babies.
00:30:27.000 So in the early 2000s, conservatives were having, I think it was like 2.05 kids on average, and liberals were having like 1.73.
00:30:34.000 Now what do we see?
00:30:35.000 Gen Z, slightly more conservative in some areas, but fairly comparable to millennials.
00:30:40.000 Now, here's what they didn't track for.
00:30:44.000 When they were talking about abortion, they weren't talking about abortion, they said, do liberals have kids, do conservatives have kids?
00:30:50.000 And the answer was, conservatives have more.
00:30:51.000 Why?
00:30:52.000 They were probably banging at the same rate or similar rates, but liberals were having abortions, reducing that number.
00:30:58.000 now end the fact that those kids from the 2000s who are 23 today have a higher likelihood
00:31:05.000 of being sterilized, being chemically castrated or transitioning or just outright not wanting
00:31:11.000 to have kids as pro-leftist ideology.
00:31:12.000 So you add that and now those people will not have children so that the birth rate on
00:31:18.000 the left is going to go down further.
00:31:20.000 Personally, I'm not a fan of any of that stuff.
00:31:22.000 I think people should have families and they'll enjoy it.
00:31:24.000 But if the left doesn't want to have kids, that's the future that they've built.
00:31:28.000 There it is.
00:31:28.000 Makes sense.
00:31:29.000 Makes sense.
00:31:30.000 I didn't know any of that, actually.
00:31:31.000 It assumes, you know, a stable system, peaceful system.
00:31:36.000 It doesn't.
00:31:37.000 Well, because slavery is real, too, like human slavery.
00:31:40.000 If a liberal society were to go full militant and just start killing and enslaving children, then it would be like, that would have won.
00:31:50.000 You're telling me that you think that if the circumstances in the United States become substantially more arduous, then liberals will do better?
00:32:04.000 No, I think the ideology is destined for failure.
00:32:06.000 That personal ideology of cutting up kids and making them take their penis off when they're 12, that's not going to work.
00:32:11.000 And I'm not saying that's in massive numbers in the millions.
00:32:17.000 It's in the thousands, maybe even tens of thousands.
00:32:19.000 And it may be reversing because of what we're seeing with people like Chloe Cole.
00:32:22.000 But what I'm saying is, if you make an easy, comfortable system, you will start seeing
00:32:28.000 more liberals and leftists, but they want gluttony more than conservatives want gluttony,
00:32:35.000 but many conservatives do, so they abort and sterilize their kids.
00:32:39.000 But then you give them hardship, and conservatives are substantially more likely to survive that
00:32:44.000 hardship because of meritocracy, personal responsibility, and things like that.
00:32:48.000 So if it does become a dictatorship, all that means is more liberals will struggle to succeed.
00:32:53.000 Maybe.
00:32:53.000 Conservatives will succeed.
00:32:54.000 I was thinking of the Soviet Union because that was a real liberal uprising.
00:32:57.000 They were all like, wacky radical leftists and they just killed everyone else
00:33:02.000 and took control physically so enforced the ideology if not for other
00:33:07.000 countries around earth and and what won out in the end was I wouldn't call it yeah it
00:33:14.000 was oligarchy yeah but then that period of the 90s where you know oligarchs
00:33:19.000 ruled everything you know they liked Western liberalism but now
00:33:22.000 there's backlash And now they want to revive the Soviet Union in Russia.
00:33:27.000 I mean, Stalin is the most popular figure in Russia.
00:33:30.000 And they, yeah, literally want to go back to the Soviet Union.
00:33:33.000 Yeah.
00:33:33.000 Really?
00:33:34.000 How is that happening?
00:33:35.000 Who wants that?
00:33:35.000 And what do they want exactly?
00:33:36.000 Well, I mean, if you look at polling data, Western polling data, Stalin is the number one, you know, most popular figure.
00:33:42.000 And if you talk to Russians these days, you know, they want, they want to go back to the days of the Soviet Union because that period of the 90s was just so horrible for the Russian people.
00:33:52.000 And then Putin changed everything when he came in, in 2000, you know, Russia was declining, economy was really bad.
00:33:58.000 And then, you know, slowly he re-industrialized and then changed the economy for the better.
00:34:02.000 And now, you know, Putin has been very good for the Russian people.
00:34:05.000 Well, and maybe you can speak to this, but don't most Russian people not want what modern Western values are, right?
00:34:12.000 No, they don't.
00:34:13.000 I mean, they admire the West because of, you know, Hollywood and everything.
00:34:17.000 Of Westerns?
00:34:18.000 Yeah, because of the cultural power that the West has, but they are completely against Western liberal values.
00:34:24.000 In fact, there was a bill that was just passed where they expanded the anti-LGBT law that they had.
00:34:31.000 And this was, you know, across the board, every party agreed with that.
00:34:34.000 They were like, no, we're not going to have this LGBT gender ideology propaganda in Russia.
00:34:40.000 You know, that's for the West.
00:34:42.000 So it's hardening them even more.
00:34:44.000 So I saw an interview with Putin, a bunch of college kids, and they were like, you could see they wanted him to be like, yes, we are now a liberal democracy.
00:34:51.000 But he wasn't.
00:34:52.000 He was staying hard in there.
00:34:53.000 Yeah.
00:34:53.000 Then they were like, what should we, what, Senor, or whatever he called him, Putin, what should I do as a man in Russia?
00:35:00.000 And Putin was like, you should learn to cook.
00:35:02.000 And he was like, ha ha ha, everyone laughs.
00:35:03.000 He's like, no, as a real man, a Russian man, there's like this misogynist energy behind it all.
00:35:08.000 And that, that stood out to me that there's this intense like misogyny in the Russian, in that, in that guy, in that guy.
00:35:14.000 And Putin still was like, no, you should learn to cook, because the economy is about to go to crap.
00:35:17.000 He didn't say that part.
00:35:18.000 But that's what he was saying.
00:35:20.000 It's a very macho culture.
00:35:21.000 I mean, I've been to Moscow.
00:35:23.000 And then you see it in the UFC as well.
00:35:25.000 Look at all of the Russian fighters and everything.
00:35:28.000 They're doing really, really well.
00:35:29.000 They're dominating the MMA world.
00:35:31.000 And that's part of their culture.
00:35:33.000 Men should be men and women should be women.
00:35:36.000 Women embrace their femininity.
00:35:37.000 They don't reject it.
00:35:38.000 And they don't try to be more masculine.
00:35:40.000 It is here.
00:35:43.000 When you were saying that people want to bring back the Soviet Union, do they know what they want, or is it just that they want something better than what they have?
00:35:49.000 Well, it's actually older Russians that experience the Soviet Union that want it back the most.
00:35:57.000 It's interesting.
00:35:58.000 Because it's the opposite, right?
00:36:00.000 Because in the US, you have young millennials that are more socialist.
00:36:04.000 And then the older people here, they're like anti-socialist, anti-communist, but in Russia, younger people, they're more into, you know, liberal values, but then the older people who experience communism, they want the Soviet Union back.
00:36:16.000 So it's like the complete opposite.
00:36:18.000 Is this derived through polls and things?
00:36:20.000 Polls, yeah.
00:36:21.000 Mostly through polls.
00:36:21.000 And then, you know, I don't necessarily go with like lived experiences, but that was my experience in Russia as well.
00:36:27.000 How long did you live in Russia?
00:36:28.000 I was there for a month.
00:36:30.000 You know, I was there for work, but it was a...
00:36:34.000 Enlightening experience because it killed whatever progressivism and leftism I had in me.
00:36:39.000 Because it was a totally different culture.
00:36:40.000 I'd never really gone elsewhere, aside from Western Europe.
00:36:46.000 It's a very anti-progressive, anti-woke culture.
00:36:49.000 It's pretty much built in.
00:36:51.000 It's very interesting going to Russia.
00:36:52.000 You said it killed the wokeism out of you?
00:36:55.000 Whatever progressivism I had inside of me, from the Bernie days, it was just completely gone.
00:37:03.000 I became less... I became more politically incorrect, you know.
00:37:06.000 But see, the thing with Bernie, he wasn't, uh... He wasn't woke!
00:37:10.000 Yeah, it wasn't the same thing.
00:37:11.000 Yeah!
00:37:12.000 He did that interview, that famous interview where he was like, you gotta secure the borders!
00:37:15.000 Yeah.
00:37:16.000 And then he changed his whole shtick, right, in 2020.
00:37:17.000 And that's why I think he lost the primary because, you know, he went full on the id poll and the gender ideology and all that stuff.
00:37:21.000 Yeah, because he wanted to shut the borders down. And these people are global socialists.
00:37:26.000 Yeah. And then he changed his whole shtick right in 2020.
00:37:30.000 And that's why I think he lost the primary because, you know,
00:37:33.000 he went full on the ID poll and the gender ideology and all
00:37:36.000 that stuff. So it's like, yeah, that's why he destroyed
00:37:40.000 himself. But yeah, started sounding like a weirdo. Yeah. And regular
00:37:43.000 people were like, I don't know what that's all about. This guy
00:37:45.000 is crazy.
00:37:45.000 It went from unite the working class to identity politics.
00:37:48.000 Yeah. In 2020.
00:37:50.000 That's how they do it. That's how they do it with Occupy Wall Street. They do it with Bernie Sanders.
00:37:54.000 I feel like we didn't mention it, but it would be remiss not to point out that universities are also not quite as popular as they were.
00:38:00.000 Like, you hear this all the time.
00:38:02.000 People say you send your kids to university and they move more left, even if they were raised in a conservative family.
00:38:06.000 But studies are showing, I mean, especially with COVID, we saw so many people opt not to return to college and feel like it wasn't worth it.
00:38:13.000 People are aware of how much of a financial burden it is.
00:38:17.000 Well, even Elon Musk is saying college is useless.
00:38:19.000 And, you know, if you want to work at Tesla, you don't need to have a bachelor's degree.
00:38:22.000 Right, and this is more and more the thought.
00:38:24.000 Also, like, if I'm going to take on however much amount of debt to not make that much money versus someone who goes into a trade or any other thing who gets a head start who is making more money than you are even after your four-year degree, it doesn't make any sense.
00:38:36.000 The National Clearinghouse put out this study released from enrollment from last year and they found that the Enrollment was down across the board, master's degrees, any
00:38:47.000 program, definitely down among undergraduates, but it's significantly down among women for
00:38:51.000 the first time in a really long time, which I find interesting.
00:38:54.000 We've known for a while that men are declining, fewer men are enrolling in college than women,
00:38:59.000 but now, just in the last year, I can't say it's a full-on trend, but you saw twice as
00:39:04.000 many women didn't return to college, didn't enroll for the first time, I think, ever.
00:39:08.000 I wonder whether it is, you know, with like Elon Musk and many others, I think Peter Thiel talking about how college is useless.
00:39:16.000 We often said, or the belief was, the reason men weren't going to college was because they were lazy or they were living in their parents' basements and things like that.
00:39:24.000 I'm wondering if dudes just figured it out.
00:39:25.000 Yeah, I think they did.
00:39:26.000 And women are continually under this social pressure where they're like, you've got to go to school and get a degree so you can be a CEO because of patriarchy.
00:39:32.000 And you have to prove yourself.
00:39:33.000 You have to prove that you are just as good as all the men in your class by getting the degree.
00:39:36.000 Whereas men don't necessarily feel that same kind of pressure, and they are capable of doing all kinds of things.
00:39:41.000 Not that women aren't, but they're, well, I'll just pursue this thing over here.
00:39:45.000 I also think traditional learning environments aren't suited for most men, right?
00:39:49.000 Like, it's not typical.
00:39:51.000 I mean, we didn't do this for a long time to have Also, employers are also saying that college graduates are not prepared adequately for, you know, working and everything.
00:40:01.000 It's just a holding pattern.
00:40:01.000 It's useless.
00:40:03.000 I mean, at one point, academics, like universities and schools, they really were sort of interesting places.
00:40:08.000 They prepared you to work.
00:40:09.000 Right, but I don't think that's true anymore.
00:40:11.000 I mean, I went to a four-year university and I was grateful for the experience, but I knew so many people who enrolled in master's programs because they didn't want to be adults yet.
00:40:19.000 Yeah, and they also say that college graduates lack critical thinking skills because they're taught to memorize.
00:40:25.000 I mean, that's the American education system, especially in college.
00:40:28.000 You're not rewarded for thinking critically and analytically.
00:40:30.000 I want to talk to you guys about that TV show, The Last of Us.
00:40:34.000 Have you guys seen it?
00:40:35.000 Have you guys ever played the video game?
00:40:35.000 No, never heard of it.
00:40:37.000 Negative.
00:40:38.000 So it's a video game that came out, I think it's like 10 years old.
00:40:41.000 The show is huge right now.
00:40:43.000 It's on episode three.
00:40:44.000 They've already renewed it for second season.
00:40:45.000 It's got Pedro Pascal in it, I think his name is.
00:40:47.000 And it's really good.
00:40:49.000 Three episodes in.
00:40:50.000 What's it about?
00:40:51.000 It cordyceps, the fungus that takes over insects' bodies and turns them into zombies, mutates to infect humans, society collapses, fungus It's this thing that looks like a mushroom or whatever?
00:41:01.000 Make people look like mushrooms.
00:41:02.000 Yeah.
00:41:03.000 Cordyceps is a type of mushroom.
00:41:04.000 According to Paul Stamets, the world-leading mycologist, cordyceps do not eat humans.
00:41:08.000 Right.
00:41:09.000 I'm glad we got a statement.
00:41:09.000 I believe.
00:41:10.000 That's the point of the show.
00:41:11.000 The show is that it mutates and does.
00:41:12.000 And then basically the world collapses.
00:41:14.000 But the reason I want to talk to you about this third episode, it's particularly culturally relevant as it pertains to medical assistance in death, in dying or whatever they call it.
00:41:22.000 And the show is about a post-apocalyptic gay relationship.
00:41:27.000 And so, the reason I brought it up is we were just talking about the differences between men and women in college, and it got me thinking about this.
00:41:34.000 There's an interesting dynamic that I was talking about earlier, where you've got all these people saying this is one of the best TV shows ever made, because it depicts a prepper, the world ends, three years later, he's all alone for three years, he meets a guy, and then becomes gay, I guess, because he wasn't before.
00:41:51.000 And then it depicts spoiler alerts, I'll try to avoid most heavy spoilers if you haven't seen the show, But it depicts this, like, story of them living together in the apocalypse, and then in the end, the controversy being generated around it is that one guy's, like, sick, and they're old, and there's no doctors.
00:42:07.000 So he's like, I want you to poison me and I'll die.
00:42:10.000 And then the other guy is like, I won't do it.
00:42:13.000 That's wrong.
00:42:14.000 And he's like, it's my life.
00:42:15.000 It's my choice.
00:42:15.000 And this is how I'm going to go out.
00:42:17.000 And then he's like, okay, fine.
00:42:19.000 And then I already warned you guys, major, major spoiler alert.
00:42:22.000 Like it's in all over the news.
00:42:24.000 And I'm going to spoil it because, you know, it's, and I've already spoiled part of it, but it's because like the medical assistance and dying thing going on in Canada.
00:42:33.000 The dude then basically crushes up a bunch of pills, puts it in a bottle of wine, and then tricks the guy and poisons himself along with the guy.
00:42:42.000 And then he says, objectively, it's the most romantic thing ever done.
00:42:47.000 And so you have these two guys committing suicide together.
00:42:49.000 And they're glorifying it?
00:42:51.000 Glorifying it.
00:42:52.000 And then people are coming out now saying it's like the greatest episode of TV ever made and stuff like that.
00:42:57.000 And I'm like, I gotta be honest, it was good.
00:43:00.000 I mean... It's a modern take on Romeo and Juliet.
00:43:02.000 Well, the one thing I wanted to bring up is the, yeah, sorry, like, the gay relationship elicits no emotion in me at all.
00:43:12.000 It just, it just really doesn't.
00:43:13.000 And I'm not trying to be a dick to gay people, like, by all means, if you like dudes, do your thing.
00:43:17.000 I got no, I got no beef.
00:43:18.000 But What I'm hearing in the media is that it's like it's so romantic these two guys are like they love each other so much the guy decided if you die I'm gonna die with you and they both poison themselves and I'm just like the glorification of suicide that's scary and then calling it romantic and I'm just like I wouldn't find it romantic it's tragic in Romeo and Juliet it's it's tragedy it's not romance but what we're seeing now is culturally
00:43:43.000 Medical assistance in dying is being, they're pushing it.
00:43:47.000 In Europe, they're doing it.
00:43:47.000 It's more favorable.
00:43:49.000 In Canada, they're doing it.
00:43:50.000 We had those stories where like the veteran calls and he's like, I need help.
00:43:53.000 And they're like, have you considered dying?
00:43:55.000 And now we get a TV show that I'm not, it's not like it's a young girl, you know, like a 20 year old being like, I've decided to die and then dying or like a suicide.
00:44:03.000 Cause that's, that's depicted in shows all the time.
00:44:05.000 It's like two guys who are alive and one guy saying, you know what?
00:44:07.000 Today's my last day because I've chosen it.
00:44:10.000 And then he's like, let's go for it, and then they call it romantic, and it's all heartwarming.
00:44:15.000 I don't know, man, I just feel like, whether it's intentional or not, this is what we're going to start seeing in the future.
00:44:21.000 We're already seeing it in Canada, we're already seeing it in Europe.
00:44:23.000 How long until you think it comes to the United States?
00:44:25.000 Suicide booths, like in Futurama.
00:44:25.000 Ten years.
00:44:27.000 I mean, assisted suicide is legal in Washington and Oregon already.
00:44:31.000 It's already here.
00:44:32.000 Well, there you go.
00:44:33.000 There it is.
00:44:33.000 It's not a question.
00:44:34.000 I mean, there's this documentary called How to Die in Oregon, and I've probably referenced it a million times, but this is one of the things that happens.
00:44:40.000 A man is on, you know, state healthcare, and he has a tumor, and the state won't pay for his treatment.
00:44:46.000 It's too expensive, it's beyond what he's permitted, but they will pay for his assisted suicide.
00:44:50.000 That's what they're doing in Canada. Yeah, it's I mean, but this movie came out easily 10 years ago
00:44:54.000 I mean, this is something that we already had and we're just seeing the effects play out all the time
00:44:59.000 I think what's weird about this show having never seen it is like
00:45:02.000 It goes against what we might say is like the glorification of like the West, right?
00:45:08.000 So I've been watching the show Yellowstone and all the other ones, and there is a fight to survive, right?
00:45:14.000 If you're in a post-apocalyptic world, I don't really know, maybe you don't feel like there's anything left living for, but at our core, I think people want stories where it's like, but I pushed through anyways, because it was worth it, because ultimately we are glorious in our victory.
00:45:30.000 So this was a component of the episode, like basically, and again, spoiler alerts, in the end, the main character goes to the house because they were friends and then he's like, where is he?
00:45:40.000 And there's a note.
00:45:41.000 And the guy writes this letter saying, I used to hate everybody and I hated everything.
00:45:45.000 But then I realized, I learned that there actually was one person worth saving.
00:45:49.000 And that's what I did.
00:45:50.000 I saved him.
00:45:51.000 I give to you all my equipment, use it to save those you love, blah, blah, blah.
00:45:54.000 And I'm just like, yo, this show, I'm wondering if it's like, They're trying, it's wokeness, that they're having two, they're two bearded, burly-looking dudes in a relationship, and I felt like it just doesn't resonate.
00:46:12.000 I do not feel this man's love.
00:46:14.000 I do not, I felt nothing from it.
00:46:17.000 In the previous episode, Joel's significant other, Tess, sacrifices herself, she gets bit, to save them and blows herself up.
00:46:25.000 That I felt.
00:46:26.000 And then what I did feel in the end of the episode is when he writes, the guy's like, use my equipments to save Tess, or to protect her, but she's already dead.
00:46:33.000 And so that was like, brutal.
00:46:34.000 Like, all of a sudden, you're in this post-apocalyptic world, you have nothing, no gas, no cars.
00:46:39.000 They're showing a plane crash, and it's like, we used to be able to fly, we can't anymore.
00:46:43.000 And then this guy, who was a prepper, has all of this amazing stuff, and he's like, it's yours now.
00:46:47.000 Use it to protect Tess, and she's already dead.
00:46:49.000 That was brutal.
00:46:50.000 I feel like this wokeness, I don't know.
00:46:53.000 Is this like Democrat, predominantly females going, oh, they love each other.
00:46:53.000 What is this?
00:47:00.000 And then it's like moderate dudes just being like, yeah, sure, I guess, you know?
00:47:04.000 I don't know.
00:47:05.000 What do you guys think?
00:47:06.000 I didn't see it, but I think when two actors are doing a love scene, if they actually love each other as people, it reads in the movie like in The Notebook, Ryan Gosling and the girl, I can't remember that girl's name.
00:47:19.000 But it was like just, it's a gut-wrenching movie.
00:47:21.000 I don't know if you guys have seen it before.
00:47:22.000 I cried when I saw the movie.
00:47:23.000 But they actually fell in love while they were shooting that.
00:47:25.000 If these dudes in The Last of Us don't actually have emotions for each other, but they're just playing the character of a gay lover, it's gonna be, you're gonna hear about it, the lizard brain's gonna be like...
00:47:36.000 The seal in you is gonna clap, but it's not real.
00:47:39.000 It's the idea of it that they love.
00:47:41.000 But if there's no love there, then there's no love there.
00:47:43.000 I mean, I just felt like I was watching this show, and you're supposed to be sad that these two people are in love with each other and they're dying, and I'm like, I don't see it.
00:47:51.000 I don't see it.
00:47:51.000 I don't feel it.
00:47:52.000 If it was a woman, I would have understood it.
00:47:54.000 Yeah, or what if they were like father and son?
00:47:56.000 Yes.
00:47:56.000 I was talking about this earlier.
00:47:59.000 If it was like a guy and his kid and his kid gets bit and he has to kill him, you'd be like, oh man, like so brutal.
00:48:05.000 Or it's the dad who like throws the cat out of the way and gets bit.
00:48:08.000 That was literally the second episode.
00:48:11.000 Tess, who is the woman, significant other for Joel, They're fighting these things, and then she's like, let's go, and then they go to this place, and the zombies find out or something, and then the kid's like, she's infected, and then he's like, show me, and he sees the bite, and he's like, no, and then she's like, get out of here, and she sacrifices herself, and it's brutal to watch.
00:48:32.000 But yeah, I was like, if this episode was a guy, and he meets some teenage kid, and then over the next five, six years, he's teaching the kid how to shoot, and he's teaching the kid how to prep and survive, and then it ended with him throwing the kid out of the way, and then getting bit on the arm, and the kid has to shoot him, and he's crying.
00:48:47.000 It would've been the most brutal episode ever.
00:48:49.000 But two adult, burly, bearded men.
00:48:51.000 I am not trying to be a dick.
00:48:53.000 I don't care if that's your thing.
00:48:54.000 I don't feel like that resonates.
00:48:55.000 I don't know, it's just me.
00:48:56.000 Did you see Brokeback Mountain?
00:48:57.000 I did not.
00:48:58.000 I didn't either.
00:48:58.000 Did you guys see that?
00:48:59.000 Yeah.
00:48:59.000 Was it good?
00:49:00.000 I heard it was phenomenal.
00:49:02.000 I mean, the one from 2005?
00:49:04.000 Yeah, with Jake Gyllenhaal and Heath Ledger.
00:49:06.000 I mean, I didn't think it was that great.
00:49:07.000 I thought it was just a cultural message, right?
00:49:10.000 I think a guess that I had is that they actually started to have emotional feelings.
00:49:14.000 I'd like to ask Jake if that happened.
00:49:15.000 Really?
00:49:16.000 But Heath killed himself not that long after.
00:49:19.000 I think he was doing a lot of drugs, really was like, am I gay?
00:49:22.000 What's wrong with me?
00:49:23.000 I know an actor in college that did a play where he was gay, and the other actor was gay and was just loving making out with him on stage, and then my buddy hung himself.
00:49:35.000 Twenty years later, but he hung himself.
00:49:38.000 He told me at one point, that play still sticks with me in the back of my mind.
00:49:42.000 As a former homeless, he felt prostituted into it for the role, because it was a great role.
00:49:48.000 Anyway, it's a little off-base, yeah, but it's not fake.
00:49:51.000 Like, when you're doing these scenes, you actually do begin to feel these things, and if these actors were on guard from that, it won't play.
00:49:57.000 You'll know that in the scene, you won't feel it, because they're guarded from it.
00:50:01.000 It's the danger of being an actor.
00:50:02.000 One of the dangers of being an actor.
00:50:03.000 God, you know, love Keith Ledger, what that guy went through.
00:50:07.000 I don't know, man.
00:50:07.000 The thing about the medical acid suicide, to go back to that point, is that if we're talking about somebody who's bedridden and has no way to pay their bills, and the doctor's like, you've got three weeks left, and they're just like, well, you know, I'm in agony and pain and the morphine's not cutting it anymore.
00:50:24.000 What do you guys think?
00:50:25.000 So my question is, do you think that depression eventually will be used as like an acceptable, I guess, justification for doing this, for going through this process?
00:50:35.000 I think that just happened.
00:50:37.000 I should pull the story really quickly, but there was a girl in the Netherlands.
00:50:41.000 So if you can prove you're depressed, then you can kill yourself?
00:50:45.000 It's in Europe, and I'll look it up in a second, but there was a girl who survived Belgium, you're right.
00:50:52.000 She was going on a class trip, she was in an airport, and the airport was bombing.
00:50:56.000 And she apparently developed really serious post-traumatic stress syndrome and had all kinds of issues, hospitalized a lot, attempted suicide, and eventually was granted permission to go through assisted suicide because she said her depression was so crippling she could never get over it.
00:51:10.000 And this girl was like, what, 18, 19?
00:51:12.000 I mean, she was young.
00:51:13.000 She was young teens, for sure.
00:51:14.000 So, you know, what's the limit, you know?
00:51:16.000 Are they gonna go for... I'm not saying I agree with that.
00:51:17.000 No, I'm just saying, in general, like, what's the limit?
00:51:20.000 ADHD?
00:51:21.000 Is that gonna be accepted?
00:51:23.000 I mean, it's like a weird form of eugenics, I feel like, right?
00:51:25.000 We're just eliminating them after you can't hack it, I guess?
00:51:28.000 Acne?
00:51:29.000 I mean, I don't know.
00:51:30.000 I kind of am.
00:51:33.000 I don't really care about the assisted suicide stuff.
00:51:35.000 I don't know.
00:51:35.000 Am I crazy?
00:51:36.000 I never have.
00:51:36.000 If you want to kill yourself, kill yourself.
00:51:38.000 Who cares?
00:51:39.000 There's so many humans.
00:51:40.000 Because everybody always regrets it.
00:51:42.000 They don't actually want to do it.
00:51:44.000 That's what we were talking about the other day.
00:51:45.000 Everybody who jumped off the Golden Gate Bridge and survived said the first thing that happened was they regretted doing it.
00:51:50.000 So you have these people who are feeling fear or panic.
00:51:54.000 They're scared.
00:51:54.000 They're in pain.
00:51:55.000 And they think it's the only way to make it stop.
00:51:57.000 And they regret it.
00:51:59.000 And so the goal is to help them not do that because it's the wrong thing.
00:52:02.000 Like I knew a guy, like the guy I was just telling you guys about, he wanted to die so he killed himself.
00:52:06.000 No one could stop him.
00:52:08.000 But then in another instance, I know a girl that tried to kill herself a couple times and then she didn't and now she's got a family.
00:52:15.000 And I assure you, your friend, if he was alive today, he'd be like, I'm sure glad that didn't happen.
00:52:21.000 Yeah, I think so.
00:52:22.000 He was like, something horrible's coming, man.
00:52:24.000 He was like, really?
00:52:25.000 No, it was 2019 when he killed himself.
00:52:25.000 It was like, 2017?
00:52:27.000 It was like, 2015.
00:52:28.000 He told me, like, something bad's happening.
00:52:29.000 He left New York City.
00:52:30.000 He's like, something bad's coming, dude.
00:52:32.000 He was, like, the best actor I knew, too.
00:52:34.000 But do you feel like because he felt like something bad was happening, he should have been able to kill himself, right?
00:52:40.000 Like, if you were in a position to have influence over it, which of course you weren't, it's a horrible thing to ask, but like, With assisted suicide or what they often refer to as euthanasia, right?
00:52:49.000 Like, euthanasia, you feel like you're putting someone out of their misery.
00:52:52.000 And that's why it starts with these conversations about people who are in very serious organ failure.
00:52:57.000 They have very serious cancers.
00:52:58.000 They will not survive.
00:53:00.000 And so, is it a higher quality of life to let them choose when to end?
00:53:05.000 The problem is, if you open the door to medically assisted suicide, it will expand.
00:53:10.000 Exactly.
00:53:11.000 I'm concerned that some people, if they want to die and you don't let them, that they will lash out and start ruining other people's lives and making the world a worse place.
00:53:20.000 But that's on the individual level.
00:53:22.000 I think that the collective consequences of, you know, starting this whole, like, medically assisted suicide trend is much worse than, you know, the individual consequences of, you know, one person lashing out in their own community versus, you know, starting this global trend.
00:53:38.000 Right.
00:53:39.000 It also reduces the friction in doing so.
00:53:40.000 It's very difficult to do.
00:53:41.000 It's not as easy as everyone thinks it is to just go ahead and do it.
00:53:44.000 People get drunk.
00:53:45.000 People do a bunch of drugs before they do it.
00:53:48.000 If you make it a nice hospital setting with doctors helping you do it... And your family surrounding you holding your hand saying, it's a good thing.
00:53:56.000 That's creepy AF.
00:53:58.000 Exactly.
00:53:58.000 It makes it way easier.
00:54:00.000 Or romantic, right?
00:54:01.000 Yeah, exactly.
00:54:01.000 It sure does seem like they don't want a lot of people.
00:54:05.000 I'm thinking of cellular apoptosis.
00:54:07.000 When you've got cells in your body, they pre-program themselves to die.
00:54:10.000 Like some cells, when they're no longer needed or they're causing damage to the system because they're taking too many resources, they will kill themselves on purpose to make the body healthier.
00:54:17.000 And I think humans are fractally doing that similar thing.
00:54:22.000 I mean, there are cultures, right, where that's the expectation for people who are too old or too sick.
00:54:27.000 They're supposed to, like, remove themselves from the society.
00:54:29.000 But, like, I don't know that that is something that I would personally be okay with.
00:54:34.000 It would be really difficult for me to accept that, you know?
00:54:37.000 People are pointing out that in the beginning of the show, people commented that a portion got dropped or deleted.
00:54:44.000 Like, it just... I was talking about Biden getting right and it was gone.
00:54:48.000 And we've had so many of these kind of things happen that I don't believe they can be coincidences.
00:54:53.000 Because it's just it happens a lot.
00:54:54.000 So someone just super chatted that they just started watching.
00:54:58.000 And in the beginning, the intro was cut, like you can't actually hear me talk about what happened with Biden.
00:55:02.000 But we talked about him again and again and again throughout this show.
00:55:06.000 I want to tell everybody what happened with Turning Point USA.
00:55:08.000 I'm not sure we ever actually talked about it.
00:55:10.000 But when we went live with Turning Point USA, you may have noticed we were late.
00:55:14.000 We weren't late, actually.
00:55:16.000 We were on time, the stream started right on time, and then what happened, Serge, remember?
00:55:20.000 You guys were like, it says we're live, but we're not coming up, nothing's happening.
00:55:24.000 Right, right, yeah, it was live to us, and it was live in my feeds.
00:55:27.000 But nobody could see it, they were like, where's the show, the show's delayed.
00:55:30.000 And then I got messages and they were like, people were telling me after the show, they were like, hey, the show didn't go up for like 10 minutes.
00:55:35.000 But we were live streaming, on our end, everything was going through, and I'm wondering if, you know, we had Charlie Kirk, we had Bannon, we had James Lindsay, We had Luke Rutkowski, you and me.
00:55:45.000 I'm wondering if the people at YouTube were like, boss, what do we do?
00:55:49.000 Tim Pool, Bannon, Kirk, Luke Rutkowski, Ian, James Lindsay are on stage in a massive stadium with thousands of seats, and they're gonna talk at this massive convention.
00:56:01.000 What do we do with this?
00:56:03.000 And then they were like holding pattern.
00:56:04.000 And then they watched us and they were like, okay, they're gonna hold a pattern. Let me get confirmation from the boss
00:56:08.000 as to what to do. And they're like, let it go. Let it go. I'm
00:56:11.000 wondering if any if anyone else did something like this, like
00:56:15.000 these kind of that exact conversation where to happen, I'd imagine they would have gotten a strike and they got
00:56:19.000 banned. But I wonder if YouTube was scared, like, we cannot defend against a lawsuit from from turning point from Steve
00:56:27.000 Bannon, and from from Tim pool.
00:56:30.000 All at the same time over this.
00:56:31.000 And because they're on stage at this massive convention center, the PR damage from taking the show down would be so massive.
00:56:41.000 So that's a possibility.
00:56:42.000 Elon Musk needs to buy YouTube.
00:56:44.000 Yeah, good luck.
00:56:45.000 But the reason I think it's possible that they've got their thumb on the scale is because You guys who watch the show consistently know this.
00:56:52.000 There was one episode where Luke started talking about the CIA and then the stream cut off.
00:56:55.000 Wow.
00:56:56.000 Dropped off like Luke made some specific points and then stream just stops and then kicks back in a few seconds later and people are like, what did Luke just say?
00:56:56.000 Yep.
00:57:03.000 What was he saying?
00:57:04.000 And that's literally what happens now and it happens a lot.
00:57:07.000 Maybe, maybe it's just like a random glitch that happens.
00:57:11.000 Yeah.
00:57:12.000 Perhaps.
00:57:12.000 Perhaps.
00:57:13.000 It could be.
00:57:13.000 But it happens a lot, you said.
00:57:15.000 But it doesn't happen during, like, we're talking about a TV show.
00:57:17.000 It doesn't happen when we're talking about Quantumania or Thanos or anything like that.
00:57:21.000 It's always specific topics.
00:57:22.000 Only when we're, like, the intelligence agencies are doing this thing and drop.
00:57:25.000 But it could happen at other times.
00:57:27.000 We just wouldn't know because no one would say anything because we weren't talking about anything.
00:57:30.000 Right.
00:57:30.000 And I'd also see that the drop frames would tell me that, like, we're dropping frames in, like, a huge number.
00:57:34.000 It would tell me that it's not doing that, but it's not.
00:57:36.000 Behind the scenes at Mines, I would tell you that don't assume malfeasance.
00:57:41.000 If there's technical glitches, it's very easily like AWS is acting up, your servers are transferring data from one server to another.
00:57:50.000 I'm done believing it's all just a mistake.
00:57:52.000 We've been talking about censorship and big tech for seven, eight years now.
00:57:57.000 Or long, I mean longer actually, maybe even nine years.
00:58:00.000 Luke was getting demonetized before anyone knew what the word was.
00:58:03.000 Someone was manually going, check this out.
00:58:05.000 We all know what it means to be demonetized on YouTube these days.
00:58:08.000 You got a little dollar sign icon on your video.
00:58:11.000 You go in one day, it's yellow.
00:58:12.000 And it says, you know, ads limited.
00:58:16.000 When Luke from We Are Change was making videos, this is back in like 2011, I'm hanging out with him, and then one day he's like, dude, my ads are turned off again.
00:58:25.000 And in his account, the little green icon, it was a circle back then, is grayed out.
00:58:30.000 He'd have to go back in the video and activate it again.
00:58:32.000 Then he'd come back later and they'd all be gray again.
00:58:35.000 And he'd go in and turn them on, turn them on, turn them on, turn them on.
00:58:37.000 Because there was no formal demonetization process.
00:58:40.000 Someone at Google or YouTube was manually axing his account and taking ads off of each individual video.
00:58:47.000 So we know that was happening then.
00:58:49.000 Then they created the system to automate it.
00:58:52.000 Now we have this.
00:58:55.000 Look, I went and hung out at the offices of Ustream back in the day.
00:58:58.000 Do you guys remember Ustream?
00:58:59.000 Yeah, we used to stream on that.
00:59:01.000 With the letter U. That's where Joe Rogan started streaming, wasn't it?
00:59:04.000 He used to do Joe Rogan Experience on Ustream.
00:59:05.000 I believe so.
00:59:06.000 And they had guys sitting at a table with a bunch of computers, and on the monitors there would be like 50 live streams all going at once.
00:59:15.000 And I'm like, what is this?
00:59:16.000 And they're like, we basically just watch all of it.
00:59:19.000 And if someone looks like they're becoming a problem, we move them over into this panel on the left, and then basically they look for people who are about to break the rules, or they think might break the rules, take them from the main grid and put them in the select box, and then if that person takes their shirt off or does something, they nuke it right away.
00:59:37.000 Take the stream right down.
00:59:39.000 I know YouTube has that.
00:59:41.000 Right.
00:59:42.000 YouTube has to have way more sophisticated tools than that.
00:59:45.000 Yeah, they probably have AI watching language.
00:59:47.000 I wouldn't even be surprised if they have AI reading voice cadence.
00:59:50.000 So if you start to get agitated, it can tell.
00:59:52.000 Yep.
00:59:53.000 So the trick is to just deliver horrible messages in a monotone?
00:59:56.000 In a super cool calm way.
00:59:57.000 Don't move your eyebrows.
00:59:58.000 Yeah.
00:59:59.000 Well, I think they have humans who watch.
01:00:01.000 And for a show like this, as big as it is, I'm willing to bet that there's, like, ten shows and they have, like, one guy who's watching all at the same time.
01:00:09.000 Or, more importantly, I'm willing to bet for a show this big they have one guy who's hired to watch the show.
01:00:14.000 At least one guy, probably.
01:00:15.000 One person, and they're, like, because... Your show, Rogan's show, and a few others.
01:00:20.000 Yeah, any big live streams would probably have groups of people.
01:00:22.000 Joe's not on YouTube anymore.
01:00:23.000 Oh, right, true.
01:00:25.000 But all the big live streams, for sure.
01:00:26.000 I suppose I can only say, I agree with you, don't believe that nothing is going on, but just don't believe anything about it.
01:00:32.000 Don't believe that it's people there twisting knobs.
01:00:35.000 Don't believe that it's the data getting corrupted.
01:00:37.000 There's no way to know, so there's no point in coming in with a strong belief on it.
01:00:41.000 But I would like to know.
01:00:43.000 I would say it seems highly probable.
01:00:46.000 That is the case.
01:00:47.000 Yeah.
01:00:48.000 Honestly, it does.
01:00:50.000 When I'm watching this, it seems fine all the time, and then I will see comments about, you know, not getting it recommended.
01:00:55.000 I have a couple accounts on my phone, and I check and make sure I'm still getting the notifications, and sometimes I don't get notifications on some of those accounts, and sometimes I do get notifications the show is live on some accounts.
01:01:05.000 I mean, the Turning Point USA thing was really weird.
01:01:08.000 Yeah.
01:01:08.000 Because I get off the stage, and then I'm told immediately, the show didn't go live for like 10 minutes.
01:01:14.000 It didn't go live.
01:01:15.000 Very stressful for the rest of us backstage being like, what is happening?
01:01:19.000 Cause we could see it in the monitors, but we, you know, like the first thing I did was pull up on YouTube and, and you think maybe there'll be a 30 second delay, right?
01:01:26.000 Like you think it's a big show.
01:01:28.000 I don't remember how long our lag is anyways, whatever.
01:01:30.000 Like it doesn't kick in and we all sort of turned to each other like, when was this?
01:01:34.000 This was, uh, how long ago was that?
01:01:37.000 Two months?
01:01:38.000 A couple of months.
01:01:38.000 Oh yeah.
01:01:39.000 Right.
01:01:39.000 December, November 15.
01:01:40.000 Yeah.
01:01:40.000 So fairly recent.
01:01:44.000 As a quick aside, the poll says people think the FBI raid on Biden, 55% say it was a cover up, 45% say deep state removal.
01:01:52.000 But again, we had on more than one occasion, Luke would start talking about the intelligence agencies and things they've done.
01:02:00.000 And then all of a sudden, the show dropped for like 20 seconds or something.
01:02:02.000 We could double blind test on Rumble.
01:02:05.000 I mean, that's some advice is we could try a different platform and see if it happens again.
01:02:08.000 But the thing is, it happens spottily.
01:02:10.000 And it could be not even Alphabet, the company, it could be like, CIA.
01:02:14.000 Well, we can test it out now and continue talking about the CIA and the FBI and see if it drops off.
01:02:20.000 Yeah.
01:02:21.000 I don't know.
01:02:22.000 It's definitely certain topics.
01:02:24.000 It's definitely topic-based.
01:02:26.000 I'm loathe to go that direction, to start assuming for sure that it's because of the topic.
01:02:31.000 Yeah, I can understand why it would be.
01:02:33.000 I get it.
01:02:34.000 And you honestly have more experience with this than I would.
01:02:38.000 But in my experience, it seems like it's topic-based.
01:02:40.000 And what I've seen in the past, I've watched the show for years, it definitely seems like there's Can you guys listening, adminning at YouTube right now, super chat us and let us know if there are technologies you're using to mute and push things around?
01:02:53.000 They have to have it.
01:02:54.000 Just let us know.
01:02:55.000 You know how I guarantee you, I would say there is no less than a 100% chance they have that ability.
01:03:03.000 You know why?
01:03:04.000 Because a crackpot psychopath livestreamed himself going into a church in New Zealand.
01:03:10.000 That kind of stuff has happened before.
01:03:12.000 It's happened more than once.
01:03:14.000 And I'm sure when that happened, every big tech platform said, we need the ability to instantly shut down any stream.
01:03:21.000 We need robot AI, whatever, tracking noises.
01:03:24.000 YouTube has a rule that you cannot do anything with guns live.
01:03:30.000 So if you're on YouTube and you have a gun channel, you can go out to your range and do tutorials.
01:03:38.000 What's that gun channel?
01:03:39.000 Hickok45, I think, is a good one.
01:03:41.000 Have you ever seen it?
01:03:42.000 So that's all fine.
01:03:44.000 And they're like, yeah, you can do that.
01:03:45.000 It's totally fine.
01:03:46.000 Not live, though.
01:03:48.000 And it has to be in the proper setting.
01:03:49.000 So it can only be at a gun shop to display the weapon, or it can be on the range where it's safe.
01:03:55.000 So I'm willing to bet they've got workers and AI.
01:03:58.000 And I bet the workers get alerted when the AI flags something.
01:04:01.000 Yeah, and there's also a lot of pressure from all of these different organizations with a lot of money on social media platforms to censor content to, you know, go after discrimination, hate speech, violence, etc.
01:04:13.000 So, you know, What about, remember all those Facebook censors, who were not necessarily censors, but I think they would be considered censors.
01:04:21.000 They were watching Facebook material, and so they uploaded it to Facebook to check it.
01:04:24.000 Wasn't that a whole big story, and that kind of disappeared?
01:04:26.000 They were talking about how it was detrimental to their health, how they were watching so much negative content all the time, and it was damaging them.
01:04:32.000 Remember that?
01:04:33.000 Well, Ian did it too!
01:04:34.000 Yeah, I did a video about that in 2018.
01:04:37.000 That was right after that idea broke, 2017, when people started coming out.
01:04:40.000 They'll hire people to look at the darkest dregs of humanity.
01:04:43.000 I mean, stuff will go up on YouTube.
01:04:44.000 Are you talking about the TikTok lawsuit?
01:04:46.000 I'm so sorry.
01:04:46.000 Oh, no problem.
01:04:47.000 No, no, no.
01:04:47.000 I'm talking about, like, yeah.
01:04:48.000 Because there was a TikTok censor lawsuit.
01:04:49.000 On Facebook, yeah.
01:04:50.000 The censors are seeing so much murder and child abuse.
01:04:55.000 The burnout rate's incredible.
01:04:57.000 Yeah, you know, there was TikTok contracts to a third party, and they filed a class action lawsuit for some more reason.
01:05:03.000 I'm sure your video is more.
01:05:04.000 The third party filed one against TikTok?
01:05:06.000 Yeah, for damages, because they say they just see terrible, awful things and they have to, like, they'll work, part of it is, like, they'll work really long shifts, right?
01:05:15.000 And then, like, without a break from anything, they're just taking in all this terrible content.
01:05:19.000 Yeah, it's not, it was 24-7 for me, so I'd go in, I'd look at 300 things, I'd go out, an hour later I'd go back in, look at another hundred things, stuff like that.
01:05:26.000 So it was all day of my life for five years.
01:05:28.000 It was, that was my life.
01:05:30.000 For those that are wondering why Ian is the way he is, he used to be normal.
01:05:32.000 And then he looked at these things and he became more and more twisted.
01:05:35.000 I did used to be normal.
01:05:36.000 I did used to be normal.
01:05:38.000 I used to watch bangedup.com.
01:05:39.000 You ever watch those old, like, Faces of Death shows and stuff?
01:05:43.000 Like in 2001, when the internet was finally on and I could finally watch video, I'd go... On America Online?
01:05:49.000 Yeah, I think I had America Online at the time.
01:05:52.000 And that was hard to watch, but I was like, if it happens, I should be able to see it.
01:05:57.000 If it's going to happen in real life, I should see it.
01:05:58.000 But that doesn't mean I need to see every blown open body.
01:06:01.000 It gets to the point where you need to protect the mind, and I don't know how we're going to handle So administrating these things, right?
01:06:08.000 This is a big challenge because there are evil people posting evil stuff on all of these platforms.
01:06:12.000 And you know, Elon of course came in and that was a big thing he was trying to do
01:06:15.000 is get rid of this stuff because Twitter certainly wasn't for whatever reason.
01:06:18.000 Let me pull up this story.
01:06:19.000 This is interesting.
01:06:20.000 This is a crazy story.
01:06:21.000 This is from our good friends over at Jezebel.
01:06:24.000 You know Jezebel, you know I'm gonna love her.
01:06:26.000 Twitch's AI porn controversy is a creepy sign of things to come.
01:06:31.000 You know, I completely agree with Jezebel for once.
01:06:33.000 Popular streamer Atriox sobbed through his apology after viewers spotted that he was looking at AI-generated porn of other streamers.
01:06:42.000 This is the weirdest thing ever.
01:06:44.000 So the dude, I guess, like, what's the story?
01:06:46.000 He was watching, he had porn pulled up on his... No, he, like, so Pop Culture Crisis, shout out to them, talked about this the other day, he, like, was doing a stream and he was clicking through, like, the tabs and I guess, I don't know if he opened it accidentally or just skipped over it, but, like, they could see he was on Pornhub and I don't know all the details of, like, how they identified that it was what it is.
01:07:06.000 It wasn't Pornhub from my knowledge.
01:07:07.000 It was a site that does this.
01:07:09.000 Oh, okay.
01:07:09.000 You can, like, upload stuff.
01:07:10.000 Well, in his apology, he said, I was on Pornhub and I got directed to an ad.
01:07:15.000 And then it sent me to whoever, so maybe that's how they figured it out.
01:07:17.000 But they noticed a tab in his browser that was like... It was crying about it?
01:07:21.000 And his wife sat in the background.
01:07:24.000 His wife was crying too.
01:07:25.000 He was like, I was on Pornhub, a totally normal site, and then an ad popped up.
01:07:29.000 The most famous line.
01:07:31.000 An ad popped up for this AI Deepfake porn so he clicked on it and then he went the extra step to pay money to get an account for the thing to watch.
01:07:38.000 Can I just give some advice to any young streamers out there?
01:07:42.000 You have a work computer and you have a personal computer.
01:07:45.000 It's not hard to understand but this guy apparently was like, I run my business off this machine.
01:07:50.000 Let's pull up adult content on the browser I use for children.
01:07:54.000 Come on, bro.
01:07:55.000 And he's been a streamer for a long time, right?
01:07:59.000 I don't know anything about him.
01:08:00.000 Obviously, he was live.
01:08:00.000 What am I saying?
01:08:01.000 Think.
01:08:02.000 Yeah.
01:08:02.000 But I just mean, like, you think he'd know to close his tabs?
01:08:05.000 Yeah, seriously, bro.
01:08:06.000 Just don't do that on your work computer.
01:08:06.000 Come on.
01:08:07.000 Or maybe just don't do it at all, in my opinion.
01:08:09.000 True, true.
01:08:10.000 As this story was breaking, what's this girl's name?
01:08:13.000 Out of a morbid curiosity, he paid for a subscription.
01:08:17.000 I'm so curious, please have my money and show me more.
01:08:20.000 Yeah, I think he's in damage control at this point.
01:08:22.000 No, but hold on, like...
01:08:25.000 I mean, it is creepy, right?
01:08:28.000 Super creepy.
01:08:29.000 It's creepy, but then what do you do, right?
01:08:32.000 If there's some sort of initiative to censor this sort of content, that's worse.
01:08:36.000 I mean, I don't know.
01:08:38.000 Someone tweeted this, and I saw it, Scott.
01:08:40.000 It's 11,600 quote tweets.
01:08:42.000 This tweet has 36 million views.
01:08:45.000 This guy, Ayan Ramaru, says, millionaire internet streamer's reaction to AI porn of herself You won't find more fragile people than popular internet personalities, especially women.
01:08:56.000 I don't know if I agree.
01:08:58.000 I think you're allowed to be freaked out that someone took a picture of your face and put it on porn and is, like, watching it.
01:09:04.000 She says people keep sending screenshots of it to her.
01:09:06.000 Like, all these people have seen it and it's got her face on it and then, like... I mean, it happens to celebrities all the time, doesn't it?
01:09:12.000 You know, their faces are put on... Yo, Futurama did this.
01:09:15.000 Yeah.
01:09:15.000 Remember when Frye was dating Lucy Lubat?
01:09:19.000 Oh, yeah.
01:09:19.000 Yeah, he had Lucy Lubat downloaded into a robot so he could date her.
01:09:22.000 Well, I already used the example of pop culture, but Japan has this big issue with importing sex dolls, and they've agreed to release some, but they won't release any that look like a public figure or like a child.
01:09:33.000 Oh, man, what the...
01:09:35.000 Yeah, because they're like, that's disgusting.
01:09:37.000 Okay, so what's a public figure?
01:09:38.000 Like, is this the end of days?
01:09:39.000 Well, I know, like... How do you define what a public figure is?
01:09:42.000 Exactly.
01:09:42.000 Do you have to be in a magazine?
01:09:43.000 Do you have to be in a movie?
01:09:44.000 Also, how are, like, all of the people... So you can do it to regular people?
01:09:48.000 Or, like, 2,000 followers on it?
01:09:50.000 Like, what point... Yeah, what's the cutoff?
01:09:52.000 Exactly.
01:09:52.000 It's terrifying.
01:09:53.000 And like, there have been a lot of people saying, oh, they shouldn't be upset about this.
01:09:57.000 But like, I think people will compare it to like, oh, it's like a sex tape getting leaked.
01:10:01.000 But like, this girl who's crying about it, like, didn't didn't do that.
01:10:05.000 Like, she didn't do this at all.
01:10:08.000 She's also not making money off of it.
01:10:10.000 And I'm sure that's not her point.
01:10:11.000 But I'll put it out there.
01:10:11.000 That's part of her point.
01:10:12.000 Yeah.
01:10:13.000 But she should get all the money from it.
01:10:14.000 It's her likeness.
01:10:15.000 Like, you can't take a picture of Ian and put it on, like, a commercial for your soda pop.
01:10:19.000 You gotta pay Ian for that.
01:10:20.000 Well, you can do it, but you won't pay me for it if you do.
01:10:22.000 No, no, look at this.
01:10:23.000 This tweet from Sweet Anita.
01:10:23.000 Look at this.
01:10:25.000 She said, This story was how I found out that I'm on this website.
01:10:28.000 I literally choose to pass up millions by not going into sex work and some random Cheeto-encrusted porn addict solicits my body without my consent instead.
01:10:36.000 Don't know whether to cry, break stuff, or laugh at this point.
01:10:39.000 I just don't think this should be the price for wanting to entertain people.
01:10:42.000 This just made me realize something.
01:10:44.000 There's a lot of women who try to do OnlyFans.
01:10:46.000 They could effectively have, like, porn star surrogates.
01:10:51.000 You know?
01:10:51.000 They don't have to actually do it themselves.
01:10:54.000 But here's the problem.
01:10:56.000 That's the thing I'm saying, like, there could be women who are like, get a different woman to do the work for you, just put your face on it.
01:11:00.000 But here's the crazy thing.
01:11:02.000 You can't do anything about it.
01:11:04.000 So we're talking about, we're working on music, and we were talking with some industry guys, and they were like, you guys should do a cover of a song.
01:11:11.000 People like covers, and then you do maybe just one cover of something as a song you really like and wanna do.
01:11:17.000 And I'm like, well, how do you do that?
01:11:18.000 I mean, we gotta buy the rights?
01:11:19.000 They're like, no, you just do it.
01:11:21.000 And I'm like, you don't need to buy the rights to cover a song?
01:11:23.000 They're like, no, the money just goes to the person who wrote the song.
01:11:26.000 I was like, oh, wow.
01:11:28.000 Now think about what that means for AI deep porn.
01:11:31.000 That means any woman anywhere can't stop someone from making it.
01:11:34.000 The only thing they can do is accept the money.
01:11:37.000 This basically means, like, you cannot stop someone from forcing a woman to be in AI porn.
01:11:45.000 The only thing that happens is like, well, they used your face, so here's money.
01:11:45.000 Yeah.
01:11:49.000 So how do you deal with this?
01:11:50.000 That's the question.
01:11:50.000 I don't know, that's crazy.
01:11:51.000 The girl in the video, and I, I'm, apologies, I don't know her name.
01:11:54.000 QT Cinderella.
01:11:54.000 I mean, like, I feel bad for her that this is happening to her, but how do you, how do you deal with this situation?
01:11:58.000 She says one of her frustrations is now she's taking, she's spending her own money to fight it, like, in court and stuff, and I guess, like, you issue a cease and desist, like, but the other part is- Good luck.
01:12:07.000 Also, it's already out there, like, even if you get this one video of her taken down, it doesn't end the surface, and also, like, Those screenshots are forever!
01:12:14.000 It's going to continue to happen to other people, too.
01:12:16.000 Yeah.
01:12:17.000 Dry sand effect, too.
01:12:18.000 Yeah.
01:12:18.000 This is going to be more and more and more.
01:12:20.000 AI is going to be able to write it on the spot.
01:12:22.000 You'll go, I want to watch person A have sex with animal L. And then they watch a donkey and the woman you love getting it on.
01:12:31.000 And then you're like, okay, stop.
01:12:33.000 I want to watch myself having sex with Phil in the blank.
01:12:36.000 And that's it.
01:12:38.000 And everyone in the world is fair game.
01:12:40.000 I want to look at a blend of person A and B having sex with person Y. Dude, people are going to put the Hulk's face on a woman.
01:12:49.000 Remember we were talking about that weird YouTube thing where they put Hitler's head on a woman's body to do Tai Chi and sing nursery rhymes to kids?
01:12:55.000 Man, the world is getting... This is technology.
01:12:58.000 It feels like... Thank you, AI.
01:13:00.000 It feels like chaos is taking over.
01:13:02.000 It's just everything's becoming random nonsense and chaotic.
01:13:06.000 But then eventually it'll become normalized and then people, you know... That's why I asked if we'll be around in 14 years, if this country can still... It's like we used to be able to defend Earth because of land borders.
01:13:17.000 They had artillery, they rolled up to the border, we could protect it with troops, they couldn't get any closer in so they couldn't hit the target.
01:13:22.000 Now we have air power.
01:13:23.000 You can't protect the Earth when everyone has air power.
01:13:26.000 There's no way to defend.
01:13:27.000 The American process of defending Earth is not functional now because you have orbital strikes and stuff.
01:13:32.000 Climate change is supposed to kill us by 2025.
01:13:36.000 So wait, these websites allow you to like, you upload someone else's picture and then it'll generate it for you?
01:13:41.000 Yeah.
01:13:41.000 I know they do those.
01:13:42.000 We did the thing with the music video Genocide.
01:13:45.000 If you haven't seen, heard the song Genocide, check it out because Ian's in it.
01:13:47.000 And we animated news personality faces to sing the song about, it's basically criticizing the media for being liars.
01:13:54.000 Yeah.
01:13:54.000 So I know that that sounds like, you know, kind of wholesome.
01:13:58.000 It's silly to like make Don Lemon sing or whatever.
01:14:02.000 But people are actually going totally dark with it.
01:14:04.000 It's crazy.
01:14:05.000 I wanted to get an account and check it out earlier today, but I stopped myself for better or worse.
01:14:10.000 Not enough morbid curiosity?
01:14:11.000 You know what I would actually rather do?
01:14:13.000 I would rather just use it, use like the AI technology to put Tom Cruise in just whatever movie.
01:14:20.000 So like I want to see Tom Cruise as Frodo Baggins and then do a voice AI and a face AI and then it's literally just Tom Cruise as Frodo Baggins the whole time.
01:14:29.000 Tell me that would not be like the funnest thing ever.
01:14:32.000 It would be fun to have technology if everyone had good intentions, right?
01:14:35.000 But you can't introduce that.
01:14:36.000 You have to assume everyone's going to do the worst thing, which is AI-generated porn.
01:14:43.000 I'm sorry, but that's the reality.
01:14:44.000 People can already do that with Adobe programs and everything, put your face on video and et cetera.
01:14:52.000 Sure.
01:14:52.000 I don't like any technology.
01:14:53.000 It's not specific to AI.
01:14:54.000 They're just making it easier with AI to do that.
01:14:56.000 Yeah, of course.
01:14:59.000 Like, that's why so many parents are deciding not to put their kids' faces on the internet, right?
01:15:04.000 That's why you saw this pushback against, like, family vlog channels.
01:15:07.000 I think that, like, we had a nice honeymoon here with technology and the internet.
01:15:11.000 We thought it was cool, we thought it was a fun time, but actually it's extremely dangerous.
01:15:15.000 It is.
01:15:15.000 And there's no putting it back in the box, so should you just...
01:15:19.000 I'm just worried about what politicians are gonna propose to stop this from happening.
01:15:25.000 We should AI Biden's face on a Trump's body, and then start showing it to people and be like, do you agree with what Joe Biden says here?
01:15:33.000 Yep, totally.
01:15:34.000 And it'll be Joe Biden like, come on, man, you know, I'm the greatest president.
01:15:37.000 Everybody agrees.
01:15:38.000 Yeah, that's a good idea.
01:15:39.000 We could honestly do like a Nancy Pelosi or a Fauci one, you'd have voices down.
01:15:44.000 I feel like it's inevitable that everyone's going to be deepfaked onto porn of all styles.
01:15:50.000 Yeah.
01:15:51.000 So I've just kind of already accepted it, like get ready for the whirlwind.
01:15:54.000 But the problem is when people don't know it's a deepfake when they see it.
01:15:57.000 Yeah, that's the problem.
01:15:58.000 And even law enforcement doesn't know it's a deepfake.
01:16:00.000 When it happens so often and people are going to just be like, oh, it's probably fake.
01:16:04.000 And then no one's gonna believe any video ever.
01:16:06.000 No one's gonna believe anything.
01:16:07.000 The perception of what happened or what happened?
01:16:07.000 Like what is real?
01:16:08.000 What if you're faking child porn and you go to the cops and they're like, nah, everything's fake, it doesn't matter, but it's a real case.
01:16:13.000 Like, what do we do then?
01:16:14.000 Well, they would have the means to, you know, sort of investigate it and look into it.
01:16:17.000 But you just said that everyone's gonna assume it's all fake.
01:16:19.000 Everyone as a general public, but I feel like- Yeah, forensics will dig through it.
01:16:22.000 They'll be like, here's the artifacts.
01:16:23.000 But if they're motivated enough to, right?
01:16:24.000 If we assume enough of it is fake, I'm just saying, like, it opens this terrible door where people are gonna be like, the reality is most of the stuff is not real.
01:16:32.000 But like, what's real, that's the question.
01:16:34.000 Is it the perception of the act, or is it the act itself?
01:16:37.000 Because when you watch porn, a video, you're not actually watching the porn, you're watching a digital representation of it.
01:16:42.000 So you're not watching porn, for real.
01:16:43.000 You're watching a fantasy... That's a semantic argument.
01:16:48.000 But we think it's real, because we feel it.
01:16:51.000 Because it happened, because people filmed it.
01:16:52.000 We're asking the question of whether or not it was filmed with real humans, or was made by a computer and it never happened to a person.
01:16:56.000 If it looks identical, there's no difference.
01:16:59.000 Was a person in this act or not?
01:16:59.000 There is.
01:17:01.000 And if we're talking about children, was a children harmed and raped?
01:17:04.000 Or is it an adult who consented and got paid?
01:17:06.000 I mean, those are questions you can't answer.
01:17:08.000 Is it a video evidence of a crime or not?
01:17:10.000 That's, you know, very simple.
01:17:12.000 But until those, if I don't think those questions can be answered, if it looks realistic enough, there's no way to know.
01:17:16.000 Well, this is... Well, I mean, Pops would be able to answer it, like you said.
01:17:19.000 I'm not entirely convinced.
01:17:19.000 I hope.
01:17:20.000 I'm not saying, hey, authority's going to fix it.
01:17:22.000 I don't know.
01:17:22.000 I'm not entirely convinced, because this is something we've been talking about for several years, and it's going to come to a point where what you do is you create a... you generate a fake video, then you run it through a compressor, or you convert it, and now it wipes out the artifacts.
01:17:39.000 Yeah.
01:17:39.000 It's going to be hard to tell at a certain point.
01:17:42.000 Interesting.
01:17:42.000 You did a really good job.
01:17:43.000 It's creepy.
01:17:43.000 Yeah.
01:17:43.000 Yep.
01:17:44.000 I was watching...
01:17:46.000 I was watching some YouTube video.
01:17:47.000 It was like some Blizzard thing or something.
01:17:49.000 I can't remember what it was.
01:17:50.000 No, no, no, no, no.
01:17:50.000 Uh, yeah, yeah, yeah.
01:17:51.000 I can't remember what it was.
01:17:52.000 And it was like... It was kind of really obvious it was CGI, but just the lighting was so good.
01:17:58.000 I was like, weird.
01:18:00.000 It was like they got past the uncanny valley.
01:18:02.000 That's what it felt like.
01:18:03.000 It felt like you knew it was CGI, but it looked so good and real.
01:18:08.000 And it was like a person walking across a bridge or something, and I was like, damn, this is creepy.
01:18:12.000 That's crazy stuff, man.
01:18:13.000 Yeah.
01:18:15.000 To go back to the crying girl for just a second, because it's on my mind, the other thing I think she could do legally is go after Pornhub, because if they're accepting advertising money from this site, I don't really, I'm not a lawyer, I don't know.
01:18:26.000 No, I don't think she could, no.
01:18:27.000 You got it after the company that did it, not the advertiser.
01:18:31.000 She can maybe do a cease and desist if they're showing her picture on the site or something.
01:18:34.000 I feel like there is, that's what's weirding me out.
01:18:37.000 And then it's gonna start a new wave of lawsuits.
01:18:40.000 I know, but that's the thing about legal fights, everyone involved has to have the money to go through it and even then they're not always effective.
01:18:45.000 There's really no justice for this.
01:18:47.000 And you're also buying his story about that's what happened.
01:18:49.000 Oh yeah, for sure, for sure.
01:18:50.000 But like, I feel like, and I don't know, I'm not gonna investigate it, but like, him being like, these are advertisements I see, there are enough people who check out Pornhub who could verify, oh yeah, they run those advertisements.
01:19:01.000 Which is like, to me, creepy.
01:19:02.000 I'm like, I got a weird- Future's gross, man.
01:19:04.000 Yeah, it's terrible.
01:19:04.000 I'm at like a crossroads in my mind, because I'm like a pretty extreme techno-libertarian.
01:19:09.000 Like, I think if the technology can exist, it's gonna exist, and it will get used.
01:19:13.000 No amount of legalities are gonna stop that technology from being real.
01:19:19.000 But then I'm like, well, wait a minute, I'm thinking about protecting children.
01:19:23.000 And like, how do you, how do, you know, you can't just let everyone see everything all the time.
01:19:29.000 Where we're going, remember Lenza?
01:19:31.000 It was a big thing.
01:19:32.000 I guess people are still doing it.
01:19:33.000 The AI pictures, the cartoon pictures and everything.
01:19:36.000 You take a bunch of pictures of yourself and it generates a perfect AI version of you or whatever.
01:19:41.000 That's the future.
01:19:41.000 We're going to live in the pods.
01:19:43.000 And you're going to be in a digital world.
01:19:45.000 And you know the funny thing?
01:19:46.000 For all we know, we're already in it.
01:19:47.000 And we are the beautiful versions of ourselves.
01:19:49.000 Dude, I just pulled up this story from last year on Yahoo.
01:19:51.000 Would you eat lab-grown human salami cultured from celebrity cells?
01:19:55.000 They're making meat from celebrity cells, or they're about to, and then you can eat your favorite celebrity.
01:20:00.000 Wait, what is the title of that one?
01:20:01.000 Would You Eat Lab-Grown Human Salami?
01:20:03.000 Why would you want to do that?
01:20:04.000 Yes, what is the motivation?
01:20:05.000 Because you want to live in the pot and be your best fantasy.
01:20:08.000 But you think it brings you closer to them?
01:20:10.000 I think so.
01:20:10.000 Alright, let's look at this story.
01:20:12.000 Would you eat human meat grown in a lab?
01:20:13.000 Would you eat human meat grown in a lab?
01:20:17.000 Well, this is like the animal rights movement, really, because they don't want you to eat Animal meat.
01:20:23.000 They don't want to eat meat.
01:20:23.000 The answer is no.
01:20:24.000 I won't eat human.
01:20:25.000 What is wrong with you people?
01:20:26.000 Did you find the one about lab-grown salami cultured from celebrities?
01:20:29.000 Lab-grown salami?
01:20:31.000 I've seen that human meat article before.
01:20:33.000 It's like an animal rights thing.
01:20:35.000 Would you eat lab-grown human salami cultured from celebrity cells?
01:20:39.000 No!
01:20:39.000 Why salami?
01:20:39.000 James Franco's salami?
01:20:41.000 They're hoping to get James Franco and Ellen DeGeneres, Jennifer Lawrence, and Kanye West on board to donate tissue samples that will become salami.
01:20:48.000 I don't think Kanye would be down.
01:20:50.000 Knowing Gaynor, I don't think he'd be down with that.
01:20:52.000 You make a lot.
01:20:53.000 Could you imagine the amount of money that someone could make?
01:20:54.000 Probably so much.
01:20:55.000 It's only legal in Idaho, though.
01:20:56.000 Idaho is the only place that allows... This is like some of the most disgusting stuff I've ever seen, yo.
01:21:01.000 We need to go back.
01:21:02.000 There was also an article that I... Go back!
01:21:04.000 We've gone too far!
01:21:05.000 Too far, too far.
01:21:05.000 There was also an article I read.
01:21:07.000 They said, would you eat human meat to save the planet?
01:21:11.000 What a wonderful, vague question.
01:21:13.000 Here's what I'm saying.
01:21:16.000 How about... I think I have this video somewhere.
01:21:18.000 Here we go.
01:21:18.000 Check out this video.
01:21:20.000 I saw this history-defined high school fitness program in 1962.
01:21:23.000 Look at this.
01:21:24.000 This was normal back then for all of the kids.
01:21:25.000 Somebody film La Sierra now.
01:21:26.000 school has developed a program that assures every student of physical...
01:21:30.000 Look at this. This was normal back then for all of the kids.
01:21:33.000 Somebody filmed last year and now you can't even play dodgeball in gym class anymore so.
01:21:39.000 Is it all boys school or are they separate by gender for these?
01:21:42.000 I'm pretty sure they all did.
01:21:44.000 So I'm kind of saying, here's what I'm saying.
01:21:46.000 Can we like go back to when we, you know, exercised and ate right but, you know, keep the getting rid of racism part and then just have like the not racism but still be exercising and eating right and not having human grown meat for food.
01:22:01.000 This is toxic masculinity now.
01:22:03.000 This is just a bunch of teenagers exercising as they should be.
01:22:06.000 But nowadays it's just like the kids sit around doing nothing.
01:22:10.000 Well, if you make them exercise and they can't do it, then you make them feel bad and that's bad because that's bullying.
01:22:16.000 Oh man, it's like social Darwinism or whatever.
01:22:20.000 Strong must survive.
01:22:21.000 Let the people who want to be doughy be doughy, and the people who want to be fit be fit.
01:22:24.000 Do you want to cater to, like, the lowest denominator?
01:22:27.000 Or should we all, like, try to encourage everyone to reach their highest potential, right?
01:22:30.000 That's why we're going to lower the standards.
01:22:32.000 Oh, good.
01:22:33.000 That's part of, like, the assisted suicide conversation.
01:22:35.000 Like, I'm very much just cater to the best of the best and let everyone else fall away.
01:22:39.000 If they can't keep up, then they're not supposed to, evolutionarily.
01:22:43.000 See, with this gym class example, I feel like it's way worse to be like, you couldn't do a push-up, so you just don't have to do them.
01:22:49.000 Like, we know you're not capable.
01:22:50.000 That's a terrible thing to communicate to a teenager, right?
01:22:53.000 Being like, hey, you can't do push-up, but by the end of the semester, maybe you'll be able to do five.
01:22:57.000 And that's awesome.
01:22:58.000 That's an improvement.
01:22:59.000 Like, that's way better.
01:22:59.000 They're gonna call it push-up shaming or something.
01:23:01.000 Exactly!
01:23:03.000 I guess if we are going into the techno-verse, that your emotions are going to be a key part in your ability to navigate.
01:23:09.000 So angry people could make bad things, and you want to make sure people are happy and healthy.
01:23:13.000 But the problem is, man, giving someone everything they want when they're little is not what makes people happy.
01:23:19.000 People need adversity.
01:23:21.000 Competition.
01:23:23.000 They need to lose, and they need to learn how to enjoy it.
01:23:26.000 The future is going to be humans are going to be short and gangly and gaunt, super thin, wearing jumpsuits, and they're going to live in pods.
01:23:36.000 But in the pods, they're going to be six feet tall, super ripped, flying around, you know, with, with spaceships and all that stuff.
01:23:44.000 This is the worst bedtime story ever.
01:23:47.000 So I think the future is androgyny because we see all of this gender transitioning happening.
01:23:51.000 And I think that the ideal will eventually be, you know, an androgynous model.
01:23:56.000 I was thinking that we're going to be just raising gestating and like pods, the kids, and then adults won't have sex organs.
01:24:04.000 We'll be super tall because we'll be in low gravity, have really long arms and legs.
01:24:08.000 Have you guys watched SG1?
01:24:10.000 This is basically what happens to one of the aliens, like the Nordics, what do they call them?
01:24:17.000 I forgot what they call them in the show.
01:24:20.000 But they're like greys, and they can't reproduce anymore.
01:24:22.000 They can only clone themselves.
01:24:23.000 And they're degrading, so they can't do it much longer.
01:24:27.000 And they eventually just get wiped out because of it.
01:24:28.000 What were you about to say?
01:24:29.000 Oh, I was just gonna say, in the early 2000s, I remember reading some fashion magazine do a cover on, like, the top model on the street.
01:24:37.000 Asgard.
01:24:38.000 Sorry, guys.
01:24:38.000 Asgard.
01:24:38.000 Asgard.
01:24:39.000 The highest, like, most in demand, and it was a man who was known for his androgynous look, and he's very thin, and, like, Tim's saying, like, long arms, stuff like that.
01:24:47.000 Yeah, the fashion industry to this day is doing the same thing.
01:24:50.000 Well, and this was, like, I think I read this article in 2010.
01:24:52.000 So at that time they already were like, this is the ideal for both men and women
01:24:57.000 because they were saying this, like he's able to walk runway for men's lines,
01:25:02.000 but also for women's lines.
01:25:03.000 But he's not shaped like we would think a traditional toxically masculine male ideally is,
01:25:09.000 and he's not shaped like a woman.
01:25:11.000 How do we get to that ideal of androgyny?
01:25:12.000 By flipping shit.
01:25:15.000 And making the body unhealthy.
01:25:16.000 Right, right.
01:25:17.000 So the fashion industry, they're making men wear dresses and then they're making women, you know, wear more masculine clothing.
01:25:22.000 So that is the first step to reaching that ideal of androgyny.
01:25:26.000 So I think that definitely.
01:25:27.000 And they're 10 years ahead.
01:25:28.000 Like if we're just talking about it now, they've already known it for a while.
01:25:30.000 I was thinking like how all wars are fought for sex.
01:25:34.000 I don't know, some people have told me that over time, like all wars are fought over women or something like that.
01:25:41.000 That's like the one thing.
01:25:42.000 I'm pretty sure the biggest wars in the world were fought over black peppercorn.
01:25:46.000 Wasn't it salt?
01:25:48.000 All that stuff, food for sure, but the aggression over women, over either guys not getting laid so he gets angry, or a guy wants the woman.
01:25:57.000 Maybe the sexuality is what has caused us to be so psychotic as a species, is our obsession with finding a mate and taking other people's mates.
01:26:05.000 I disagree.
01:26:06.000 As I explained, the East India Trade Company and the Spice Wars.
01:26:10.000 Food is a big part of it too, just survivalism.
01:26:15.000 Black pepper.
01:26:16.000 They were like, I will kill thousands to make my eggs taste better.
01:26:20.000 It's insane.
01:26:21.000 Also the profit off the black pepper, the amount of money they could make selling it.
01:26:24.000 That's my point.
01:26:24.000 What about female leaders that go to war?
01:26:27.000 Like Boudicca?
01:26:28.000 I mean, Margaret Thatcher, for example?
01:26:30.000 Not true.
01:26:31.000 Yeah, not all war is for sex, obviously, that's ridiculous.
01:26:34.000 It's for pepper!
01:26:35.000 A lot of male aggression is because of it.
01:26:37.000 I think in nature, male aggression is often because of female stuff.
01:26:41.000 When you think about the, I don't remember what they are, but the rams that get interlocked because they're trying to be each other to seduce a woman.
01:26:47.000 How do you define male aggression?
01:26:50.000 Just displays of aggression, right?
01:26:51.000 So, if you're talking about these rams, they fight each other.
01:26:52.000 Isn't that just being a man?
01:26:54.000 I mean... I think aggression is a naturally masculine tendency.
01:26:57.000 That's what I'm saying.
01:26:57.000 But, like, there are times when it becomes fatal, right?
01:27:02.000 If you're rams who lock together and die because you were trying to seduce or show the girl that, like, you were the better or stronger of the two.
01:27:08.000 Like, it can be, at times, not the best.
01:27:12.000 You need a certain amount of aggression.
01:27:13.000 You don't need to dominate.
01:27:14.000 But, like, I think you're right.
01:27:16.000 We link—in nature, sex and aggression are often linked, but I don't think it's necessarily the only cause of war.
01:27:22.000 I wonder if it's being bred out of humanity, if people are attempting to lower the aggression and the sexuality to an androgynous state.
01:27:30.000 It's crossed my mind like five, ten years ago, and I wonder if it's a technocratic state of mind thing.
01:27:34.000 Like, we need these people not testosterone-filled, because they're too much fistfights, too much lying.
01:27:41.000 So that's physical aggression, but then on the female side, there's emotional aggression, right?
01:27:46.000 Because, I mean, they're influenced by estrogen and everything, so... Yeah, and aggression for women looks different.
01:27:52.000 Yeah, I feel like it's worse.
01:27:54.000 I would too, but... I think it's more toxic.
01:27:56.000 Well, they have that thing where they say, like, men... like, women... like, men, if there's a problem, like, eventually can just hit each other, right?
01:28:02.000 Yeah.
01:28:02.000 But with women, like... And then they'll get over it.
01:28:04.000 Yeah, yeah.
01:28:05.000 But the hitting each other never solves it, I don't think.
01:28:09.000 I don't think so.
01:28:09.000 But they assert dominance, and that's sort of the natural order of, like, the difference between men and women.
01:28:14.000 I mean, it does solve it, though, right?
01:28:15.000 The beef is over.
01:28:16.000 It solves it in the physical sense.
01:28:18.000 Like, the guy's not going to get knocked down.
01:28:19.000 So you think that the dudes are, like, emotionally upset with each other after they win a fight or something?
01:28:23.000 Probably.
01:28:24.000 I mean, if you're not working it out emotionally or consciously, I don't think it's getting worked out.
01:28:30.000 What kind of fight do you mean?
01:28:32.000 Like, two guys want a woman, they get into a fist fight, the guy breaks the other guy's legs, breaks his neck, and now he's like, he didn't discipline.
01:28:42.000 We're a tribal now, so we have no choice, because otherwise he'll come back later.
01:28:47.000 It's kind of crazy if you think about it.
01:28:48.000 I was in Norway and this Norwegian guy, and it's funny because they're all kind of woke, he was like, you want to hear a joke?
01:28:55.000 And I was like, yeah.
01:28:56.000 And he's like, how come Britain has no beautiful women?
01:28:58.000 And I was like, why?
01:28:59.000 Because we took them all.
01:29:01.000 And then I was like, oh, yeah, that's right, because the Vikings went and raided and took all the women.
01:29:05.000 And I'm like, I don't know if it's a joke.
01:29:07.000 But the crazy thing is, imagine you're a woman in a village in the British Isles or whatever, in Great Britain or something, or Wales or Scotland.
01:29:15.000 And then all of a sudden, a bunch of burly dudes in a boat just walk up, destroy your village, and take you, and bring you back to their village, and it's like, this is your life now.
01:29:22.000 Yeah, true.
01:29:23.000 Like, that's just it.
01:29:24.000 Like, okay, I guess.
01:29:25.000 Yeah, God, I, you know, the horrible, I talk about this horrible, horrible stuff, but it's only because I want people to think about what it could be.
01:29:33.000 You can't just call, call the cops, like, if we really screw this up, you know, this American thing.
01:29:39.000 Watch I'm watching 1883 because you know you guys were telling me to watch it.
01:29:43.000 Yeah.
01:29:43.000 It's just like everybody dying all the time.
01:29:46.000 It's a bit ridiculous.
01:29:47.000 It's just like they're walking and all of a sudden like there's like there's a scene where they're talking about this and a girl goes to take a leak or something and a snake bites her ass and then she's dead.
01:29:55.000 Like her corpse is laying on the ground like well snake bit her ass.
01:29:57.000 Look, I think that, what was that video game that was popular?
01:29:59.000 Oregon Trail?
01:30:00.000 Oh yeah, man.
01:30:01.000 Like, I don't think it prepared us for what the West was really like.
01:30:03.000 That's why I liked watching E2 and E3.
01:30:05.000 It's not realistic.
01:30:05.000 Yeah, but the West was not like that.
01:30:06.000 There's no way they did, like, this, where the marshal walks into the room and he's like, who did it?
01:30:10.000 Bang, bang, bang, just kills everybody.
01:30:12.000 No, I can't say the depictions of law enforcement and how that works, but like, the risk to the travelers, like, while they're going, like, the snakes and the wagon and the whatever else, like, that is real.
01:30:22.000 It seems kind of crazy today because There are very rudimentary things we've learned in grade school that can really make a difference that they didn't know.
01:30:34.000 And if you think about it, to put it simply, there were people at a time who didn't know what a wheel was.
01:30:43.000 And we all know what a wheel is, because we've just seen it.
01:30:46.000 We've never had a workshop class on how to make a wheel.
01:30:48.000 We'd probably struggle to figure out how to make a perfect wheel out of wood or something, but eventually we'd figure out something.
01:30:53.000 How to make something roll.
01:30:55.000 But there was a period where people didn't even have wheels!
01:30:57.000 Nobody figured it out, nobody knew, nobody saw it.
01:30:59.000 Now it's like, if we really needed to, you could probably make a wheel with something.
01:31:03.000 You just do it because you know the concept of it.
01:31:06.000 So when it comes to like the Oregon Trail, there's a bunch of stuff you may have just seen passively watching the Discovery Channel and you're like, oh yeah, you know, like how to get water at night.
01:31:14.000 It's like you dig a little hole and you put like a leaf in it or something and you get like a cup or whatever.
01:31:18.000 Condensation forms and it drips into the cup.
01:31:20.000 There's just like stuff that you learn from watching movies that these people never had access to.
01:31:24.000 Think about this.
01:31:25.000 Well, germ theory, right?
01:31:27.000 Yeah.
01:31:27.000 Sorry to cut you off.
01:31:27.000 Oh, right.
01:31:28.000 Like washing wounds.
01:31:29.000 They used to be like, why wash your hands?
01:31:30.000 Yeah, or like you have to boil your water before you drink it.
01:31:32.000 Right.
01:31:33.000 They didn't know that because they didn't have the concept.
01:31:35.000 But in 1883 they're boiling the water because of cholera.
01:31:37.000 But they have to tell them to.
01:31:38.000 Right, but so think about this.
01:31:41.000 When you watch a movie, you get a bunch of BS information, and people think the wrong things from watching it.
01:31:46.000 People think that, like, silencers make guns go, pew, pew, pew!
01:31:49.000 They really don't.
01:31:51.000 And they really don't know what these things sound like.
01:31:53.000 But there is so much information we absorb every single day that we take for granted.
01:32:00.000 If you go back to 1883, you probably absorbed no information!
01:32:04.000 For days.
01:32:05.000 Like literally, the information you absorb is like, an animal was there.
01:32:09.000 Okay, well that's not like longevity information.
01:32:12.000 I'm talking like today, you know, I'll read an article and it'll be like a new fusion reactor.
01:32:16.000 It combines these things with these things, and I'm like, I don't know, I just learned something, I guess.
01:32:20.000 And I don't know, I want to practically apply like a fusion reactor or something.
01:32:24.000 But you go back to the 1800s, and you're walking the whole day, and you learn nothing new.
01:32:30.000 The only thing new you learn is, like, there's a tree at this point on the map.
01:32:34.000 I think, you know, we've also seen this transition from, you know, TV culture to internet culture, right?
01:32:40.000 We have all of the world's knowledge at our fingertips 24-7, you know?
01:32:46.000 30 years ago, we wouldn't be able to Google, like, some stupid, you know, piece of information that we were looking for, but now we can.
01:32:51.000 It's crazy.
01:32:52.000 That was actually, uh, there was a comic, I think it was XKCD, I'm not sure, where someone, it's like 1990, and someone said, hey, what year was Lincoln shot?
01:33:00.000 And the other person's like, I don't know, you want to go to the library and find out?
01:33:03.000 Nah.
01:33:03.000 And then it's like, now, hey, what year was Lincoln shot?
01:33:06.000 Then he's looking at his phone, he's like, 1864.
01:33:08.000 I don't know, was it 1864?
01:33:09.000 What year was it that he was shot?
01:33:11.000 1865 was like when the war ended.
01:33:13.000 Yeah, right, 65.
01:33:14.000 It was like a month after the war ended.
01:33:16.000 Yeah, 65.
01:33:17.000 That makes sense, right?
01:33:18.000 Can't go anywhere without your history.
01:33:19.000 What year was it?
01:33:20.000 April 14th, 1865.
01:33:20.000 55, you were right.
01:33:21.000 I was close, I was close.
01:33:24.000 Yeah, to this point, you know, when I had that Twitter space with the Taliban I was telling you about before, I co-hosted it with Nuance Bro, and we were discussing the detainment of Andrew Tate, and I got MAGA people together with these Taliban guys living in Afghanistan.
01:33:39.000 They actually had a very fruitful conversation about, you know, where the world was headed.
01:33:43.000 Can you imagine if we could do this back in 2001?
01:33:46.000 Maybe the war would not have happened.
01:33:47.000 That's crazy.
01:33:48.000 Yeah, that's why I'm wondering why do we have this global conflict?
01:33:51.000 Like, where is Putin on a video chat with Joe Biden?
01:33:54.000 Cue YouTube taking us off air.
01:33:56.000 We're gonna go to Super Chat, so if you haven't already, would you kindly smash that like button, subscribe to this channel, share this show with your friends, and become a member over at TimCast.com to watch our uncensored members-only show, which will be up around 11 p.m.
01:34:08.000 tonight.
01:34:09.000 We do those Monday through Thursday at 11 p.m., and as a member, you're supporting our cultural endeavors and our show, so we really do appreciate it.
01:34:15.000 Smash that like button, all that good stuff.
01:34:17.000 All right.
01:34:18.000 I prefer Rumble.
01:34:19.000 Do you?
01:34:20.000 Says, Tim and Ian, please check out 2034, a novel of the next world war by Admiral James Stavrivus.
01:34:29.000 It's sobering and clearly shows.
01:34:31.000 It explains World War 3 will touch American soil.
01:34:33.000 Technology is a double-edged blade.
01:34:35.000 Ooh, yeah, drone bombers, man.
01:34:37.000 Defiant Blackout says, yo, Tim, when you started off with that raid comment, YouTube glitched and I replayed it.
01:34:43.000 That whole part is gone about the orange man.
01:34:46.000 Let me just make sure I can reiterate this, and I'm going to give advanced warning for our censors, so you can, okay, I'm about to say it again, all right.
01:34:54.000 When Donald Trump has the FBI go to his house to look for documents, it's a raid.
01:34:59.000 When they go to Joe Biden's house, house, office, it's a planned search.
01:35:04.000 It's funny how the media does that, right?
01:35:06.000 All right, did you guys, did the censors, did you get it?
01:35:08.000 Oh, no, that one went through?
01:35:10.000 Well, you tried, you tried.
01:35:12.000 Just a coincidence.
01:35:12.000 BryceE says, Tim, your stream cut out for a few seconds in the beginning when you were
01:35:15.000 talking about the Biden raid.
01:35:16.000 Did it now?
01:35:19.000 Richard Winter says, I'm pretty sure YouTube just censored you saying Donald Trump in your
01:35:23.000 opening.
01:35:24.000 Did they now?
01:35:25.000 Look at all these people saying this, huh?
01:35:28.000 Just a coincidence.
01:35:29.000 Don't question it.
01:35:30.000 Rusta says, just some feedback on an earlier vid from today.
01:35:32.000 The 4U algorithm existed before Elon bought the site, but Elon just made it easier to switch from 4U and actually following on the app.
01:35:40.000 Anyway, love you all.
01:35:41.000 Yes, there were little stars in the top right, and you had the home feed and the latest feed.
01:35:46.000 And every so often it would automatically switch you back to home, which was algorithmic.
01:35:51.000 Yeah, I think it's awful.
01:35:53.000 All right, Darius Arkin says, shortly after having a discussion with my district manager where I was told that my conservative politics are unacceptable, my transfer request was denied and I was terminated from my position.
01:36:05.000 Yikes.
01:36:06.000 Well, if that was Washington, DC, you'd have a case for a human rights violation where politics is a protected class.
01:36:10.000 Isn't that crazy?
01:36:12.000 Doesn't apply anywhere else, though, for the most part.
01:36:16.000 All right, let's see.
01:36:18.000 Matthew Reckamp says, the problem with your poll is that it doesn't have a third option for both.
01:36:22.000 That was the poll on the question of, are they trying to remove Biden or cover up the raid or cover up Biden's malfeasance?
01:36:29.000 Yep.
01:36:30.000 David Toronto says, Trump was president.
01:36:31.000 He can declassify documents.
01:36:33.000 Joe couldn't.
01:36:33.000 Either way, I'll take President DeSantis.
01:36:36.000 Well, there you go.
01:36:36.000 That's one simple answer, I guess.
01:36:38.000 What do you guys think?
01:36:39.000 Is it going to be DeSantis?
01:36:40.000 I hope so.
01:36:40.000 I prefer Trump.
01:36:42.000 You like Trump?
01:36:42.000 I prefer Trump.
01:36:43.000 It's tough, man.
01:36:44.000 I want to see him debate.
01:36:44.000 I think DeSantis is pretty much like the same GOP establishment type of guy that's using wokeism to basically capture the MAGA base.
01:36:53.000 Right.
01:36:54.000 So he's doing what they want to try and earn their favor?
01:36:57.000 Yeah.
01:36:57.000 I mean, that's kind of what he should be doing.
01:36:59.000 Yeah, but I just don't think DeSantis is the same as Trump.
01:37:02.000 Yeah, I understand that.
01:37:04.000 And that's one criticism I have of Trump.
01:37:06.000 But then he ended up firing Bolton.
01:37:09.000 He brought in a bunch of people.
01:37:11.000 He had a bunch of people surrounding him.
01:37:12.000 As Luke likes to point out, he tried to get Bill Gates to be an advisor.
01:37:16.000 So it's tough.
01:37:17.000 Trump was not perfect.
01:37:18.000 I don't know, man.
01:37:18.000 Yeah, he wasn't, and that's actually why I trust him more.
01:37:21.000 He's not a politician.
01:37:22.000 He's not a traditional politician, whereas DeSantis, you know, he's a born-and-bred politician, really.
01:37:28.000 Yeah, too polished, maybe.
01:37:29.000 Yeah.
01:37:29.000 All right.
01:37:30.000 Christina H. says, Caitlin Bennett said she'd love to come on the show.
01:37:32.000 I hope you guys get in contact soon.
01:37:34.000 I will figure that one out.
01:37:37.000 Let me write that down.
01:37:38.000 Let me write down Caitlin Bennett.
01:37:41.000 That'd be cool.
01:37:41.000 I haven't seen her in a while.
01:37:44.000 Has she been on before?
01:37:45.000 I think they were saying she took a break because she's a mom.
01:37:48.000 She took time to... I went to Kent State.
01:37:50.000 That's where Kaitlin went, Kent State.
01:37:52.000 Yeah, she used to interview people and she'd have like the gun and then... She had really curly hair too, right?
01:37:58.000 And 345 says they are going to have Hakeem Jeffries run.
01:38:02.000 Really though, it's kind of soon.
01:38:03.000 What?
01:38:04.000 But that would boost his profile.
01:38:05.000 Like Nikki Haley is going to run, and she doesn't actually expect to win.
01:38:09.000 Running for president makes you, it boosts your profile.
01:38:12.000 You sell books.
01:38:13.000 You get famous.
01:38:13.000 You get Fado O'Rourke, yeah.
01:38:15.000 And that's the only reason they do it.
01:38:16.000 They do it so that they can get a book contract and make a bunch of money.
01:38:19.000 Right, say they ran.
01:38:22.000 All right, Really Rick says, Hey Samira, it's gays for Trump.
01:38:26.000 When can we expect the next Taliban Twitter space?
01:38:30.000 Very soon, hopefully.
01:38:31.000 Pick a topic and then, you know, maybe you can join.
01:38:34.000 You guys should join.
01:38:35.000 That's super interesting.
01:38:36.000 Yeah.
01:38:36.000 So how many people from the Taliban come on normally?
01:38:39.000 Um, I mean, that was a very specific Twitter space, but then, you know, they'll show up in our spaces sometimes, you know.
01:38:45.000 Nuance Pro said they, like, asked, they were asking for dating advice or something.
01:38:48.000 Yeah.
01:38:49.000 Oh, my gosh.
01:38:49.000 We don't know what you're talking about.
01:38:51.000 No, they have good advice.
01:38:52.000 And, you know, we've asked them about pop culture references and everything.
01:38:55.000 So, Yeah, I mean, I'm hoping to, you know, get these groups together, you know, would be good conversation.
01:39:02.000 All right, Wyatt Caldenberg says, Tim, I have been involved in politics since the 60s.
01:39:06.000 One thing I learned, never trust people who spread gossip and create drama within the movement.
01:39:11.000 They always turn out to be paid rats, crackpot or people with dark secrets.
01:39:16.000 Well, the one thing I will say is, we talked about reputation management firms a while back.
01:39:20.000 These are companies that their whole thing is how to create personas, how to manufacture identities, and how to do the inverse, how to destroy and character assassinate.
01:39:33.000 And so the one thing I'll warn you about is like with Julian Assange, this is where the future of assassination is.
01:39:40.000 Back in the day, Let's just leave all the conspiracy theories aside.
01:39:44.000 Several prominent people were killed in big news stories.
01:39:48.000 What happens when a prominent individual has their life taken?
01:39:51.000 Well, their work stops, but their ideas become immortal.
01:39:54.000 They become martyrs.
01:39:55.000 What we see now is Julian Assange gets accused of some impropriety or nonsense that turns out to be totally fake.
01:40:02.000 They used it as pretext to shut down his work or at least impede it to the best of their abilities and try and destroy his legacy so that his ideas die forever.
01:40:10.000 The man gets to live locked in the Ecuadorian embassy for 10 years, but they made sure that all across the internet people were spamming comments about how Julian Assange was doing this, that, or otherwise to women.
01:40:20.000 The media narratives came out and started doing all the same thing, and it was all a lie.
01:40:24.000 And then what it turns out, they dropped the case against Julian Assange and then Donald Trump moved to have him indicted and extradited to the United States on espionage charges when he's not even an American citizen.
01:40:37.000 You gotta watch out for this intel stuff, man.
01:40:40.000 It's weird stuff.
01:40:41.000 It's weird, creepy stuff.
01:40:43.000 But you can usually tell when it's inorganic.
01:40:46.000 Like, the story about Julian Assange, if you actually looked into, you knew, like, oh, okay, this is not real.
01:40:51.000 But for some reason, the New York Times or whoever else, all these big news outlets were writing overtly fake things.
01:40:56.000 The same thing I can say for Donald Trump.
01:40:58.000 Like, all these people spamming on Twitter, screaming about how Trump said Nazis were very fine people, and it's like, yo, he never did that.
01:41:04.000 So who are these people doing this?
01:41:07.000 Well, you know, I think it is.
01:41:08.000 I think it's a mix of prominent high-profile accounts that get the intelligence agencies or the contractors will reach out to them and say, hey, this is the go.
01:41:17.000 Here's your rate.
01:41:17.000 Like, here's your pay.
01:41:18.000 Do this story.
01:41:19.000 Or they will spam prominent personalities on the left and say, like, Hey look, like all of a sudden 50 comments will appear on someone's page being like Donald Trump said this, Donald Trump said this.
01:41:29.000 Troll farms.
01:41:30.000 And then these prominent personalities will start talking about it and regurgitating these ideas and then saying Donald Trump's a Nazi and then it creates that moment.
01:41:38.000 The reputation management firms should really look into it.
01:41:41.000 That's what Wikipedia is run by.
01:41:43.000 Basically, a handful of different agencies and individuals who do what's called reputation management.
01:41:49.000 And then what's really dirty is when campaigns target an individual for some reason that you can't really understand, and then the person goes to them and says, hey, we can fix this for you.
01:41:59.000 Maybe you just pay us and we'll make it all go away.
01:42:01.000 We'll handle your Wikipedia page and get those articles removed.
01:42:04.000 Crazy stuff, right?
01:42:05.000 All right, let's see what we got.
01:42:07.000 Cabo Rojo says the Dees can run a decrepit, senile, alleged pedo and make him get 81 million ballots.
01:42:13.000 It doesn't matter who the establishment decides to run.
01:42:15.000 That's what I'm saying.
01:42:16.000 Like, these, look, people are going to get a universal mail-in ballot, and then someone's going to knock on their door and say, vote for Biden.
01:42:24.000 Oh, whatever, I guess.
01:42:25.000 They're not going to know anything about him.
01:42:27.000 Joe Biden could walk on Fifth Avenue.
01:42:29.000 And at the point of that statement, it's what Trump said.
01:42:31.000 Trump said he'd go out on Fifth Avenue and he wouldn't lose a single vote.
01:42:34.000 He was right.
01:42:34.000 I mean, it's not a good thing, you know, obviously, but he's not wrong in the zealotry of a lot of these voters.
01:42:40.000 It's not exclusive to Trump.
01:42:41.000 I think that's the important thing.
01:42:42.000 Yeah.
01:42:44.000 All right.
01:42:44.000 What do we got?
01:42:45.000 Grofty says the UFO might spin at some point.
01:42:47.000 Buck?
01:42:48.000 Well, you know, here you go.
01:42:49.000 The UFO might spin.
01:42:50.000 There it is.
01:42:53.000 Air duster to spin it.
01:42:55.000 Spinning, the UFO is.
01:42:57.000 That's it.
01:42:57.000 All right.
01:42:58.000 That's dangerous.
01:42:59.000 Wobbling, the UFO might.
01:43:00.000 Yeah, you have to make sure it's perfectly flat.
01:43:03.000 Level, yeah.
01:43:03.000 Otherwise, the air is pushing it down.
01:43:05.000 I think, hypothetically, if something spins horizontally fast enough, it creates antigravity.
01:43:11.000 It diminishes its vertical momentum to zero.
01:43:14.000 How does that make sense?
01:43:16.000 I'll talk, I'll bring on Jeremy Rist, the alien scientist, to explain it at some point in the future.
01:43:20.000 So the challenge with this UFO is that it's not symmetrical in every direction.
01:43:25.000 So when the blower is on it, it's pushing it slightly downward, and so it's going down, but then bouncing up, so it starts wobbling.
01:43:34.000 So what you have to do is you have to get it perfectly flat and aligned.
01:43:36.000 I've got this thing to go really, really fast where the whole thing starts shaking like crazy because it can't handle the energy.
01:43:42.000 But I kind of feel like there's got to be a way to make it spin perfectly.
01:43:47.000 You know, like, what's the maximum RPM for this UFO?
01:43:51.000 Oh my gosh.
01:43:52.000 In a vacuum, I don't know if there would be a maximum.
01:43:55.000 I mean, it eventually would break apart.
01:43:58.000 Right, it would throw itself apart.
01:43:59.000 I watched this really funny video, a skateboard wheel, and they put like a power tool on it so that it would spin, and it spins so fast.
01:44:08.000 This is urethane, it's hard.
01:44:09.000 It melts basically, it just wobbles out like Laffy Taffy.
01:44:14.000 You guys ever watch the Hydraulic Press YouTube channel where he just smashes different things?
01:44:19.000 I love that channel.
01:44:21.000 Therapeutic.
01:44:23.000 All right, what do we got here?
01:44:25.000 Bryant Laws says, question, what if this is the setup for Civil War II?
01:44:29.000 Think about it.
01:44:30.000 Lincoln wasn't a perfect choice for POTUS, so would Newsom be in the same boat?
01:44:34.000 I don't know.
01:44:35.000 Is Joe Biden our Buchanan?
01:44:38.000 Maybe.
01:44:39.000 Lincoln came out as a fourth party.
01:44:40.000 There were four parties running for president that year, and he was kind of a nobody, wasn't a politician, and it was just a radical time in history.
01:44:50.000 So it is possible that we're about to embark on, like we were just talking earlier, we need, at least I was saying, I think more political parties.
01:44:56.000 The difference is now we have bot farms, troll farms, run by these parties and intelligence agencies.
01:45:03.000 They've been doing it for a while.
01:45:05.000 I actually talked to, I'll just say, a political party guy, and this was probably seven or eight years ago, who was talking about basically this.
01:45:16.000 They're not so overt where they don't come out and say, we want to make trolls to target individuals or anything like that.
01:45:20.000 They're just like, we want to maximize user outreach.
01:45:22.000 They want to construct narratives.
01:45:24.000 Yeah, and what was the ShareBlue?
01:45:26.000 Was that the company that hired people to go on Reddit and post comments or something?
01:45:31.000 If you ever wonder why it is that you're online and you're seeing just a massive, you're getting inundated with a bunch of comments that, like, come out of nowhere, you gotta watch out for these companies like ShareBlue or whatever.
01:45:40.000 Is that what it's called?
01:45:42.000 This is from Daily Beast.
01:45:43.000 The Hillary Clinton PAC spent a million dollars to, quote, correct commenters on Reddit and Facebook.
01:45:48.000 Exactly.
01:45:49.000 ShareBlue.
01:45:49.000 Yep.
01:45:50.000 That's right.
01:45:52.000 The internet is fake, dead internet theory.
01:45:57.000 So Reddit was what made videos go viral.
01:46:03.000 People would find a video, find it funny, put it on Reddit, and then it would go viral because the engine works that way.
01:46:07.000 If you liked it, you give it an upvote.
01:46:08.000 If you didn't, it went down.
01:46:09.000 So if the video had that X factor, more people are upvoting it than downvoting it, it starts skyrocketing.
01:46:14.000 Then people figured out how to game the algorithm.
01:46:17.000 It wasn't just getting upvotes, it was downvoting everything else.
01:46:20.000 So they started automating it.
01:46:21.000 They started using troll farms.
01:46:22.000 You see those videos where people have like a hundred cell phones on the wall and they're walking over and they're posting messages?
01:46:28.000 That's how they do it.
01:46:29.000 And so they would load up Reddit and they would go downvote, downvote, downvote, downvote everything in competition with it on each of their accounts from different phones.
01:46:36.000 And then all of a sudden, not only is this video getting a hundred upvotes, everything else got downvotes and boom, your video made the front page.
01:46:43.000 Crazy, right?
01:46:44.000 And this was like news like 10 years ago.
01:46:45.000 People were talking about this.
01:46:47.000 And technology has advanced so much.
01:46:49.000 Can you imagine?
01:46:51.000 Ritual Studio says Universe 25 is becoming a reality.
01:47:00.000 Paul Lam says I'm 24 looking at all this woke crap and all I can think about is preparing to move to the middle of the woods and building a home for my future family.
01:47:08.000 I'm telling you, man, if you got out of the city and you got chickens and got a little house, you're probably sitting pretty in your rocking chair on the porch, looking at your glorious chicken wealth.
01:47:19.000 Just purchased a solar power battery.
01:47:21.000 I purchased a thousand, what is it?
01:47:23.000 A thousand watt generator.
01:47:24.000 Solar generator.
01:47:25.000 Solar generator.
01:47:25.000 And then two big solar panels.
01:47:27.000 200 watt solar panels.
01:47:28.000 Well, we have a whole bunch of these big batteries with solar panels.
01:47:31.000 Yeah, I got kind of one of those.
01:47:32.000 You lay out the panels and it was really cool.
01:47:35.000 When we first bought them, this was a couple years ago, we laid out like 12 panels and we watched the thing charge up and the battery was like charged to full in three hours.
01:47:43.000 And that's actually pretty crazy.
01:47:45.000 That's a lot of electricity being generated.
01:47:47.000 And so we actually, I mean, we had a lot of panels.
01:47:49.000 It was a huge space being covered, but you could run an air conditioner off that.
01:47:53.000 Wow.
01:47:53.000 These little batteries up in the ground.
01:47:54.000 We actually did, one day the power went out here and it was super hot in the studio
01:47:59.000 and we ran, the air conditioner ran, I think for like 45 minutes
01:48:02.000 off of one fully charged battery.
01:48:04.000 So, you know, that's pretty good.
01:48:06.000 Get one.
01:48:07.000 It was $1,000.
01:48:07.000 If the apocalypse happens, I'm not going to be wasting my... I'm not going to waste my electricity on air conditioning.
01:48:12.000 Unless I absolutely have to for some reason.
01:48:14.000 Also, your phone is a battery.
01:48:15.000 Keep that in mind.
01:48:16.000 Like, my solar panels, they have USB-C ports on them, so you can just plug your phone right into the panel.
01:48:21.000 And it is a battery that you can use.
01:48:23.000 You said move out of the country, but where would you move to?
01:48:26.000 I don't know.
01:48:26.000 I'm thinking about it.
01:48:28.000 If it gets really bad, then I might consider it.
01:48:30.000 What's on your short list?
01:48:33.000 I really liked Russia when I went there.
01:48:35.000 El Salvador.
01:48:37.000 Bitcoin.
01:48:38.000 Yeah.
01:48:39.000 Crime's dropping.
01:48:40.000 The standard of living is increasing.
01:48:41.000 It's sounding pretty good.
01:48:43.000 Max Keiser and Stacey Herbert are like superstars there right now.
01:48:46.000 That's right.
01:48:47.000 So if I had to choose any other country, I'd just choose that.
01:48:48.000 I'd hit up Max and Stacey and be like, what's going on?
01:48:51.000 Bitcoin, yo!
01:48:51.000 I'm your new neighbor.
01:48:52.000 Yeah, Bitcoin.
01:48:54.000 What about you?
01:48:55.000 I don't know.
01:48:58.000 Serge and I, I think, both have a couple different citizenships, and so I grew up with this idea that if one of the countries was really bad, I could jump to the next one.
01:49:06.000 But are they both woke?
01:49:07.000 Well, I'm British Canadian and American, so it's not looking good on all fronts.
01:49:13.000 I'm not totally sure where I'll go after this, but that idea that, like, you could leave or, like, that this country wasn't forever is, like, definitely something I grew up with, you know?
01:49:22.000 Because that's why my parents left the countries they were in.
01:49:24.000 Yeah.
01:49:25.000 All right.
01:49:25.000 RadioactiveRat says, Tim, for the reverse shark tank, I work for a tiny animal shelter in Northern California, one of the poorest counties in the state.
01:49:33.000 As a rep of the shelter, could I request specific donations?
01:49:35.000 We desperately need.
01:49:37.000 Right now, I don't, I mean, I don't know what to say.
01:49:40.000 The idea I had was, people are ragging on Mr. Beast.
01:49:42.000 They were calling it demonic.
01:49:44.000 Yeah, literally.
01:49:44.000 What he did, and I'm like... It's insane.
01:49:46.000 We can do MILF Manor, or we can do Mr. Beast Cures Blindness.
01:49:51.000 I'll take Mr. Beast any day of the week!
01:49:53.000 Yeah.
01:49:54.000 So I said on the segment, we should do a reverse Shark Tank.
01:50:00.000 Reverse... Altruistic Shark Tank?
01:50:02.000 Is that what it's called?
01:50:03.000 The Shark Tank?
01:50:04.000 What's the show?
01:50:05.000 Dragon's Den and Shark Tank.
01:50:07.000 Dragon's Den's the British version of it.
01:50:08.000 Reverse Shark Tank where It's a bunch of philanthropists and people go in to express
01:50:14.000 why they need charitable donations.
01:50:15.000 I think we'd want to refrain from people who are like, my dad's dying of kidney failure needs- because that's like
01:50:20.000 really horrible to make someone come and beg to save someone's life.
01:50:23.000 But we want to keep it as a possibility.
01:50:25.000 I think it's more so like it's foundations and billionaires and nonprofits come in and talk about the work they're
01:50:30.000 doing.
01:50:30.000 So it's basically like I run a charity that saves animals.
01:50:34.000 Last year, we saved $3,000 by providing them emergency kidney dialysis, and they went on to live for five years with your contribution of X amount of dollars, and then the whales, because they're not sharks, because whales are nice, We'll determine how much they want to give, and if they do, and it'll be very similar.
01:50:49.000 It'll be like, how much, what are your expenses every year?
01:50:52.000 How much of the money that I donate goes to the actual cause of helping these animals versus paying your administrative costs?
01:50:57.000 And then they'll say it's 20%, which is, you know, really, really, really, really great.
01:51:01.000 80% of the contribute of the money goes towards, and I think that's the way to do it.
01:51:05.000 Yeah, because that would be that that show would function as the money source for people like Mr. Beast that don't have money.
01:51:11.000 It's basically just the same show, but with nonprofits.
01:51:13.000 be like, I have a plan to cure a thousand people's blindness with this technology, this
01:51:16.000 is the surgeries, and then the charity can fund the process.
01:51:19.000 And you can do it thematically, like one episode is all animal related, right?
01:51:21.000 And then you can contrast like how different services are helping this cause.
01:51:26.000 It's basically just the same show but with non-profits.
01:51:28.000 I mean Shark Tank could even do that and have an episode where it's charities that come
01:51:33.000 in and say, we run a charity that does these things.
01:51:35.000 We are looking for $500,000 to open a new building that will provide children with meals and a warm bed, blah, blah, blah.
01:51:42.000 It's impact investment.
01:51:43.000 It is the dawn of the impact investment age.
01:51:45.000 I think it's a great idea.
01:51:47.000 Great idea.
01:51:47.000 And then, I was saying, we should have a marketing movement where companies use their marketing budgets to compete for doing the best thing.
01:51:55.000 So I will pledge this.
01:51:57.000 Depending on how much revenue we generate this year, we will not be buying Times Square billboards.
01:52:03.000 Instead, we will use our marketing budget to, I don't know, make a video where we cure blindness.
01:52:09.000 Just rip off Mr. Beast because if everyone started doing that, we'd have no blind people, or with cataracts at least.
01:52:14.000 But we can do something like that.
01:52:15.000 We could use our marketing budget to do something really cool.
01:52:20.000 Impactful, yeah.
01:52:21.000 Yeah.
01:52:21.000 Helping injured athletes, you know, PRP, platelet-rich plasma, things like that.
01:52:25.000 All kinds of things.
01:52:25.000 I mean, that's expensive.
01:52:27.000 There's probably people more in need we could help and problems we could solve.
01:52:31.000 Well, and there are probably people you've had on the show who would partner with you to do stuff, right?
01:52:34.000 Better idea.
01:52:35.000 We buy 500,000 lottery tickets.
01:52:41.000 Uh-oh.
01:52:41.000 Yeah, and give them out.
01:52:42.000 And somebody's bound to win the lottery, you know?
01:52:45.000 And then use that money towards whatever?
01:52:47.000 No, no, they can do whatever they want, you know?
01:52:48.000 It's like, we just give out lottery tickets.
01:52:49.000 That's our philanthropy.
01:52:51.000 We're a non-profit that gives homeless people lottery tickets.
01:52:54.000 Ooh, fingers crossed.
01:52:56.000 No, but I think we'll do that.
01:52:59.000 I think instead of buying billboards, we will produce a video akin to a Mr. Beast charitable giving.
01:53:05.000 Because someone said, put your money where your mouth is, Tim.
01:53:08.000 If Mr. Beast is doing it and you're telling people should, you should.
01:53:10.000 And I'm like, okay.
01:53:11.000 Yeah, right.
01:53:11.000 Let's figure it out.
01:53:12.000 We'll figure out what to do.
01:53:14.000 The thing about the blindness thing as well, Ian, is that it's really cheap and the surgery takes like 10 minutes.
01:53:20.000 It's not really that expensive.
01:53:22.000 Obviously it's expensive, a couple grand, but some people don't have a couple grand and it takes like 10 minutes.
01:53:26.000 So that's why they chose that to be the thing they would do for a lot of people.
01:53:29.000 Man, let's kick it into overdrive.
01:53:31.000 If it's that easy to do, and people just need a little kickstart, yes.
01:53:34.000 All right, Cobe Johnson says, Tim, long-time fan since the old studio days.
01:53:39.000 You should get Peter Zahan on one evening.
01:53:42.000 Get great insight into geopolitics and the end of globalization.
01:53:46.000 Ian could have a field day talking about graphene.
01:53:48.000 Look him up.
01:53:48.000 Maybe we should reach him out.
01:53:49.000 He was on Rogan a couple weeks ago, three weeks ago.
01:53:51.000 Oh, cool, cool, cool.
01:53:52.000 On Rogan.
01:53:53.000 That's so funny.
01:53:54.000 He was right on top of him.
01:53:55.000 They were running down the street.
01:53:56.000 Just making a move.
01:53:58.000 Amenthy says, Bill in Last of Us was gay in the game, but it wasn't a focus.
01:54:02.000 And his relationship with Frank was strained.
01:54:05.000 Bill also doesn't die in the game.
01:54:06.000 I like Offerman in the role, but I hate these changes.
01:54:08.000 Yeah, I think in the game what happened, like Frank got bit, and then Bill had to kill him or something?
01:54:13.000 No idea.
01:54:13.000 Something like that.
01:54:15.000 All right.
01:54:16.000 Merle Gray says, I mean, that's kind of what I was saying.
01:54:18.000 the scenes y'all described with the dad and the notebook get the emotion of love, how
01:54:22.000 to use because it involves a sacrifice just like when Jesus Christ died on the cross.
01:54:27.000 I mean that's kind of what I was saying like I don't, you know seeing like two grown adult
01:54:32.000 men, I don't feel anything over that loss because I view them both as self-sufficient
01:54:39.000 capable men.
01:54:41.000 And it's like... It also seems cheesy and corny.
01:54:46.000 Forced.
01:54:46.000 Yeah, forced.
01:54:47.000 Well, and I think there's nothing noble in this.
01:54:49.000 Yes, there's nothing noble, yeah.
01:54:50.000 There's nothing that we're like, man, if I were in that circumstance, like, I hope that I have the character to, like, like, he's letting someone who's apparently dying die, and then he's also killing himself, so he doesn't have to deal with the post-apocalyptic world.
01:55:03.000 And the political motive behind it, you know, it's overtly woke.
01:55:07.000 Yeah, that turned people off, probably.
01:55:09.000 There's no actual love building in it.
01:55:13.000 There's nothing I see that actually shows a relationship of love.
01:55:17.000 Maybe it's because they're both men, or maybe it's because they just didn't write it well enough.
01:55:20.000 That's the problem with a lot of modern art, is they tell you that they're in love, and then you're supposed to have feelings, but they don't actually play it out over the course of the process.
01:55:27.000 I'll put it this way.
01:55:29.000 Episode 2, Tess sacrifices herself to save them and there's this really disgusting scene where the zombie, the fungus is coming out of its mouth and then he kisses her and it's going down her throat or whatever.
01:55:40.000 But like, I think her name's Anna Tore of the Actress.
01:55:43.000 Her emotion and her acting in the, I'm gonna sacrifice myself for you and she's knocking over like fuel cans or whatever.
01:55:49.000 It's like, you can feel it.
01:55:51.000 And then it's really sad when it blows up and then Joel is like, The one thing he had in this destroyed world is now dead.
01:55:57.000 But, like, this episode with these two guys, there's, like, literally nothing there.
01:56:02.000 It shows them arguing with each other.
01:56:03.000 Were they both healthy?
01:56:05.000 Not in the end.
01:56:05.000 In the end, one guy was sick and dying.
01:56:07.000 He's, like, in a wheelchair or something.
01:56:09.000 How was Bella Ramsey in that show?
01:56:12.000 She's cool, yeah.
01:56:12.000 Is she good?
01:56:13.000 I think the show's really good.
01:56:14.000 Look, I think that episode was really, really good.
01:56:17.000 I just, I don't connect with that emotionally.
01:56:20.000 I just can't understand it.
01:56:21.000 I'm not trying to be a dick, it's just like, if it was a woman and a man, there would be a dynamic there I would relate to or understand.
01:56:28.000 And maybe that's just me, but I will say this, that movie Bros, where it's like a gay rom-com, flopped.
01:56:34.000 Billy Eichner movie.
01:56:34.000 If you're going to do a gay romance thing, The Last of Us Episode 3 is how you would do it.
01:56:40.000 Like, the guy, the Prepper guy isn't actually gay.
01:56:43.000 He's been alone for three years.
01:56:45.000 Humanity's been wiped out.
01:56:47.000 And then when this guy shows up, I guess, who is gay, he's like, I've only ever been with a girl or whatever.
01:56:51.000 And so it's like, it's a weird dynamic, I guess.
01:56:53.000 It's interesting.
01:56:54.000 But it's like, the Bros movie was just weird, lewd, kind of over-the-top gross.
01:57:01.000 I think people are just seeing it too much.
01:57:03.000 It's overexposed, right?
01:57:04.000 Because I think there was that Disney movie with the gay character or something that was supposed to be, you know, like a big hit, but then it completely flopped everywhere and then didn't do well at the box office.
01:57:16.000 So I think people are just fed up of the, you know, overt political messages in culture and in Hollywood.
01:57:23.000 I think the issue is it's just relatable to a very, very, very small portion of the population.
01:57:29.000 Yes, yes.
01:57:30.000 If it was his brother in this and it's like, you know, one day he sees his trap is sprung and he looks out and he sees his brother's there and he's like, oh my god, John, where have you been?
01:57:40.000 And then it's like he's smiling and he's like, I've been alone for three years and you made it back.
01:57:44.000 I thought you died.
01:57:45.000 And then in the end, his brother's dying.
01:57:47.000 Like, I would relate to that and be like, oh man, that's so sad, you know?
01:57:49.000 It sounds like a gay fantasy.
01:57:53.000 If they wrote it like, gay guy meets straight guy, but straight guy falls in love with him.
01:57:58.000 Now he's gay too.
01:57:59.000 That's what I think too.
01:58:00.000 Yeah.
01:58:00.000 Anyway, anyway, let's read some more.
01:58:03.000 All right.
01:58:03.000 Curtis says, when Putin said something to the effect of, America is full of Satanists, I couldn't disagree knowing that porn is one of America's top exports.
01:58:13.000 A lot of people feel that way.
01:58:14.000 Yeah.
01:58:15.000 Yikes.
01:58:17.000 All right, we'll grab some more Super Chats.
01:58:19.000 Rod Undefined Rod says, Tim, the gun channels have been destroyed by YouTube this past week.
01:58:23.000 They even struck Ian McCollum for a suppressor video, misspelled deliberate.
01:58:30.000 We're planning on having some people on talk about this.
01:58:33.000 So Yeah, we'll see.
01:58:36.000 I don't want to say too much because we're in the process of booking everything.
01:58:38.000 I don't know when it's going to happen, but they're tweeting about it.
01:58:40.000 I just went to YouTube.
01:58:41.000 Tim Pool calls out Hassan Abi.
01:58:44.000 What is this?
01:58:45.000 Calls out?
01:58:45.000 I never called him out.
01:58:47.000 Tim Pool calls out Hassan Abi.
01:58:49.000 It's Hassan.
01:58:49.000 It's on Hassan's channel.
01:58:50.000 What?
01:58:51.000 I didn't agree with him.
01:58:51.000 Hassan, come on, man.
01:58:52.000 Wait, wait, wait.
01:58:54.000 Hassan wrote that?
01:58:54.000 Yeah, I think it's on his show.
01:58:55.000 Calls out.
01:58:56.000 I literally agree with him like four or five times.
01:58:58.000 I say he's right.
01:58:58.000 Yeah, but if you say agree with, you're not going to get as many clicks.
01:59:01.000 Come on, Tim.
01:59:02.000 He says I call him out.
01:59:04.000 Tim Pool calls out Hasan Abi, yeah.
01:59:06.000 Shout out, maybe.
01:59:07.000 I was like, Hasan complaining about how some YouTube guy's gotta make this video.
01:59:11.000 He's right.
01:59:12.000 Controversy gets clicked.
01:59:13.000 That's a good point, that if the title was Tim Pool's Pretty Cool, it might get more views.
01:59:20.000 I disagree.
01:59:21.000 I think at this point, if Hasan wrote Tim Pool actually agrees with Hasan over Mr. Beast, people would click it to be like, yo, what's this about?
01:59:28.000 And the word agree is really big.
01:59:31.000 It was the craziest thing when Hassan's like, this fills me with rage that it's up to some YouTube guy to give these people this 10-minute procedure, and I'm like, he's right.
01:59:42.000 Why is our society where we need entertainment in the form of helping people with this 10-minute procedure, why can't we figure out a way to solve the problems that are happening in this country?
01:59:51.000 So I don't know if I agree with him on the solution to these problems, but he's right to call out it as it is.
01:59:57.000 Look, I will take Mr. Beast over Milf Manor any day.
02:00:00.000 But I think we can do better.
02:00:02.000 And as long as we're giving $100 billion to Ukraine, someone pointed out, let's just give $2.5 million to every homeless veteran.
02:00:09.000 Oh, yeah, someone said, sure.
02:00:10.000 How about we give him 500,000?
02:00:12.000 Mr. Beast gave, solved 100,000 or 1000 people's blindness.
02:00:16.000 He could have bought an M1 Abram for the Ukrainians.
02:00:19.000 What was he thinking?
02:00:21.000 That's a good one.
02:00:22.000 Yeah, AT Ukraine isn't going to stop anytime soon.
02:00:25.000 No, that's for sure.
02:00:27.000 I gotta look at that video.
02:00:28.000 Yeah, I'm gonna watch it later.
02:00:29.000 I'll send it to you.
02:00:30.000 I think I said, like, I probably disagree with him politically on, like, how we solve it, but he's completely right about that being an issue.
02:00:36.000 And people are ragging on him.
02:00:37.000 It's like, he's not mad at Mr. Beast.
02:00:38.000 Yeah.
02:00:39.000 He's mad at, like, the system, and I agree.
02:00:41.000 He's gotten under your skin.
02:00:43.000 That's the magic of his marketing and titling here.
02:00:45.000 I'm pissed that we gave $100 billion to Ukraine before fixing our bridges, our roads, our pipes, our schools.
02:00:50.000 You got kids drinking lead and all this other garbage.
02:00:53.000 And then these leftists come out and they're like, can we fix these pipes?
02:00:56.000 And everyone's like, yes.
02:00:58.000 Yes, actually, we agree with fixing the infrastructure in this country.
02:01:01.000 Instead, for some reason, our politicians just go blow up kids overseas.
02:01:04.000 Yeah, but now the leftists are saying, we can do both.
02:01:07.000 We can fix the lead pipes and we can aid Ukraine.
02:01:11.000 Just print more money.
02:01:11.000 I don't know, maybe that's why he's saying, I didn't criticize him, but maybe he's criticizing me because I said we shouldn't be funding the war in Ukraine and we should be funding... I'm telling you, this title has really gotten you.
02:01:21.000 He's won.
02:01:22.000 Yeah, I saw this meme today saying, remember when Donald Trump said he was, they were saying he was going to cause World War III and now they're like, yeah, let's start World War III.
02:01:30.000 Yeah.
02:01:30.000 All right, everybody, if you haven't already, would you kindly smash that like button, subscribe to this channel, share the show with your friends, become a member over at TimCast.com so you can check out our uncensored members-only show that's coming up in about one hour.
02:01:42.000 We record it right after we wrap here, and then we upload it, and it's not so family-friendly.
02:01:46.000 You can follow the show at TimCastIRL everywhere.
02:01:49.000 You can follow at TimCastNews on Twitter.
02:01:51.000 Do it.
02:01:51.000 You can follow me personally everywhere at TimCast.
02:01:54.000 Samira, do you want to shout anything out?
02:01:56.000 Yes, follow me on Twitter, at Samira Khan.
02:01:59.000 S-A-M-E-E-R-A-K-H-A-N.
02:02:01.000 And tune in for the after show.
02:02:04.000 Right on.
02:02:05.000 I'm Hannah Clare.
02:02:06.000 I'm a writer for TimCast.com.
02:02:08.000 You should go to TimCast.com and click on the read tab to see articles from me and the rest of our team.
02:02:12.000 You can follow me personally on Instagram at hannahclare.b.
02:02:15.000 You can follow me on Twitter at hcbrimlow.
02:02:18.000 And you should definitely follow at TimCastNews on Twitter.
02:02:22.000 It's excellent.
02:02:23.000 Go there immediately.
02:02:24.000 I'm Ian Crossland.
02:02:25.000 Follow me on the internet anywhere you can find Ian Crossland.
02:02:27.000 I'm probably that guy.
02:02:28.000 And God, prayers to the people of Dnipro, up and down the river, and Kherson.
02:02:33.000 It's a rough... It's the best for all those people, everyone there.
02:02:37.000 Healthy, calm.
02:02:39.000 Let's make it happen.
02:02:42.000 And yeah, you can find me anywhere at Surge.com.
02:02:45.000 I took a poll to see if you guys would like to see my music in the future.
02:02:48.000 We'll see how that poll goes and I'll abide by that.
02:02:50.000 But yeah, it was a good show.
02:02:52.000 Thanks, guys.
02:02:52.000 See you in the after.
02:02:53.000 We will see all of you over at TimCast.com.
02:02:56.000 Thanks for hanging out.