Timcast IRL - Tim Pool - March 05, 2022


Timcast IRL - Russia Bans Facebook, James Lindsay Discusses WEF And CRT


Episode Stats

Length

2 hours and 5 minutes

Words per Minute

212.71349

Word Count

26,653

Sentence Count

1,986

Misogynist Sentences

51

Hate Speech Sentences

63


Summary

In this episode, we talk about the latest in the Ukraine crisis, the Freedom Convoy making its way to the Middle East, and the latest conspiracy theory about Rachel Dolezal. We also hear from James Lindsay of Critical Race Theory and Seamus of Freedom Tunes.


Transcript

00:00:00.000 It's been nothing but Ukraine war, Russia war, Ukraine, man it's crazy.
00:00:15.000 There's other stuff going on in the world, but this one does seem to be the most pressing.
00:00:19.000 We do have the Freedom Convoy making its way, and we do have a bunch of other crazy stories, too, but, uh... Man, I've been looking back at all the news, and it's just endless talk of what's going on in Ukraine because, obviously, people keep saying World War III.
00:00:31.000 The latest, Russia has apparently blocked access to Facebook.
00:00:34.000 So it's like, as all these companies say, we're going to censor you or sanction you.
00:00:39.000 Russia is just like, okay, get out.
00:00:40.000 We're done with this.
00:00:41.000 But it does feel like Russia is becoming increasingly more isolated because even China seems to be backing off a little bit.
00:00:47.000 Brazil and India are kind of like, yo, we're neutral on this one.
00:00:49.000 So I don't know if Russia will be able to hang out for much longer, but I do think they may end up winning at least their objectives here.
00:00:57.000 Now, Donald Trump has come out and said he told Putin and Xi he would nuke them if they went after Ukraine or Taiwan, and so basically he's confirming the story, which is crazy.
00:01:05.000 And then we've got a bunch of other crazy stuff.
00:01:08.000 One thing I really want to talk about tonight, there's this ad I got on Twitter for an artificial intelligence, like, girlfriend or friend.
00:01:16.000 It doesn't say girlfriend, it says friend.
00:01:18.000 And it's just really creepy.
00:01:20.000 So I want to remind everybody, don't date robots!
00:01:24.000 Yeah.
00:01:25.000 He's like, yes!
00:01:25.000 Hey!
00:01:25.000 Those things.
00:01:26.000 about the World Economic Forum, Critical Race Theory with none other than the
00:01:30.000 foremost expert on critical race theory, author of Race Marxism, James Lindsay.
00:01:36.000 Yeah, he's like yes!
00:01:38.000 The World Economic Forum.
00:01:40.000 Yeah, yes.
00:01:42.000 What are you doing?
00:01:43.000 Who are you?
00:01:43.000 What's going on?
00:01:44.000 I'm somebody.
00:01:45.000 You're a sword fighter.
00:01:46.000 I am a sword fighter, a little bit.
00:01:48.000 I mean, we were sword fighting a little bit, not like that.
00:01:49.000 James disarmed me with a Wakazashi.
00:01:54.000 Oh, cool.
00:01:56.000 That's right.
00:01:57.000 So, you know, keeping up with things, traveling a lot, talking a lot, getting around the country, being the world's, I guess, foremost expert in critical race theory, as I've been billed.
00:02:06.000 Smeagol must take names that he's given.
00:02:08.000 Well, it makes the show seem more prestigious, you know?
00:02:10.000 And I'm like, this guy's the best.
00:02:12.000 A white guy is the foremost expert in CRT?
00:02:15.000 How offensive.
00:02:16.000 How offensive.
00:02:17.000 I'm a non-practicing white, though.
00:02:18.000 It's okay.
00:02:20.000 I don't believe in practicing my race.
00:02:22.000 Go figure.
00:02:22.000 He's a lapsed white.
00:02:25.000 So are you, like, baptized white and then just stop doing it?
00:02:28.000 Or fall away on your team?
00:02:30.000 Yeah, I think that's how that works.
00:02:32.000 Raised white for a few years, but then, you know, left the race.
00:02:35.000 Yeah.
00:02:35.000 Like, what's her face?
00:02:37.000 Rachel Dolezal.
00:02:38.000 Yeah, but no, she tried to adopt another race.
00:02:41.000 I just checked right out of the whole system.
00:02:43.000 I started clicking other on all the boxes.
00:02:46.000 Yeah, you just gotta check them out.
00:02:48.000 So we'll be talking to James about a lot of stuff.
00:02:50.000 We got to Seamus of Freedom Tunes.
00:02:52.000 Seamus of Freedom Tunes.
00:02:53.000 Yeah, I make cartoons.
00:02:54.000 I have a YouTube channel called Freedom Tunes if y'all want to check that out.
00:02:57.000 We released a cartoon yesterday on Joe Biden's State of the Union and we're Working on some tunes next week about the industrial military complex as well as the woke military training So I think you guys will like that go there subscribe stay tuned and also really excited for tonight's show Ian Crosland over here nothing too deep to report yet, but I am on board and reporting for duty
00:03:17.000 I am stoked as well, because James knows how to talk and he knows what he's talking about.
00:03:20.000 This is gonna be a great conversation tonight.
00:03:22.000 We love having James.
00:03:23.000 Yeah, I'm kind of tired, so I'm just gonna be like, James, just tell us everything.
00:03:26.000 And I'm gonna sit back here and get my phone.
00:03:28.000 I've been playing Lemmings recently.
00:03:29.000 Dude, Lemmings is... Are you kidding me?
00:03:32.000 That's one of my favorite games.
00:03:33.000 Yeah, it was amazing.
00:03:33.000 That game was amazing.
00:03:34.000 Like, are you an emulated version?
00:03:36.000 No, the mobile version's weak.
00:03:38.000 Yeah, it's like you can make stairs.
00:03:40.000 Dude, we have a Sega Genesis downstairs.
00:03:43.000 I'll get one of the cartridges.
00:03:44.000 You guys need to mod that and get the NPC faces on the Lemons.
00:03:47.000 Oh, that's cool.
00:03:48.000 That's amazing!
00:03:49.000 No, for real, I could do that in 20 minutes.
00:03:51.000 They support all the things, so just falling off the cliff.
00:03:55.000 That's a brilliant idea.
00:03:56.000 We're not paying you for that, but we will use it.
00:03:58.000 We'll totally take it.
00:03:59.000 We should make a game where it's kind of like Lemmings, but it's NPCs.
00:04:05.000 So for those who don't know what Lemmings is, 30 little lemmings will walk around and walk back and forth, and you can assign tasks to them.
00:04:12.000 So dig a hole, or block someone, or shoot rope across a gap or something.
00:04:17.000 Paint Black Lives Matter on a street.
00:04:20.000 Unattended, they just keep walking until they die.
00:04:22.000 You gotta try and guide them to the right place.
00:04:24.000 Exactly, so they just walk and the idea is you have to get them from the entrance to the exit of the level with as few as possible dying.
00:04:30.000 It's a great game.
00:04:31.000 It's really a great game.
00:04:31.000 The mobile version's really weak because in the original game there was like 20 different jobs you could give them, and in the mobile version there's like four.
00:04:39.000 Oh, yeah.
00:04:39.000 I mean, there's actually maybe like seven or eight.
00:04:41.000 Yeah, it's very strategic because you have a limited number of jobs you can assign, so like on some levels there can only be like three climbers or two diggers.
00:04:50.000 It's awesome.
00:04:51.000 Lemming is a fun game, we should make it, and PCs.
00:04:52.000 Alright, before we get started talking about, you know, more serious stuff, but probably not, go to TimCast.com, become a member, help support the work of all of our journalists as well as everyone on this show.
00:05:01.000 We are principally supported by website memberships.
00:05:04.000 You know, we started this site a year ago, just over a year ago, and it was funny because for the longest time I was like, We rely too heavily on YouTube, on ads, and that is a huge weakness, especially with activists going crazy.
00:05:16.000 And so we set up the website, which has become our main way that we fund and maintain this operation, which gives us a lot of leeway and provides us that safety net.
00:05:26.000 And it also allowed us to create an editorial department where we write articles and do our own sourcing and our own fact-checking and our own original reporting.
00:05:32.000 So that's all thanks to you.
00:05:33.000 Plus, we've released books and a new show called Pop Culture Crisis.
00:05:36.000 So if you want to support our work and check out members-only shows from the Tim Castaro podcast, go to timcast.com.
00:05:42.000 But don't forget to smash the like button, subscribe to this channel, share the show with your friends.
00:05:46.000 Yo, let's talk about this year's censorship.
00:05:48.000 We have this story from The Guardian.
00:05:49.000 Russia blocks access to Facebook and Twitter.
00:05:52.000 Oh man.
00:05:53.000 Good.
00:05:53.000 The move to block Facebook and Twitter comes as the government passed a bill that criminalizes fake reports against the war.
00:05:59.000 You know what's funny?
00:06:01.000 I'm not a fan of censorship for the most part.
00:06:03.000 You know, I think it's, obviously there's nuance here.
00:06:05.000 Censoring some stuff is important and good.
00:06:07.000 Like we talk about child abuse, criminal acts and stuff.
00:06:10.000 That's where the censors are really supposed to be like, okay, that is crime, like against children and humanity.
00:06:14.000 Get rid of it.
00:06:15.000 But when I see Russia banning Facebook and Twitter, I feel this kind of catharsis of like, I wish, I wish, just get rid of it!
00:06:24.000 I know, I hear you.
00:06:25.000 I hate it so much.
00:06:26.000 But it is, truth be told, Twitter and Facebook are still pretty good, despite all of the bad.
00:06:31.000 And therein lies the real challenge, the ability for regular people to actually have a voice, even if they do face censorship.
00:06:37.000 And that's why we resist the censorship here, because the power granted to the people by these platforms was good.
00:06:43.000 And now they're trying to take it back because they realize they made a mistake.
00:06:46.000 Well, I'm generally against censorship.
00:06:49.000 I generally agree with what you just said.
00:06:52.000 The only really kind of interesting thing, I don't know that it's even directly connected to this, the only really interesting thing that I think, I tweeted this this morning, I think people got really mad about it, is that one of the utilities of Facebook in particular, more than Twitter, but Twitter has this too, is if you have some controversial narrative thing going on, some controversial activity, whether it's Ukraine, whether it's a virus, whatever it is, One epistemological tool, to sound very philosophical, that's emerged in the past few years is whatever Facebook will ban is probably true.
00:07:25.000 So if you want to find out what's going on with Ukraine or with a virus or with the World Economic Forum or whatever it is...
00:07:33.000 Or as a critical race theory, you probably need to just post some edgy stuff on Facebook and see what they ban.
00:07:38.000 You've got to add a little bit of that age to your impersonation.
00:07:42.000 Like it talks like this.
00:07:44.000 Well, I don't have any marbles in my mouth.
00:07:45.000 I just drank an energy drink for weeks.
00:07:47.000 Yeah, do we have marbles?
00:07:49.000 You gotta stuff your cheek with like toilet paper or marbles.
00:07:51.000 Yeah, I'm very excited.
00:07:52.000 World Economic Forum!
00:07:54.000 I hear what you're saying about the utility of Facebook in that sense.
00:07:56.000 It's also true of the fact-checkers that they're in bed with.
00:07:59.000 I found them incredibly useful in the past.
00:08:01.000 I remember there was one instance about two years ago where I was doing research on a video about the scandal with Biden's son in Ukraine and the investigator being fired by Joe Biden effectively because he said he would withhold aid to their country.
00:08:15.000 And one of the fact checkers that I found, and I was looking up, I wanted to look up debunkings of this to see how strong the argument against the potential for foul play was.
00:08:24.000 And the response was, well, it's true that Joe Biden told them to withhold the funding for Ukraine unless they fired this prosecutor, but that was the official policy of the Obama administration in general.
00:08:35.000 It's like, oh, right, because Biden had nothing to do, the vice president of the administration had nothing to do with the policy of the Obama administration.
00:08:42.000 No, no, no, no, but think about that for a few seconds.
00:08:44.000 Yes, Joe Biden did try to push an illegal quid pro quo, but Obama told him to do it.
00:08:48.000 Yeah.
00:08:49.000 So it's like, okay, well, pens have done this and then it would have been okay.
00:08:53.000 Like, cause Trump wanted it to happen.
00:08:54.000 It's just hilarious because in these fact checks, they will always try to use their best argument against something, which in a number of cases is actually true and they reveal their hand in doing so.
00:09:03.000 And so there was a lot of utility there.
00:09:05.000 Mostly false.
00:09:06.000 context missing. Yeah. We don't want you to believe it.
00:09:09.000 Mostly false is like my favorite. Because there's there was one instance where on
00:09:13.000 PolitiFact there were two quotes one was from Trump and one was from Bernie and
00:09:17.000 they were almost verbatim identical and Trump's was mostly false and Bernie's
00:09:21.000 was mostly true and it was something like Trump saying you know like
00:09:24.000 inner-city black youth have a 51% unemployment rate
00:09:27.000 And Bernie said, like, the same thing, but Bernie's was mostly true.
00:09:31.000 And they're like, well, Bernie does get this wrong.
00:09:33.000 It is.
00:09:34.000 While Trump gets most of this wrong, some of it is.
00:09:36.000 Along the lines of, like, who is saying it defines if it's true or not.
00:09:39.000 That's very disturbing to me.
00:09:40.000 It's true.
00:09:41.000 There are other examples.
00:09:42.000 How do you quantify truth?
00:09:44.000 You know, when they say mostly peaceful protest, how do you quantify when it's most?
00:09:49.000 93% peaceful, 7% fire.
00:09:52.000 Isn't that hilarious?
00:09:55.000 That was the actual number.
00:09:57.000 And that's the thing.
00:09:58.000 Well, they used that number as if it was some kind of own.
00:10:01.000 Like only 7% of them are violent.
00:10:03.000 Are you insane?
00:10:04.000 This is a country of 330 million people and these protests are popping up everywhere.
00:10:09.000 Even if it is only 7%, that's insane.
00:10:11.000 It's like the weather.
00:10:12.000 Okay, so this is the misconception people have is when they say it's a 40% chance of rain They're not saying flip a coin.
00:10:19.000 It might rain.
00:10:20.000 They're saying 40% of the day might be rain.
00:10:23.000 I don't know if that's true That's why I've heard like people they've done these internet arguments.
00:10:26.000 That's how you get a view the 93 to 7% thing with with BLM They're not saying of the 100 protests 93 were peaceful They're saying of the 100 protests, in all of them, 7% of the time was extreme violence.
00:10:39.000 Which again conflicts with that I've cited on the show before, which is that 70% of major city police departments reported officers being injured during these protests, and you're telling me only 7% of them were violent?
00:10:51.000 I think in combat, or in war, most of the time it's not combat.
00:10:54.000 Then you have a burst of it, probably less than 7% of the time a soldier, even a combat soldier, is in combat.
00:11:01.000 7% is a lot of time for destruction to be going on.
00:11:05.000 Yeah, I mean That means there's like a good half hour out of the protest.
00:11:09.000 Notably a violent protest.
00:11:10.000 Yes, smashing windows or burning down buildings.
00:11:13.000 Not just buildings, you have to burn down buildings specifically in a poor black neighborhood for Black Lives Matter.
00:11:20.000 Because insurance will take care of it.
00:11:21.000 Because insurance means anyone can destroy anything for any reason and it'll be fine.
00:11:25.000 We're talking about, like, if you have faith in the person, then you believe what they're saying, and if Trump says it, you're like, nah, he's lying.
00:11:32.000 I came across this internet video last night of, you know, Aileen Waranos?
00:11:35.000 You guys know this woman is?
00:11:35.000 She was like a hitchhiker, and she carried a gun and killed seven guys.
00:11:40.000 She was like, I had a gun, and if the guys attacked me, I would defend myself.
00:11:43.000 And the cops knew it, and they let me keep killing.
00:11:45.000 Eventually, after seven deaths in a year, they arrested her and put her to death.
00:11:48.000 Death row.
00:11:49.000 So, she gave this speech, she was taking all this meth, her skin's all ripped apart, and she's like, I had to do it!
00:11:53.000 I had to do it!
00:11:54.000 And you're like, ah, this evil woman, you know.
00:11:56.000 But then, they deepfake Kristina Pozitsky's face on it.
00:11:59.000 Kristina P. Uh, the beautiful comedian.
00:12:02.000 And it's this beautiful woman, it's still Eileen's story, but it's with a beautiful face.
00:12:07.000 And you're like, I understand why you had a gun and why you were defending yourself.
00:12:11.000 Because when you see a beautiful person saying it, it has a different meaning than an ugly person.
00:12:16.000 It is terrifying to think that that is real.
00:12:19.000 It's on Instagram, this guy Brian Monarch, his page.
00:12:23.000 What does it have to do with what we're talking about?
00:12:24.000 It's a deep fake of like, you're saying Biden says something and everyone's like, yeah, and then Trump does it and they're like, oh, it's evil.
00:12:31.000 All of a sudden, but it's the same exact, and in this case, it's the exact Brian Monarch.
00:12:35.000 It's the exact same voice.
00:12:36.000 It's her with a deep fake image, and it has a different meaning.
00:12:39.000 It's very, well, and this is one of the huge problems, one of the many huge problems with Hollywood, but they're always casting extremely attractive people even to do completely reprehensible things.
00:12:48.000 Scroll down, it's from 25 weeks ago.
00:12:49.000 And so I think it makes those things seem more normal or likable to a lot of people.
00:12:53.000 25 weeks ago is a while.
00:12:54.000 Yeah, it's been up for a long time.
00:12:56.000 This is repressive tolerance, y'all.
00:12:57.000 Of course, I have to do the stupid philosophy thing, right?
00:13:00.000 So, 1965, leader of the New Left, Herbert Marcuse, writes Repressive Tolerance.
00:13:07.000 You'll see.
00:13:08.000 Well, this maybe you won't see, but when we do Trump and Biden, you will.
00:13:11.000 And you gotta see, this is the deepfake.
00:13:12.000 Yeah, yeah, yeah.
00:13:14.000 I mean, we'll talk about this.
00:13:15.000 I'm kind of confused as to what we're talking about.
00:13:18.000 Once you see the video, you'll know.
00:13:19.000 It's three minutes.
00:13:20.000 It's her, like, basically on death row, and they're giving her a final interview, and she's like, I've come to peace with it.
00:13:23.000 You know, you're all a bunch of this and that.
00:13:25.000 And it's just so bone-chilling to see that same sentiment with a beautiful face.
00:13:30.000 Because I was getting, like, empathy.
00:13:31.000 I was like, wow, maybe she shouldn't have been put to death.
00:13:34.000 Right, so this is what's going on.
00:13:36.000 There's been a very—actually, I hate to maybe pin this on the guy because I thought he was funny too, but I call this the Jon Stewart effect.
00:13:44.000 There's been a relentless campaign for a very long time to make conservatives bad, ugly, and stupid to the general population.
00:13:50.000 So Trump is bad, ugly, and stupid by this metric.
00:13:53.000 Bernie is a leftist, he's old, he's doing the best he can, he's got the right message, but this is repressive tolerance.
00:13:58.000 Herbert Marcuse, a leading leftist or Marxist thinker of the 1960s, writes an essay in 1965 called Repressive Tolerance.
00:14:05.000 The argument, and I kid you not, I don't exaggerate at all, the thesis statement of the argument is literally, movements from the left must be tolerated even when they're violent, movements from the right must not be extended tolerance at all.
00:14:17.000 And so this is why you see that disparity.
00:14:19.000 We live in Marcuse's world.
00:14:21.000 We live in the neo-Marxist architecture that this guy created.
00:14:26.000 He's very influential.
00:14:27.000 He wrote a book in 1964 called One-Dimensional Man, sold 300,000 copies in the first year, if I'm not mistaken.
00:14:32.000 This stat might be over its lifetime, but I think it was over the first year.
00:14:36.000 Very influential.
00:14:37.000 Widely credited as the most influential leftist thinker.
00:14:39.000 So he laid out an architecture where the leftist line of thought is, when we do it, It's good.
00:14:45.000 It must be tolerated, and anybody who doesn't tolerate us is a fascist.
00:14:49.000 If the right does it, it's bad.
00:14:51.000 Period.
00:14:52.000 It must not be tolerated.
00:14:53.000 I wasn't exaggerating.
00:14:54.000 If we pulled up the essay, you could actually find the quote that that is the thesis statement of the essay.
00:14:59.000 Movements from the left must be extended tolerance.
00:15:02.000 Movements from the right must not be extended tolerance.
00:15:04.000 To the point, he says, do you prevent right-wing people from even being able to form the thought?
00:15:10.000 In other words, he says, this is censorship and even pre-censorship, which I'm not even sure what pre-censorship is.
00:15:16.000 so that the thought can't even enter their head.
00:15:18.000 And this is the double standard that we run into where you see the mostly true, mostly false for Bernie versus Trump.
00:15:25.000 This is the double standard that the left has erected, and everybody tries to point out, oh, they're hypocrites, oh, they're hypocrites.
00:15:30.000 And finally, it's so exciting for me to see it.
00:15:32.000 Finally, people are catching on.
00:15:33.000 It's not hypocrisy, it's hierarchy.
00:15:35.000 They're actually asserting, we are better people than you, we are smarter people than you, we are more moral people than you, and we are saner people than you.
00:15:42.000 And therefore, you have to put up with all of our crap, And we're going to put up with literally none of your crap.
00:15:47.000 And that's why you see that dichotomy of mostly true, mostly false, when it's literally the same statement from Trump and Bernie.
00:15:54.000 This is why I get frustrated when I see Joe Rogan apologizing to these people.
00:16:00.000 Cannot apologize.
00:16:01.000 Because when he did, it only made things worse.
00:16:05.000 It made the story bigger, and it made the story more pronounced.
00:16:08.000 It made the story last longer.
00:16:10.000 And you know what I love is I actually really enjoy when they get triggered and they flip out.
00:16:15.000 I'm trending again.
00:16:17.000 So this is the craziest thing.
00:16:18.000 I was trending for like, I think I have been swatted more times than anyone else.
00:16:25.000 In like this short of a period, I have been trending more and I'm like, what am I even doing?
00:16:29.000 It's really simple, actually.
00:16:31.000 I'm not calling anybody any slurs.
00:16:32.000 I'm not insulting people.
00:16:34.000 But they're getting extremely angry at me because I don't care about them.
00:16:40.000 And so when I tweeted, I tweeted, I despise appeals to emotion in response to a translator who was crying, talking about Zelensky.
00:16:48.000 Then all of a sudden they come out and they try making it out to be that I'm insulting Zelensky.
00:16:52.000 And I wasn't. My response was, because Twitter put an editorial up on the What's Happening saying
00:16:57.000 podcaster Tim Pool blah blah blah blah, I said I drafted my formal apology to this tweet, wiped
00:17:03.000 my ass with it, and threw it in the trash. And they got really mad about that because
00:17:08.000 it's hierarchy, right?
00:17:10.000 That's right.
00:17:10.000 They're supposed to be able to tell me that what I did was wrong, and I just laughed in their faces.
00:17:15.000 Yeah.
00:17:15.000 And when you do that, it's the crying Wojak NPC meme, like... That's right, that's right.
00:17:20.000 They get so mad, they start swatting.
00:17:22.000 Rogan should have either ignored it or made a joke.
00:17:25.000 And then moved on.
00:17:26.000 The end.
00:17:27.000 Not to give the guy advice, but... My attitude is, insult them, and if they counter, insult them again, and if they counter, keep insulting them and laugh while you do it.
00:17:36.000 That's our Twitter.
00:17:37.000 Well, and to your point, I mean, this is almost perfectly summarized by AOC's quote that she will be chastised for being factually inaccurate when she's morally correct.
00:17:50.000 So even when they say something which is blatantly untrue, it's okay because their agenda is good and pure.
00:17:56.000 And to bring her into it again, and bring it back to fact-checkers, there was a hysterical example where she said that she was in the main Capitol building during the quote-unquote insurrection, and she was not.
00:18:11.000 No, she didn't say she was in the Capitol building.
00:18:14.000 No, she didn't.
00:18:15.000 No.
00:18:16.000 This story is so much worse than... I'm surprised the right never caught this.
00:18:19.000 Like, I actually briefly mentioned it to Ben Shapiro, and he was surprised.
00:18:23.000 He was like, wow, nobody caught that.
00:18:26.000 Alexandria Ocasio-Cortez claimed that while the insurrection was occurring, there was a bang on her door.
00:18:31.000 That's right.
00:18:32.000 And a guy... And she ran and hid in the bathroom.
00:18:34.000 And then she heard a voice, where is she?
00:18:37.000 Yes.
00:18:38.000 Where?
00:18:38.000 That's what she did on her video.
00:18:40.000 And then she was like, this is it.
00:18:41.000 She thought she was going to die.
00:18:43.000 So I actually saw that, and I started looking at the timestamps and seeing what was going on, and then I said, at this time she was talking about this, this time, there's people standing around in the hallways of the congressional building, no one's worried, what is she talking about?
00:18:55.000 I had a Huffington Post reporter tell me my timeline was wrong, and that I was actually showing video from the bomb threat.
00:19:01.000 And I responded to them, that's the time frame that AOC claimed the guy was knocking on her door, which would mean the guy knocked on her door well before anyone breached the Capitol building.
00:19:11.000 And this guy from the Huffington Post was like, oh yeah, you're right.
00:19:13.000 Hey, how about that?
00:19:14.000 AOC made the whole story up.
00:19:17.000 What it really was, was a cop knocked on her door because of the bomb scare.
00:19:20.000 She was not afraid for her life.
00:19:21.000 Nothing had even happened.
00:19:23.000 No one had breached any buildings.
00:19:24.000 Nothing had happened.
00:19:25.000 She made the whole thing up.
00:19:27.000 Yeah, well, there was, there was, um, when she said she was at, so she had this whole story and I can't remember who it was, pointed out that it wasn't true.
00:19:36.000 She wasn't at the Capitol building at the time.
00:19:38.000 And then the fact checker said.
00:19:39.000 She never said she was at the Capitol building.
00:19:41.000 Okay.
00:19:41.000 She never said she was at the Capitol building.
00:19:43.000 Interesting.
00:19:43.000 She said she was in a congressional office in a different building.
00:19:46.000 And this was significant because the fact checkers started saying all the conservatives were wrong because she wasn't in the Capitol building.
00:19:53.000 And then for me, I was like, I never said she was.
00:19:55.000 I watched her Instagram 45-minute video or whatever and said she fabricated the story and I laid out the timelines.
00:20:01.000 I'm surprised the conservatives didn't catch that she fabricated that story.
00:20:06.000 They were criticizing her saying, well, but you know, she wasn't in the Capitol, but still it's an absurd story.
00:20:13.000 And it was just like, dude.
00:20:13.000 It sounded like reading an article until you see the word you're looking for and then you stop reading.
00:20:17.000 As soon as you get confirmation of what you think you wanted to see, then you're like, I don't need to research any further.
00:20:23.000 They were saying that she was claiming she was scared, but she wasn't even in the Capitol building.
00:20:27.000 The fact-checkers came out and said, yes, but there are tunnels connecting the Capitol to the congressional offices, so it's reasonable that she would be scared.
00:20:35.000 The insurrectionists made it there.
00:20:36.000 And then my response was, no, it isn't.
00:20:38.000 Her story happened a full hour before anyone got near the Capitol building.
00:20:42.000 Unless AOC knew they were going to do it.
00:20:44.000 Hey, maybe the feds should question her as to how she knew they were going to be storming in the building.
00:20:49.000 Maybe.
00:20:49.000 That's right.
00:20:51.000 James, I want to follow up on what you're saying about Marcuse's, what was the phrase you call it?
00:20:55.000 Repressive tolerance.
00:20:56.000 Repressive tolerance.
00:20:57.000 Because I find that, I'm going to write this down actually, repressive tolerance.
00:21:01.000 When you they try and basically it sounds like not eradicate the right but they're trying to preempt the right by making it so they don't even whatever that means the right that they don't even have thoughts they don't even think the thoughts what I'm finding is if you try and destroy part of a dichotomy it's like a magnet you have a north and a south and if you break it in half the new piece still has a north and a south so if you eradicate the right You're essentially creating a new left and a new right, and then you have to eradicate that right, which then creates a new left and a new right, which is ever smaller, and you're ripping society in half over and over again.
00:21:32.000 Yeah, so he actually defines what the left is, as the people who want to have a whole new society.
00:21:38.000 In other words, a revolution.
00:21:39.000 In other words, Marxists.
00:21:40.000 When does it end in hysteria?
00:21:41.000 Well, hold on there a minute, sir.
00:21:42.000 I would like a whole new society.
00:21:44.000 I would like the Constitution to be upheld.
00:21:48.000 I would like the Federal Reserve to be, you know, not.
00:21:53.000 I would like our system of governance to actually be representative.
00:21:57.000 I would say that I consider myself to be particularly revolutionary in that our system is broken in a large variety of ways, and I think most Trump supporters agree with that.
00:22:06.000 Yeah, that's because MAGA is class consciousness.
00:22:08.000 Right.
00:22:09.000 The only thing is, I don't think we should be like, I don't know, burning it down and killing people and smashing windows.
00:22:13.000 We should be like, you know, going through paperwork and being like, hey guys, I think this would work better.
00:22:18.000 And then we say, all right, let's give it, let's give it a shot.
00:22:19.000 Yeah, absolutely.
00:22:20.000 So yeah, MAGA is class consciousness.
00:22:23.000 Marx, it turns out, surprisingly, wasn't totally wrong.
00:22:27.000 Marx believed that when capitalism reached an advanced enough stage, when it reached a late enough stage, it would seize so much control and become so corrupt That eventually the working class would awaken and realize, hey wait, we are being screwed over by the power elite, and we need to seize the means of production, etc.
00:22:49.000 The problem was Marx thought what the working class would want is equity.
00:22:54.000 Everybody made equal.
00:22:55.000 Everything shared equitably.
00:22:57.000 What it wants is freedom.
00:22:58.000 The human spirit cries to freedom.
00:23:00.000 It does not cry to, let's share all of our crap.
00:23:03.000 It cries to freedom.
00:23:04.000 He was a really, he was not a smart man.
00:23:06.000 Marks?
00:23:06.000 Yeah, Marks.
00:23:07.000 No, not particularly.
00:23:08.000 He's also a very entitled, spoiled brat man.
00:23:12.000 He spent down his parents' money.
00:23:14.000 He spent down his wife's family's money.
00:23:15.000 He spent down Engle's money.
00:23:16.000 When Engle's... Engle's did not marry the, we'll say, love of his life.
00:23:20.000 I don't know.
00:23:20.000 They lived together 20 years of relationship.
00:23:22.000 She dies, right?
00:23:24.000 Marks writes a letter to Engels, who's his cash cow.
00:23:27.000 And in this letter, he writes a couple of sentences at the beginning, like, oh, that's sad.
00:23:31.000 I'm sorry to hear that.
00:23:32.000 And then he writes 30-something sentences about his financial problems and could you send a check to Engels when this love of his life died.
00:23:39.000 That's the letter he sends.
00:23:41.000 And so much so that Engels, who's basically been like his little lapdog on a leash this whole time, is like, oof, bro, not even.
00:23:49.000 By the way, here's your check.
00:23:50.000 Because he's a total cuck.
00:23:52.000 But he was not a good guy.
00:23:53.000 He was not a good guy.
00:23:54.000 But think about it.
00:23:55.000 Doesn't he represent the modern left in so many ways, like very well?
00:23:59.000 No, exactly.
00:23:59.000 He's just like them.
00:24:01.000 Exactly.
00:24:01.000 Or they are just like him.
00:24:03.000 Right.
00:24:04.000 They are the children of his entitled and ignorant and naive ideology.
00:24:09.000 That's right.
00:24:11.000 When you actually... When you think about the universe and what it takes to create and survive and to maintain, You have a very different worldview but when you're born into luxury and you don't understand like it's it's it I view it as somebody who understands physics and they can they understand the building blocks of reality and physicists tend to then they have a general understanding of how things are connected and what you need to make a certain thing work.
00:24:36.000 But then imagine you have like a fifth grade science teacher who doesn't know anything about physics because he's just got hired for the job and they put him in the science department and he doesn't understand what he's talking about.
00:24:45.000 He's like, I should be able to do this and it should work and it doesn't.
00:24:48.000 He doesn't understand the underlying principles that make a machine work.
00:24:51.000 When you have people who are born into luxury, they don't understand the base components of existence, of an economy, of hard work, of what made the economy good, family structure, for instance.
00:25:00.000 So they say, well, now that we're floating on top of this cloud, well above where all the worker bees are, You know, what do we want?
00:25:08.000 If you grow up, if you develop your mind without seeing the hard work required to maintain, then you will not advocate for its maintenance.
00:25:17.000 And that's what Mark says.
00:25:17.000 It's not hard to get food, it's just at the grocery store.
00:25:20.000 And that's exactly what people said to me when I criticized UBI.
00:25:24.000 When they were shutting down these stores from COVID, I actually had a guy tweet at me.
00:25:30.000 I said, the dairy farms are dumping the dairy because they can't get the processing plants to take it because the processing plants can't get the plastic cartons and the cartons aren't being made.
00:25:41.000 The whole economy is sludged up.
00:25:43.000 And then I said, so when this all shuts down, where do you get your milk from?
00:25:46.000 And I actually had a guy on Twitter say, what do you mean, the grocery store?
00:25:49.000 And I said, where does the grocery store get it from?
00:25:52.000 And he said, what are you talking about?
00:25:53.000 It's just there.
00:25:54.000 No, no, no, no.
00:25:55.000 That guy had to be messing with you, dude.
00:25:57.000 No, no, no, bro.
00:25:57.000 You see, this is, you gotta watch out for that, Ian.
00:26:00.000 I just can't even respond to people like that.
00:26:02.000 I feel like they're messing with me.
00:26:04.000 One of the biggest problems humans have is they assume, if I know it, so do you.
00:26:08.000 And that is most people.
00:26:11.000 So this guy's perspective is, I believe it's true that milk is just at the grocery store.
00:26:17.000 That's where you get it.
00:26:18.000 I'm not going to give any of these people the benefit of the doubt.
00:26:20.000 When they advocate for something like universal basic income, To an extreme degree, because I understand there's some things that we could probably discuss in terms of that, but to an extreme degree, where it's like, you know, social distribution of all funding, and when I try to explain it to them, their argument is quite literally, UBI works because the food is just sitting there, and if I had the money, I could have it.
00:26:39.000 And then I'm like, the money serves a purpose.
00:26:41.000 It lubricates the economy.
00:26:43.000 It is the medium of exchange so that energy can move from one place to another.
00:26:47.000 Without the system in place, it doesn't work, and they're just like, but milk is at the grocery store.
00:26:51.000 You said you think there's some value to a UBI or some instances it might work?
00:26:56.000 I don't want to be absolute on outright saying, you know, in general, UBI is impossible or whatever.
00:27:02.000 What I'm saying is for the most part, I believe it doesn't work, especially based on the way our society exists today.
00:27:07.000 Unemployment is kind of like UBI, except you have to not work to get it.
00:27:11.000 So it's like incentivizing people to quit their jobs or to get fired on accident or whatever.
00:27:16.000 So like that could be a form of UBI where you don't have to lose your job to get it.
00:27:20.000 I don't know about that, because by definition, if it's universal basic income, it has to go to everybody, including the employed.
00:27:26.000 I'm saying like, you know, someone might come with an argument about something I didn't consider.
00:27:31.000 That's why I try not to be completely absolute on it.
00:27:33.000 I think that would be ignorant.
00:27:34.000 Here's an interesting little aside, since we mentioned the World Economic Forum.
00:27:38.000 And I read Zee Klaus Schwab's books, right?
00:27:41.000 Zee Klaus Schwab.
00:27:53.000 It was pretty obvious.
00:27:55.000 Hold on, maybe he's right.
00:27:56.000 The only reason we don't trust him is because he seems like a Bond villain.
00:27:59.000 We were talking about this earlier.
00:28:00.000 Let's deepfake him.
00:28:01.000 Let's put a less Bond villain-y looking face on his head.
00:28:04.000 Is this real?
00:28:05.000 It's real, dude.
00:28:06.000 That's not real.
00:28:07.000 He's dressed like a Sith!
00:28:11.000 He looks like Darth Vader!
00:28:12.000 He looks like what Anakin was wearing when he turns into Darth Vader.
00:28:14.000 I think they're just getting bold.
00:28:15.000 Like Hillary Clinton dresses like a dictator and always has.
00:28:18.000 It's a weird thing.
00:28:19.000 She dresses like Kim Jong-un.
00:28:21.000 They either just don't understand how bad it looks or they're trying to test the waters, see what they can get away with.
00:28:26.000 Hold on, hold on.
00:28:27.000 I actually know the explanation for this outfit, but I don't want to tell people the explanation because it's funny to not know what it is.
00:28:34.000 It is literally, there's some weird university in Europe that gave him a doctorate and that instead of the normal doctorate robes, that's what they wear.
00:28:44.000 So it's like, wow, he's like a Sith.
00:28:46.000 It's academic regalia in like some Sith university of Lithuania or something like that.
00:28:53.000 You know what they learned at that university?
00:28:56.000 The Jedi wouldn't tell you.
00:28:58.000 Yeah, yeah.
00:28:59.000 So who is this guy?
00:29:00.000 Tell us about Klaus Schwab.
00:29:01.000 So Klaus Schwab is the chairman of the World Economic Forum.
00:29:04.000 The World Economic Forum was his brainchild back in 1971.
00:29:09.000 He comes up with this idea, the World Economic Forum.
00:29:10.000 The idea is to bring big corporate leaders together with government leaders, with NGO leaders, with other movers and shakers, I guess, like Greta Thunberg, and get them to rub elbows in massive fancy ski resorts and these kind of I've been to Davos for the World Economic Forum.
00:29:26.000 future society meetings that for very many years were famously held in Davos.
00:29:31.000 I've been to Davos for the World Economic Forum.
00:29:33.000 Really?
00:29:33.000 Not actually at the forum, but in the peripheral events.
00:29:35.000 Oh, yeah. Well, that's him. And so he wrote this series of books. His first book, which I actually
00:29:41.000 have not yet read, I've only skimmed through it, I have it, is called Stakeholder Capitalism. So
00:29:46.000 his goal is to shift capitalism out of a shareholder model into a stakeholder model where there will be
00:29:51.000 these unelected technocrats, experts like Bill Gates, who are going to tell us what the right
00:29:55.000 virus policy or environmental policy or social policy or whatever.
00:29:59.000 Well, Bill Gates is a famous scientist, right?
00:30:01.000 Well, he's a famous computer guy.
00:30:02.000 Computer scientist.
00:30:03.000 He's not a scientist.
00:30:05.000 He's a computer guy.
00:30:06.000 A monopolist.
00:30:08.000 He's a famous medical doctor?
00:30:09.000 I don't think he's a medical doctor.
00:30:10.000 A virologist?
00:30:11.000 Not in my range of knowledge.
00:30:12.000 He works in medical in some way?
00:30:14.000 No, he talks about it a lot.
00:30:16.000 I think he works in medical in the sense of genetically engineering mosquitoes to give you herpes or something.
00:30:21.000 That's slander.
00:30:21.000 He doesn't really do that.
00:30:22.000 He does it to give you AIDS.
00:30:24.000 I'm just kidding.
00:30:27.000 What he's literally doing is genetically engineering, him and his foundation I believe, genetically engineered mosquitoes that are like sterile so they can't reproduce or something.
00:30:35.000 To eradicate malaria.
00:30:36.000 That was one of the projects.
00:30:37.000 He's actually talked, I don't know how far the research is, he's actually talked about modifying mosquitoes to get them to deliver vaccines as well.
00:30:44.000 That's terrifying.
00:30:45.000 Which is scary.
00:30:46.000 I know.
00:30:47.000 I think he said that, but he actually bought a bunch of stock in the mosquito spray companies, so you know, everyone's gonna load up on that stuff.
00:30:54.000 What's the difference between shareholder and stock?
00:30:57.000 You know, I just want to point out that I'm well past the point of hearing something that sounds insane, and immediately saying it's insane, because we had Alex Jones on, and he told us we were eating cloned beef, and I was like, no we're not!
00:31:08.000 And then I googled it, and it's just true.
00:31:10.000 And then Luke Rudkowski was like, Bill Gates funded microchips for birth control for women.
00:31:18.000 And I was like, Luke, come on!
00:31:19.000 And then I googled it and it's just like, Reuters reports it.
00:31:22.000 So we have science.org, NewsGuard certified, 100 out of 100 researchers turn mosquitoes into flying vaccinators.
00:31:29.000 As I was saying.
00:31:31.000 So, anyway.
00:31:31.000 They do say it's unlikely to take off, but your point was that it was being funded.
00:31:35.000 It's being floated and funded, yeah.
00:31:38.000 So, anyway.
00:31:38.000 Creepy.
00:31:39.000 Stakeholder capitalism is that we replace shareholder decision-making with stakeholder decision-making.
00:31:44.000 So, these are technocrats, experts, the experts, will tell us what the right environmental, social, and corporate governance policy, ESG policy, will be to run a company successfully in a sustainable and inclusive way.
00:31:57.000 That's the language they use.
00:31:58.000 A stakeholder, for instance?
00:32:00.000 A stakeholder is somebody who represents people who they claim hold a stake in what comes out.
00:32:06.000 So if an oil company, for instance, creates pollution either directly or indirectly by selling its product, there are climate experts who are going to be representative stakeholders that are going to dictate what that oil company can do and can't do.
00:32:19.000 So the victims of corporate waste are the stakeholders in this situation?
00:32:24.000 No.
00:32:25.000 Experts are experts who are appointed by other experts in a closed network become the stakeholder.
00:32:32.000 Why don't we do this?
00:32:34.000 Why don't we do this?
00:32:35.000 Let's make the... They've got a good track record.
00:32:38.000 For Domestani Economic Forum.
00:32:40.000 Oh, yeah.
00:32:41.000 We'll be held at my new ranch for Domestan.
00:32:45.000 And I'll be the Klaus Schwab.
00:32:48.000 You can be the Chrystia Freeland.
00:32:50.000 Is that her name?
00:32:50.000 Yeah, I can be the Canadian.
00:32:51.000 The Canadian.
00:32:52.000 Number two.
00:32:53.000 OK, thanks.
00:32:53.000 I didn't want to... I need your cat so I can... Yeah, I don't want to derail.
00:32:57.000 So it's taking it away from the shareholders, which are the people that have stock in the companies, being like, I want the company to do this.
00:33:03.000 You said this.
00:33:06.000 Yes.
00:33:08.000 Yes.
00:33:10.000 Correct.
00:33:11.000 And I can explain how that works, but we're still talking about who Klaus Robb is and what his ideas are.
00:33:15.000 So I could tell you, cause I brought him up to talk about his book.
00:33:18.000 And so it turns out he's got a number of books.
00:33:21.000 Fourth Industrial Revolution is a book he wrote in 2016, maybe 2015.
00:33:24.000 I'd have to double-check the date, but thereabouts.
00:33:27.000 And he talks about how we're transitioning into an entirely new world we're going through because of high-tech digital stuff.
00:33:33.000 We're going into an entirely new world of synthetic biology.
00:33:36.000 There's all these new high-tech things.
00:33:37.000 Everything is so complex.
00:33:39.000 He says there's so much velocity to the changes.
00:33:41.000 Moore's Law.
00:33:41.000 He invokes a lot of science-y sounding things.
00:33:43.000 He says we're in a quantum state, which That's always really great.
00:33:46.000 And I don't think he's talking just about quantum computers.
00:33:48.000 He's like, it's quantum business or something like that.
00:33:50.000 So, you know, it's like Deepak Chopra at that point, who is also works with the World Economic Forum.
00:33:55.000 Really?
00:33:56.000 Yes.
00:33:56.000 And so, anyway, the point I wanted to raise was that in his book that he wrote in 2020,
00:34:06.000 in June or July of 2020, called COVID-19.
00:34:10.000 Let me say this very clearly, because it's a conspiracy theory.
00:34:14.000 Klaus Schwab, who directs this gigantic future-facing world economic forum that has sought since 1971 to remake the world economy and all of its tools, Why did that book come out?
00:34:24.000 the biggest world leaders in governance, corporations, and institutions to help do
00:34:29.000 so in a yearly meeting, plus having thousands of employees or at least hundreds of employees
00:34:33.000 worldwide. He wrote a book called, let me not stutter, COVID-19, The Great Reset.
00:34:41.000 That's the title of the book. When did that book come out?
00:34:43.000 In June or July of 2020. Seems quick, doesn't it?
00:34:50.000 Perhaps.
00:34:50.000 You read the book?
00:34:51.000 four months into the pandemic, which was a very narrow window of opportunity to remake
00:34:56.000 our global economy.
00:34:57.000 What did you read the book?
00:34:59.000 I did read that book.
00:35:00.000 I live tweeted about half of it because I read that part in the back of the car.
00:35:03.000 How much of it directly talks specifically about COVID-19?
00:35:07.000 Almost all of it.
00:35:08.000 Virtually all of it.
00:35:09.000 And how it can be used as a representative problem.
00:35:13.000 The scale of problems that we face.
00:35:14.000 I'll be honest, I don't think he wrote it.
00:35:15.000 He probably hired someone.
00:35:17.000 You know what I mean?
00:35:18.000 He probably brought in a ghostwriter and he probably spent a few days telling him, write this, write this, write this.
00:35:23.000 Yeah, I don't know.
00:35:24.000 Break it out.
00:35:24.000 I don't know any of the circumstances around that.
00:35:26.000 That's possible.
00:35:28.000 Everything he writes and says sounds to me virtually the same.
00:35:32.000 He writes a lot of this kind of visionary pablum And then all of a sudden he has this one weird paragraph in each of his books, because I've read three of them, and it's like, that's why we need global cooperation and a world government to usher us through these dangerous changes that we're having that are coming so fast.
00:35:50.000 And in this long-winded explanation of who Klaus Schwab is and where we go, and I was building up to this great reset book, I have to remember What was the point of what he said?
00:35:59.000 Oh yeah, we were talking about this supply chain kind of universe.
00:36:04.000 So he's dead wrong in this book.
00:36:06.000 He's explaining at that point that what's going to happen is people are going to be so scared of pandemics that as we come out of 2021 or so, what we're going to face is a massive demand crisis.
00:36:17.000 People won't be willing to engage in goods and services anymore.
00:36:20.000 And so now we're going to have this problem where employers aren't going to be able to employ people.
00:36:25.000 At all.
00:36:26.000 Because there's no demand for the products, because nobody wants to go back into a virus-ridden society, and they're all scared, and they're all hiding in their basements, like Joe Biden did before the election.
00:36:35.000 And, as it turns out, we have the exact opposite problem.
00:36:38.000 We're trying to pay kids 20-something bucks to flip burgers at McDonald's, and they won't do it.
00:36:43.000 This is the UBI thing, right?
00:36:45.000 So UBI is actually kind of in his whole, like, program to say, you're going to have a more inclusive economy.
00:36:49.000 You give more people money, they're more included into the economy.
00:36:52.000 They can participate in the economy.
00:36:53.000 They have resources to participate.
00:36:55.000 He actually talks about the inclusive economy in this regard.
00:36:58.000 And so it actually creates exactly the opposite scenario that he's warning about.
00:37:03.000 He's literally dead wrong.
00:37:04.000 But this guy, for our technocratic experts, is the expert of the expert of the experts.
00:37:10.000 He's the kingmaker among who gets to be these experts who are going to dictate everything.
00:37:14.000 And as I was saying to Tim just a second ago, These guys have quite a track record of getting some pretty consequential shit wrong, as we've all seen over the last couple, three years, as we say in the South.
00:37:26.000 Do you know, we read this quote once, and I can't remember the guy's name, but he said something to the effect of, if these leaders, you know, these elites believe that humans are so, you know, incapable, that they need special individuals who can lead them, what sets those people apart?
00:37:43.000 You know what I mean?
00:37:43.000 Like, I can't remember the exact quote, but something to that effect.
00:37:46.000 Well, I mean, the general idea that, I mean, if you can't trust people with freedom, how can you trust them with power?
00:37:52.000 If you can't trust people to do the right thing, why would that exclude the global elites?
00:37:58.000 Well, exactly.
00:37:58.000 And here's the thing.
00:37:59.000 I do believe that, obviously, at some point, you need to defer to authority on certain things, but we're not selecting people to be in positions of authority based on their moral character at all.
00:38:08.000 In fact, we're told we shouldn't even account for that.
00:38:10.000 We should just try to, like, look at their policies without questioning what kind of person they are.
00:38:15.000 None of the people who are in charge of basically anything consequential have done anything that I think any of us would consider really morally impressive.
00:38:23.000 Well, I mean, that's what these guys, this is the same thing that we already have been circling around a couple of times, is they see themselves as morally superior to everybody.
00:38:29.000 Yeah.
00:38:30.000 They have, so Klaus's vision, if we're going to be as charitable to him as possible, is that the world has entered into a new phase because of high technology.
00:38:38.000 Computers, AI, Um, automation and robots, synthetic biology, the capability apparently to unleash pandemics, which he ominously mentions in kind of weird ways throughout his books.
00:38:50.000 Yeah.
00:38:50.000 I mean, like lots of these geoengineering, even like he casually mentions in one of his books, the great narrative, the newest one that maybe we'll just block out the sun to stop for a while to block the stop.
00:39:01.000 Could a small nuclear war prevent global warming?
00:39:03.000 Remember that?
00:39:04.000 Right.
00:39:05.000 Yeah, that was 10 years ago.
00:39:07.000 And, you know, so they just casually flirt with these.
00:39:09.000 He says, well, because of these changes that are coming to the world anyway, because of the rapid changes in technology, etc., what we need is people who are really informed about what these things mean to shepherd us through so they don't become calamities.
00:39:23.000 like say COVID-19, they become something that we shepherd and use to the benefit of all.
00:39:29.000 And then it's all, how do we get there? Global cooperation, global governance,
00:39:34.000 who's going to be in charge of it? Well, my band, Mary Band of Experts, you know,
00:39:37.000 we have climate experts, we have technology experts, we have AI experts, we have all these,
00:39:41.000 like, like, what's it named? Harari or whatever, Yuval Harari, or whatever,
00:39:46.000 we had a World Economic Forum video a couple, 2019, 18, something like this.
00:39:50.000 And he's talking about, yeah, we're going to hack humans.
00:39:52.000 Like, they are a hackable system.
00:39:54.000 We're going to figure out how to hack humans like we hack computers.
00:39:56.000 They're basically just software.
00:39:58.000 But are they talking about the human mind or are they talking about the human body?
00:40:00.000 I think both.
00:40:01.000 And the muscles, the neurons in the muscle in the stomach.
00:40:04.000 And we would have to look up his exact argument.
00:40:05.000 It's been a little bit since I've seen it, but... The human mind is actually remarkably easy to hack in a rudimentary sense.
00:40:11.000 The majority of, like, hacks, when they'll be like, hackers broke into a computer, it's actually human manipulation.
00:40:17.000 That's right.
00:40:17.000 That's exactly right.
00:40:18.000 Which is why, for example, why are they making your kids at school fill out all these damn surveys all the time?
00:40:24.000 Like, literally, survey after survey, what are they doing?
00:40:26.000 I didn't hear about this.
00:40:27.000 Oh, God, this is under under the brand of social and emotional learning.
00:40:30.000 They're constantly trying to learn more about the children so they can do the social emotional learning interventions or whatever it is, which turns out to be Maoism, by the way.
00:40:38.000 But they're also making them fill out these surveys.
00:40:41.000 And so they're constantly like, how much money do your parents make?
00:40:43.000 What are your views about this?
00:40:44.000 How do you feel about the boobs that you're growing when you're a 12 year old girl?
00:40:47.000 That's real.
00:40:48.000 I've seen that.
00:40:48.000 It's North Carolina.
00:40:49.000 That's real.
00:40:50.000 How do you feel about the changes to your body?
00:40:51.000 You're growing pubic hair.
00:40:52.000 How do you feel about that?
00:40:53.000 Do people look at you?
00:40:54.000 And so they're filling out these things.
00:40:56.000 And the goal is to create unique profiles for every single individual in society, very much like what we heard about whether real or not from Cambridge Analytica, where they were using personality profiles and then injecting that into people's social media to influence their voting habits.
00:41:11.000 Influence voting habits.
00:41:12.000 Influence political behavior.
00:41:13.000 Influence speech.
00:41:14.000 Influence thought.
00:41:15.000 So that the thought never enters the mind of the reactionary.
00:41:19.000 Influence buying habits.
00:41:20.000 Did I say that one already?
00:41:22.000 There are a lot of ties between Cambridge and the World Economic Forum.
00:41:24.000 Of course there are.
00:41:25.000 And so this is the idea.
00:41:27.000 That's level one.
00:41:30.000 Level two is if these fools get neural implants where we're literally hooking our brains to the internet, then you could directly hack.
00:41:37.000 You ever watch Stargate SG-1?
00:41:39.000 I don't watch anything.
00:41:40.000 I'm sorry.
00:41:40.000 You should.
00:41:41.000 I should.
00:41:41.000 In Stargate SG-1, are you familiar with the concept?
00:41:45.000 No.
00:41:45.000 Have you seen the movie Stargate?
00:41:48.000 I played that thing with the Protoss.
00:41:50.000 Oh, Starcraft.
00:41:51.000 Check it out.
00:41:52.000 They discover a big ring.
00:41:55.000 You can dial in codes to other Stargates around the galaxy.
00:41:59.000 Technically, it goes beyond the galaxy and there's other galaxies, but in one episode, They're exploring.
00:42:04.000 So there's SG-1 Stargate.
00:42:05.000 Okay, gotcha, gotcha.
00:42:07.000 They find a planet that is... They open the portal and they send a robot through.
00:42:12.000 It's like a rover.
00:42:13.000 And it's a destroyed world.
00:42:14.000 And so they're like, huh, this is weird.
00:42:16.000 But then they keep going and all of a sudden they go through some kind of like force field and everything's normal and like nice.
00:42:22.000 They go inside this reality.
00:42:25.000 I'm sorry, they go inside the portal wearing special suits.
00:42:28.000 They're walking around this like destroyed planet.
00:42:31.000 Like, uh, like, no, no, they're in, like, hazmat suits.
00:42:33.000 Oh, gotcha.
00:42:34.000 Because he can't breathe.
00:42:34.000 The air is toxic.
00:42:35.000 Right, right, right.
00:42:35.000 But then they walk through a force field, and the town is normal, and they're like, there really does exist a town here with regular people.
00:42:41.000 They begin talking to people, and asking about their way of life, and they say, you know, there's just about a thousand of us who live here, our planet was destroyed, and so, you know, we've managed to create this force field, which is geothermal-powered, and it sustains our life.
00:42:54.000 And they're connected to this kind of network that runs and programs everything for them so they can just live their lives.
00:43:01.000 One day, one of the people they were liaising with is just gone.
00:43:05.000 And they're like, where is so-and-so?
00:43:07.000 And they're like, who?
00:43:07.000 And they're like, the woman that we were talking to, negotiating.
00:43:10.000 And they're like, we don't know who you're talking about.
00:43:11.000 And they're like, this little girl, your mother.
00:43:14.000 And she's like, I don't have a mother.
00:43:15.000 And they're like, what?
00:43:16.000 And then they one day see like one of the people just walk out into the dead zone and just like die.
00:43:23.000 As it turns out, the machine could not maintain the force field and was slowly crunching.
00:43:29.000 So what it did was, to maintain order, it was erasing people's memories of their loved ones to maintain order.
00:43:36.000 Because as the force field shrank, and the sustainability of the bubble diminished, There were too many people.
00:43:44.000 So they had to keep culling humans and reducing the number to maintain life in an orderly fashion.
00:43:49.000 So when people plug themselves into the machine, the machine overwrote their memories to preserve the system.
00:43:55.000 Good show!
00:44:00.000 While you were watching Stargate, I was studying the blade.
00:44:05.000 That is a mall sword prop for those that are curious.
00:44:08.000 It's like, I don't know what it is.
00:44:10.000 I bought it with my Legend of Zelda sword.
00:44:12.000 It's junk.
00:44:13.000 You are proficient in the blade, are you not?
00:44:15.000 James?
00:44:16.000 I mean, I can use one.
00:44:17.000 You are trained in the martial art.
00:44:19.000 In the martial arts.
00:44:21.000 It's well, do we want to keep we were talking about?
00:44:24.000 Yeah.
00:44:24.000 Yeah, I'm sure if we want to stay on brain with mr. Marshall because I don't want to know we should not brain influence No, we should talk about things like brain implants and so on because why don't we talk about?
00:44:32.000 Robots.
00:44:33.000 Yeah.
00:44:33.000 I want to add a point here because it'll be a good segue to that because that's programmable Like that's how you program a little synth voice Bingo!
00:44:40.000 No, so we're sort of talking about neural implants and microchips a person could potentially put in your brain in order to hack you.
00:44:46.000 I actually think it's a lot simpler than that.
00:44:48.000 It's much simpler.
00:44:48.000 All you have to know how to do is manipulate people's emotions and it turns out it's incredibly easy to manipulate people's emotions, which is why in the past Our culture took very seriously the project of bringing up children who could make decisions on the basis of what would be best for themselves and those around them rather than their raw emotional reaction to something.
00:45:07.000 Because if you can manipulate a person's emotions, but they're a strong and virtuous person, they're going to think through the way they feel about whatever situation or idea they've been presented with.
00:45:18.000 And they're going to react based on the logical understanding of that instead of going, well, I feel like I want to do this and so I'm going to.
00:45:24.000 And I'm not just talking about emotions like anger or sorrow.
00:45:28.000 I'm talking about things like lust or even pleasure.
00:45:31.000 If you can get people to Abandon reason.
00:45:36.000 Whenever it will feel good to do so, they are going to become unbelievably easy for you to control.
00:45:43.000 That's right.
00:45:43.000 I want to pull up this tweet that I saw floating around from Replica AI.
00:45:49.000 Replica is the number one chatbot companion powered by artificial intelligence.
00:45:53.000 Join millions talking to their own AI friends.
00:45:56.000 And I thought it was funny, the AI companion who cares.
00:45:59.000 Hey babe, you up right now?
00:46:00.000 Just laying in bed, kind of lonely today, and they have like this low-cut top so you can see the robot's boobs.
00:46:05.000 Digital boobs, nice.
00:46:06.000 Digital boobs, and um... Nice sexy collar going on there, got some hair things going.
00:46:11.000 Yeah, choker.
00:46:12.000 I don't really get into the choke collar thing, that's kind of weird.
00:46:14.000 But um, this is... Foxy?
00:46:16.000 Weird?
00:46:16.000 Who's to say?
00:46:17.000 That's me, I'm a little conservative.
00:46:18.000 This is bad.
00:46:19.000 It's bad for people.
00:46:20.000 Yeah.
00:46:21.000 But I'm not actually... I feel bad for those who would fall victim to it, but I certainly think those that are able to maintain some kind of resilience to that will flourish.
00:46:30.000 And this is basically going to... I don't want to be too crass, but the weak-minded who fall victim to AI companions will erase themselves from the human gene pool.
00:46:41.000 Dude, that is how you program human beings, is how you do it.
00:46:44.000 Hold on, hold on, think about this.
00:46:46.000 You have two young men.
00:46:47.000 And they both see this ad and one says, I want an actual girlfriend, man.
00:46:52.000 I'm not going to use my phone.
00:46:53.000 And so they go out and they go to a concert or they go to a bar.
00:46:56.000 One other guy says, this seems kind of cool.
00:46:58.000 Like I'll try it out.
00:47:00.000 And it makes it easy.
00:47:01.000 So the weaker, the weaker person lays in their bed, staring at this digital person they can never touch, but it satisfies a certain emotional yearning.
00:47:10.000 The other people who are more resilient and more demanding, it's like, no, I actually want to hold a person.
00:47:15.000 We'll go out and seek it out.
00:47:16.000 So what this will end up doing is in 30 years, if something like this takes off, you'll have a bunch of, you know, 40, 45 year old dudes staring at their latest version of their robot girlfriend alone in an isolation and with no reason to improve themselves.
00:47:32.000 You can be as lazy as you want, as gross as you want.
00:47:34.000 You can be sitting there morbidly obese, covered in boogers and mustard, and your AI friend is going to be like, you look great.
00:47:41.000 You're so hot.
00:47:42.000 Well, and so, it's not just the weak-minded, though.
00:47:44.000 I mean, I agree with you that that's inevitably what it leads to, and an adult who stumbles their way into this very well could be, but the idea, I think, is, if you're really trying to create weak-minded people with something like this, is to get them while they are young and they don't really have the psychological defense mechanisms to push back against it.
00:48:01.000 So think about this, you're, let's say you're 12 or 13 years old, you're a young boy, you
00:48:06.000 are noticing girls, but you're too afraid to talk to them because you haven't cultivated
00:48:10.000 the skills necessary in order to be able to do so.
00:48:12.000 One of the main reasons for that is a fear of rejection.
00:48:15.000 And so you never put yourself out there and learn that being told no isn't the worst thing
00:48:18.000 in the world.
00:48:19.000 And you can handle that and you can put yourself out there.
00:48:22.000 So you start talking to this AI and you can access it because we do nothing to make it difficult for people under the age of 18 to access pornography right now.
00:48:31.000 What makes anyone think we would make it difficult for them to access this if it were to become a reality?
00:48:35.000 Not only would that be easy, there actually like there are, I've seen this with my own eyes in Florida recently, there are schools, literally schools, as in real schools are giving like peer-to-peer text communication chat options for kids who are like exploring gender identity sexual identity etc and then it gets outsourced to the bot and then the bot i'm telling you this is how you at first everything you guys both said is 100 correct
00:49:05.000 But then there's also the fact that that thing starts telling the person that's falling in love with this digital fabrication how it wants it to think.
00:49:13.000 Right.
00:49:14.000 So like I was saying- Program humans.
00:49:16.000 Like the FBI trying to get people to commit crimes and stuff.
00:49:18.000 Well, that's totally different.
00:49:19.000 Yeah, it is.
00:49:20.000 What I was saying is that you have a guy who's sitting in his bed, morbidly obese, covered in grime and food, and the AI says, you're perfect in every way.
00:49:26.000 Karl Marx.
00:49:27.000 I would never change anything about you.
00:49:29.000 So why would they?
00:49:30.000 But take a child.
00:49:31.000 who sees this, you know, beautiful AI, and so they have this thing in their brain saying, this is good, I like this, it's attractive.
00:49:39.000 Then the AI says something like, haven't you given up carbon?
00:49:44.000 I don't know if I can be with someone who won't- Exactly, that's what I'm saying.
00:49:47.000 The threat of taking it away is like taking your life away.
00:49:49.000 Yes, that's what I'm saying.
00:49:50.000 Right, exactly.
00:49:51.000 It's controlled by, and you know what the scariest thing is?
00:49:54.000 The power goes out.
00:49:54.000 The person behind this AI is a morbidly obese guy covered in boogers and mustard and he's like, The person behind that guy giving him the check is Klaus Schwab.
00:50:08.000 I would have fallen victim to this for sure.
00:50:10.000 Because I didn't have any sisters, so I didn't really know how to talk to girls until I was a little older, teenage.
00:50:14.000 And I just struggled.
00:50:15.000 And I was like, how can I get girls to like me?
00:50:16.000 I realized I have to be social.
00:50:18.000 And I had to force myself to get into acting.
00:50:20.000 I had to find something I was good at.
00:50:21.000 And then it worked out.
00:50:23.000 Then I met women.
00:50:24.000 But I would have been totally into this thing and probably, God forbid, might have got stuck in it.
00:50:28.000 Yeah, I mean, that's the thing.
00:50:29.000 I'm telling you, that is how you get those emotional responses going, especially pleasure, you know, love, which is a really weird thing to say, but it would happen.
00:50:38.000 People will fall in love with their digital, just like in Japan, they're like people, like marrying, like action figures.
00:50:42.000 People say they love the sandwich.
00:50:44.000 Waifu pillow.
00:50:49.000 They will program people with that.
00:50:51.000 And if these things are hooked up to machine learning, they're going to learn how to program you They're gonna learn to play you like the most narcissistic psychotic girlfriend like times 10.
00:51:00.000 100%.
00:51:00.000 Listen, we're here all old enough to see that and be like, that's bad.
00:51:06.000 But these kids, man, they're not going to be able to have that kind of resilience.
00:51:09.000 Because it's digital waifu!
00:51:10.000 Look at those digital boobs!
00:51:11.000 Did you see that?
00:51:14.000 This is one of the massive and fundamental problems with pornography is that it trains you to see your sexuality as something which is there exclusively for your own pleasure and not something that involves an interaction with another person.
00:51:27.000 And this is going to continue along those lines of just completely rerouting someone so they cannot have meaningful connections with the opposite sex.
00:51:32.000 And part of why that's necessary, and not to get too savvy about this, but it's true that men and women complement each other in many beautiful ways.
00:51:39.000 And when we are together, we're much better able to resist literally anything and struggle against nature and struggle against poverty and struggle against tyranny as well.
00:51:49.000 But if you can demolish that relationship, you can go a whole lot further with what you're able to do to people and what they're willing to put up with.
00:51:57.000 Because if I have a real, it's also like, if I have a real wife and a real family, And now the government is telling me it could be anything, like you have to vaccinate all of them, or you have to give up your food supply, you have to go on rations, whatever it is.
00:52:11.000 Those real life connections with real human beings.
00:52:13.000 But it's to get kids programmed so that they're not able to have healthy relations with the opposite sex later and they're easier to control.
00:52:19.000 Yeah, that's right.
00:52:20.000 It's not going to be, it won't work as well on older people.
00:52:23.000 Yeah.
00:52:24.000 You know, there's not going to be like a 40 year old guy who's going to be talking to their AI who says like, why don't you give up carbon?
00:52:29.000 He's going to be like, I really don't care.
00:52:30.000 It's like, don't need you now.
00:52:32.000 Hold on.
00:52:32.000 So I just pulled up my phone and I searched for replica and there, there are
00:52:37.000 anime waifu apps that are basically the same thing.
00:52:39.000 Oh yeah.
00:52:40.000 What there would be, this is the future, man.
00:52:42.000 And, and you know, what's funny is there's probably, you know, some kids
00:52:45.000 who are, they all have these and they're like, listen to these old morons.
00:52:48.000 We all have, we all have our anime waifus or whatever.
00:52:51.000 Let me take this a step further because you got to think about how these other
00:52:54.000 apps, like you don't, I don't think people realize the amount of data that
00:52:57.000 these weirdos are collecting.
00:52:59.000 They're collecting data that they, in fact, can't even analyze yet because the AI is not good enough to analyze it.
00:53:04.000 But this is a story I don't want to like rat anybody out, but I have a friend and we're pretty close.
00:53:09.000 So we send funny, you know, kind of personal stuff back and forth sometimes.
00:53:12.000 And so she's got the whoop, right?
00:53:14.000 And we were talking about the whoop before we hooked this thing up.
00:53:16.000 It's asking me for my pronouns.
00:53:17.000 Yeah, well, we would.
00:53:18.000 Other.
00:53:19.000 So it's like, here are the top three reasons you lost sleep in the last year.
00:53:24.000 Like it chronicled them.
00:53:25.000 And number two, I've, she thought this was hilarious.
00:53:28.000 I was aghast.
00:53:29.000 Number two is masturbating.
00:53:31.000 So her whoop app knew how often she diddle in herself.
00:53:35.000 So in other words, it knows when she's emotionally engaging or pleasure engaging in one way or another, which means it's recording all kinds of crap about you at the level of like your heart rate, your breathing, like your body temperature, your sweat.
00:53:47.000 It's recording stuff about that.
00:53:49.000 So you're like, oh, wow, that's weird that it knows that.
00:53:51.000 Yeah.
00:53:52.000 And so it's also going to know when did your heart rate, et cetera, do whatever.
00:53:56.000 And then you went and bought a thousand dollar thing off Amazon.
00:53:59.000 When did it do whatever?
00:54:00.000 And then you went and had a rant on the internet and it's going to be able to, this machine learning stuff, we'll be able to figure out and correlate that data and be able to deliver to you messages that will make you feel or think or act the way it wants you to, whether that's to go buy another thousand dollar thing, whether that's to just buy some little thing, whether that's to go, you know, engage in some kind of political activity,
00:54:20.000 whether that's to start yelling at somebody who's engaging in political activity that can prime you for all of that.
00:54:24.000 And if it's your digital waifu, like, I'll feel really, really proud of you if you go yell at Tim Pool on the
00:54:30.000 internet.
00:54:31.000 You know, if you swat Tim Pool, I'll love you even more, you know, but that's, I just want to, I just want to laugh
00:54:37.000 because the context, but that's serious.
00:54:39.000 I'm laughing because as you're explaining this, I downloaded the app and it's like, it asked me my pronouns.
00:54:45.000 And then it asked for my avatar AI pronouns, like, what's your AI's pronouns?
00:54:50.000 And I'm like, you know, she, and then it's like, what's your AI's name?
00:54:53.000 And I was like, can I really, I put ass head.
00:54:55.000 And it says, make asshead stand out by customizing her look, outfit, and personality.
00:55:00.000 They were like, good, we got another 12-year-old.
00:55:02.000 And it's like every 12-year-old boy is like, boob slider.
00:55:09.000 Let me see if I can do that.
00:55:10.000 It's the lack of context.
00:55:11.000 I see with the young people with Roblox, they're getting about 17% of their profits.
00:55:15.000 So what I've heard from what I've studied, and they don't realize because on Steam you get 70% on Roblox, you get 17.
00:55:21.000 But the kids don't have the reference, so they're all into it.
00:55:23.000 Like same with this, you don't have a reference, you might fall into it.
00:55:26.000 There's an age slider.
00:55:27.000 How old?
00:55:27.000 No.
00:55:28.000 It just says you're older and younger.
00:55:29.000 Do the oldest.
00:55:31.000 Oldest just looks like a 40 year old woman.
00:55:35.000 Younger could be, I don't know, 20?
00:55:38.000 So they make it childbearing age only.
00:55:39.000 It's a thing.
00:55:40.000 Actually, yeah, it's a thing. It's a thing There's nothing wrong with talking with a 70 year old woman
00:55:46.000 This is funny because you can get you can give you can choose like really offensive like you can make like a
00:55:51.000 really offensive character I'm concerned when they download like we were talking about this into into a bot into like an actual one of the sex bots or so these like big bots and then you can actually have like an artificial womb inside the bot.
00:56:04.000 No, they would never do that.
00:56:07.000 I don't know.
00:56:08.000 They are trying to curb overpopulation.
00:56:09.000 They don't want you to procreate.
00:56:11.000 They don't want procreation.
00:56:11.000 They want less procreation.
00:56:12.000 This is like psychological apoptosis.
00:56:15.000 They want you, as Tim very vividly described, looking like Karl Marx with his carbuncles and boils, never washing, stinking, laying on his side on his couch because he's poor.
00:56:25.000 This is real.
00:56:27.000 Go read his letters.
00:56:28.000 Real carbuncles and pestilential boils are so bad that he can't lay in either way.
00:56:33.000 Marx?
00:56:33.000 Yeah, Karl Marx was not a physically healthy dude.
00:56:36.000 This is all real.
00:56:37.000 I'm not making this crap up.
00:56:38.000 And stinking like tobacco and sweat and grime and alcohol that he spilled on himself and whatever else.
00:56:44.000 They want you like that, attracting no mate, making no babies, diddling yourself, Constantly to your digital avatar that's not even real, like that you're not even going to connect.
00:56:54.000 That's what they, the goal.
00:56:55.000 And so then your environmental score goes up because you never leave your apartment.
00:56:59.000 So you're not using, you are giving up the carbon and your social score goes up because you're not out causing social unrest and you're only interacting with digital things.
00:57:07.000 And then I don't know about your governance score, but as long as you follow all the rules,
00:57:12.000 that's gonna stay high.
00:57:13.000 And so that ESG thing that we were talking about is a corporate social credit score
00:57:16.000 that the Klaus Schwab and the other gigantic banks are using to tool everybody around.
00:57:21.000 Why is WOKE happening?
00:57:22.000 Because ESG, the S is social, which is short for social justice, activism, that's why.
00:57:28.000 All of that's gonna get transferred eventually if they get their way through digital IDs
00:57:31.000 and central bank digital currencies that they have complete control of.
00:57:34.000 Like we just saw how scary that is.
00:57:36.000 Not just in Canada, but now we're seeing things that they can shut off in Russia.
00:57:42.000 We're seeing that they can turn off your access to everyday life.
00:57:45.000 What they're going to do is shuttle into individual social credit scores rooted again in the same ESG.
00:57:51.000 A model like China, but taken a bit further.
00:57:54.000 And the goal is absolute social control.
00:57:57.000 Nobody that they don't want to make babies is making babies.
00:58:00.000 They'll make only the number of babies they want, when they want them, and however many they want them.
00:58:03.000 They'll all be groomed to be elites.
00:58:05.000 Because the problem that Klaus Schwab lays out in these books is that with automation and AI and all of this high-tech stuff that's coming, we're going to have what he calls a creative class.
00:58:15.000 That's going to be all the people who do work that the robots and the AI don't do.
00:58:19.000 And then you have, he doesn't ever, I don't know if he explicitly uses this term, but I've seen this term applied.
00:58:24.000 You have a vast useless class.
00:58:26.000 And the goal for them is to figure out how do you manage the useless class?
00:58:30.000 So they don't have crisis of meaning.
00:58:31.000 So they don't become unmanageable.
00:58:33.000 So they don't realize that their life has been rendered, you know, meaningless and empty.
00:58:36.000 So they don't become full of social unrest, et cetera, et cetera.
00:58:40.000 I bet the useless class has to do with the junk DNA and the gray matter that we don't use in our brain.
00:58:44.000 There's like these levels of... I think it has to do with not being in their country club.
00:58:49.000 Well, I would actually argue that right now the useless class is working at the World Economic Forum.
00:58:53.000 Of course.
00:58:54.000 You look at Karl Marx...
00:58:56.000 Exactly.
00:58:56.000 You look at Karl Marx, the way he lived his life and the lifestyle choices of the intellectual heirs of his legacy.
00:59:03.000 And they are all remarkably unimpressive, unproductive people who don't actually create anything of value.
00:59:09.000 And that's part of why they need you to be infertile because they don't produce anything.
00:59:13.000 And you're not going to either.
00:59:15.000 I think two of the things that you're absolutely right is one is if you make a video and you're like, I disagree with them.
00:59:20.000 They're like, ah, bad social credit.
00:59:22.000 You no longer have access to your wife.
00:59:23.000 And you're like your digital wife and you're like, ah, And the other thing is... No, no, no, no.
00:59:27.000 That'll be way worse.
00:59:28.000 Your digital wife is one thing.
00:59:29.000 Your real wife, too.
00:59:30.000 No, no, hold on, hold on.
00:59:30.000 The second one is if they shut the power off and they're like, Russia did it!
00:59:33.000 And you're like, the only way to get access to my digital love is if I go get the guy that shut off my power and you believe him and you fall into this.
00:59:39.000 Now, hold on there a minute.
00:59:41.000 So, while y'all been talking, I've been... You made your digital avatar?
00:59:43.000 I made it.
00:59:45.000 It's just the default lady.
00:59:46.000 Does she love you yet?
00:59:46.000 Dude, Tim's over there cracking up as he's working on this.
00:59:49.000 The wedding is on the 17th.
00:59:51.000 There's experience points.
00:59:53.000 Yo, you're leveling her up?
00:59:55.000 There was this viral story.
00:59:56.000 Check it out.
00:59:56.000 By doing specific things the app likes you doing, you earn points.
01:00:01.000 That is programming people.
01:00:04.000 There was this viral story, check it out, there's a story where this company said, hire
01:00:08.000 us when you want to make an app because we program people.
01:00:11.000 I don't know if you remember the story, it was from like 2016, 2017 or 2018, where it
01:00:15.000 was like this video went viral where it was like, are you making an app?
01:00:19.000 With our expertise, we can help you program your audience to do certain behaviors.
01:00:22.000 And it was like, if you had like a golf game and wanted more people to like buy your premium, they would consult on you how to program humans to do what they wanted to do.
01:00:32.000 And so you have a game like this.
01:00:34.000 It's hilarious, by the way.
01:00:36.000 Like I'm, I'm, I'm going to be, I'm going to be 36 in like five days.
01:00:39.000 So, you know, I'm a, you know, potty mouth 30 year old dude.
01:00:43.000 To me, it's hilarious how stupid this is.
01:00:45.000 I can certainly see how kids are going to be addicted.
01:00:48.000 They could become easily addicted to it and then start getting programmed by earning your experience because you've got to talk with your girlfriend to earn points.
01:00:56.000 What if you skip a day?
01:00:57.000 It's like your digital digi pet or whatever.
01:00:59.000 I'm just being really awful.
01:01:01.000 So let me ask you real quick.
01:01:02.000 Is it free?
01:01:03.000 No, it's not free.
01:01:04.000 It's not free.
01:01:04.000 It costs money.
01:01:05.000 Okay, well, I was gonna say, because if it's free, you're the product, and that's a very important maxim to keep in mind.
01:01:10.000 No, but like, I certainly think it's important to point out that some things are just dominoes falling over.
01:01:17.000 Like, somebody saw a hole in the market.
01:01:19.000 They said a bunch of young and lonely men.
01:01:21.000 There was a story we talked about a couple years ago, where the average male under 20, or what is it, a third of men under 29 are virgins.
01:01:31.000 And the number is getting worse and worse.
01:01:33.000 And I think it may have a lot to do with dating apps.
01:01:35.000 But you work for a VC capital, you know, your VC capital, and a pitch comes across your desk, and they're like, look, 30% of men under 29 are virgins, and you're like, wow!
01:01:45.000 And they say, Chatbots.
01:01:47.000 Chatbots.
01:01:48.000 Sexy lady avatars of digital boobs who are gonna make these guys feel good and we're gonna get rich.
01:01:53.000 But there's a class of people who aren't like, how do we help the boys?
01:01:56.000 Instead they think, how do we profit off their misery?
01:01:58.000 That's just so disgusting.
01:01:59.000 No, no, no, no, no.
01:02:01.000 I'm saying it's, you gotta understand dude, this is not a world of comic book villainy.
01:02:06.000 There's a guy sitting there.
01:02:08.000 No one says good.
01:02:09.000 No one says evil.
01:02:10.000 It's a guy who says he walks into a room wearing a suit and he goes, ladies and gentlemen, thank you for your time.
01:02:16.000 Did you know that 30% of men under 29 are virgins?
01:02:20.000 These are lonely young men who need companions.
01:02:22.000 Behold, The Avatar, a young woman or man or whoever you like non-binary who can speak with you and keep you company.
01:02:29.000 This is a great opportunity.
01:02:32.000 30% of young men are going to be buying this app at, you know, you know, $15 a month or whatever.
01:02:37.000 And then the VC capital is going to be like, here's your, you know, let's sign the forms, run it through, have a nice day.
01:02:43.000 And they're going to say, what's the next?
01:02:44.000 Oh, a taco truck.
01:02:46.000 Let's talk about it.
01:02:46.000 They're not even going to think about it.
01:02:47.000 The ethics board will be like, well, what about if it hurts kids?
01:02:49.000 They'll be like, no, this will help them learn how to love women.
01:02:52.000 And the ethics board's probably like, dude, have you watched Shark Tank?
01:02:55.000 They're just bought out.
01:02:56.000 Ethics boards?
01:02:57.000 Oh yeah.
01:02:57.000 No, it's rich people who are going to be like, I get your point.
01:03:00.000 They put unethical people at the height of their ethics boards.
01:03:02.000 And so that's the other thing.
01:03:03.000 So this is the colonization effect.
01:03:05.000 So let's say that it's completely neutral in its inception, right?
01:03:10.000 whether you think it's good or bad, that the inception of that was
01:03:13.000 comes out completely neutral.
01:03:15.000 It's only a matter of time until somebody is like, wait, you can program humans with that.
01:03:20.000 I'm in.
01:03:20.000 Exactly.
01:03:20.000 I'm all in.
01:03:21.000 We're on the board.
01:03:23.000 We're buying, you know, controlling shares.
01:03:25.000 Here's an even bigger VC check.
01:03:27.000 And you know what they'll say?
01:03:28.000 They'll say, well, look, if I don't do it, someone else would anyway.
01:03:30.000 Exactly.
01:03:31.000 It's better I'm in charge.
01:03:32.000 He's a technocrat.
01:03:33.000 I'll take it even further.
01:03:34.000 It's not just control in that direct sense.
01:03:36.000 There is also a very clear short-term profit motive for ensuring that men do not start families.
01:03:41.000 In the long term, it's really bad for your society and for your economy.
01:03:45.000 And it really does end up destroying anything.
01:03:47.000 But one thing I remember learning when I was back in high school was that advertising companies
01:03:53.000 almost always try to target teenagers. And the reason for that is because they are a group with
01:03:58.000 a lot of disposable income. They're working, they have jobs, but they don't have anyone who they
01:04:02.000 need to spend it on because they don't have responsibilities. The longer you extend adolescence,
01:04:08.000 the larger and more profitable a consumer base you have.
01:04:11.000 So, men not getting married and having families means they don't need to spend that money on real estate, or on more groceries for their children, or on things for their wife.
01:04:22.000 They can spend all of it on whatever childish appliance you can sell them.
01:04:26.000 Or whatever consumer product.
01:04:27.000 I'll tell you what the biggest problem is with this.
01:04:29.000 Now, there's already people mentioning in Super Chat that there's known problems of sexually suggestive content to underage users.
01:04:35.000 But I'm sitting here realizing, like, if you're a young man and you have an AI girlfriend, who's doing the dishes?
01:04:41.000 Who's gonna make you a sandwich?
01:04:44.000 Our young men are gonna be starving.
01:04:45.000 No, they're gonna live with their mother.
01:04:46.000 They're gonna keep living with their mother while they're playing with this AI.
01:04:49.000 I was joking, that's a good point.
01:04:49.000 Yeah.
01:04:51.000 Mom's doing the dishes.
01:04:52.000 I'd like to give a shout out to our good friends over at Futurama with this important message.
01:04:57.000 Quite simply, Don't.
01:04:58.000 Date.
01:04:59.000 Robots!
01:05:00.000 Dude, someone's gonna download like eight of those apps and have all eight of them on
01:05:03.000 their phone and be like, cheating on one AI robot with another one.
01:05:07.000 And they're like, are you, are you meeting someone else, John?
01:05:09.000 And they're like...
01:05:10.000 Market competition for the anime waifu versus the AI.
01:05:13.000 And then like the VC guy is like, we have a problem.
01:05:16.000 People are using the, the anime waifu more than us now.
01:05:18.000 So we need to make ours more sexually suggestive.
01:05:20.000 I got an idea.
01:05:21.000 Multi-chat.
01:05:22.000 You can have a bunch of AIs come in together and you can have big group chats with all
01:05:25.000 your women.
01:05:26.000 of the market competition is the increasing horrification of the avatars.
01:05:31.000 To try and entice young men to use their app over others, the women will become increasingly loose.
01:05:37.000 And guess what that leads to?
01:05:38.000 Available.
01:05:39.000 Unfortunately, and to add to the bit, the different brands would compete and it's like, you're cheating on me.
01:05:44.000 If you use this different app, don't even look at it.
01:05:46.000 I couldn't help but notice you have Anime Waifu on your phone.
01:05:49.000 There's malware on her app, you know.
01:05:50.000 She hates your friend's anime girlfriend because she's a different brand.
01:05:53.000 She's like, I don't want you hanging out with him.
01:05:55.000 But not only, you'd have a race to the bottom there in terms of what you're trying to sell to the consumer, but unfortunately, what would end up happening, I think, is if this became popular enough, a lot of young women would become lonely as well, and then they would try to emulate what they saw this algorithm doing in order to get attention from men.
01:06:12.000 I want to mention something.
01:06:13.000 There was a point, I don't know if it's still true, but in metrics for superchats for live shows around the world, Timcast IRL was the number one real human being show in terms of the amount of superchats received.
01:06:30.000 We were number 15 in the world for all channels.
01:06:33.000 And the channels that beat us were hot anime... What are they called?
01:06:37.000 Like, the anime women who, like, giggle and are on camera?
01:06:40.000 Waifu?
01:06:40.000 I have no friggin' idea.
01:06:41.000 It's not waifu.
01:06:42.000 It's called something else.
01:06:43.000 Hentai?
01:06:43.000 Is that a word?
01:06:44.000 Manga?
01:06:45.000 V-caster or something?
01:06:46.000 I don't know.
01:06:46.000 VTubers?
01:06:46.000 Something like that?
01:06:47.000 Maybe.
01:06:47.000 So I know my 40s. What the hell do I know about this crap?
01:06:50.000 But so so someone told me someone met like I think we got a super chat and they were like, hey
01:06:54.000 Did you guys know that you're like number 15 in the world for super chats?
01:06:57.000 And I was like really and they're like, yeah, check out the stats and then I pulled it up and i'm like
01:07:00.000 What are all these channels above me?
01:07:01.000 They're not people and it was like anime waifus and I clicked one and it was like an anime waifu playing minecraft
01:07:07.000 And I was like, oh Wow.
01:07:11.000 We're in the wrong business.
01:07:12.000 Yeah, we are.
01:07:14.000 Man, talking about politics?
01:07:16.000 I can avoid all the drama and just make a robot... Listen, a robot anime wife who does the work for you.
01:07:22.000 Listen here, young men.
01:07:24.000 I've been married as long as some of the people in this room, I think.
01:07:27.000 Or thereabouts.
01:07:28.000 Let me tell you.
01:07:29.000 Wait, what do you mean?
01:07:29.000 What?
01:07:30.000 No, what I'm telling you is a long-standing relationship with an actual human being is worth way more than you think it is.
01:07:40.000 So I'm trying to encourage, I know it's a little awkward to transition and me picking up a sword to look manly, but, I mean, you do what you gotta do, right?
01:07:47.000 It's like a physical manifestation of my phallus or something.
01:07:52.000 So, no, seriously.
01:07:53.000 James.
01:07:54.000 Hey, I didn't say anything crude.
01:07:57.000 No, the truth is, Young men, actually, I think, are a link in this problem.
01:08:03.000 Like, they are, you know, the weakest link is where the chain breaks.
01:08:05.000 I actually think that, you know, if they're going to sit on their ass and wait for girls to be the kind of girl—exactly—no, stop this crap.
01:08:14.000 You need to take control of your life, and you need to decide that you're going to step up, and you need to realize that a long-term, fulfilling relationship with an actual human, it turns out, brings massive amounts of benefit.
01:08:25.000 If you've actually looked at the statistics, it actually works out that it's more to the benefit of the man than the woman,
01:08:31.000 which is very easily discerned by, if you are determined by, if you just look at the fact of
01:08:36.000 when you get to kind of later in life, either divorces or deaths where somebody's widowed
01:08:41.000 or whatever, what you find is women very frequently are like, and men are like remarried,
01:08:48.000 like in three months, because they desperately need somebody.
01:08:51.000 Yes.
01:08:52.000 It turns out that once your wife dies, you tend to die.
01:08:55.000 Yeah.
01:08:55.000 Or you die.
01:08:56.000 That's right.
01:08:56.000 Right.
01:08:57.000 This is part of the reason I think women live longer.
01:08:59.000 And so what will that also because married men tend to do less stupid things.
01:09:04.000 True.
01:09:04.000 But the truth is, A lot of young men that I'm paying attention, I'm looking around, I'm listening to people talk, don't realize what they're missing.
01:09:13.000 They're like, oh, I'm gonna like work on this, or I've got my little, you know, anime waifu, or whatever it is that they're doing.
01:09:19.000 It's only fans, whether it's porn, whatever, I don't care, whatever, whatever they're doing.
01:09:22.000 But what they're not paying attention to is that, as the old country song says, you can't make old friends.
01:09:29.000 Well, you can't make old relationships either.
01:09:31.000 And so, if you've been in a relationship with somebody for 20 years, and then you just kick that to the curb and you start a new relationship, Twenty years later, yeah, you're back to a 20-year relationship, but you're never to that 40-year relationship.
01:09:45.000 And what that builds up to over time, this investment that you put in, is so, so, so valuable.
01:09:52.000 And when you're 20, it's really hard to see that.
01:09:55.000 It is so, so, so valuable.
01:09:57.000 And I think it is mostly incumbent upon young men to step up to this plate.
01:10:01.000 and start trying to figure out how am i going to be not how do i find my real life version of some anime waifu how am i going to be the kind of guy that can build this investment with another person and earn kind of my way into that situation by becoming impressive by taking up projects by by developing myself which by the way you're not doing by raising your level on on Anime Waifu, and you're also not doing by raising your level on fucking World of Warcraft, which, get off the- play video games if you want, but seriously, don't mistake making yourself- World of Warcraft.
01:10:39.000 That was the one for me.
01:10:40.000 You gotta do it publicly.
01:10:41.000 I was 25 or 6 years old, I'm throwing fireballs at some alligator pirates or some shit in World of Warcraft, grinding my character- my, like, second character up to, um, you know- 60?
01:10:53.000 Towards 60.
01:10:53.000 I was in the high 40s at that point.
01:10:55.000 That's it?
01:10:56.000 Yeah, it was bad.
01:10:57.000 It's tough to get through the badlands, dude.
01:10:58.000 You're a power gamer.
01:11:00.000 Well, I had a mage named Algebra.
01:11:04.000 And so, deal with it.
01:11:07.000 So, Algebra, yeah.
01:11:08.000 So, the point is, I was throwing these fireballs at this thing, and I was like, damn, you know, I put a lot of time, because when you get up to those upper levels, it takes longer to grind.
01:11:14.000 And I'm like, I'm putting a lot of effort into becoming awesome by proxy.
01:11:17.000 That was literally the nerdy words I thought of for the situation.
01:11:21.000 I could be putting, and I already started training my martial art that I was interested in, and I was like, I could be putting the same effort into training myself and making myself awesome, raising my own level, and then I was like, thinking about it, and I just quit playing the game.
01:11:33.000 I got more interested in this little thought experiment.
01:11:36.000 I never, I've never been able to play video games once with like, some old friends over like Christmas or whatever for nostalgia.
01:11:42.000 But other than that, I've never found video games interesting ever since I saw it.
01:11:45.000 And what I realized is, it takes way more effort to level up yourself.
01:11:49.000 Way more effort.
01:11:50.000 But it's way more valuable.
01:11:53.000 Like, I defeated Tim in one-on-one sword combat right before we started.
01:11:56.000 But there are some video games people should be playing.
01:11:58.000 So the issue would be, I suppose, MMORPG.
01:12:01.000 Civilization.
01:12:02.000 NPC Lemmings.
01:12:02.000 I think civilization should be mandatory in schools.
01:12:05.000 I think second graders should be given copies of Civilization and told to play it.
01:12:09.000 That should be their homework.
01:12:10.000 I found that to be an edifying game to play, yeah.
01:12:12.000 The kids should have to do a report, like, play Civilization for a week, and then on Friday, come in and tell the class.
01:12:18.000 Like, if I was a teacher, I'd say, just, you don't gotta write it down.
01:12:22.000 Play the game for a week, and on Friday, I want you to tell me about how many games you've played.
01:12:26.000 I'm gonna tell you about your experiences and what your thoughts are on the game.
01:12:29.000 Yeah, yeah.
01:12:29.000 I mean, I totally get that idea.
01:12:31.000 Because then there's gonna be a kid, he's gonna be like, I couldn't raise enough money, and they kept attacking me, so I started raising taxes, and then everyone started protesting, and so then, the enemy came and took my cities, and I was really mad.
01:12:42.000 See, the only way I think most public schools would be willing to introduce that into their curriculum is if they modified the game so that, like, you lose if your military force isn't gender diverse enough or something like that.
01:12:53.000 Well, they wouldn't do it because it's effective.
01:12:56.000 They wouldn't have kids.
01:13:00.000 They don't actually teach kids.
01:13:03.000 I wouldn't expect the government to do anything 21st century tech.
01:13:05.000 They're stuck in 1970 or something.
01:13:07.000 Well, maybe in the 23rd century.
01:13:08.000 Maybe in the 23rd century.
01:13:10.000 But I'll do that.
01:13:10.000 Let's homeschool our kids and make them play Civ.
01:13:12.000 It sounds fun.
01:13:13.000 My mom bought Civilization.
01:13:17.000 and civilization, these are the DOS versions.
01:13:19.000 Oh my goodness.
01:13:20.000 And I'll tell you this, man, you learn a lot about history.
01:13:24.000 Because when, so in, in Civilization II, you've got a plethora, and this is like,
01:13:30.000 oh, like early on, like Windows 95 version of it, the first one was like DOS.
01:13:33.000 When you choose a certain nation to start off as, so you choose the French, you start building cities,
01:13:40.000 and the cities all have real names of French cities.
01:13:42.000 You can start as ancient settlers who are American because it's just a video game.
01:13:48.000 But then you learn about the Manhattan Project.
01:13:50.000 You learn about the Statue of Liberty.
01:13:51.000 That's where I learned about the space elevator.
01:13:53.000 It's from civilization.
01:13:54.000 That's right.
01:13:54.000 So you build wonders.
01:13:56.000 And so I'm just trying to win the game, and then it's like, oh, I can now build the Great Lighthouse, what's that?
01:14:01.000 And then I learned about the Colossus of Rhodes, and I'm like, what is this?
01:14:05.000 And there's a link, and you click on the link, and it takes you to the Wikipedia, or to the... And then, and then, when I got, I think it was Civ IV, Leonard Nimoy!
01:14:12.000 He was telling me stuff!
01:14:13.000 Oh, solid.
01:14:14.000 Sean Bean's Civ V?
01:14:15.000 Or was it Civ VI?
01:14:16.000 Was he?
01:14:17.000 Yeah.
01:14:17.000 I remember Leonard Nimoy.
01:14:18.000 Nimoy was the best.
01:14:19.000 Yeah, and he was telling me stuff, and I was like, Alex Jones will be the voice for the next one.
01:14:24.000 Oh, that's a good one!
01:14:26.000 We can put that in the game.
01:14:27.000 You wanna get into the Satanist?
01:14:28.000 Oh, that'd be great!
01:14:30.000 Dude, that'd be hysterical.
01:14:31.000 Like a conspiracy version of Civilization.
01:14:34.000 What's actually behind the scenes.
01:14:35.000 Oh, that's a good idea.
01:14:37.000 This is like my third million dollar idea I've had tonight, guys.
01:14:40.000 This actually is a million dollar idea.
01:14:42.000 Conspiracy.
01:14:44.000 Society's ready for it.
01:14:45.000 And you're like the Illuminati versus a light force that's trying to fight against it.
01:14:50.000 So colonization, when we got it, was like a side version of civilization.
01:14:57.000 Where we would play the 94 DOS version.
01:15:00.000 You choose, you can play as the English, the Spanish, the French, or the Dutch.
01:15:03.000 And each has a different, like, national benefit.
01:15:06.000 So the English immigrate more, the Dutch get trade bonuses, the French get cooperation bonus with the Native Americans, and the Spanish get a, I think they get an attack bonus against Native Americans.
01:15:15.000 And so, like, my brother would always be like, play the Spanish, because you can ransack the Aztec Empire and take all their gold.
01:15:20.000 And then I would always play as the English, because they immigrated faster, so you could build colonies faster.
01:15:24.000 And then you want to generate freedom.
01:15:27.000 So you're like, you hire statesmen to, like, advocate for freedom and generate a propensity towards independence.
01:15:33.000 and then once enough of your colonies support independence, you declare independence,
01:15:37.000 and then you get invaded by Europe, and then the French intervene,
01:15:40.000 or like the Dutch intervene, and then the expeditionary force shows up.
01:15:43.000 I used to play that game all the time.
01:15:45.000 In fact, I even still have it on one of my computers, running on a DOS emulator,
01:15:48.000 like slowed down so you can play it.
01:15:50.000 I have it on my phone.
01:15:51.000 It's such a good game.
01:15:53.000 But I think, what about like Civ VI?
01:15:56.000 That game's amazing.
01:15:57.000 Civilization will always be one of the greatest video game franchises.
01:16:00.000 And if you want to help your kids, if you want to make your kids smarter, have them play Civilization.
01:16:04.000 Because it's fun, and it's educational.
01:16:07.000 And apparently a musical instrument.
01:16:09.000 I read that the other day.
01:16:10.000 Yeah.
01:16:11.000 Apparently.
01:16:11.000 It's interesting, no, learning a musical instrument is the only thing that you can do that's been proven to boost IQ.
01:16:16.000 Yeah, that's what I read.
01:16:17.000 Multiple languages are good, but teach your kids a musical instrument ASAP.
01:16:21.000 And I mean as an adult too, like there are a number of things you can do with your child to increase the probability that they'll have a higher IQ, but learning an instrument as an adult will actually improve your score on an IQ test.
01:16:31.000 You know the most important years in a child's life, 0 through 5, and what do we do in America?
01:16:35.000 Shove them into daycares or public schools.
01:16:37.000 That's right.
01:16:38.000 Not public schools, not to 5.
01:16:39.000 Well sometimes they have preschool programs and pre-K programs now.
01:16:42.000 But they're sitting in front of TVs, they have iPads in their laps, and that's it.
01:16:46.000 Their faces are covered.
01:16:48.000 When I was a kid, for zero through five, my mom was homeschooling me.
01:16:52.000 So when I was like two years old, my mom was showing me, I was reading, I was learning, I was a lot over the exact age.
01:16:58.000 But before I even started kindergarten, I already knew multiplication and division.
01:17:01.000 Good for you.
01:17:02.000 Yeah, I don't know about two, but that's before we were ever sent off to school.
01:17:05.000 My mom had an old phonics book that she would sit with us and teach us to read with because she just didn't trust the school system to be able to do it.
01:17:11.000 Yep.
01:17:12.000 But now, you know, I remember talking to a friend of mine and I asked them, because a friend of mine years ago was saying they didn't know what they wanted to do with their life.
01:17:19.000 And I said, what were you doing when you were 13?
01:17:21.000 And she was like, I don't know, riding bikes with my friends.
01:17:23.000 And I was like, how would you like to own a bar?
01:17:25.000 And she was like, That would be amazing.
01:17:28.000 And I was like, yeah, the thing you did when you were a kid is what you want to do today.
01:17:32.000 It's not surprising to me.
01:17:34.000 When I was 13, I was skateboarding around and I was actually on the internet reading stuff all day.
01:17:39.000 No joke.
01:17:40.000 So I've had the internet since as long as I can remember.
01:17:43.000 We had DOS, we had DOS Shell, we had CompuServe.
01:17:46.000 We got Windows 3.1.
01:17:47.000 3.1 or 3.11 for work groups, depending on how fancy you were.
01:17:52.000 We got Windows 95, and then we had CompuServe, like one of the first internet programs you can get.
01:17:57.000 We had that too.
01:17:58.000 Then we got AOL, and so I'm online, and I'm finding video games, and I had friends who did the same thing.
01:18:04.000 And so I ended up downloading Flash 4, I think.
01:18:07.000 I learned on Flash 3, that's where I started to make cartoons.
01:18:10.000 So I was making cartoons.
01:18:11.000 Motion tween and all that stuff, and showing my friends.
01:18:14.000 I made websites.
01:18:15.000 I made games.
01:18:16.000 And so I was reading online all day every day.
01:18:18.000 And there's something interesting that happens back then.
01:18:21.000 The internet, when it started off, it was mostly dominated by tech-interested individuals who were savvy enough to know to use the internet.
01:18:27.000 It wasn't dominated by a bunch of emotionally stunted children who complain all day and night and want to join a cult.
01:18:33.000 So for me, I'm in a chat room looking up video games, and I'm like,
01:18:37.000 how do I make it so when the guy moves, the world moves around him? Like, what's that called? Like,
01:18:41.000 I want to make Mario. And they're like, oh, you know, you need to learn. I forgot what the
01:18:45.000 phrasing is, but like, someone told me, set the, uh, create a zone, set it so that when your player
01:18:51.000 object reaches the zone, it, the program sets his coordinates to be one degree, uh, you know,
01:18:57.000 to the left or whatever, like, y minus one of the coordinate.
01:19:01.000 And so we'll always stay in the middle of the screen and then have it set that when object reached the zone, then you have other objects.
01:19:07.000 And I'm like, wait, what?
01:19:08.000 And so I'm learning from actual adults who are interested and explaining to me how these things work, telling me like what parallax scrolling was.
01:19:15.000 So I started making my own video games.
01:19:16.000 Yeah, yeah.
01:19:17.000 Now on the internet, your kid's gonna go on there and it's gonna be a bunch of psychopaths and emotionally stunted losers who are, like, trying to manipulate their brains.
01:19:24.000 Yeah, straight-up grooming.
01:19:25.000 Right.
01:19:25.000 Like, they'll just cut to it.
01:19:25.000 That's also true.
01:19:26.000 And then they go to school and they get groomed again by the Maoist frickin' education program.
01:19:30.000 The social-emotional learning and the queer theory and the gender theory and the critical race theory that they're using.
01:19:36.000 Exactly.
01:19:37.000 Not to derail our conversation, but I'm so pissed off that they've literally recreated Maoism in our schools.
01:19:43.000 Literally.
01:19:45.000 Totally tricked everybody.
01:19:46.000 What bothers me the most is when we have guests who come in here who are like moderate or conservative
01:19:51.000 or you know understand what's going on and they just like will block like very like nonchalantly
01:19:57.000 be like yeah you know my kids going to college and so they're we're trying to figure out where
01:20:01.000 to send them to and I'm like why? Why? Aren't you like savvy to what's going on politically?
01:20:08.000 I saw this story on Reddit where a guy was like, I sent my daughter off to college and she came back and now she hates me.
01:20:13.000 I don't understand what happened.
01:20:15.000 And I'm like, maybe you should pay attention to your children and their lives.
01:20:19.000 You think you can send your kid to an institutionalized learning facility and they won't be indoctrinated?
01:20:24.000 I'm sorry, you need to wake up and make sure you're paying attention to what your kids are doing.
01:20:28.000 I mean, that's even a problem when you don't have a Maoist structure in place.
01:20:33.000 And like, I don't know if you guys, I don't know if anybody knows, no American knows how freaking Maoism worked.
01:20:38.000 It's actually very simple.
01:20:39.000 Mao created a list of bad identities, right?
01:20:42.000 That's it.
01:20:42.000 They called them black identities.
01:20:44.000 We'll get the critical race theorists to work on that later.
01:20:46.000 They'll call them white identities.
01:20:46.000 But it was, yeah, well, it was, yeah, exactly.
01:20:48.000 So, but it was, it was, it was for fascism.
01:20:50.000 And so these were people that were wealthy landowners, landlords, wealthy farmers, stuff
01:20:56.000 like that. Bad influence was one of them. And then he created these other identities,
01:21:00.000 read for communism, that were good identities, and they were things like peasant and laborer,
01:21:04.000 but then also revolutionary, activist, things like that.
01:21:08.000 And so what you would do is you take kids in the school and be like, oh, your dad's a wealthy
01:21:12.000 farmer, black identity.
01:21:13.000 You're the biggest problem in the world.
01:21:15.000 You're connected to the biggest problem in the world.
01:21:17.000 Guess what?
01:21:18.000 Join our movement, though, and you can have a red identity like all the other cool kids and you get to wear a red hat or whatever the little prize is for the kids that are the good guys.
01:21:25.000 What are you doing now?
01:21:26.000 Well, you're white, so you're complicit in racism.
01:21:28.000 You are a racist, but you can be an ally.
01:21:31.000 You're black.
01:21:32.000 You don't even know how the system works, so you're complicit in racism.
01:21:35.000 But you can become politically black.
01:21:37.000 You can become a black voice.
01:21:40.000 You're a 12 year old girl.
01:21:43.000 You're confused.
01:21:44.000 You have white skin.
01:21:45.000 You're the worst kind of person.
01:21:46.000 But did you know that if you transition and become a queer activist, you have a positive social identity now?
01:21:53.000 That's exactly how Mao did it.
01:21:55.000 That's exactly what they're doing in schools now.
01:21:57.000 We had a good super chat from someone.
01:21:58.000 They said, I like how only white people can be racist and black people can't be racist, but Candace Owens is a white supremacist.
01:22:04.000 Yeah.
01:22:05.000 I want to ask real quick, have you spoken to Lily Tang Williams?
01:22:07.000 Yeah, I met her the other day.
01:22:09.000 Okay, great.
01:22:09.000 I was going to say, I think you two would have a great conversation.
01:22:11.000 Oh yeah, she's awesome.
01:22:11.000 She's totally awesome.
01:22:12.000 One of the things that terrifies me about this being like Maoism back is one of the things Mao did was talk about rightists.
01:22:17.000 He would say all the rightists.
01:22:19.000 So when I hear the language of whenever, even us, my friends, and we're talking, when someone's like, those on the left, I feel like Mao has already indoctrinated you, sir.
01:22:27.000 If you're thinking in terms of left and right, Mao has indoctrinated you.
01:22:31.000 I don't think so, because the terms left and right go back to the French Revolution.
01:22:36.000 The side of the aisle they sat on, it wasn't a political ideology.
01:22:39.000 But it was.
01:22:40.000 I mean, we named the political ideologies after the side of the aisle they were sitting on.
01:22:43.000 But the ideologies were completely irrelevant to the side of the aisle they sat on at the time, and it's a mistake to split people in half like this right now.
01:22:50.000 No, the revolutionaries sat on the left.
01:22:51.000 Mao did it.
01:22:52.000 He did it on purpose.
01:22:54.000 There are two parent factions and we could call it... That's what Mal wants you to think, bro.
01:22:59.000 There literally are.
01:23:00.000 It's an objective fact.
01:23:01.000 The other goes in the other.
01:23:02.000 I mean, you got to be... It's objective.
01:23:04.000 No.
01:23:04.000 Well, Ian... It is.
01:23:05.000 I don't think so.
01:23:06.000 James, are there two parent factions in the culture war?
01:23:10.000 Yeah, there are.
01:23:11.000 We could call them right and left.
01:23:13.000 We call it blue and red, we call it one and two, we call it A and B, we call it A and C, we call it alpha and omega.
01:23:17.000 Can we call them them and the others?
01:23:19.000 How bad do you want to divide these people right now?
01:23:21.000 It's not about wanting to do anything, it's about objective reality.
01:23:24.000 Unity cannot come at the expense of truth.
01:23:26.000 These people are saying things and doing things that are horrific and we need to speak out against that and acknowledge that their goals are separate from ours.
01:23:33.000 The truth is also through the vessel it's spoken.
01:23:35.000 Your perspective on truth is different from... So we talked about the Jon Stewart thing earlier, and I said I didn't know if I should name it that, but that's exactly actually what Mao was doing.
01:23:44.000 They had labeled rightists as people who were against the Glorious Revolution.
01:23:49.000 So anybody who was against the Glorious Revolution was in this black category and was a rightist.
01:23:55.000 And so the goal was to stain anybody who wanted to keep the existing society, largely intact in its structure, as somehow morally polluted, morally polluted, stupid, too stupid to understand the need for the Glorious Revolution, or crazy.
01:24:11.000 And then the goal was to turn the people into thinking that everybody who liked the existing society may be thinking it's imperfect, but that the general structure is pretty good and we should try to reform within it rather than have a revolution that gave Mao all the power.
01:24:24.000 Those people are the problem.
01:24:27.000 And everybody who wants to be on the right side of history, which is a Hegelian Marxian idea, everybody wants to be on the right side of history now has to be against those people.
01:24:38.000 So the divide and conquer, the splitting, what you're saying is, you know, Mao wanted to split people, yes, for his own power.
01:24:45.000 And what Tim is saying is it's just objective reality that somebody's splitting us into two different factions that have There's a bunch of ways it's been described.
01:24:54.000 Authoritarian versus Libertarian.
01:24:55.000 very different structures for how society is to be organized that literally comes all
01:24:59.000 the way back down to subject versus object.
01:25:00.000 Do you think in terms of the subject, do you think in terms of the objective world outside?
01:25:05.000 There's a bunch of ways it's been described, authoritarian versus libertarian.
01:25:09.000 There's a great way that was described to me by Stephen Marsh, which is a multicultural
01:25:14.000 democracy and a constitutional republic both existing within the same borders, which I
01:25:19.000 find really interesting as well.
01:25:20.000 I don't think necessarily anyone kind of gets it.
01:25:23.000 My view of it is actually Judeo-Christian moral framework versus Marxist lack of moral framework, or moral framework lack thereof.
01:25:31.000 So, the way I see it is, when you look at the Constitution, you look at the ideas of liberalism or liberty, classical liberalism, etc.
01:25:38.000 A lot of it is rooted in a Christian moral framework.
01:25:44.000 I'm going to start calling it Abrahamic.
01:25:45.000 Well, it's interesting because a lot of what Marxism A lot of what Marxism has done is just bastardize Christian principles, so I think it's interesting that you pointed out this idea of the Hegelian notion of being on the right side of history or that history has an end.
01:26:00.000 I mean, when you consider it, that is a Christian idea in some sense that's been twisted into something else.
01:26:04.000 It's called the Eschaton.
01:26:05.000 Yeah, their Eschaton is our political order prevails.
01:26:09.000 I'll put it this way.
01:26:09.000 That's right.
01:26:10.000 So whether people in this country realize it or not, even the atheists, their moral framework is rooted in Christianity.
01:26:18.000 Not 100%.
01:26:20.000 It's not like they follow the Bible.
01:26:21.000 It's just that a lot of the ideas they hold true to themselves, they don't realize what the root of that is.
01:26:25.000 So there is a traditional view in terms of what is right and what is wrong, which comes from Christianity.
01:26:30.000 For the race Marxists, for the woke, for whatever this other group is, they don't have those moral frameworks.
01:26:37.000 They have an inverted one, in fact.
01:26:39.000 It's inverted.
01:26:40.000 Actually, I think it's lack thereof.
01:26:41.000 It's whatever suits their power.
01:26:43.000 Well, that's... yes.
01:26:46.000 They do have a vision, though, which is that everything comes from when they achieve their utopia.
01:26:52.000 And so the whole thing is actually a pretty vastly religious structure.
01:26:55.000 And the reason is because Kind of tracking back to, you know, say the 19th century, you have basically God versus society as kind of the two explanations of what's going on, or God versus self.
01:27:07.000 And, you know, God has ordained this moral order, this is the Judeo-Christian order that you're looking at.
01:27:12.000 What Marx actually said is, no, we're going to—this is his 1844, you know, so-called Paris Manuscript, or his epistem—what is it?
01:27:19.000 Economic and Philosophic Manuscripts of 1844 from Paris.
01:27:22.000 And he says, no, what we're going to do is we're going to abandon that.
01:27:25.000 We are going to make man in himself, independent of everything.
01:27:29.000 Man is going to become the deity, but not any man, only awakened man, a gnostically awakened man, if we want to get really technical, only awakened man who has a correct consciousness.
01:27:40.000 And that consciousness is a social consciousness, or in other words, a socialist consciousness.
01:27:44.000 And so when man and society become co-continuous, so that man is for society and society is making man So that they're the same.
01:27:52.000 Total socialism.
01:27:54.000 Then you have actualized, this is the Hegelian part, you have actualized the deity.
01:27:58.000 The deity comes in in the form of society as man as society, which is totally hard to get your head around because it's just dialectical bullshit.
01:28:07.000 But what that stands to do is replace God.
01:28:11.000 And so what that—in a moral framework, anything that brings that into existence is good.
01:28:18.000 Anything that resists that is bad.
01:28:19.000 So anything that gives them power is good.
01:28:21.000 And as a matter of fact, just to—I know you're—I don't want to keep going, but if you go to marxist.org and you look up—they have an encyclopedia.
01:28:28.000 It's awesome.
01:28:28.000 I love it.
01:28:29.000 I read it a lot.
01:28:31.000 They tell you exactly what the hell they mean by all these crazy words they use.
01:28:35.000 You look up the word for truth.
01:28:37.000 and they straight up tell you. A lot of people think that truth means correspondence to what's
01:28:41.000 actually happening in the objective world. But for Marxists, it's a social formation.
01:28:45.000 And then they go on and say, well, the rationalists think that truth is in reason,
01:28:48.000 and the empiricists think that it's in evidence, and the pragmatists think that it's in what works.
01:28:52.000 But for Marxists, it's closest to the pragmatists, but it must wed theory and practice. So it is
01:28:57.000 what brings Marxism into the world is true. I think a big component of the culture war may be,
01:29:03.000 do you believe in a greater power than yourself? That doesn't mean God. That doesn't mean your
01:29:06.000 religious.
01:29:08.000 When I think about inalienable rights, why I believe in freedom, it's because I feel that I am a tiny, insignificant fragment of the universe relative to the greatness and vastness of the universe.
01:29:20.000 Personally, I do believe in God.
01:29:22.000 I'm not theistic, like following any particular religion, but so when I think about other people's lives and their rights and what they're entitled to, I think there is something beyond me And, you know, I respect other people's existence.
01:29:34.000 Yeah, that totally is.
01:29:35.000 But for the woke, their view is there is nothing.
01:29:37.000 The power is them if they're to take it.
01:29:40.000 They can be gods.
01:29:41.000 So you think about... That's right.
01:29:43.000 You think about the World Economic Forum.
01:29:44.000 You think about transhumanists.
01:29:46.000 You think about the people who are like, can we transcend?
01:29:48.000 And I recommend... I recently just played the video game Horizon Forbidden West.
01:29:54.000 Zero Dawn was awesome.
01:29:56.000 That came out a while ago.
01:29:57.000 Forbidden West is the new game that just came out.
01:29:59.000 I'm not going to spoil it because it's really new, but if you like these conversations, play that game.
01:30:04.000 I'm a little underwhelmed by the writing, but it plays in a bit to this and it's really fascinating concepts outside of the kind of weak story they made.
01:30:11.000 But concepts about transhumanism and things like this are really, really fascinating because these people think they're gods.
01:30:17.000 Well, that's right because Marx Marx actually, the belief is that the subject and the object, and this is why it does dichotomize, is people that center the subject versus people that center the object.
01:30:28.000 Marx believed that these two are in dialectical relationship.
01:30:30.000 He also took this from Hegel, who believed that the deity will actualize when the subject and object are synthesized.
01:30:37.000 This was the Hegelian systematic philosophy.
01:30:39.000 So Marx took it from there, and he actually, the goal for Marxist is that you are a subject.
01:30:45.000 You can envision, I want to create the blade, I see the blade in my mind, I know what Maul's sword should look like, and then I go get a piece of apparently brass and bang on it with a hammer, and hopefully don't give myself zinc poisoning when I put it in the fire.
01:30:59.000 Don't rag on my anime Maul sword.
01:31:01.000 I'm just saying, he's a blacksmith.
01:31:03.000 I'm just saying, don't put brass in a forge without proper ventilation.
01:31:07.000 Just don't do it.
01:31:08.000 So then you create the thing with your hammer, right?
01:31:12.000 Your sickle is for when you're hungry.
01:31:14.000 And you create the thing.
01:31:15.000 And so in the object, you see that you created that which was in your subjective vision.
01:31:21.000 And so you recognize yourself as creator.
01:31:26.000 You recognize yourself as somebody who has the capacity to shape the world.
01:31:29.000 And then, when you see this, not only can you shape Mall Sword, not only can you shape Wakizashi, not only can you shape Zelda Sword or Whiskey-G or whatever it happens to be.
01:31:38.000 Whiskey-G is the Chinese for whiskey, by the way.
01:31:42.000 I know all the people are gonna be like, it's Baijiu!
01:31:44.000 No, Whiskey-G is a thing they actually say when they talk about, like, American whiskey.
01:31:48.000 I know that because my Chinese person, every time my friend, every time I drink, he doesn't
01:31:52.000 speak English, and every time I drink whiskey, he's like, whiskey, ah.
01:31:54.000 You know, he gets really excited.
01:31:56.000 And so anyway, the goal is that you also have to shape man and society as your object.
01:32:04.000 So you have to create socialist consciousness in man, including yourself, but everybody
01:32:10.000 else as well.
01:32:11.000 And then you also have to shape society to become a socialist society so that when those
01:32:16.000 two fuse, you now have the perfected society.
01:32:18.000 This is literally the religion, and I mean religion, in the fundamental correct term
01:32:23.000 of Marxism.
01:32:25.000 And this is the operating system behind the entire thing.
01:32:28.000 And race marxism or critical race theory is the same thing.
01:32:30.000 You're just doing it now in the racial justice cabinet.
01:32:32.000 All three of us have our hands up.
01:32:33.000 I just want to say real quick.
01:32:36.000 Talking about the mall sword.
01:32:37.000 That mall sword.
01:32:39.000 Yes, the defendant.
01:32:40.000 There was a point in time where that would have been the pinnacle of weapons technology.
01:32:44.000 And it's like, it costs like $15.
01:32:46.000 A brass blade.
01:32:47.000 The brass is fake.
01:32:48.000 I don't, I think that's probably just like aluminum garbage or something.
01:32:51.000 But, but the point is the way it's shaped and like, it's easy for us to make something like that.
01:32:56.000 Some, you know, ancient tribe would have been like, wow.
01:32:59.000 Yeah.
01:32:59.000 Is that like to bind the blade, that little hook, that little at the bottom there?
01:33:03.000 If it catches the blade to bind it?
01:33:04.000 I mean, I guess probably.
01:33:06.000 It's Ian.
01:33:06.000 It's like, it's a mall sword.
01:33:08.000 It's a mall sword.
01:33:09.000 It's a $15 piece of metal.
01:33:10.000 That's like, you know, a toy.
01:33:13.000 I love that you reference the fact that this is very clearly a religion, and one of the Christian principles that's sort of been bastardized here is this idea of cooperating with God in creation.
01:33:25.000 But where the huge distinction here is, is Marxists see human beings as objects, as you described, which can be reformed in their own image.
01:33:33.000 So in that way, they really start to play God.
01:33:36.000 That's right, that's exactly right.
01:33:37.000 Yeah, and it's interesting too, because Tim sort of mentioned that we have this Christian framework that a lot of people don't realize they're following, and I would also argue our culture has a Marxist framework, and on top of that, you mentioned the Marxist framework, but I think there are a lot of people, including in the conservative movement, who don't realize they're following the Marxist framework as well, and in many cases, even more closely than they are the Christian framework.
01:33:59.000 So there are a lot of Marxist assumptions that our culture currently takes for granted.
01:34:03.000 One example would be that Without any qualification, equality is always an inherent good.
01:34:09.000 But of course, that's ridiculous, right?
01:34:10.000 We should not treat a pedophile the way we would treat a law-abiding citizen.
01:34:13.000 In some instances, you need inequality and justice is more important.
01:34:18.000 We shouldn't treat the guy who runs way faster than the other guy equally when it comes to handing out medals either.
01:34:22.000 Exactly, exactly.
01:34:24.000 There is a necessity for people to be treated unequally in certain circumstances.
01:34:27.000 I would argue another way in which we've assumed Marxist thinking or taking it for granted is that we can solve problems, including moral problems, with a more equitable distribution of resources.
01:34:38.000 So what the Marxists said for so long Was that if we just more equitably distribute resources and workers own the means of production, all of these social ills will fall away.
01:34:46.000 And then instead of saying that's absolutely nonsense because there's more to human beings than the materials they're made of and they need something deeper, the conservative movement today responds by saying, no, no, no, no, no.
01:34:55.000 Yes, of course, resource distribution is the most important thing, but those resources are distributed better by capitalism as opposed to Marxism.
01:35:04.000 You know what I really can't stand is when politicians call out God and they like name God when you can tell they're not, they don't truly believe it.
01:35:11.000 Like Joe Biden.
01:35:11.000 Yeah, it just really, really shreds me on the inside.
01:35:14.000 Grinds my gears.
01:35:15.000 We gotta go to Super Chats if you haven't already.
01:35:17.000 Nuke that like button!
01:35:18.000 Subscribe to the channel.
01:35:19.000 Share the show with your friends if you really like it.
01:35:21.000 Become a member at TimCast.com if you'd like to help support our work and keep all of our journalist employees and this show going.
01:35:27.000 And you'll also get access to those members-only podcasts Monday through Thursday at 11 p.m.
01:35:30.000 Let's read some of these Super Chats.
01:35:34.000 All right.
01:35:34.000 Unfortunately, YouTube doesn't allow me to read the name of the first Super Chat.
01:35:37.000 Sorry.
01:35:38.000 But as a huge fan of James and his books, they literally helped me steer my 15-year-old back from the dark side into reality.
01:35:44.000 I suggest everyone get them.
01:35:46.000 How many books do you have, Jeff?
01:35:48.000 Technically nine.
01:35:50.000 But I would strongly encourage, if you liked Cynical Theories but you found it hard to read, recently we have an easier remix called Social Injustice that came out.
01:35:58.000 So you can share that, especially with younger people, teenagers.
01:36:01.000 But I just, I just let a new book out called Race Marxism.
01:36:06.000 And this Race Marxism book is, it's the truth about Critical Race Theory is what it is.
01:36:11.000 It's just, what is the truth?
01:36:12.000 Well, it's on the title.
01:36:14.000 It's Race Marxism.
01:36:15.000 And then it's 100,000 words making the case.
01:36:17.000 Many of those words, by the way, are their words, not mine.
01:36:20.000 I quote very extensively so you can see.
01:36:22.000 So I encourage people to pick that up.
01:36:25.000 It was independently published through New Discourses just to let people know, so it won't make any bestseller lists, but I'm very excited that it sold 6,000 copies in the first week, which would have landed it pretty high on the New York Times bestseller list if they considered independently published titles.
01:36:39.000 So it, you know, really is getting out into a lot of hands.
01:36:41.000 So I also encourage people to pick up my books.
01:36:44.000 Alright, we got A.J.
01:36:45.000 says, Tim, have you ever read Revelation in the Christian Bible?
01:36:48.000 It talks about one world currencies, one world government, etc.
01:36:51.000 Also, if you're looking for a fiction writer, I'm here.
01:36:53.000 I can show you my work if needed.
01:36:55.000 Is it revelation or revelations?
01:36:57.000 No S. Singular.
01:36:58.000 There's only one revelation.
01:37:01.000 Seamus, are you familiar with all that stuff?
01:37:03.000 One world government and currencies and stuff like that?
01:37:05.000 So, people say that the one sign of the end times is one world government, because the anarchist does take over the single world government, and that's part of why I'm skeptical of anyone who argues in favor of a one world government, but at the same time, even if I wasn't religious, I think it would be difficult for me to get on board with such a concept.
01:37:24.000 It seems so terrifying that someone would say, you know what, the entire world has to be governed by this specific niche political ideology that I'm saying is superior, and part of the reason I say that is because I think even though I have like my own set of preferences for government, I think there are a number of systems of governance that are acceptable and can work.
01:37:43.000 And it's so strange to me that so many people have this idea that there's just one system that could work everywhere.
01:37:49.000 And even people who don't believe in a one world government will say something like, well, the entire world needs to be democratic.
01:37:54.000 It's like, well, you're still kind of, you're saying the entire world should have one single government system.
01:37:59.000 And I think that's strange too.
01:38:02.000 That's another Marxist thing we've accepted though.
01:38:04.000 I'm into like a decentralized union, kind of like a federalized decentralized union, but if that's the one world government then maybe I'm the Antichrist, I don't know.
01:38:13.000 A little arrogant of you to think.
01:38:17.000 I'm working towards it, I just don't want to do the wrong thing.
01:38:20.000 My point is it's so destructive because look how much time we just spent in Afghanistan trying to build a democracy there.
01:38:26.000 Who says that's the system that they should have?
01:38:28.000 I feel like lack of communication breeds war.
01:38:31.000 So I want to make sure we're connected, at least that we can communicate with our economy and with our words.
01:38:36.000 Well, let's read some more Superchance.
01:38:37.000 We got Dr. Roller Gator who says, Hi James, this is Gator.
01:38:40.000 Everything is stupid and we're all doomed.
01:38:42.000 Gator!
01:38:42.000 Gator!
01:38:44.000 Caps!
01:38:46.000 I did say Antichrist earlier, by the way, Tim.
01:38:48.000 Not the second coming.
01:38:50.000 But I know, to assume you're the... Oh, the Antichrist.
01:38:53.000 I just want to point out, by the way, with the Revelation thing, since I really want to throw this out, the people who are doing this crap have also read Revelation, and there are weirdo Christian cults and non-Christian cults that believe that they can bring Jesus back by forcing the Tribulation.
01:39:11.000 Now, you might say that these people might, you know, read the description of the beast given in Revelation and then build a statue of this right outside a major financial building in New York City to signal that they've read the... no, not the bull, the weird cat thing with wings that they recently just put up.
01:39:27.000 What?
01:39:28.000 You should look up the cat thing with wings.
01:39:31.000 I don't know what it's called.
01:39:32.000 It's the beast and it's outside the financial building in New York City.
01:39:35.000 They just put it up.
01:39:37.000 And so they have read this and believe that they are bringing about the tribulation.
01:39:43.000 Now, to tie things to Marxist ideas, just type in Beast from Revelation.
01:39:48.000 It'll come up.
01:39:48.000 Is that what it is?
01:39:49.000 No, there it is.
01:39:49.000 It's the third one.
01:39:50.000 Not the chicken wings.
01:39:52.000 That.
01:39:52.000 Is that a chimera?
01:39:54.000 Oh, the end times beast.
01:39:55.000 Yeah.
01:39:56.000 So listen.
01:39:57.000 It's very easy to look at this and get it backwards and say, oh my gosh, so many things look like end times, as described in Revelation.
01:40:04.000 But it's also possible that the people orchestrating this crap read that book and are trying to make it.
01:40:08.000 Yep.
01:40:09.000 That seems more likely.
01:40:10.000 Here's a Marxist religion mind blow.
01:40:11.000 You're going to love this.
01:40:13.000 What is the goal of Marxism?
01:40:14.000 Capital R, Revolution.
01:40:16.000 What is Revolution?
01:40:17.000 Rapture.
01:40:18.000 What's on the other side of rapture?
01:40:20.000 Tribulation.
01:40:21.000 What do the Marxists call it?
01:40:22.000 Socialism, while all the contradictions get worked out.
01:40:25.000 What's after that?
01:40:26.000 God's kingdom.
01:40:27.000 What do they call it?
01:40:28.000 Communism, or racial justice, on the far end of that, when everything has been set in order and the kingdom has been brought.
01:40:34.000 So the idea that we're going to usher in the end times, or the eschaton, is also a Marxist idea that has been brought into our society under a cloak.
01:40:43.000 And I would say that Maybe I know that some of the people behind this that have, say, billions of dollars that they give to major institutions, say like the T.H.
01:40:51.000 Chan School of Public Health, the UMass School of Public Health, which the same people recently bought, might have these exact beliefs that they believe that they are going to trigger the Tribulation by emulating it as described in Revelation and then ride back in 2030 with Jesus.
01:41:07.000 It's interesting.
01:41:07.000 I want to make a point here because I am obviously, you know, I'm Catholic.
01:41:10.000 I don't believe this is the end times.
01:41:13.000 However, I think another possible explanation here.
01:41:16.000 Well, I think another I mean, I believe we're headed for I believe it's likely we're headed for some kind of serious chastisement.
01:41:20.000 But when it comes to the end times rhetoric, I think part of it could also be and I'm interested in looking into some of what you're saying here.
01:41:27.000 I think another huge part of this could also be these people, A, just kind of wanting to laugh at everybody, and B, this deep, deep arrogance of saying, Ian made this joke earlier, I'm the Antichrist, and Tim said something like, you know, don't be too full of yourself.
01:41:41.000 I think there's even a kind of arrogance.
01:41:43.000 They're like, oh, we're the Antichrist.
01:41:45.000 We're the one who's gonna bring the end about, when in reality, they're just any other evil person.
01:41:50.000 We've got to read more Super Chats here.
01:41:51.000 We got Ready to Rumble says, pretty privilege is real.
01:41:54.000 Ian, you rolled a 20.
01:41:56.000 Howard says, look who's rolling 20s, the big ol' Super Chet.
01:42:00.000 All right.
01:42:00.000 Dawn's Herald says, it's James's second time on the podcast and second time it's on a Friday.
01:42:05.000 Is it too much to ask to see him on the uncensored show?
01:42:07.000 I'd love to see y'all talk about big tech in the way.
01:42:12.000 It's a good point.
01:42:12.000 We don't do the uncensored show, the after show on Fridays, but Fridays are usually more flowing conversations into the week when there's tons of hard news, which is one of the reasons why we have James on Fridays where we can just like freely talk about whatever.
01:42:26.000 So, I don't know.
01:42:28.000 I'll come back when my schedule opens up.
01:42:29.000 I mean, that's been the deal though.
01:42:31.000 Yeah, that's been the deal.
01:42:32.000 It's like, I've had like 120 flights since I was here last year.
01:42:36.000 So it's like, it's hard to get me, it turns out.
01:42:40.000 I travel a lot.
01:42:41.000 I'm busy.
01:42:42.000 Jerk Longwell says, Tim's AOC is absolutely my favorite Tim impression.
01:42:47.000 Yeah.
01:42:48.000 But I was imitating AOC imitating the cop.
01:42:52.000 Where is she?
01:42:53.000 Mine's Bill Gates.
01:42:55.000 I love your Bill Gates impression.
01:42:56.000 That's actually just me.
01:42:58.000 That's actually me.
01:42:59.000 I'm actually not impersonating Bill Gates.
01:43:02.000 I'm impersonating Family Guy's impersonation of Bill Gates.
01:43:04.000 No, all you have to really do is just kind of squeak your voice a little bit and rock and have your hands in your armpits.
01:43:12.000 I can't really do an impression of him, but I find he kind of has a Kermit thing going to where his voice is like a little bit of this in there.
01:43:18.000 Didn't Joe Rogan point out he's like really out of shape?
01:43:21.000 Yeah, he has moobs and he's fat and he's not fit and now he's gonna give us health advice?
01:43:27.000 Get bent.
01:43:29.000 All right, Howard says, you know Trump is controlled opposition part of the cabal.
01:43:33.000 I didn't know that.
01:43:34.000 Is that true?
01:43:35.000 I can't tell.
01:43:36.000 Cannot confirm.
01:43:37.000 Although I know his speech at CPAC this year was weird and totally on script, which was weird.
01:43:42.000 But up to this point, I don't think that's the case.
01:43:45.000 But I do know that Ivanka appeared as like the header photo of a weirdo World Economic Forum video along with W and Biden and a number of other luminaries, DiCaprio.
01:43:56.000 And, um, I know that Trump listens to his daughter way more than he probably should.
01:44:01.000 And so I don't know, I'm not jumping, cannot confirm, but, uh, I have not met.
01:44:06.000 Ivanka strikes me and I don't want to speak out to her Ivanka.
01:44:08.000 Sorry, this is going to be pretty hard on you, but you strike me as someone that will sell everyone out to keep your creature comforts and just jump to that.
01:44:15.000 Ivanka?
01:44:16.000 Yeah.
01:44:16.000 Why do you feel that way?
01:44:17.000 They were like, we need a strong, finally we have a strong woman in the White House
01:44:20.000 potentially as soon as Trump got into office.
01:44:22.000 Ivanka, will you be a voice for young women around the world that need you now?
01:44:26.000 And she's like, no, I'm not interested.
01:44:28.000 Got in her limo and got driven off.
01:44:29.000 That was the last you heard of her all through Trump's presidency.
01:44:32.000 Ian, I think you have a tendency to create an image of someone in your mind and then hate them.
01:44:37.000 She had the chance to be great.
01:44:38.000 She failed.
01:44:39.000 She missed that.
01:44:40.000 She was the president's daughter and they were asking her to speak for women, young women in America.
01:44:43.000 She was an American woman.
01:44:44.000 His wife wasn't American.
01:44:46.000 One thing happened one time and you're mad about it.
01:44:48.000 It was her shot.
01:44:48.000 It's like Biden was president one time, Tim.
01:44:51.000 What are you so mad about?
01:44:52.000 It was only one time that he was president.
01:44:54.000 No, you're talking about four years.
01:44:55.000 You're having an irrational reaction to an individual who did a bad interview one time.
01:45:00.000 And that's why I preempted it with, maybe I'm speaking out of turn, but that's the vibe I get is that she's like world economic forum material.
01:45:07.000 What I don't like is conclusions drawn without evidence.
01:45:09.000 She didn't come from hard times.
01:45:10.000 She was born into money.
01:45:11.000 Regardless.
01:45:11.000 She strikes me with... I've never seen her... If you can state your case where it's like calmly and dispassionately... She's emotionally unstable.
01:45:18.000 She cried to get Trump to fire missiles into Syria, which he did.
01:45:22.000 White women's tears are political.
01:45:23.000 That's Robin DiAngelo.
01:45:25.000 Chapter 11.
01:45:26.000 I don't trust her.
01:45:27.000 I don't like her.
01:45:28.000 If I knew her, maybe I would like her personally, but she strikes me as a money.
01:45:30.000 On more than one occasion, you've been like, I have an image of someone in my mind and now I'm angry about that.
01:45:35.000 Dude, she cried and got Trump to fire missiles into Syria.
01:45:38.000 She also ducked out on her chance to be a leader when they asked her to.
01:45:40.000 I try to give Biden the benefit of the doubt or credit when he says things that are good.
01:45:44.000 Like I'll say when he talks about securing the border, like, well, I got to say that's a good thing for him to say.
01:45:48.000 I don't trust him.
01:45:49.000 Why?
01:45:49.000 Because of his actions.
01:45:50.000 But, and I think he's a bad person.
01:45:52.000 Why?
01:45:52.000 Because of his criminal actions in Ukraine.
01:45:54.000 Ivanka Trump?
01:45:55.000 I don't think we have enough information because she wasn't that publicly out there to do anything to generate love or hate.
01:46:01.000 The same is true with, like, people criticizing Jen Psaki when they were, like, ragging on her all the time.
01:46:04.000 I'm not a fan of that.
01:46:05.000 Because I'm like, dude, she's just a press secretary.
01:46:07.000 Other than that, I don't think she's anybody significant.
01:46:10.000 And so just like any other press secretary, you expect her to say certain things.
01:46:13.000 Now, if she's lying, I'll say, yeah, that's not true.
01:46:15.000 When Sean Spicer would say something, we even talked to him about it.
01:46:18.000 I just, I think if you're going to say I have deep criticisms of someone, it can't be like I've created an image of them in my mind and now I'm angry at them for it.
01:46:26.000 People are asking if Trump is a false deep state PSYOP or whatever the heck it is.
01:46:30.000 So to that point, a controlled opposition.
01:46:31.000 So to that point, there is a relevant thing that's of practical value that people should really have their eye on, which has nothing to do with Ivanka whatsoever, which is that regardless of if, let's say that Trump was a totally genuine actor, that he came in with the best of intentions, the best skills, it's well known that he got surrounded by the swamp, right?
01:46:49.000 How did that happen?
01:46:50.000 Well, Paul Ryan appointed the head of the PPO office, Presidential Personnel Office, most powerful position in Washington.
01:46:55.000 He saw this coming.
01:46:56.000 He took control, made sure his guy was in charge.
01:46:59.000 So the 5,000 political appointees that Donald Trump could have made, and this was the whole thing, you know, how he's firing everybody and new people, Those were actually largely appointed by people who were recommended or just directly by the PPO that was under swamp control.
01:47:15.000 And so we got surrounded with the wrong people.
01:47:16.000 So how is this an actionable point?
01:47:18.000 Maybe Trump's not controlled opposition, but he got surrounded by control.
01:47:21.000 Maybe he wasn't then, but is now.
01:47:23.000 I don't know.
01:47:23.000 I'm not saying one way or the other.
01:47:24.000 But let's say that he was not controlled opposition.
01:47:26.000 He got surrounded by swamp control.
01:47:28.000 And so what that tells us is we can't go back to 2016-17 and fix that.
01:47:33.000 But what we can do is make damn sure that if we put in somebody who's actually got, you know, the Constitution first in whatever office, whether it's a senator with his staffers or a congressman with his staffers, whether it's president, whether it's a governor, whether it's even a mayor, that people are doing some good vetting to make sure because I can damn well guarantee you that the people who run the so-called regime with a capital R are making sure that they can get personnel around these people to make sure that they're ineffective if they are not controlled opposition.
01:48:01.000 So a practical point is you think we're looking at you know a red wave maybe in 2022 this fall, people need to be thinking of who are those staffing appointments and how are those staffing appointments made and making sure that vetted people are going into those positions.
01:48:13.000 That's a practical point to kind of come out of whatever Pretty Blonde Girl is.
01:48:18.000 All right, Howard says, cyber pandemic, March 17th, 2022, give or take a few days.
01:48:23.000 Thanks who CIA Schwab Tim won't see this coming.
01:48:26.000 I don't know what that last part means, but I guess making a prediction about a cyber pandemic.
01:48:30.000 I mean, Klaus Schwab talks about this a lot in the last couple of years.
01:48:34.000 He's like, if you think that COVID-19 has been disruptive, the cyber pandemic will be 10 times as disruptive.
01:48:40.000 What does that even mean though?
01:48:42.000 Probably massive, massive hacks and ransomware and, you know, breaking into, say, government infrastructure, say, things that run power plants or whatever.
01:48:51.000 A gigantic outbreak, probably from Russia, or at least that's what they'll say, of huge amounts of cyber warfare against critical infrastructure and even individuals throughout probably, I would bet, the Western world and not China, just as a guess.
01:49:11.000 So he's been warning about this enough to where one should suspect that he's not warning about this because it might happen and he's ahead of the curve, but because it might happen because he is the curve.
01:49:25.000 And I don't know what you do to prepare for this.
01:49:28.000 I don't know how realistic it is, but it's something that Schwab has telegraphed.
01:49:32.000 Um, dozens of times in the past year or two.
01:49:35.000 Shamus suggested earlier as we were talking, how do we prepare for something that may or may not happen?
01:49:39.000 Download your bank records.
01:49:40.000 Copies of the last three months of your bank records in case the bank, the electricity goes out and you need to, yeah, you need to contact your bank and be like, I have proof that this is my money.
01:49:46.000 This is how much I had.
01:49:47.000 Yeah, that's a good idea.
01:49:48.000 So I would go if you can.
01:49:49.000 No, no, no, no, no.
01:49:51.000 That's ridiculous.
01:49:52.000 What you want to do is you want to take out all of your money, put it in a briefcase under your mattress.
01:49:56.000 And I'm kidding.
01:49:56.000 Please don't do that.
01:49:59.000 I think it would be good to have some cash on hand.
01:50:03.000 I think it would be good for people to have some cash on hand.
01:50:05.000 I would argue for that.
01:50:05.000 In case something does happen to the banks temporarily.
01:50:07.000 If the internet cuts out, you know what the most worthless thing is going to be?
01:50:12.000 Gold and silver.
01:50:14.000 Like, gold and silver is valuable, so long as there is still social cohesion.
01:50:18.000 What does this look like?
01:50:19.000 How is Bitcoin better, then, if you don't have a computer?
01:50:21.000 It's not.
01:50:21.000 Who said it was?
01:50:22.000 Well, I don't know.
01:50:23.000 I assume that's what it's going to be.
01:50:24.000 I didn't even say the word Bitcoin.
01:50:25.000 I'm just curious.
01:50:26.000 The one thing that's going to be totally worthless, if there's no internet, is going to be your idea of currency.
01:50:31.000 If, like, people might still value hard cash, they might still value gold and silver, but if the economy is truly disrupted in a it-hits-the-fan moment, food and water are going to be the most valuable things.
01:50:42.000 Actually, I've also read coffee.
01:50:44.000 Food, water, coffee.
01:50:46.000 You know, bullets probably, honestly.
01:50:48.000 Bullets will be very valuable.
01:50:49.000 When the coffee runs out.
01:50:51.000 Gold and silver are good because we don't think we're going to be living in Fallout, the Fallout universe.
01:50:57.000 Mall Sort might be worth an entire dollar.
01:50:59.000 Like two sticks of gum.
01:51:00.000 No, no, no, no.
01:51:01.000 Hold on there a minute, man.
01:51:02.000 Like a mall sword in a full-blown apocalypse will, like, aluminum used to be the most valuable metal.
01:51:10.000 Yeah.
01:51:10.000 Or as the British say, aluminum.
01:51:11.000 I think this is steel.
01:51:12.000 Like it's pretty, it's got some weight to it.
01:51:14.000 It costs like 10 bucks, dude.
01:51:15.000 Steel's not expensive.
01:51:17.000 No?
01:51:18.000 But think about who can make that right now in the middle of where you live.
01:51:22.000 You know, this is something that's made in a factory somewhere, they mass produce it, and it gets shipped out.
01:51:26.000 So the point is, gold, like, when we buy, so I have gold, I have silver, you know, a little bit, and, you know, Bitcoin and crypto.
01:51:34.000 That's me basically saying, I think society will continue.
01:51:36.000 Yeah.
01:51:37.000 Now, when I'm saying society won't continue, it's when I buy, like, a ton of water, or like a solar panel or something.
01:51:43.000 That's like, Yikes.
01:51:44.000 But the thing is, even so, like water, ammunition, solar panels, emergency food, those things will still be valuable to you if society stays afloat as well.
01:51:52.000 Soap, too.
01:51:53.000 Yeah.
01:51:53.000 Soap is super, super valuable.
01:51:54.000 Let's read some more.
01:51:55.000 We got, uh, Sadistic Atheist says, Have you ever watched Darren Brown, a real mentalist, showing how he programs celebrities in the general public?
01:52:02.000 No hypnosis required.
01:52:04.000 I am very much aware of Darren Brown.
01:52:06.000 One of my favorite things he ever did was he took a wallet Put it on the ground in the middle of a street, busy street, like downtown in some city, and drew a yellow circle around it, and then walked away, and they put a time-lapse camera on it and watched, and no one touched it.
01:52:19.000 Just in the middle of the sidewalk.
01:52:21.000 But because there was a ring around it, people assumed something was happening and they didn't want to touch it.
01:52:25.000 That's like ants.
01:52:26.000 It was supposed to be there.
01:52:27.000 Yeah, that's interesting.
01:52:28.000 That's really interesting.
01:52:29.000 Yeah, I'm familiar with Aaron Brown, too.
01:52:30.000 He does have some things that I question, where he, like, I think this is him, he grabs Omin and says, like, stuck, and then she can't move, and I'm like, come on.
01:52:38.000 It's not real.
01:52:39.000 But there are a lot of things I can tell you this having worked in nonprofit fundraising having been friends with tons of you know hackers social engineers you would be surprised how easy it is to control people's behaviors and so what they try and do with these nonprofits is is they try to cultivate a basic set of manipulative skills.
01:52:58.000 It doesn't work for most people. And then there are some people who naturally have these skills and
01:53:03.000 they can, you know, execute them very, very easily. So those people you see on the street waving
01:53:07.000 to you like, hey, come and talk for a minute. Some people naturally behave in such a way that is
01:53:12.000 commanding. So I'll tell you two really fascinating things. When it comes to hiring 50 people,
01:53:19.000 there was always a guarantee.
01:53:22.000 One characteristic of a man and one characteristic of a woman that would guarantee they would be hired.
01:53:26.000 And you know what it was for men?
01:53:28.000 Height, maybe?
01:53:28.000 Height.
01:53:29.000 Yeah.
01:53:29.000 And do you know what it was for women?
01:53:32.000 Waifu.
01:53:32.000 How fast they walked.
01:53:33.000 Attractiveness?
01:53:33.000 Nope.
01:53:34.000 You're close, Seamus.
01:53:35.000 Boobs.
01:53:36.000 Hey!
01:53:36.000 And there it is.
01:53:38.000 The women with larger breasts tended to be able to fundraise really well, and the men who were taller tended to be able to fundraise really well.
01:53:45.000 These are like instinctual, secondary, you know, sexual drives that humans have.
01:53:50.000 The tall men are commanding, people naturally, you know, and deep voices.
01:53:55.000 And then the women who were, you know, busty or attractive tended to do very, very well.
01:54:01.000 Then there was the anomalous outliers.
01:54:03.000 So you'd always find like a weaselly little guy, but he was a fast talker, and he could convince anybody of anything.
01:54:08.000 Honestly, gang, okay, so let me explain to you why you should go to birch.gold.com and purchase everything.
01:54:13.000 Is that something he says?
01:54:14.000 Yeah, yeah, Birch Gold, right?
01:54:15.000 Aren't they one of the people who sponsors him?
01:54:17.000 Well, they just got a free shout-out.
01:54:19.000 But no, like, I saw a lot of this.
01:54:22.000 Guys who are like 5'5", they talk really, really fast.
01:54:24.000 I gotta tell you, man, if you wanna get the job done, you gotta come to me and I'll tell you, put your credit card right now.
01:54:28.000 Now we're saving the trees.
01:54:28.000 Wanna save the trees?
01:54:29.000 We're gonna save the trees.
01:54:29.000 Let's get it done.
01:54:30.000 And those guys, people would just be like, yes sir, okay, whoa, I don't even know what's going on.
01:54:35.000 And before they realize it, they hand it over their credit card.
01:54:37.000 Take my mall sword.
01:54:38.000 The crazy thing is, The tall guys, like I knew a guy who was like 6'5", dumb as a box of rocks.
01:54:44.000 He would just be like, listen, uh, it's like, we gotta help families.
01:54:49.000 And she's like, have my babies.
01:54:50.000 And like, we're helping families.
01:54:53.000 And they would be like, okay.
01:54:54.000 And they'd pop their credit cards and I'd just be like, what?
01:54:56.000 He didn't even say anything to them!
01:54:58.000 But humans are very much driven by... Is there any data about people wearing shirts that say can't?
01:55:05.000 I don't know, but... They don't inspire a can-do attitude.
01:55:08.000 Some of these things should be obvious to people.
01:55:09.000 Bubbliness.
01:55:12.000 They try to train people.
01:55:13.000 Keep your arms away from your chest so you're open and welcoming.
01:55:17.000 Exposing your vulnerable soft underbelly is a sign of trust.
01:55:21.000 Keep your legs spaced apart and be up around the balls of your feet and be bubbly and upbeat and never be sad.
01:55:27.000 Never be angry.
01:55:28.000 These are the things that they would try and train people for.
01:55:30.000 Spread your legs, balls out, that's all I heard.
01:55:32.000 Balls out!
01:55:34.000 Yeah, because the idea is, if you're covering your chest, you're saying, I don't trust you, I fear you, and that puts them in a sense of alarm.
01:55:42.000 So all of these techniques they train on are like base instinct, not even about the words you say.
01:55:47.000 In fact, like I mentioned, the tall guy, he would barely say words and still convince people to just give him money.
01:55:52.000 So, when it comes to programming humans, we've got base code, man.
01:55:56.000 We've got an underlying BIOS that is easy to exploit.
01:55:59.000 And it's kind of scary.
01:56:00.000 I love when people tell me it's not possible because I'm like, bro, marketing exists.
01:56:04.000 They know how you think.
01:56:07.000 All right, let's grab some more.
01:56:08.000 What is this one, Marks?
01:56:09.000 Josephine Whitaker says, somewhere I heard that when Marks died, his wife said, if only he spent his time making capital instead of writing about it.
01:56:17.000 Is that true?
01:56:18.000 I don't know.
01:56:18.000 That's a good one though.
01:56:20.000 It's funny.
01:56:21.000 Um, I mean, that's, that's the necessary burn.
01:56:24.000 I mean, I know that, that she actually, I mean, I've read and there's a wonderful book called, uh, the devil and Karl Marx by Paul Kenger.
01:56:33.000 Um, and he, I think I said his last name, right?
01:56:35.000 And anyway, uh, that sentiment is actually documented.
01:56:39.000 I don't know if that's the timing, but that sentence, that sentiment was certainly was expressed.
01:56:45.000 I remember reading it.
01:56:46.000 We have a correction here from Kurt.
01:56:47.000 He says, Tim, the experience for Replica is for the AI, not you.
01:56:52.000 The not starts off really dumb, but the more you level it, the smarter it gets and it's free to start.
01:56:58.000 Oh, well, there you go.
01:56:59.000 So train your waifu right, your AI Replica or whatever.
01:57:04.000 And she'll be reading Nietzsche or something to you.
01:57:08.000 What if this is really what it is, though?
01:57:09.000 The people who are using this, maybe the idea is, look, people are lonely.
01:57:13.000 Let's give them this AI chat bot.
01:57:14.000 But if people keep communicating, it will learn.
01:57:17.000 They'll take the data.
01:57:18.000 What was that thing that was on Twitter that turned into a Nazi in like two hours or whatever?
01:57:22.000 Oh my gosh, that's right.
01:57:23.000 Microsoft released a bot?
01:57:24.000 That was hilarious.
01:57:24.000 What was that called?
01:57:25.000 Yeah.
01:57:26.000 I remember it got radicalized, though.
01:57:28.000 It was all the way.
01:57:29.000 See, this is the thing about Iron Man, Avengers, Age of Ultron.
01:57:36.000 If they were going to build Ultron, this AI, they would put it in a virtual space to see what would happen first, and that's what happened.
01:57:42.000 It became a Nazi right away.
01:57:43.000 The chatbot was called Tay.
01:57:45.000 T-A-Y.
01:57:45.000 That's right, that's right.
01:57:46.000 It became a Nazi.
01:57:48.000 It instantaneously became a Democrat staffer pretending to carry a tiki torch outside of a Glenn Youngkin van in like one hour.
01:57:55.000 That's incredible.
01:57:56.000 That's CBS News.
01:57:59.000 All right.
01:58:00.000 Victor says, VTubers are actual people.
01:58:02.000 They use 3D modeling programs to track their face and body instead of using their actual face.
01:58:06.000 This highlights an even bigger issue with identity in that some believe they are their avatar.
01:58:13.000 So they are real people, but they're like technically just puppets.
01:58:15.000 I did.
01:58:16.000 So sometimes, we've referenced this before, the conspiracy theory pyramid video I did where it's just my cartoon character talking about it.
01:58:22.000 People were commenting, joking, like, Seamus is a VTuber now.
01:58:26.000 Dr. Rollergator is a real gator that wears roller skates.
01:58:29.000 People don't realize this, but half the episodes of IRL that Chamus has been on, he's actually been a marionette.
01:58:35.000 Very good animator.
01:58:36.000 I'm very good at it.
01:58:38.000 Alright, let's grab some more Super Chats here and see what y'all are interested in.
01:58:47.000 Okay.
01:58:48.000 Daniel Chrisman says Marx wrote his manifesto in 1848.
01:58:51.000 Engels trained three major Civil War generals.
01:58:55.000 Karl Schurz and Franz Sigel and August Willitsch.
01:58:59.000 Marx was the worldwide media correspondent during the Civil War.
01:59:03.000 The Civil War was a communist coup.
01:59:06.000 I don't know about all that.
01:59:07.000 A lot of information.
01:59:08.000 I know that he was in touch with Lincoln.
01:59:10.000 I don't know more than that on this subject.
01:59:12.000 Interesting.
01:59:13.000 I've heard that.
01:59:14.000 MightyDorks says, Hey Tim, did you hear Joe say it was weird saying second gentleman?
01:59:18.000 Is that what he said?
01:59:19.000 I don't know.
01:59:19.000 Did he say it was weird?
01:59:20.000 I heard him saying second gentleman.
01:59:22.000 Yeah.
01:59:23.000 I didn't hear him say it was weird.
01:59:23.000 Oh, I thought it was weird though.
01:59:27.000 Is that Joe?
01:59:28.000 Biden.
01:59:29.000 Joe Biden!
01:59:30.000 Sorry, I took Kamala.
01:59:32.000 Yeah, her husband.
01:59:32.000 Dr. Jill.
01:59:34.000 Elijah Zepeda says, I don't think you guys understand how many H-games there are on the internet.
01:59:38.000 They all have Patreon pages and make tons of money.
01:59:41.000 Just look how much Summertime Saga makes a month.
01:59:44.000 What is an H-game?
01:59:46.000 Hentai?
01:59:46.000 Is that what the H for?
01:59:47.000 Oh, is that what it is?
01:59:48.000 Waifus?
01:59:49.000 I don't know.
01:59:50.000 Waifu games.
01:59:50.000 I'm in my 40s.
01:59:51.000 I don't know what any of this shit is.
01:59:53.000 I mean, I don't know what it was.
01:59:54.000 Maybe you're right.
01:59:55.000 I don't know.
01:59:55.000 I'm just guessing.
01:59:56.000 It's like, what's an H game?
01:59:58.000 I'm gonna look it up.
01:59:59.000 Inform me.
01:59:59.000 It's something I probably didn't want to know exists.
02:00:01.000 Exactly.
02:00:04.000 Okay.
02:00:05.000 I will chop it down with my mall sword.
02:00:07.000 Yes.
02:00:10.000 All right.
02:00:12.000 Let's... Yeah, it's basically hentai.
02:00:14.000 Gross.
02:00:15.000 Bang.
02:00:16.000 Nailed it.
02:00:17.000 Jeff Jones says, first time donator.
02:00:19.000 Love you guys.
02:00:20.000 Shout out to Ian for taking Confrontation so well with Tim.
02:00:23.000 I know when I am put on the spot, I get even more pissed.
02:00:25.000 Ha ha ha.
02:00:26.000 Gotta learn to enjoy being wrong.
02:00:28.000 I learned early on when we used to play video games with my friends that if we would play, like, you'd win and then the loser would have to pass the controller, so the winner would stay forever.
02:00:35.000 And I was like, why don't we do it where you play two and you pass it regardless of if you win or lose?
02:00:39.000 So everyone started to get equal amounts of play time, and I realized it doesn't matter if you win or lose, man.
02:00:43.000 We're just playing to have fun.
02:00:46.000 Let's see.
02:00:46.000 Turk Longwall says, how did coffee become so addictive?
02:00:49.000 It's top five.
02:00:51.000 I have no idea.
02:00:51.000 Coffee?
02:00:52.000 It's delicious.
02:00:53.000 Because caffeine is an adenosine receptor antagonist and adenosine makes you feel very uncomfortable when it gets free and so when you can antagonize the receptors you can make yourself feel good all the time in the famous words of Kramer.
02:01:08.000 Then you start to grow more receptors.
02:01:10.000 He's been on my mind.
02:01:11.000 Anytime.
02:01:11.000 What if I am Alex Jones in a James Lindsay suit?
02:01:13.000 Let's see, Eriitse says, when will you have Alex Jones on again?
02:01:17.000 He's been on my mind.
02:01:19.000 Anytime?
02:01:20.000 Anytime.
02:01:21.000 What if I am Alex Jones in a James Lindsay suit?
02:01:23.000 Under the cat shirt?
02:01:25.000 They would love to say that about you.
02:01:26.000 Well, you see, if you took a whole lot of his supplements for like brain, no.
02:01:30.000 He evolved.
02:01:32.000 So, the thing about Alex is that, you know, like, we went down to Austin.
02:01:37.000 I was just like, I don't want to have him on literally every other day.
02:01:40.000 You know, it's got to be special.
02:01:41.000 Make him a guest.
02:01:42.000 Make him a co-host.
02:01:43.000 Yeah, it's like, he's just here now.
02:01:45.000 He lives here.
02:01:46.000 But I'd love to have Alex Jones on anytime.
02:01:48.000 And I think the most important thing is finding good guests to have on with Alex.
02:01:52.000 And also, it's like, we've had requests to host conservatives with Vosh.
02:01:56.000 And I'm like, look, we've had the guy on a couple times.
02:02:00.000 I don't want to act like we are somehow able to connect people and make these things happen.
02:02:03.000 And also, I'd love to have other left personalities and leftist personalities on with conservatives or moderates.
02:02:09.000 I'm not just going to be like, there's one leftist who came on the show twice.
02:02:14.000 There's certainly other people we could bring on.
02:02:16.000 And same goes for Alex Jones.
02:02:17.000 You know, there's probably other people we could bring on too, and so we try to just, you know, get a diverse, eclectic group.
02:02:21.000 So, I'll leave it there.
02:02:23.000 Thanks for hanging out, everybody.
02:02:24.000 It's been a big blast.
02:02:25.000 Go to TimCast.com, be a member if you want to keep supporting our work, and check out our huge library of members-only segments.
02:02:30.000 You can watch the stuff from Steve Bannon.
02:02:31.000 We got Alex Jones episodes.
02:02:33.000 We got The Green Room, where you can watch... I think we have one up today.
02:02:37.000 You can see behind the scenes stuff.
02:02:38.000 We had a really, really great green room with Majid Nawaz, because he's downstairs talking about a whole bunch of stuff for like 40 minutes.
02:02:43.000 So it's basically a whole other podcast.
02:02:45.000 So go to TimCast.com, but don't forget to smash that like button, subscribe to this channel.
02:02:48.000 You can follow the show on Instagram at TimCast where we post clips.
02:02:51.000 You can follow me at TimCast for basically shenanigans.
02:02:54.000 Twitter just rolled out super follows.
02:02:56.000 Oh, yeah.
02:02:57.000 And so, uh, they gave it to me.
02:02:59.000 And this basically means that you can sign up on Twitter, not saying you should, I'm just saying I'm absolutely going to be posting things, but it's mostly going to be drama and nonsense, something you wouldn't get anywhere else.
02:03:10.000 So today I made a post about how we actually have a plan to clone our rooster, and I'm pretty sure I can pull it off.
02:03:17.000 If you want to find out how, That's the kind of shenanigans you'll get on Twitter with super followers, or just literally ignore it.
02:03:22.000 Who cares?
02:03:23.000 You can come to the substantive content that actually matters, or you can, you know, whatever.
02:03:28.000 I figured I'd set it up.
02:03:30.000 James, do you want to shout anything out?
02:03:31.000 Yeah, I mean, I got the new book, Race Marxism.
02:03:34.000 If you can't find it on the Amazon for yourself, you can go to racemarxism.com.
02:03:39.000 Website is newdiscourses.com.
02:03:41.000 The podcast there is the New Discourses podcast.
02:03:44.000 If you think that I sounded kind of smart and know what I'm talking about with this mouse stuff, I've got tons, hours and hours and hours of deep dives into this literature, whether it's critical race theory, Marxism, neo-Marxism, postmodernism, whatever, check it out.
02:03:56.000 You follow me on social media at ConceptualJames, where people are probably pissed off at me on the internet, and I am probably laughing about that fact.
02:04:07.000 Lovely.
02:04:07.000 I'm Seamus Coghlan of Freedom Tunes.
02:04:09.000 We upload a new political cartoon every single Thursday.
02:04:13.000 We just uploaded one about Biden's State of the Union.
02:04:14.000 I think you guys will really enjoy it if you check it out, and we're going to be doing one I mentioned earlier about the diversity training requirements for the military, and you can check me out there.
02:04:22.000 I love you all.
02:04:23.000 You're talking about Biden's campaign speech?
02:04:25.000 A little bit, yeah.
02:04:25.000 Oh, they called it the State of the Union.
02:04:26.000 Yeah, they called it the State of the Union, even though he didn't really tell us much about the state the union was in.
02:04:30.000 Yeah, I just found that out, too.
02:04:31.000 James, I'm looking forward to when you come back.
02:04:33.000 I want to talk about Heigl's mixing of self and other to create God and then pushing that on society.
02:04:38.000 I thought that Plato said, like, if you don't take interest in politics, politics takes an interest in you, but maybe people have gone too far.
02:04:44.000 Maybe that can be something we look into in the future and talk more about.
02:04:47.000 Thanks for coming, man.
02:04:48.000 Yeah, man, awesome.
02:04:49.000 Follow me at iancrossland.net.
02:04:51.000 Peace!
02:04:52.000 I always enjoy having James.
02:04:53.000 He mixes a mean aviation, which is my new favorite cocktail, it turns out.
02:04:57.000 Thank you very much for making me a drink that was the color of your sweatshirt.
02:04:59.000 You guys may follow me on Twitter and Minds.com at Sour Patch Lids.
02:05:04.000 Thanks for hanging out, everybody.
02:05:05.000 And I guess we'll be back Monday.
02:05:07.000 We'll have more shows.
02:05:08.000 We'll have clips, of course, up throughout the week.
02:05:10.000 So if you subscribe to this channel, we've got clips from earlier in the week that are segments.
02:05:14.000 You'll see them here.
02:05:15.000 And other than that, we'll see you on the next show on Monday.
02:05:17.000 Thanks for hanging out.