Timcast IRL - Tim Pool - March 13, 2021


Timcast IRL - Minneapolis To Fund Antifa Autonomous Zone With $500k w-James Lindsay


Episode Stats

Length

2 hours and 21 minutes

Words per Minute

206.44173

Word Count

29,270

Sentence Count

2,415

Misogynist Sentences

24

Hate Speech Sentences

44


Summary

In this episode, we discuss the settlement in the case against the city of Minneapolis, the new anti-free speech law in Scotland, and the new M&M boycott. We're joined by James Lindsay, a leading scholar in critical race theory, to discuss all of this and more.


Transcript

00:00:00.000 you you
00:00:38.000 How's it going ladies and gentlemen?
00:00:40.000 Welcome to Timcast's IRL Podcast.
00:00:42.000 If you're not a subscriber, please smash that subscribe button and that like button and that notification bell so you can make sure you stay tuned to the shows.
00:00:50.000 We got a bunch of interesting stories and topics to talk about today.
00:00:53.000 The big story that's just breaking now is that amid the ongoing Derek Chauvin trial, Minneapolis has agreed to settle with the family of George Floyd for a historic $27 million in a wrongful death suit.
00:01:08.000 There's a lot of questions around why they would do this and how this is going to impact the trial of this officer and whether or not he's going to be able to get a fair trial now that the city's basically admitting fault.
00:01:18.000 I think ultimately it results in riots. A couple other stories though. Scotland,
00:01:23.000 my friends over in Scotland, Count Dankula, man they got it bad. A new anti-free speech law is
00:01:28.000 coming into effect. There's some news about Cuomo. They're apparently trying to cancel M&M because
00:01:34.000 you know M&M is edgy and I guess racist and homophobic and transphobic and bigoted and
00:01:39.000 all those really awful things.
00:01:41.000 And these young people are now realizing the kind of music this guy was making.
00:01:45.000 So he made a rebuttal song.
00:01:47.000 I don't think the song is that good, but we'll certainly talk about it.
00:01:50.000 And to assist us in navigating the world of woke, we have one of the preeminent scholars engaging in criticizing critical race theory, wokeness.
00:01:59.000 We have James Lindsay.
00:02:00.000 Hey Tim.
00:02:02.000 Yeah.
00:02:03.000 Do you want to introduce yourself, explain what you do a little bit?
00:02:06.000 I wrote about woke.
00:02:08.000 So, I mean, you did a great job.
00:02:09.000 You called me a preeminent scholar.
00:02:10.000 What am I going to do better than that?
00:02:11.000 I like to tell people that I'm one of the best, I'm one of the leading scholars on critical race theory, et cetera, critical theory, among people who don't believe it.
00:02:19.000 And so there are probably people who are advocates and adherents who are in the faith who know it more deeply and more thoroughly than I do.
00:02:28.000 Maybe not.
00:02:29.000 But I don't believe it.
00:02:30.000 I think it's wrong.
00:02:30.000 I think it's actually racism reinvented, and go into as much depth as you want.
00:02:35.000 It's just racism.
00:02:36.000 So that's what I do.
00:02:37.000 I have a website, New Discourses.
00:02:39.000 It's newdiscourses.com, and I try to put up podcasts, videos, and written material.
00:02:46.000 I'm writing an encyclopedia explaining how the woke talk, how they misuse words.
00:02:51.000 And that's about it.
00:02:53.000 It's all I do these days.
00:02:55.000 And you're a fairly pessimistic, I guess, huh?
00:02:57.000 I'm a little pessimistic right now.
00:02:58.000 Yeah.
00:02:58.000 Yeah.
00:02:59.000 I'm not.
00:02:59.000 I mean, a very it's like so black pill is the thing.
00:03:01.000 Right.
00:03:02.000 So people don't know what a black pill is.
00:03:03.000 You got like all these pills because the friggin matrix came out and now everything's a pill.
00:03:08.000 Yeah.
00:03:09.000 Right.
00:03:09.000 Orange pills.
00:03:10.000 The Bitcoin one.
00:03:11.000 Is that Max Keiser's podcast?
00:03:12.000 Yeah.
00:03:12.000 I didn't even know about that pill.
00:03:13.000 There's too many pills.
00:03:16.000 I know about the blue pill.
00:03:17.000 The blue pill means you want to stay in the Matrix.
00:03:19.000 I know about the red pill and that means you think that the media is lying and you want out of the Matrix.
00:03:22.000 And then there's a clear pill that's like, I don't care about any of this.
00:03:27.000 It's when George Carlin, back in the day, you remember George Carlin had that thing where he was like, I don't care who wins, I think it's all crap.
00:03:34.000 You know, I have no stake in the outcome any longer.
00:03:36.000 I'm just going to write jokes.
00:03:37.000 And he's like near the end of his life and he's just going to have his old man ranting.
00:03:39.000 That's that's going clear pill.
00:03:42.000 Like you don't you have no stake in the outcome anymore.
00:03:44.000 Just watching.
00:03:45.000 And you just go clear for yourself.
00:03:47.000 Then there's black pill, which means it's over.
00:03:50.000 It's over.
00:03:50.000 It's over.
00:03:51.000 Depressed.
00:03:51.000 That's it.
00:03:52.000 Despair.
00:03:52.000 And then there's white pill where all of a sudden you see hope.
00:03:55.000 And I guess there's orange pill.
00:03:56.000 That's racist, bro.
00:03:57.000 Oh, I didn't write it.
00:03:59.000 Racism.
00:04:00.000 I mean, a lot of pills are white, but some of them are pink and some of them are blue.
00:04:04.000 Yeah, this is like, it's the weird world where colors all of a sudden represent race in any context.
00:04:09.000 So like, you know, Ian and I and a couple other people in the house, we all play Magic the Gathering.
00:04:13.000 It's a card game.
00:04:14.000 And they have colors.
00:04:16.000 It's like each card can have a different color.
00:04:18.000 And so because...
00:04:22.000 Because they use the colors white and black, there are certain cards they've deemed racist for simply using the mechanic of black magic.
00:04:29.000 Like this fantasy idea.
00:04:32.000 That's the depravity of wokeness.
00:04:34.000 Well, that's in their literature.
00:04:36.000 That's actually in their scholarly literature.
00:04:38.000 That's in the books.
00:04:39.000 They're literally in their scholarly books that they write.
00:04:44.000 The association with black magic being bad and white magic being good.
00:04:50.000 But you know what really comes from?
00:04:54.000 Night and day.
00:04:55.000 That was literally what it was.
00:04:57.000 It was light and dark.
00:04:57.000 It was at night.
00:04:58.000 Things were dying.
00:04:59.000 It was dangerous.
00:04:59.000 There were predators.
00:05:00.000 In the winter, it got darker and everything died and it was scary and people got scared.
00:05:04.000 And then when the sun was out, we were warm and we were safe.
00:05:07.000 And so we just created this idea of light and dark, good and bad.
00:05:09.000 It had nothing to do with the color of someone's skin.
00:05:12.000 Now they've made it that way.
00:05:13.000 Yeah, yin and yang in Chinese, or as those gringos would say, yin and yang.
00:05:19.000 Yin is the black one, everybody, and yang is the white one.
00:05:23.000 And it's the creative and the receptive, if you go into the kind of Taoist cosmology.
00:05:28.000 Super racist, though, because Asians are like more racist now, I guess.
00:05:31.000 Asians are more white than white people, I guess.
00:05:33.000 Yeah, they were white-adjacent for a while, and they're also a model minority, and now, though, they're just white, but then they're super white, because they get to claim that they're also a minority, so they get to hide from being white.
00:05:43.000 So they're, like, extra super white.
00:05:47.000 It's, like, super straight, but super white.
00:05:49.000 But instead of being, like, good, it's bad.
00:05:51.000 It's all upside down.
00:05:53.000 It's all clown world.
00:05:54.000 We'll get into this stuff, too.
00:05:55.000 We've got Ian.
00:05:55.000 He's chillin'.
00:05:56.000 Oh, hey, everybody.
00:05:56.000 Ian Crossland.
00:05:57.000 IanCrossland.net.
00:05:58.000 And I just want to point out that black and white are not colors.
00:06:00.000 They are shades.
00:06:01.000 Oh, yeah.
00:06:01.000 That is true.
00:06:03.000 It's technical for a minute.
00:06:04.000 They are shades and tints or something like that.
00:06:06.000 Yeah, yeah.
00:06:06.000 And then there's me in the corner pushing buttons correctly this time.
00:06:10.000 And I have no say in this color argument, so I'll turn it back to Tim.
00:06:13.000 Before we get started, go to TimCast.com and sign up to become a member to get exclusive access to members-only podcast episodes and segments.
00:06:21.000 And I don't know if y'all are into this stuff, but the other day with Kim Iverson, she was talking about Destiny cards, and apparently I'm the Ace of Spades.
00:06:30.000 And as far as I'm concerned, that's like the best card.
00:06:32.000 And so naturally, I was like, tell me more, madame.
00:06:35.000 And we did like a 40 something minute segment where she broke down what destiny cards are.
00:06:39.000 It's a kind of astrology.
00:06:41.000 And I'm not, I'm not a big, you know, believer in any of this stuff, but she talked about how she predicted Donald Trump would be defeated, but not feel defeated and didn't understand what it meant until it happened.
00:06:51.000 I was like, now I get it.
00:06:53.000 So it's, it's interesting stuff.
00:06:54.000 But of course we also have Scott Pressler who was talking about primarying these America's Last Politicians.
00:06:59.000 So if that's more your cup of tea, become a member, and don't forget to smash that like button, subscribe to the notification bell.
00:07:05.000 Let's check out this first story, and then we'll just get into this stuff, because this is from CBS News.
00:07:10.000 Minneapolis approves historic $27 million settlement with George Floyd's family.
00:07:16.000 They say the city council voted 13 to 0 to approve the settlement which directs half a million dollars to be used to benefit the George Floyd memorial site at 38th and Chicago.
00:07:28.000 Do you know what that means?
00:07:32.000 I'm gonna build the suspense on y'all for a minute.
00:07:34.000 Thirteen to zero, really.
00:07:35.000 This past weekend, the George Floyd Memorial site, which is an autonomous zone, someone was shot and killed.
00:07:41.000 The city has just announced they are going to fund this Antifa autonomous zone to the tune of half a million dollars.
00:07:48.000 That's the degree to which this insanity... I'm sorry, James, you may say that you're a little pessimistic, but boy, am I getting pessimistic on this one.
00:07:54.000 Twenty-seven million dollars in a civil settlement to the family.
00:07:57.000 Look, I'm upset the guy died.
00:07:59.000 I don't like it when anybody dies.
00:08:00.000 Twenty-seven million.
00:08:01.000 Historic.
00:08:01.000 Why?
00:08:03.000 Because the terror worked.
00:08:04.000 That's right.
00:08:04.000 The burning down the cities, it worked.
00:08:07.000 That's right.
00:08:07.000 So what are you gonna see?
00:08:09.000 More fire.
00:08:10.000 Yeah, they're giving half a million dollars to these people.
00:08:13.000 Now the crazy thing too is, Chauvin is still on trial.
00:08:17.000 And so you have these jurors who are supposed to be coming in and being asked if they can remain impartial, and now they're gonna be told, you know, in the civil case, they've already won.
00:08:26.000 Clearly he must be guilty of something the city's agreed to settle.
00:08:29.000 How can this guy get a fair trial?
00:08:31.000 O.J.
00:08:31.000 Simpson lost a civil case one- After, though, wasn't it?
00:08:34.000 I think the civil case was after the criminal case, though, in his case.
00:08:37.000 It's funny phrasing.
00:08:39.000 I don't know the history of civil case, criminal case, the adjunctives of it.
00:08:43.000 Is there a precedent for doing a civil case first, finding them?
00:08:47.000 I'm not a lawyer, I have no idea.
00:08:48.000 I do know that this was a wrongful death and that he's being tried for murder one, right?
00:08:53.000 No, Murder 2, 3, and Manslaughter.
00:08:55.000 Oh, okay. So, yeah, who knows then?
00:08:57.000 Because the city is sort of admitting fault here.
00:09:01.000 Yep.
00:09:01.000 And, uh...
00:09:03.000 I can't see that...
00:09:05.000 I don't see how you, yeah, I mean.
00:09:07.000 I mean, I guess theoretically, maybe the judge will be like, no, no, no, you've got to separate that.
00:09:11.000 And maybe when it comes to the defense, they'll say a settlement is not an admission of guilt.
00:09:18.000 The city just simply thought paying out would be less expensive than the damage.
00:09:22.000 And he might actually say the fact the city thought $27 million was cheaper Then fighting the suit shows what they really feared was the riots from the extremists who are trying to destroy the city.
00:09:34.000 Well, yeah, of course.
00:09:35.000 I mean, that's what the whole gig is an extortion gig.
00:09:38.000 I get that there's like, oh, there's the passion.
00:09:40.000 People are locked down.
00:09:40.000 I don't want to like deny the, you know, what was the saying we had to endure?
00:09:45.000 It was a quote from Martin Luther King, but I felt like it was taking a bit out of context, which was that a riot is the voice of the unheard or something like that.
00:09:55.000 And I get it.
00:09:55.000 People were locked down for months.
00:09:56.000 People are like kind of out of their minds looking for something to do.
00:09:59.000 There's this flashpoint.
00:10:01.000 Everything goes crazy.
00:10:03.000 But...
00:10:06.000 It's an extortion gig.
00:10:08.000 Because the theory underlying it, if we look at critical race theory, is an extortion gig.
00:10:12.000 It works on public relations there, not riots.
00:10:14.000 But it also defends things like looting.
00:10:16.000 There was that book, Defense of Looting, that's all rooted in critical race theory and all this crazy stuff.
00:10:20.000 That was written about Ferguson.
00:10:22.000 I was in Ferguson, and when that article came out, I threw up in my mouth a little bit.
00:10:27.000 It was like reading that was a combination of several different... What's the right word for this?
00:10:35.000 Like violations of ethics.
00:10:39.000 The first was to see the people desperately defending their town, their neighborhood from these violent rioters and looters who are exploiting them, to see the manipulation and fake news and the corruption of the media.
00:10:51.000 It was like all at once.
00:10:52.000 And I was like, oh, in Ferguson, the kids who lived there, Some young men linked arms around the convenience store and they were trying to defend the businesses from the looters who came from out of town to exploit and steal and burn things down because they thought it was funny and they didn't care.
00:11:10.000 And then along comes these ultra-woke white progressives from the suburbs who have no idea what it's like to live in poverty, cheering on the criminals who invaded this neighborhood and attacked the poor, marginalized people who lived there.
00:11:25.000 And I witnessed it.
00:11:26.000 That's in Ferguson, right?
00:11:27.000 That was in Ferguson.
00:11:27.000 OK, yeah.
00:11:28.000 So what happened in Minneapolis is, you know, the person who's now the vice president coming out and saying this needs to continue.
00:11:34.000 Let's bail them all out of jail.
00:11:35.000 Here's a link to the fund.
00:11:36.000 Kamala Harris.
00:11:37.000 Kamala Harris, yeah.
00:11:38.000 And people thought that if she got elected, this is true, that she was the tough cop, the top cop.
00:11:44.000 She was going to come and put an end to this Antifa stuff, this violence.
00:11:48.000 And I remember talking to some people I know in the Chicago suburbs.
00:11:53.000 They were talking about why they were supporting Biden.
00:11:54.000 And I was like, don't you think that under Biden, these extremists are emboldened and the riots are going to get
00:11:59.000 worse?
00:11:59.000 And there were people who said, dude, Kamala is like, was like locking up innocent people.
00:12:04.000 Pretty sure she's going to go nuts.
00:12:06.000 And I was like, dude, she tried bailing these people out.
00:12:09.000 She solicited funds to get them out of jail.
00:12:12.000 And what's happening now?
00:12:13.000 They set the Mark O. Hatfield courthouse on fire again in Portland.
00:12:18.000 That's right.
00:12:18.000 Yeah.
00:12:19.000 And speaking of Portland, you know, I remember talking to some folks that I'm friends with in Portland and their big concern going into the election.
00:12:26.000 These are people who are lifelong Democrats or progressives or leftists or whatever, as Portland would do.
00:12:31.000 And their concern was, well, our city isn't protecting us, our state isn't protecting us.
00:12:36.000 You look at Ted Wheeler, the mayor, you look at—was it Kim Brown? Is that her name? The governor
00:12:40.000 of Oregon? Maybe I've got Kim wrong, but I think that's right. And then the last line of defense,
00:12:46.000 it's like Trump. And you know, his hands are tied. He can't even send in like federal troops
00:12:51.000 or the National Guard or anything. And then their concern was, you know, if Biden and Harris win,
00:12:59.000 now every level of government, whether city, state, or federal,
00:13:03.000 is going to do nothing to protect the city of Portland.
00:13:06.000 And now, like you said, the federal courthouse was on fire with people inside, right?
00:13:10.000 It was set on fire with people inside.
00:13:12.000 Law enforcement.
00:13:13.000 And they had to rush out and try to put the fire out.
00:13:14.000 Yeah.
00:13:15.000 Do you think they should have sent the feds in?
00:13:16.000 Did Trump should have sent the feds in?
00:13:19.000 I mean, it's complicated, right?
00:13:21.000 The way that Antifa works, and a lot of people don't understand this, and if you say, oh, it's Black Lives Matter when it's not Antifa.
00:13:27.000 Sometimes it is.
00:13:28.000 There's this wing of Black Lives Matter that I think is called Black Lives Matter Revolution, but we can look that up.
00:13:33.000 That's the activist wing, the paramilitary wing.
00:13:35.000 The way that they work is what's called inducing mid-level violence.
00:13:38.000 A lot of people don't understand that there are levels of violence, or how violence works, and what mid-level violence is, is that kid the brothers in the back of the car. I'm not touching you.
00:13:46.000 I'm not touching you. I'm not touching you. And what they want is you to either back down
00:13:50.000 and show weakness or to overreact.
00:13:53.000 And so we were referring to this over the summer as the Trump trap,
00:13:56.000 is that if he sent in federal troops, aha, he's a fascist.
00:14:00.000 He's sending in federal troops on American citizens. The narrative was there. This whole game
00:14:04.000 is driving narratives.
00:14:06.000 We see this with what's happening in Minneapolis now.
00:14:08.000 You see it there.
00:14:10.000 The entire operation of what's happening is a narrative-driven maneuver where you have a collusion between big media and the Democratic Party to seize power, and the narrative is the thing.
00:14:21.000 So the game was to put Trump in a lose-lose situation.
00:14:23.000 Oh, he's too weak.
00:14:24.000 He didn't do anything to protect anybody.
00:14:26.000 He didn't do anything.
00:14:27.000 But if he had done, the narrative would have flipped the other way.
00:14:29.000 Oh, he's a fascist.
00:14:30.000 See, we've been telling you for four years.
00:14:32.000 We've been conditioning you for four years.
00:14:34.000 Trump's a fascist.
00:14:35.000 He's going to seize power.
00:14:36.000 He's going to take over everything.
00:14:37.000 And now he's sending federal troops against American citizens.
00:14:40.000 That's exactly what would have happened.
00:14:42.000 So his hands were tied.
00:14:44.000 From what I understand, he was also told by military brass, if you make this order, we won't follow it.
00:14:50.000 And so what's he going to do?
00:14:52.000 Then he gets put in the same trap now with these other people.
00:14:54.000 He has to either force them or back down and look weak.
00:14:58.000 And the same narratives are going to be able to get spun out of this.
00:15:01.000 And this is what people don't understand.
00:15:02.000 And I mean, I don't want to make this like left, right or whatever, but the radical left understands narrative and political warfare, and nobody else does.
00:15:10.000 And so that's why they keep catching everybody with their pants down.
00:15:13.000 Political warfare is the most important concept you've never heard of.
00:15:17.000 And political warfare, our foreign adversaries reported a few decades ago, I think China and Russia both, had remarked that the American ability—because we rely so much on physical warfare, we have such high technology, we have badass jets, you know, the whole thing—we don't even think about political warfare anymore.
00:15:34.000 They said it's so degraded that it might as well not exist.
00:15:39.000 And this is what we're losing right now, is political warfare.
00:15:41.000 And the radical left is trained, they're excellent at it, they think about it constantly, and they put people in these traps, these mid-level violence traps.
00:15:50.000 We're particularly susceptible to it because this country is classically liberal.
00:15:55.000 People need to understand what that means, classically liberal.
00:15:58.000 It doesn't mean conservative or liberal in the colloquial sense.
00:16:01.000 It means believing in freedom and government for, of, by the people, things like that.
00:16:06.000 So this country is a constitutional republic with philosophically liberal values.
00:16:11.000 Liberal as a term seems to have just gradually evolved to mean, I guess, Democrat.
00:16:15.000 But it was like John Locke that invented that.
00:16:17.000 Right, right.
00:16:17.000 Classical liberalism is much more similar to, like, center-right libertarianism, if anything.
00:16:23.000 And then social liberalism is the civil rights kind of era, people who believed in free speech and respect for, you know, people and things like that.
00:16:30.000 So that came about.
00:16:31.000 But those of us who occupy this, like, moderate libertarian space, kind of left, kind of right, maybe, depending on who you are, Well, we're playing fair and we believe in respecting the rights and speech of our opponents.
00:16:45.000 And the problem then is I routinely stand up and defend leftists who get censored.
00:16:51.000 They gloat and laugh when it's us getting censored.
00:16:54.000 That's right.
00:16:55.000 Or I will absolutely, and I did, defend Taylor Lorenz in the wake of this criticism Of her, because this woman from the New York Times put out a tweet saying that, you know, her life was literally destroyed by harassment.
00:17:09.000 I saw a tweet, and I thought it was a silly thing to say, but I'm not going to be bothered by yelling at someone on Twitter simply because I thought something they said was silly.
00:17:16.000 Otherwise, my whole day would be nothing but that.
00:17:19.000 Well, she started getting a lot of criticism, and I just ultimately said, look, I get it.
00:17:22.000 If you want to criticize the idea and the institution, please do so.
00:17:25.000 But, you know, getting into the weeds and getting into the drama, I think, is a waste of our time.
00:17:30.000 I absolutely tweeted in her defense multiple times, made videos in her defense, because I don't like the idea of people piling on and engaging in this culture war drama.
00:17:39.000 And then not only, I won't go into details, but now there's literally a harassment campaign being promoted by large, powerful institutions, putting insane and ridiculous lies out about me, talking about where I live and my home, and absolutely engaging in a targeted harassment campaign, and they're all laughing about it.
00:17:57.000 They're all laughing about it.
00:17:58.000 So I can stand on principle and say, like, let's not do this, guys.
00:18:01.000 And then what happens then is the people on, you know, the anti, you know, woke side, the people who believe in freedom say, it's warfare and we have no choice.
00:18:09.000 They started it.
00:18:11.000 That's the problem.
00:18:12.000 You know, I believe in a principle, and if my principles are, we don't do these tactics, then I am at an extreme disadvantage, where I can only get, I can sit back and turn the other cheek when they do these things to me, to my family, to my friends.
00:18:25.000 But if we dare speak up, you know, Glenn Greenwald points this out, they will come for you and they will use the weight of all of these institutions to destroy you.
00:18:32.000 And even when I still defend them, they don't care.
00:18:35.000 They are bad people.
00:18:37.000 No, this is why I say that, I mean, I've written an article, I think it had a podcast, both of these things that I actually called wokeness, but it could be radical leftism in general, or even any certain totalitarian strain.
00:18:48.000 I called it radical left wokeness in particular, I called it a virus on the liberal body politic.
00:18:55.000 It takes advantage of certain liberal, cultural, and ethical mores specifically to do exactly as you just described, to absolutely neuter the host's ability to defend itself.
00:19:08.000 Your immune system's not there.
00:19:10.000 I like to actually kind of say, frankly, I've kind of gotten hard about cancel culture, and my belief there is that the asymmetry is the story.
00:19:19.000 It's no longer—it's not culture war.
00:19:22.000 There's a power grab happening.
00:19:23.000 I'm not quite as far as the right, who are like, it's war, we have to act like war.
00:19:27.000 But the asymmetry is the story.
00:19:30.000 And so for me, until we adopt an attitude, if someone wants to cancel, you cancel them first, four or five times, because it won't take that many.
00:19:37.000 There is a realization that this is a mutually assured destruction.
00:19:43.000 They win.
00:19:43.000 that this horrific thing's going to continue, but what it requires is a suspension of one's
00:19:49.000 principles as a classical liberal, which puts us in that decision dilemma yet again.
00:19:54.000 And also, by the way, I don't know if it's called GFAS. Is it GFAS? Is that the George
00:19:58.000 Floyd Autonomous Zone? Is that what that's called? But that's the same thing, though, right? Just
00:20:03.000 wanted to point out, since we don't, so we don't lose it.
00:20:05.000 That's also mid-level violence, right?
00:20:07.000 And the city has decided to show weakness by giving them, whatever, half a million dollars.
00:20:12.000 Yeah. So their other option, of course, is to send in the troops and just bulldoze this.
00:20:16.000 They won't do it. Apparently, it's been there for like nine months. They put in half a million
00:20:20.000 dollars now as part of this settlement, and someone just died there recently,
00:20:23.000 and the police couldn't get in to help this person.
00:20:26.000 So, I'll say this now.
00:20:28.000 My opinion very much changed in the Autonomous Zones.
00:20:30.000 I'm here for it.
00:20:31.000 You know what?
00:20:32.000 Roll with it.
00:20:33.000 I left these cities.
00:20:34.000 You know, it's funny.
00:20:36.000 I'm in the Philadelphia area previously, and I was like, there's too many riots, and they crossed the bridge, and they're coming into these neighborhoods.
00:20:42.000 So, I think it's time to start considering not being in these cities, because it's not about Antifa, it's not about Black Lives Matter, it's just about opportunistic violence when things start, you know, going crazy.
00:20:52.000 And we had a year of riots.
00:20:53.000 Well, sure enough, I think it was like a week after we didn't move out officially, we started the process and came down to the new location, riots broke out, mass shootings, there was like a hostage situation, and I'm like...
00:21:05.000 There you go, man.
00:21:06.000 I mean, homicides are on the rise.
00:21:08.000 Everywhere, yeah.
00:21:08.000 And so, I think people need to realize, I think the last group of people I'm worried about in a big city is people like Antifa.
00:21:15.000 I mean, a lot of these people are like suburban privileged, scrawny, you know, angsty youths.
00:21:21.000 I really don't care.
00:21:22.000 My bigger concern?
00:21:23.000 In Philadelphia, there was a bunch of armed dudes, like repeat offender criminals, occupying buildings and shooting at cops.
00:21:30.000 And you've got that kind of crime.
00:21:33.000 You've got the homicide skyrocketing.
00:21:35.000 You've got the gun crime skyrocketing.
00:21:37.000 You've got just theft across the board, even petty theft in these crimes in a lot of these cities, partly due to the demoralization of the police, the defunding of the police.
00:21:45.000 Hold on, hold on.
00:21:46.000 Nobody knows why it's happening.
00:21:48.000 That's what I learned on the media.
00:21:49.000 Yeah, nobody knows.
00:21:51.000 Last summer, it was because it was summer.
00:21:52.000 And in the winter, maybe it was because it was winter, but I don't know.
00:21:56.000 But nobody knows.
00:21:57.000 Yeah, they're like, you see, in the summer, It's warm, giving more people the opportunity to go out, and more people outside means more crime will happen.
00:22:07.000 In the winter it's darker, which means more opportunity to commit crime.
00:22:11.000 And then in the springtime, as the weather starts getting nicer, people want to go back outside again, so crime just exponentially increases non-stop.
00:22:18.000 So there's nothing to do with defunding the police, nothing to do with, uh, you know, a demoralization of police, just happenstance, just magic.
00:22:25.000 Nothing to do with telling them to stand down, nothing to do with DAs letting people off if they get arrested, nothing to do, nothing to do, nothing to do.
00:22:31.000 Nothing to do with the city of Minneapolis putting half a million dollars into an anti-photonomist zone.
00:22:35.000 Hey, we're at that point now where it's like the autonomous zone has official government sanction, you know, in this place.
00:22:41.000 And it's, it's, it's remarkable.
00:22:44.000 So.
00:22:45.000 Where are the conservative or right-wing autonomous zones?
00:22:49.000 Right here in this house.
00:22:50.000 Not really.
00:22:52.000 No, I'm talking about... Not at all.
00:22:54.000 What I'm specifically talking about was when are we going to start seeing right-wing dudes standing in the road, the U.S.
00:23:02.000 State Highway or whatever, that leads into a small town of a few thousand people, just doing checkpoints.
00:23:08.000 They did this in the Pacific Northwest when the fires kept starting.
00:23:12.000 And this is one of the things, you know, Joe Rogan, one of the things I'm disappointed with him about is that he had said on his show, he was wrong, he made a mistake, that Antifa was going around starting these fires.
00:23:23.000 He then made an apology where he said, none of it was true, it's not happening.
00:23:27.000 The issue was that there was a guy who was like a leftist who was caught starting the fires.
00:23:31.000 And there were other people, many, who were not politically affiliated who were just crazy firebug types.
00:23:37.000 So you need to explain to people, when these photos came out showing right-wing dudes with signs saying, you know, checkpoint, and then they were explaining people are starting fires, the media made it seem like conspiracy crackpots were tracking, were scared of Antifa.
00:23:50.000 Sure, some of them were.
00:23:51.000 There was one story about a leftist who got arrested.
00:23:54.000 But a lot of irregular people, like, I don't care who they are, or what their affiliation is, they started fires.
00:23:59.000 And there was a handful of people across the west coast that did.
00:24:02.000 So, you know, Joe's apology kind of just got it wrong.
00:24:06.000 But it's, you know, it's fine.
00:24:07.000 I don't think it was intentional.
00:24:09.000 But there are a lot of people on the left that absolutely engage in ongoing and sustained violence.
00:24:15.000 And over the past year, it's only worked.
00:24:18.000 So my point is... And they're opportunistic to that stuff.
00:24:20.000 Right.
00:24:21.000 When are we going to see, in response to these autonomous zones, Right-wing groups just setting up in their communities.
00:24:27.000 There was a viral video where, you know, an Antifa group went into a neighborhood in Colorado, and a bunch of regular guys just chased them out.
00:24:35.000 And it did not go well for those guys, because actual working-class, like, union boys, they're not scrawny, suburbanite, privileged white kids.
00:24:44.000 No, they're men who went out and were basically like, welcome to my neighborhood.
00:24:49.000 And the Antifa people ran off.
00:24:51.000 So I'm wondering, if the city's gonna give half a million dollars to benefit the George Floyd memorial site at 30th and Chicago, which has become an autonomous zone lockdown where black-clad individuals threaten journalists and refuse to let police in, and the police haven't been in in months apparently, when do we see conservatives have a peaceful autonomous zone where they set up checkpoints?
00:25:10.000 I mean, you know the answer to that question.
00:25:13.000 Never?
00:25:13.000 And the answer is because the left owns the political warfare, like I just said.
00:25:17.000 It's the most important concept you've never heard of.
00:25:20.000 And why?
00:25:21.000 What's going to happen?
00:25:22.000 Well, you know what the media is going to do immediately.
00:25:24.000 It could be as peaceful as peaceful.
00:25:26.000 It could be literally doing nothing but growing flowers, feeding the hungry, you know, bringing out whatever the most pro-social thing you can possibly imagine is.
00:25:34.000 And it's going to be a crazy right-wing militia group of KKK, blah, blah, blah, is what they're going to say about it.
00:25:40.000 It's going to be white supremacists, white supremacists, white supremacists.
00:25:42.000 And because those words, those ideas, it's not just that they have power, because that's one thing.
00:25:48.000 It's actually that they stick.
00:25:50.000 Like, the word conservative, as somebody who, like, is never identified as one, has been tainted almost to the point of, like, absolute, I guess, taint.
00:26:00.000 It's just completely poisoned.
00:26:01.000 And so, conservatives don't seem to understand this.
00:26:03.000 I put this on Twitter a while back, and I said that the meme is mightier than the AR-15.
00:26:08.000 And a bunch of gun guys were like, They'll show you mine, and I'm like, shut up.
00:26:14.000 I don't care.
00:26:14.000 It's like, shut up.
00:26:15.000 It's not true.
00:26:16.000 The only AR-15 rounds that were fired, I don't know how many actually fired, but three hit people from Kyle Rittenhouse.
00:26:23.000 Right, yeah, he had two to three rounds.
00:26:25.000 Yeah, so the only rounds that came out of an AR-15 were like those, like all summer.
00:26:30.000 Thereabouts, you know, approximately.
00:26:32.000 And why?
00:26:32.000 Because every law-abiding gun-owning citizen knew that the second somebody opened fire, guns are gone.
00:26:41.000 Or the bid to take guns is going to start.
00:26:43.000 The meme of gun-toting right-wingers.
00:26:46.000 And look what happened to Kyle.
00:26:47.000 That was brutal.
00:26:49.000 Yep.
00:26:49.000 The meme was actually more powerful.
00:26:51.000 An AR-15 that you can't fire because you know it's going to get turned against you in the narrative war, and you're going to lose that political war over it.
00:26:58.000 That meme is more powerful than the gun.
00:26:59.000 It's an ancient thought.
00:27:01.000 The pen is mightier than the sword.
00:27:02.000 Yeah, that's what I was, I mean, I was riffing.
00:27:04.000 But it's an ancient idea.
00:27:05.000 Listen, listen.
00:27:06.000 I didn't quite think of that one myself.
00:27:07.000 I'll tell you what the real divide is, right?
00:27:09.000 So you've always, you've always been like a fairly liberal person.
00:27:11.000 Yeah.
00:27:11.000 I have as well.
00:27:12.000 Yeah.
00:27:12.000 And Ian, I think you are pretty like... I used to be super liberal.
00:27:16.000 I mean, super liberal.
00:27:18.000 Like, free everyone, we don't need weapons, everything can be fine, just pie in the sky.
00:27:25.000 Let everyone in, we can help everyone.
00:27:27.000 All the time.
00:27:28.000 Crazy.
00:27:29.000 When I was younger, I was an anarcho-punk skateboarder.
00:27:32.000 I could play Baby I'm an Anarchist on the guitar, and I still can from Against Me, and I know a lot of the Against Me songs.
00:27:37.000 I love that band.
00:27:38.000 And, you know, these days it's all changed quite a bit, but the one thing that I think defines the actual war, the culture war, is those who read beyond the headlines and those who don't.
00:27:53.000 That's it.
00:27:54.000 There's really good examples of this if you go online, and you can actually see it.
00:27:59.000 So I'm actually interested in maybe hiring a researcher to start tracking this as a data point, to go to the subreddits of prominent progressive YouTube personalities, and read the comments, and then actually mark them for, like, Who read Beyond the Headline and who didn't?
00:28:18.000 And then go to conservative ones and do it.
00:28:20.000 Because what I've noticed recently is that as I've been browsing the subreddits of several prominent progressive YouTubers, they don't actually read the articles.
00:28:29.000 They'll comment on the headlines.
00:28:30.000 And while it's true for most people, when I go to conservative, you know, sites, not so much.
00:28:37.000 And this is also exemplified in left-wing memes about conservatives.
00:28:40.000 So there was one that I saw On Reddit, where they were mocking the conservative subreddit because the conservative subreddit made a point about not allowing billionaires the ability to manipulate and control our elections and big tech and things like that.
00:28:53.000 And they were mocking conservatives as if the conservatives were the ones who did it to themselves.
00:28:57.000 And it's like, I don't understand.
00:29:00.000 If you actually read what they're talking about, they're agreeing with you.
00:29:03.000 Like, you've convinced them.
00:29:05.000 Or do you not agree with him anymore?
00:29:07.000 The issue was, the caricature of the conservative from the headline does not accurately represent the real world.
00:29:13.000 And that's true for libertarians, be it left or right, or centrists, or whatever.
00:29:17.000 And so what I've found is, why is it that the people who would follow you, James, or would watch this show are more likely to be informed?
00:29:26.000 Is it because of this show?
00:29:28.000 No, I don't think so.
00:29:29.000 It's because of the nature of the individuals who would watch this show.
00:29:31.000 Yeah, I agree.
00:29:32.000 The people who go and turn on CNN and hear Wolf Blitzer say, a bunch of white, you know, white supremacists stormed the Capitol, they go, wow, and they walk away.
00:29:42.000 And then you have the headline that appears on Twitter and it says, far-right white supremacists storm, you know, you know, let's do a better example.
00:29:48.000 You'll turn on CNN and they'll say, the white supremacist group, the Proud Boys, you know, their leader was arrested and they'll go, wow.
00:29:56.000 And you'll see the headline, White Supremacist Leader of Proud Boys Arrested.
00:30:00.000 And people will see the headline and go, wow.
00:30:02.000 And then people who are more inclined to watch shows like this will say, okay, click it and see a picture of a black man as the leader of the Proud Boys and go, wait, what?
00:30:08.000 Something doesn't make sense here.
00:30:11.000 And that's like the red pill moment.
00:30:12.000 No, that's exactly right.
00:30:13.000 You actually start reading the news.
00:30:15.000 That's Red Pill Level 1 right there.
00:30:17.000 That's where you see the news, the headline especially, is lying to you.
00:30:22.000 And then you start finding the buried leads, and you start finding the inconsistencies in the stories, you start noticing that the reporting doesn't match reality, and then all of a sudden you're like, wait a minute.
00:30:33.000 And what it's all been framed up to be is, like, now you're a conservative.
00:30:36.000 Right.
00:30:36.000 Because you've read, right?
00:30:38.000 But you're not a conservative.
00:30:40.000 You're actually somebody who engages information, like, fully or at least more than almost zero.
00:30:46.000 I'll tell you what else is really interesting, too, in this whole debacle.
00:30:50.000 There's another level to this.
00:30:51.000 Those who are algorithmically fed red-pilled type information countering the narrative, and those who are actually engaged with the content in general.
00:31:02.000 What happens is, there are a lot of people, so I was talking to some guy I know today, and he said, he's like, listen man, you used to be really good at everything, and now all you do is just basically, you're like a hardcore Trump supporter, and it's all you do is just always, always, always protecting Trump, And I was like, do you watch my show?
00:31:19.000 And I was like, because we do defend him, we do criticize him.
00:31:22.000 He's like, I only ever see from you defending Trump.
00:31:25.000 And I said, have you considered that Facebook and YouTube are only giving you the videos I produce out of the, you know, three and a half hours per day that I produce?
00:31:35.000 They're only showing you the Trump article, you know, the Trump defense ones, and they're ignoring The times we've talked about him when we were critical of him on war and John Bolton and things like that.
00:31:45.000 I think we said John Bolton 5,000 times and criticized Trump for hiring a bunch of dumb people.
00:31:50.000 Certainly we defended him and I did vote for him.
00:31:52.000 That's true.
00:31:53.000 But there's a lot of people who were Trump supporters who only ever got from YouTube the videos that were like, Trump isn't that bad.
00:32:01.000 And then the times when I was like, here's what I don't like about Trump and here's what I'm mad about, it doesn't go to them because they're less likely to click it.
00:32:07.000 So YouTube isn't incentivized to share those videos.
00:32:10.000 Then you'll get people on the left who are only fed that content because it's more likely to be watched by Trump supporters.
00:32:15.000 And then they're like, this is all he produces.
00:32:17.000 But then the people who actually engage with the content, who would watch every episode, I see them commenting online and they're like, what are you talking about?
00:32:23.000 He criticizes them all the time, especially Ian.
00:32:26.000 And he's like, when Luke and Dave Smith were on the show, it was like two hours of just nothing but ragging on Trump.
00:32:30.000 Awesome.
00:32:31.000 But people don't get that because the algorithm doesn't give it to them.
00:32:35.000 Trump has a lot to be ragged, not to take it too far away from what you're saying, but all these people have enough to be critical about, I think.
00:32:41.000 I mean, sure.
00:32:42.000 And this was actually one of the points that I raised as one of the main reasons I voted for Trump was because the media is like relentlessly, like totally, actually unfairly critical of Trump constantly.
00:32:54.000 And they were at the time covering up the Hunter Biden laptop story.
00:32:58.000 It's like, There's no evidence they're going to be critical of Biden.
00:33:02.000 They're actually covering up a story that's critical of Biden right now.
00:33:05.000 And I was like, if the media, I mean, the media's function, the point of a free press is to be able to criticize power in a free country.
00:33:12.000 Yep.
00:33:13.000 And if that's not happening by whatever set of, you know, for whatever set of reasons, then you have to become skeptical of the side that benefits from that.
00:33:22.000 reason that I wanted to vote for Trump.
00:33:23.000 Seems like a symbiotic reason, like the clicks, like you were saying, it's a
00:33:27.000 clickbait thing, so they're making money off of it, and it's also politically
00:33:31.000 infusive for this liberal economic order, basically.
00:33:34.000 Well, right.
00:33:35.000 Plus, I think Tim was describing that there's an echo chamber aspect to the
00:33:39.000 algorithm. The algorithm gets better and better and better.
00:33:41.000 It's driven by machine learning.
00:33:43.000 So it gets better and better and better at predicting what kinds of things you will click on and therefore play the first five seconds of an ad of, that they make some fraction of a penny off of, unless you accidentally click on the ad and then they make some slightly more fraction of a penny.
00:33:55.000 Accidentally click on it?
00:33:56.000 I mean, has anybody ever actually clicked on one of those things on purpose?
00:33:59.000 Not lately.
00:33:59.000 Unless I missed the skip it button.
00:34:00.000 I click on ads on Instagram all the time.
00:34:02.000 Yeah, Instagram's dangerous.
00:34:03.000 Dude, there's cool stuff.
00:34:05.000 There was one where it was like a mini lightsaber.
00:34:07.000 All I can tell you then.
00:34:08.000 It was a torch, but it was on like a, like a, like it was awesome.
00:34:11.000 All I can tell you then is that your algorithms are getting to know you very well.
00:34:16.000 Dude, they showed me this solar torch thing.
00:34:17.000 It's called whatever.
00:34:18.000 And it's got like a jet, like super long.
00:34:20.000 And he like melts through a tin can.
00:34:23.000 And I was like, I have to buy that.
00:34:24.000 That is sort of a seriously man toy that every man does need.
00:34:28.000 Like, could you send me that ad?
00:34:29.000 There is one here.
00:34:31.000 But literally, I was just like, I can get a torch faster.
00:34:34.000 So I just went and bought a torch.
00:34:35.000 And I was like, cool.
00:34:36.000 And then we have fires in the back sometimes.
00:34:38.000 So I was like, I'll just use a torch.
00:34:39.000 No, that's fun.
00:34:40.000 But no, the truth is, though, that the whole point is that the algorithm is supposed to learn what you'll click on because it drives their ad revenues.
00:34:49.000 There doesn't have to be a nefarious plot.
00:34:50.000 You don't have to have whoever the directors of YouTube in these shadowy rooms saying, oh, we're going to turn this all left, all left, all left.
00:34:57.000 It doesn't have to be that.
00:34:59.000 These algorithms are going to feed people echo chambers.
00:35:01.000 And that's how you end up getting like conspiracy theories, like whether it's QAnon or BlueAnon or whatever Anon or WhaleAnon I made up on Twitter.
00:35:08.000 WhaleAnon?
00:35:09.000 Oh, that was when Twitter, you know, they made that was a bird bird watcher thing, you know, and they put that they put a fake tweet.
00:35:15.000 I don't know if it still exists, but they made that like fake news is everywhere on Twitter and they put, you know, a sample fake tweet and it said whales.
00:35:22.000 Whales are not real.
00:35:24.000 They're robots paid for by the government to control you.
00:35:27.000 And I was like, this is whaling on.
00:35:28.000 And so I'm like making memes, like they look like a queue and there's a whale though with a tail.
00:35:34.000 So here's what happens.
00:35:35.000 There was a study a few years ago where they tracked various networks on Twitter and then created a visualization.
00:35:41.000 I saw that.
00:35:42.000 They found that digital marketing overlaps with the resistance, the establishment democratic position.
00:35:48.000 That means the... What is that?
00:35:51.000 Hashtag?
00:35:51.000 Hashtag the resistance.
00:35:52.000 Yeah.
00:35:53.000 Right, right, right.
00:35:53.000 Yeah.
00:35:54.000 So what happens is Pepsi and Coke and Oreo and whatever big brands, Nabisco, their ad campaigns live in the same universe as anti-Trump Democrats.
00:36:05.000 So their advertisings are crafted around this core group of individuals, which means if they see a video where someone's sitting there like with their eyes half closed and they go, I like Donald Trump, they go, get our ad!
00:36:17.000 Hats off that video!
00:36:20.000 And then Google says, okay, okay, okay.
00:36:22.000 And then Google goes, we just lost a $50 million contract.
00:36:25.000 What happened?
00:36:26.000 The guy said he likes Trump.
00:36:27.000 Okay, well ban it, ban it.
00:36:28.000 We're losing $50 million.
00:36:29.000 There it is.
00:36:31.000 There it is.
00:36:32.000 I mean, I can tell you something a little scarier than that.
00:36:35.000 I happen to know some people who work in, you know, AI development and all of this.
00:36:39.000 And so this guy called me a while back and he's like kind of big into all this.
00:36:43.000 He's like a PhD in it or whatever.
00:36:45.000 And he's like, yeah, I don't know if you know that they're actually, you know, you have machine learning or whatever, the algorithms and how they work.
00:36:52.000 They actually, the people who are behind that, this, you know, you have to, with a machine learning algorithm, it's like the machine makes a guess and then humans are training it by saying, yeah, it was a good guess or a bad guess, right?
00:37:02.000 In many, not always, but in a lot of cases.
00:37:06.000 They're literally using intersectionality as the guide to decide what the right and wrong answers are.
00:37:12.000 And so it's like, you know, oh, was this white supremacy?
00:37:14.000 You know, it's intersectionality is how they're using to decide that kind of stuff.
00:37:18.000 And that's what's getting baked into those algorithms.
00:37:22.000 Meanwhile, they publish articles with the Iron Law of Quote Projection saying white supremacy is on all of the algorithms.
00:37:28.000 We have to put more intersectionality into the algorithms.
00:37:30.000 A few years ago, I was hanging out with you and Helen Pluck Rose and Peter Boghossian.
00:37:34.000 And I got into an argument with Peter.
00:37:36.000 That's a nice surprise.
00:37:38.000 And it was because I was saying that the media was creating this.
00:37:41.000 It was social media and the algorithms, and he was adamant that it was coming from the universities.
00:37:45.000 I think both are true.
00:37:47.000 Both are true.
00:37:47.000 But the point I made was this.
00:37:49.000 I literally watched this happen.
00:37:51.000 I experienced this.
00:37:52.000 I started working in digital media spaces end of 2011.
00:37:57.000 I officially started working for Vice in 2013.
00:38:00.000 I went to Disney.
00:38:01.000 I watched how they implemented these things, why they wanted to do it.
00:38:03.000 They explained it to me, and I explained to them why they were wrong.
00:38:06.000 But what happened was, it's very, very simple.
00:38:08.000 It's a simple algorithmic equation.
00:38:11.000 When Facebook started rising in prominence in attention and generating ad revenue, they started to notice that police brutality videos were skyrocketing in viewership.
00:38:21.000 Something about it was just getting more shares and more attention, and there's like a viral song, this is what happens when you call the cops.
00:38:28.000 There were websites dedicated to nothing but police brutality videos in the Alexa Top 500.
00:38:33.000 Absolutely crazy.
00:38:34.000 All they would do is just aggregate police brutality videos.
00:38:37.000 Now, sidetrack, I'm sure a lot of young people who are swimming in that are now activists saying defund the police because their whole brains were mashed by it.
00:38:46.000 But something else happened.
00:38:48.000 If you made a post and said, police brutality, the algorithm recognized people love this stuff.
00:38:55.000 Send it to them more.
00:38:57.000 But then something else happened.
00:38:59.000 There were also videos about social justice, racism, sexism.
00:39:04.000 The algorithm also said, man, people love these videos.
00:39:06.000 Send it to everyone.
00:39:09.000 And then something magical happened.
00:39:11.000 Racism skyrocketing, police brutality skyrocketing, and then all of a sudden someone made racist police brutality.
00:39:17.000 And then if a racism video would get X views and a police brutality video would get Y views, a police brutality and racism segment would get XY views.
00:39:28.000 Well, I should say x plus y, because I don't want to assume it's exponential.
00:39:30.000 Or xy minus 20%, or some algorithm of increase.
00:39:34.000 It would be an increase.
00:39:35.000 And so this created a space.
00:39:38.000 When digital news outlets started getting tons of venture capital funding, they started to realize this is how you make money.
00:39:45.000 A good example is there's an expo on mike.com.
00:39:47.000 Are you familiar with mike.com?
00:39:49.000 Yeah.
00:39:49.000 It still exists.
00:39:50.000 Right.
00:39:51.000 When they started, they were libertarian.
00:39:54.000 They were Ron Paul libertarian, apparently.
00:39:56.000 So it's been reported.
00:39:58.000 And it was because Ron Paul was very popular online.
00:40:00.000 The Ron Paul Love Revolution.
00:40:02.000 So they were like, OK, let's write these articles.
00:40:04.000 Then they started to play that game.
00:40:07.000 At least it's been reported this.
00:40:08.000 I'm going to avoid litigation.
00:40:10.000 So this is what I read, and I could be wrong.
00:40:12.000 But my understanding is that they started Putting out articles on social justice and then creating formulas where it was like, you know, X has a Y problem and then all of a sudden combining these different things was getting more and more views and traffic because if you had a community of people who watched police brutality and a community of people who watched social justice, you mixed those communities together and you maximized your viewership.
00:40:39.000 This pushed intersectionality as the perfect ideology.
00:40:43.000 All of a sudden, there's this one article from Vice, I can't remember the exact title, but it was like, trans women of color being beaten by police proves why we need Black Lives Matter.
00:40:54.000 And it was just like every possible keyword mashed into a headline to get as much traffic as possible to maximize revenue.
00:41:01.000 And so long as advertisers don't mind being on these psychotic, I mean, this is like, When you read these articles, like there was a joke meme where a guy is like sitting in a room and he's trying to write a name for a Vice article.
00:41:17.000 He's trying to come up with a Vice article.
00:41:18.000 So he pulls an adult toy out of a box and throws it at the wall and it just sticks.
00:41:22.000 And then he's like transgender ketamine dealers of Columbia or something, just random.
00:41:27.000 That's basically what was happening.
00:41:29.000 And then the advertisers are like, I'm okay with this.
00:41:32.000 And so as long as that system functioned and advertisers didn't care, it was fine.
00:41:36.000 And then they found their boogeyman with Trump.
00:41:39.000 We saw Gamergate happen.
00:41:40.000 This was, I mean, the start of the culture war, it's a very interesting thing.
00:41:44.000 And I'm not, I can't get into history, mostly because it's very complicated.
00:41:47.000 You know, I think the Gamergate was the first battle.
00:41:50.000 But what I think happened was these video game websites.
00:41:54.000 What do you write about?
00:41:55.000 You have a job.
00:41:56.000 Your job is to write five articles per day.
00:41:58.000 Or three articles per day.
00:42:00.000 You've already wrote every walkthrough for, you know, the new Zelda game.
00:42:04.000 What do you write?
00:42:06.000 No new game coming out until the holiday season.
00:42:09.000 Speedrunner did a speedrun?
00:42:11.000 Alright.
00:42:12.000 Oh!
00:42:13.000 Video games.
00:42:14.000 Racist.
00:42:14.000 Ooh, yeah.
00:42:15.000 And sexist.
00:42:17.000 And anti-gay.
00:42:18.000 KKK go away.
00:42:19.000 Put it all in the headline and we got a hit article.
00:42:21.000 And then every day they had to do that.
00:42:23.000 And then all of a sudden that's what gaming became.
00:42:26.000 You say the sexist thing too and you put like a picture of like a bikini anime chick or something like that.
00:42:30.000 And then like you're also just getting guys that are like, oh yeah, look at the chick in bikini.
00:42:34.000 Because even though it's a cartoon, click.
00:42:36.000 Whoops.
00:42:37.000 You'd have, like, a sexist article, a racist article, an anti-gay article, and then you'd make the sexist, racist, anti-gay article, and then those writers would go to other companies and write an anti-gay article over there that would enable them to, you know, reference it.
00:42:49.000 Game of Telephone.
00:42:50.000 Game of Telephone.
00:42:51.000 So, it happened fast, too.
00:42:52.000 Yeah, big time.
00:42:53.000 That's true.
00:42:53.000 Because people were so hungry, and they desperately want this confirmation bias, and they love tribalism, but what happened is You'd get one article, you know, where this outlet would write some nonsensical drama about completely not just total BS.
00:43:06.000 Video gamer accused of, you know, oh, they're stealing $100 from their ex.
00:43:10.000 And then.
00:43:15.000 People would click on it, and they'd be like, I can't believe he really did that!
00:43:18.000 And the story nowhere in it would actually say it actually happened.
00:43:21.000 But then another outlet would pick it up and be like, I can't believe so-and-so was accused of doing this.
00:43:25.000 Then the next article would say, multiple reports now, you know, citing that this happened.
00:43:30.000 And then it loops all the way back around, and they create a circular feedback loop of sourceless garbage.
00:43:36.000 And the whole thing is just algorithmically trying to generate revenue.
00:43:40.000 Here's the best part, though.
00:43:42.000 They've created this feedback loop where they're basically parasites, right?
00:43:46.000 You know, I think the conversations we have are imperfect.
00:43:50.000 We're not the best.
00:43:50.000 We engage in culture war stuff too, and nobody is perfect.
00:43:54.000 But I think we try to have legitimate conversations, really break down these ideas to the best of our abilities.
00:43:59.000 What they do is they just leech and parasite.
00:44:02.000 So recently we had this journalist from Axios got a job at Teen Vogue as the editor-in-chief.
00:44:08.000 Well, the people who work there wrote a letter trying to get that editor-in-chief cancelled for 10-year-old tweets.
00:44:14.000 In response, a seven-figure ad deal was pulled, frozen.
00:44:20.000 The article said they lost it, and then it said it was frozen because of the controversy.
00:44:25.000 So this company Teen Vogue, which is supposed to be a fashion magazine, and now it's like writing about Marx and social justice and critical theory, hires on people who are good at writing about intersectionality because it gets a lot of clicks, and then those people attack their own company, costing their own company seven figures.
00:44:42.000 So, perhaps there's some optimism in all of this, in that these people are just consuming themselves and will eventually just be a withered husk in the corner of nothing.
00:44:50.000 Yeah, I mean, that is actually the hope, and that's sort of the mentality behind the people that are so-called accelerationists.
00:44:58.000 They want to kind of encourage this to happen faster or to get people to see it.
00:45:02.000 I'll just take the story that you told just to kind of complete, since we mentioned scholarship and that's what I do, I can plug the scholarship into your story, because the majority of those people, you know, you said they work for a video game magazine or whatever.
00:45:13.000 Some of them did.
00:45:14.000 A lot of them are freelancers.
00:45:16.000 In both cases, these people probably studied something like media studies, especially the freelancers.
00:45:21.000 What was the joke?
00:45:21.000 You know, we're talking GamerGate was 2014, right?
00:45:24.000 Yeah.
00:45:24.000 So what was the joke?
00:45:25.000 The joke was, you know, you get a degree in something like gender studies or media studies or something and you're going to be a barista.
00:45:32.000 No, well, probably you might be, but you're also going to be a freelancer because you think you're a great writer and you have great insights.
00:45:38.000 So what are they going to start writing these sexist, racist, homophobic, blah, blah, blah, you know, keyword articles about?
00:45:44.000 Well, they're going to start infusing that theory.
00:45:46.000 And then various touch points happen in society that then mainstream it.
00:45:50.000 But what's happening is these bloggers or freelance journalists, I should say, are essentially starting to infuse that theory into pop culture.
00:45:59.000 And that theory has been cranking for 50 years saying, you know, by the way, guys, Society actually is secretly fascist, it's secretly racist, it's secretly sexist.
00:46:09.000 Critical race theory's kind of central thesis could be boiled down to racism never gets better, it just hides itself better.
00:46:15.000 And then you have these little cutesy, you know, graduates in these stupid degrees become freelance writers and then it's keyword city, right?
00:46:23.000 It's buzzword city, racist video game, racist movie, racist Dr. Seuss, racist your mom, whatever it happens to be.
00:46:31.000 And one thing after another, and then these are the people who are poised to write these little fluff piece.
00:46:36.000 Well, they're like, it's like the hit version of a fluff piece, right?
00:46:39.000 No content.
00:46:40.000 It's like, it's like a, it's like a bundle of thorns or something instead of a fluff.
00:46:44.000 But these are the people who are going to write these things and they start injecting the ideas of systemic racism and systemic sexism and all of these kind of critical theory
00:46:52.000 ideas into the pop culture through those mediums. And that if you go way back, I mean, that was
00:46:58.000 actually Gramsci's plan, not to get heavy in theory, but Antonio Gramsci, the I'm supposed to
00:47:03.000 always say Albanian Italian, I said that he was Italian, which is true. And these Italian
00:47:07.000 people who wrote me these emails are like, call him Albanian Italian, please don't stick him. Don't
00:47:13.000 make it. He was brilliant. Actually.
00:47:16.000 The guy was probably one of the sharpest minds of the 20th century.
00:47:19.000 Unfortunately, he was also a communist, and he was the one who realized that you have to undermine the pillars of culture in order to take over a society, and he identified those as being in religion, family, media, education, and law.
00:47:35.000 So that media pillar That's what the goal was he said what you have to do is you have to get inside if you want to take a pillar of culture down you get inside of it and you create a counter hegemony within the existing hegemony and start making it grow and basically when you say parasite that's what you're talking about it's get latch on and then grow the strength of that thing like like a Sith or something I don't know I'm not a Star Wars you're the one with the torches but
00:48:00.000 But I mean, that's the idea, right, is to get in there and to start infusing these critical ideas.
00:48:07.000 This is a plan that's been like cooking up scholarly for 100 years that had actually no real way to work until all of a sudden we cooked up this.
00:48:15.000 I've been thinking about this a lot lately, the last, I don't know, three months.
00:48:18.000 Social media and then, I don't know what to call it, but like, just kind of like grubby media.
00:48:23.000 You know, like, you know, little grubby articles like these different... Gawker was like just a propaganda place or whatever.
00:48:30.000 It got taken down.
00:48:31.000 Games of telephone articles that regurgitate the same thing written by somebody else.
00:48:35.000 Grub media.
00:48:35.000 It's called churnalism.
00:48:36.000 Churnalism, yeah.
00:48:37.000 Or urinalism, according to Jeremy Hanby, the quartering today.
00:48:40.000 Urinalism!
00:48:43.000 You said that the Sith were parasitic.
00:48:44.000 I like that, because the Sith Apprentice, parasite off the Master, grows strong and then kills the Master.
00:48:50.000 I mean, right there.
00:48:52.000 I mean, it's like the whole quarterstips plan.
00:48:53.000 It's basically, you know, Emperor Palpatine taking over, becoming the Chancellor, and everyone cheers for it.
00:49:00.000 I mean, there you go.
00:49:01.000 But I'm convinced more and more regular people hate all of this.
00:49:06.000 Oh, totally.
00:49:06.000 I would bet.
00:49:07.000 I was talking to somebody the other day about it, and the guy said, I think that we're in the majority.
00:49:13.000 And I was like, we're not just in a majority.
00:49:15.000 We're not even in a super majority.
00:49:16.000 I bet it's north of 90%.
00:49:17.000 Yeah.
00:49:19.000 North of 90% of people, at best, to the degree that they're aware of it or annoyed by it or insulted by it or think it's ridiculous.
00:49:25.000 And cowardly.
00:49:26.000 But that's the problem.
00:49:28.000 I'll get to a point where I'll have a clear mind.
00:49:30.000 I'll wake up in the morning and I'll put on like Facebook or something.
00:49:33.000 That's a mistake.
00:49:33.000 I'll feel like I'm being twisted and destroyed into some dark tunnel and I'll have to stop and put it away and leave.
00:49:41.000 I mean, it's a formal term that comes out of Jacques Derrida on Postmodern Philosophy, but I used to call Twitter a deconstruction machine.
00:49:49.000 It deconstructs anything you put.
00:49:51.000 You put, for myself, a video of myself using a sword.
00:49:55.000 Immediately people add Benny Hill music and try to make a joke out of it.
00:49:59.000 You can't make a joke out of something I think is awesome.
00:50:01.000 It's just not going to work.
00:50:03.000 Like, I know I look awesome, just shut up, you know?
00:50:05.000 Somebody changed it into an eggplant, a giant eggplant instead of a giant sword.
00:50:09.000 I'm like, I just downloaded that and that's like a meme I share now.
00:50:12.000 Yeah, that's fun.
00:50:13.000 But no, anything you put on the internet is going, on Twitter, on social media I should say, not the internet, is going to get, I'm so stuck on the German now, it's going to get Alf Gehoven almost immediately.
00:50:24.000 It's going to get deconstructed.
00:50:26.000 It's going to get taken apart.
00:50:27.000 It's going to get turned into this dialectical soup, where it gets chewed up and spit back at you into broken pieces, like you said, that eventually become the grist that goes into that closed circle of garbage media.
00:50:42.000 And this is actually the process.
00:50:43.000 Because that crap on Twitter becomes the article that becomes your Wikipedia page.
00:50:47.000 Right, yeah.
00:50:48.000 I actually made some food like this once.
00:50:50.000 I have a meat processor.
00:50:51.000 And I put in a nice, fine steak.
00:50:53.000 And then I put in some fish and some chicken and it became a goop.
00:50:56.000 And then I fed all the goop right back in.
00:50:58.000 And then I fed all the goop right back in and it became a grayish paste.
00:51:01.000 And that's basically what it is.
00:51:02.000 Totally, I have no idea what it is, but apparently you're supposed to eat it.
00:51:05.000 It's chicken nuggets.
00:51:06.000 Chicken nuggets.
00:51:06.000 Well, I mean, it was just meat medley, meat punch.
00:51:09.000 The Wikipedia thing though, you're absolutely right.
00:51:11.000 Dude, do you know what was on my Wikipedia for a while?
00:51:14.000 It's fairly new.
00:51:16.000 I've only had a Wikipedia entry for a few months, and right off the bat, somebody said, like on my Wikipedia, it said, you know, James Lindsay portrays himself as a serious commentator.
00:51:26.000 He wrote a book, How to Have Impossible Conversations, and yet he goes on Twitter and makes your mom jokes.
00:51:31.000 That was on my Wikipedia.
00:51:32.000 And then that got taken off.
00:51:33.000 And then somebody else, I don't know who put it, I'm assuming a fan of mine, went on and put that James Lindsay has gigantic balls of brass.
00:51:40.000 That was on my Wikipedia for a while.
00:51:41.000 Wikipedia is the meat goop.
00:51:48.000 This is probably bad in the bigger picture, but I'm here for it.
00:51:51.000 Do you know the Zeppelin story?
00:51:53.000 For like seven years, my Wikipedia claimed that I invented a Zeppelin.
00:52:00.000 at some kind of autonomous flying camera on a Zeppelin and It was because I guess there was some writer who overheard
00:52:09.000 just like We were having like a hackerspace kind of conversation
00:52:12.000 about crazy ideas And then somehow they wrote that I actually invented a Zeppelin
00:52:17.000 and then it that's it It's fact and no matter how many times I was like, bro, I
00:52:21.000 didn't invent a Zeppelin man It's like we're talking about I bought a consumer drone and
00:52:26.000 me and my buddy hacked the drone to broadcast live footage footage during Occupy Wall Street.
00:52:31.000 It was like one of the first times, I think maybe the first time, there was a live news broadcast via drone.
00:52:36.000 We weren't looking for a big corporation, so it was just online, but it was amazing.
00:52:39.000 I was like flying a drone over this Occupy protest, and then it couldn't fly that long.
00:52:43.000 We talked about a bunch of crazy ideas and apparently an article came out that was total BS that wouldn't correct it and didn't want to correct it because it was a fun story.
00:52:51.000 And then Wikipedia was like, Tim Pool invented a drone.
00:52:53.000 And I was like, I didn't.
00:52:54.000 And they're like, too bad.
00:52:55.000 You're not reliable.
00:52:56.000 So you know what?
00:52:57.000 I'm here for it.
00:52:58.000 I think.
00:52:59.000 That was the Tim Pool blimp is what that was, right?
00:53:02.000 Or the good Tim blimp.
00:53:03.000 It said Zeppelin.
00:53:04.000 It literally said Zeppelin modification or something.
00:53:07.000 Tim Zeppelin.
00:53:09.000 I want to preserve the important cultural institutions that have helped bring about civil rights, real social justice.
00:53:16.000 But at this point, I think the entire system, it's already falling apart.
00:53:20.000 The Jenga Tower is in free fall.
00:53:22.000 That's the perfect metaphor.
00:53:23.000 It's these guys are just knocking pegs out and it's only a matter of time until the thing falls down.
00:53:27.000 So you think accelerationism is where it's at?
00:53:31.000 Maybe a little bit, but not necessarily.
00:53:33.000 It's very targeted.
00:53:33.000 I think I know where you're going.
00:53:35.000 Well, here's what I'm going to say.
00:53:38.000 Wikipedia is fundamentally broken and it's extremely susceptible to attack right now.
00:53:46.000 But there is a I wouldn't call it accelerationism, but an opportunity to prove the paradox.
00:53:51.000 Ridicupedia.
00:53:53.000 Well, so here's what happened, right?
00:53:54.000 I tweeted, impeach the Queen.
00:53:56.000 We talked about it the other night.
00:53:57.000 And a News Guard certified website wrote an article saying Tim Pool calls for the impeachment of Queen Elizabeth.
00:54:03.000 People supported him saying, let's convene Congress.
00:54:05.000 Like this person who wrote it was one of the worst journalists imaginable.
00:54:09.000 Didn't understand you can't impeach the Queen.
00:54:11.000 That Congress has no power, even if you could, to interfere with another country's monarch.
00:54:16.000 Wrote the article!
00:54:17.000 And then here's the best part.
00:54:18.000 On Wikipedia, they're actually arguing over whether or not it should be included.
00:54:22.000 And for a while, somebody put it in, and they tried to take, like, Tim Poole has called, you know, in 2021, Tim Poole called for the impeachment of Queen Elizabeth, and then people are like, we have to put it in, and people are like, you can't put it in, and then someone actually linked the video where I was laughing, saying, I made the whole thing up!
00:54:36.000 I know how this system works, and I can make a joke, and then Wikipedia must treat it as fact.
00:54:43.000 So you have one faction saying, we know he's basically screwing around, and it was a joke tweet.
00:54:48.000 There's a wink meme on it.
00:54:51.000 And then someone saying, we don't have the right to choose what we determine is true or false.
00:54:56.000 That's original research, and it's banned on Wikipedia.
00:54:59.000 You have to go by the sources, and NewsGuard has certified the source.
00:55:02.000 The whole thing is broken.
00:55:04.000 That's totally broken.
00:55:05.000 Ridicipedia, that's the name of this project.
00:55:08.000 There's a lot of journalists who will write whatever you tweet as law.
00:55:10.000 name your project, Ridicupedia, turn it ridiculous. You talk about how Twitter and everything. This is
00:55:14.000 a deconstruction. It's like a counter deconstruction. Like they want to build a counter hegemony,
00:55:18.000 I want to do a counter deconstruction. So there's a lot of journalists who will write whatever you
00:55:22.000 tweet as law. And so if you engage on Twitter seriously and say things that you actually mean,
00:55:28.000 they'll write articles about what you say, or they'll take things out of context to make it
00:55:32.000 worse. So as far as I'm concerned, I'm all for Biden, baby.
00:55:36.000 100%.
00:55:38.000 Biden 2024!
00:55:38.000 Joe Biden.
00:55:40.000 I'm tweeting things like, you know, I believe in you, Joe.
00:55:44.000 I tweeted when I was like, if we all cheer for Joe, maybe he'll send us the next $600.
00:55:47.000 I believe in you.
00:55:48.000 And so at a certain point.
00:55:50.000 You have to do jazz hands for Joe.
00:55:51.000 Well, here's the important thing.
00:55:53.000 All of my tweets are 100% serious.
00:55:55.000 That's it.
00:55:56.000 That's all I care about.
00:55:56.000 Um, I think I've crafted a strategy that creates a paradox that cannot be countered in any reasonable capacity.
00:56:06.000 I'm sure there will be attempts, but the ultimate issue is this.
00:56:09.000 I tweeted, abolish the ATF.
00:56:11.000 I tweeted, abolish the IRS.
00:56:13.000 Which one of those, if any, are my actual opinions?
00:56:16.000 I also tweeted, you know, there was a study saying that, you know, Fauci says something like COVID lockdowns, or they said something like, we don't know if there's ever going to be an end to the pandemic.
00:56:28.000 And so then I said, okay, then release all the restrictions.
00:56:31.000 There's actual opinions in there.
00:56:32.000 Which one's the real one?
00:56:34.000 Maybe you can come to one of my, you know, podcasts and try and figure out how I really feel about things.
00:56:39.000 Do I really want to abolish the ATF?
00:56:42.000 I said abolishing the police is wrong.
00:56:43.000 What's my real opinion?
00:56:44.000 Go ahead and figure it out, journalists.
00:56:46.000 You won't be able to, which means any tweet ever used by any publication from me will be them publishing complete bunk BS because you will not be able to determine which one of my tweets is real.
00:56:58.000 Yeah, I do this, only I change my name when I do it.
00:57:00.000 You know, I often change my name on Twitter.
00:57:02.000 So, this doesn't inspire me, by the way.
00:57:04.000 I gotta change my name on Twitter back to King of Your Mom, so that some journalist puts that in my Wikipedia.
00:57:08.000 But he really does think he's the king of your mom.
00:57:12.000 Because I am.
00:57:13.000 But no, I changed my name, though, to Lames Ginzy.
00:57:15.000 Lames Ginzy is my troublemaking, like, if I'm gonna say, you know, very smart opinions, if I'm a very, let's see, James Lindsay is a deplorable or something now.
00:57:24.000 And so Lames Ginzy is my very smart person persona.
00:57:28.000 He has all the correct opinions.
00:57:30.000 But by doing that, you're making sure they know which one is the real and which one is the... Well, no, no, no.
00:57:36.000 The problem here is that when you change your name, it changes your name on all your tweets.
00:57:40.000 So I do it and I make the tweets and like an hour later, it doesn't make any sense.
00:57:44.000 So then I have the same paradox.
00:57:46.000 I just throw a wink and a nod to my peeps that actually pay close attention to me.
00:57:50.000 I'm also going to tweet a lot of nice things about lefties.
00:57:54.000 Good.
00:57:54.000 Just to be nice.
00:57:57.000 My favorite thing to tweet about lefties is what the president said.
00:58:03.000 I don't know which president.
00:58:04.000 A president.
00:58:05.000 Some president said that you can't hate the haters and losers.
00:58:08.000 You gotta love them because they can't help that they were born effed up.
00:58:13.000 It's true, love your enemy.
00:58:14.000 I mean, it does help, even to win a war, if you love the enemy, you'll be better served at destroying them, you know, if you were fighting them.
00:58:22.000 I look at it this way, you know, I was approaching everything wrong.
00:58:25.000 For the longest time, I knew that my word was meaningless for Wikipedia and for these journalists.
00:58:31.000 There was, you know, like the Zeppelin thing should have been a really strong wake-up call for me.
00:58:35.000 No matter how many times I kept screaming I didn't make a Zeppelin, people were like, the guardian, I think it was the guardian that wrote it.
00:58:39.000 So like, shut up, you're not allowed.
00:58:42.000 And so it's like, okay, why don't I just give ridiculous quotes to journalists and let them write things and then have that be... Could you imagine the historical record?
00:58:50.000 I did that to Zach Beauchamp whenever he interviewed me about the Grievance Studies Affair.
00:58:53.000 He called me and it was like the most leading conversation ever.
00:58:56.000 He was like, wow, that's very interesting.
00:58:58.000 I learned something new.
00:58:59.000 I had no idea that was true.
00:59:00.000 Could you tell me a lot more about that?
00:59:03.000 And I was like, I'm just going to plant quotes and see what he publishes.
00:59:07.000 And he wrote the best article ever.
00:59:09.000 I love it.
00:59:09.000 I mean, it's awful to me, but it's so funny because I know what I said and I know what I wanted.
00:59:15.000 It's like nonsensical.
00:59:15.000 He published things I wanted him to publish.
00:59:17.000 Right.
00:59:18.000 So I owned him and he doesn't even, to this day, he has no idea that I absolutely owned him on that article.
00:59:22.000 I think there's two things people can do they don't realize.
00:59:24.000 Become active on Wikipedia.
00:59:26.000 And then the interesting thing was watching people on Wikipedia who are like, they have legitimate accounts and they're longstanding editors arguing over the philosophy and merits and the rules of Wikipedia.
00:59:36.000 And it just becomes absolute, it's bedlam.
00:59:39.000 Like, you can't use original research, but Tim Poole admitted it, I don't care if he admitted it, and they're just fighting.
00:59:44.000 Yeah, this is beautiful.
00:59:45.000 What are you gonna do?
00:59:46.000 Because, I mean, it reduces Wikipedia to absurdum.
00:59:48.000 Like, it's not even like it's an unreliable source, like all your college professors might have told you, or your high school teachers, or whoever.
00:59:54.000 It's like, it's actually, it's actually just reduced to absurdity.
00:59:59.000 Yep.
01:00:00.000 One of my favorite things is that, like, I haven't taken Twitter seriously in a really long time.
01:00:04.000 Oh no, that's key.
01:00:05.000 You can't take Twitter seriously.
01:00:07.000 You're doing it wrong.
01:00:08.000 All these people are mad at me.
01:00:09.000 They're like, what are you doing on Twitter?
01:00:10.000 I'm like, having fun.
01:00:11.000 It's like a sewer pipe.
01:00:12.000 Like, as fun as a sewer pipe.
01:00:14.000 I mean, it's not like... That's what I used to say.
01:00:16.000 It just sends information out, which is in the form of whatever you want to send through it.
01:00:19.000 You said it was a sewer that followed you back in the day, but my other buddy said, no, it's more like a dumpster fire you pack up in a backpack and take with you.
01:00:25.000 And I thought that was the best way to ever pocket.
01:00:28.000 Accelerationism, I think, is the wrong idea.
01:00:31.000 Typically, people use it to refer to physical chaos and violent stuff.
01:00:35.000 But I think screwing with the press to prove the failures and the paradox of these systems, I wouldn't call that accelerationism.
01:00:44.000 I would call that...
01:00:46.000 It's almost like sabotage.
01:00:49.000 Well, it is.
01:00:50.000 Sabotage can be a form of acceleration.
01:00:51.000 Like if a dam is cracking and you're like, dude, that dam's going to break and kill millions of people.
01:00:56.000 So you go and you smash the dam.
01:00:58.000 You accelerate what's going to happen anyway, because you're like, let's just get it over with.
01:01:02.000 But then you're the one that's blamed for destroying the dam.
01:01:04.000 I'm not talking about acceleration.
01:01:06.000 What I'm talking about is, so long as the media goes unchecked, it will not be a dam that's about to break.
01:01:11.000 It will last forever and it will grow and then make lots of money.
01:01:13.000 I think it has to break.
01:01:14.000 I don't think the truth always lasts.
01:01:17.000 not okay so until they drive people to kill each other see my usual answer with
01:01:20.000 that is the truth is always gonna win in the end right so it will break but the
01:01:24.000 Soviet Union lasted 70 years I don't think that's a long time of so I don't
01:01:28.000 think the truth is last I think the winner of the war writes the history I
01:01:32.000 mean reality is still gonna bat last you could The lie only maintains so long.
01:01:36.000 And whether you know what actually happened or not is a different question.
01:01:40.000 But the dam, which I'm thinking of it as the ability to continue maintaining the lie, that will give, and eventually the truth will out.
01:01:52.000 But the problem is that, A, like I said, the Soviet Union lasted 70 years of not that good, to put it mildly, and China is still going.
01:02:02.000 Yep.
01:02:02.000 It's been a long time.
01:02:03.000 It's still not great.
01:02:05.000 And simultaneously, you know, when you're talking about something like a cultural revolution, which is frankly what we're in the midst of, accelerationism can push people like you're talking about a dam breaking and millions of people dying.
01:02:17.000 Yes.
01:02:18.000 Tens of millions of people died in the cultural revolution.
01:02:20.000 You are correct.
01:02:21.000 And accelerating toward that rather than trying to figure out a way to divert the course of the river is probably not the best strategy.
01:02:29.000 I have an analogy.
01:02:31.000 It's like you're in a car, right?
01:02:33.000 And you're, like, on an abandoned train track, and you can see that up ahead, the bridge is out.
01:02:39.000 And if you try and slow down, you might just go off the cliff.
01:02:42.000 But if you slam that gas, baby, you could jump the gorge and then...
01:02:46.000 Or, or reach, what, 88 miles an hour?
01:02:48.000 That's what I'm going to say.
01:02:48.000 You got to hit 88 miles per hour.
01:02:50.000 Back in time when the bridge was still there.
01:02:53.000 Yeah.
01:02:53.000 No, no.
01:02:53.000 Forward in time.
01:02:55.000 When the bridge was completed.
01:02:55.000 Even better.
01:02:57.000 But you need to have like a pink... Actually, that's a really interesting... In Back to the Future, the bridge wasn't there yet.
01:03:03.000 And they hit 88 and then hit the future where the bridge did exist.
01:03:06.000 And it wasn't about going super fast.
01:03:07.000 It was just the right speed.
01:03:09.000 So nobody died.
01:03:10.000 There was no destruction.
01:03:11.000 So perhaps if we... Oh no, the train went off though and blew up.
01:03:15.000 You're right.
01:03:15.000 There was some destruction.
01:03:17.000 But the train was blowing up anyway because that crazy chemist put those pink logs in there that were blowing its gaskets and stuff, right?
01:03:24.000 Great Scott!
01:03:25.000 So here's the analogy.
01:03:26.000 1.21 gigawatts.
01:03:27.000 We're all in the woke train and if we don't get it to the right speed, The DeLorean won't make it to the future where the bridge is complete and we can all live peacefully, but the woke train is gonna go off no matter what.
01:03:38.000 That's correct, actually.
01:03:39.000 I think that's a good way.
01:03:42.000 Pop culture reference!
01:03:43.000 I was like, this is going right off the gorge.
01:03:46.000 No, and then it landed.
01:03:47.000 You landed that, Tim.
01:03:48.000 There's a lot of progressive pundits who get really triggered by pop culture references.
01:03:52.000 Yeah.
01:03:52.000 I'm like, you know, like you don't have to like it. I don't care. Probably a lot of people might not like it one point
01:03:57.000 21 gigawatts regular people Understand pop culture and it's interesting to me
01:04:03.000 Who are these progressives who like don't watch movies and don't watch TV shows and like are not part of culture
01:04:08.000 Maybe that's why there's in this weird space where they believe these weird things
01:04:11.000 I don't know.
01:04:13.000 They're sad.
01:04:15.000 They're sad.
01:04:15.000 How fun is it to talk about Back to the Future?
01:04:17.000 It's way better.
01:04:18.000 It's a great movie.
01:04:19.000 It was fun.
01:04:19.000 That was three, right?
01:04:20.000 That was three.
01:04:21.000 Three was great.
01:04:22.000 I loved it.
01:04:22.000 Christopher Lloyd.
01:04:23.000 He's incredible.
01:04:24.000 He's good.
01:04:24.000 Okay, so we're talking about...
01:04:26.000 Accelerationism, the culture war.
01:04:28.000 Do you think that it's like a Chinese or a communist intentional plot to subvert the political structure?
01:04:35.000 One of many that this guy, what was the guy's name?
01:04:37.000 The Albanian guy?
01:04:38.000 Oh, Antonio Gramsci.
01:04:39.000 This communist conceptual artist, Antonio Gramsci.
01:04:43.000 Gramsci.
01:04:44.000 Do you think they're using that philosophy now to try and undercut American society?
01:04:49.000 Yes and no, more yes than no.
01:04:52.000 The infiltration of the institutions and the establishment of a counter-hegemony and a counter-state, which the peeps on the right call a deep state, is certainly something that's been an objective, that's been explicit by the radical left intelligentsia.
01:05:08.000 It was definitely adopted by big players like Herbert Marcuse in the 1960s.
01:05:12.000 And you're like, who?
01:05:14.000 He was a rock star.
01:05:15.000 He had huge followings.
01:05:16.000 You're talking 1960s and he could, like, he could pack a house at 300,000, you know, sales of his book in the first, like, couple years it came out or something in the 60s.
01:05:25.000 So this is a, he was a rock star on the left.
01:05:27.000 He was mentor to Angela Davis.
01:05:29.000 We're talking about BLM and the prison stuff.
01:05:31.000 He was, he was her PhD mentor at UCSD.
01:05:34.000 And so it's like, he's not a fringe figure.
01:05:38.000 And he deliberately said in the 1960s that it's time for the leftist intelligentsia in
01:05:43.000 the universities to start teaming up with the cultural outsiders and the racial minorities,
01:05:48.000 specifically, to form kind of a block that would operate in kind of unison by the racial
01:05:54.000 minorities.
01:05:55.000 He didn't, of course, mean racial minorities.
01:05:56.000 What he meant was the radicalizable black liberationists who are operating within his
01:06:01.000 paradigm, which was the paradigm that was identified by the Communist Party in the 20s
01:06:04.000 and 30s as the wedge issue that would open America.
01:06:07.000 So there is an element to where it's very deliberate.
01:06:10.000 Now, your typical woke person has never heard of any of this, has no idea.
01:06:13.000 They're just trying to be a good person.
01:06:14.000 They put their black square on their Instagram.
01:06:17.000 They're like, but I do care about racism, and they have no idea about any of it.
01:06:20.000 Your typical professor of gender studies probably has a dim idea at best, maybe hasn't even ever heard of Gramsci.
01:06:28.000 So, probably has no idea that this actually was a communist plot, although they're going to still say, you know, down with capitalism, down with capitalism.
01:06:35.000 On my flight in here, I was actually reading Herbert Marcuse's essay on liberation, which was written in 1969.
01:06:42.000 And he's just like, all through it, he's like, You know, what we have to do is abolish capitalism in order to make room for socialism.
01:06:50.000 So this was certainly something that was in that line of thought all the way back, you know, in the 60s, and was deliberately implanted into the scholarly literature in the universities very intentionally then.
01:07:02.000 So there is an intentional aspect to it.
01:07:05.000 It's also modeled after what we are experiencing as a cultural revolution.
01:07:08.000 You know, you brought up China.
01:07:10.000 The Chinese had a cultural revolution in the 60s, through 76, or 65, through 75, or something like that.
01:07:18.000 The wave for that was paved—I know Mao is his own special kind of character—but the wave for that was paved with the same rhetoric as we see in critical race theory applied to the Han race instead of the white race.
01:07:28.000 We talked about Han supremacy starting in the 1920s in China.
01:07:31.000 We started talking about how some people were good Hans and some people were Han supremacists.
01:07:36.000 Like separating all the rhetoric mirrors really closely exactly what's going on, and they started to create basically racial disharmony throughout China, and then that was kind of part of the grounds upon which that instability that Mao was able to step in.
01:07:53.000 Who, when you say, though, is your typical critical race theorist a Marxist?
01:07:57.000 Well, some of them are.
01:07:58.000 I mean, you had the founders of Black Lives Matter come out and say, you know, we're trained Marxists.
01:08:02.000 They got caught on video saying that.
01:08:04.000 Was that Patricia Cullen?
01:08:04.000 I don't think they were caught on saying it.
01:08:06.000 I think they were probably saying it.
01:08:06.000 Well, I know they probably were saying it.
01:08:07.000 It was like a public thing.
01:08:08.000 Yeah, I mean, and if you read their website before they scrubbed it, it was like comrades this, comrades that.
01:08:13.000 The Black Lives Matter fist is actually just the communist fist.
01:08:16.000 Right.
01:08:16.000 Yeah, exactly.
01:08:17.000 It's not even the Black Lives Matter fist, but people call it that.
01:08:19.000 Right.
01:08:20.000 I mean, that's exactly right.
01:08:21.000 It is just a communist fist.
01:08:24.000 So there's something very deliberate there.
01:08:26.000 But again, your average critical race theorist may not know very much of this, because they're just caught up in the theory, just going along with the theory.
01:08:33.000 So there's like this lack of intentionality.
01:08:35.000 But then you say, who's funding this?
01:08:37.000 Of course, the universities are giving it space, but who's funding this?
01:08:41.000 And you start looking at the various, you know, huge organizations that are funding this, throwing billions of dollars into it.
01:08:48.000 You have, for example, Kimberly Crenshaw is one of the—she is the creator of intersectionality.
01:08:53.000 She was one of the creators of critical race theory.
01:08:56.000 She was a—she cites Angela Davis, who we just mentioned is a student of Marcuse.
01:09:02.000 Well, she runs this thing called the African American Policy Forum that's almost completely funded by the Open Society Foundation.
01:09:07.000 This is one of these big billionaire philanthropists that's basically dumping money into this.
01:09:12.000 That's George Soros.
01:09:13.000 That's Soros, yeah.
01:09:14.000 What's the agenda there?
01:09:15.000 Was he a communist?
01:09:16.000 Well, I don't know.
01:09:17.000 I have no idea.
01:09:18.000 Mackenzie Bezos also put $2 billion into critical theory.
01:09:21.000 Yeah, I mean, tons of them.
01:09:22.000 I mean, the Oregon Ethnic Studies Ethnic Math Program that just made headlines everywhere, where they were, like, focusing on the right answers, white supremacy culture.
01:09:29.000 Two plus two is five.
01:09:31.000 Two plus two is five.
01:09:32.000 Well, that one's not in there, but it's in the wheelhouse.
01:09:35.000 Yeah, but Ethnomathematics for the state of Oregon, which just got pushed through by law, I think, into their schools, that's funded by Bill and Melinda Gates.
01:09:43.000 The whole program was funded by the Bill and Melinda Gates Foundation.
01:09:46.000 And so it's like, are they communists?
01:09:50.000 I mean, it's like, I don't know.
01:09:51.000 I don't think so.
01:09:52.000 Maybe.
01:09:52.000 I don't know.
01:09:55.000 What's a good word for people like that?
01:09:57.000 You know, like megalomaniac?
01:09:59.000 I mean, megalomaniac is usually kind of the thing, right?
01:10:03.000 It's people who think that they have the capacity to just step in and do good.
01:10:07.000 Like the bad guy in the movie.
01:10:10.000 Do they know how to work?
01:10:11.000 Yeah.
01:10:12.000 I'm glad you brought that up.
01:10:15.000 Isn't that film just a bit on the nose?
01:10:18.000 Yeah!
01:10:18.000 Like, they even make the My Fair Lady reference.
01:10:21.000 What's that?
01:10:21.000 Okay, so you know the story of My Fair Lady, right?
01:10:24.000 It was like a woman was like a tramp or something?
01:10:27.000 Yeah, yeah, Eliza Doolittle.
01:10:29.000 And so this is written by George Bernard Shaw.
01:10:31.000 It was actually originally titled Pygmalion and modeled after the Pygmalion myth, but we don't have to go into Pygmalion.
01:10:37.000 So Eliza Doolittle is this tramp, basically.
01:10:42.000 And these two rich guys find her and they're like, I have a bet, you know, and they're gonna bring her in and teach her her manners well enough to fake it and if she can get to the end of this, you know, fancy dinner at the end of a certain amount of time and convince everybody that she's not basically a flower girl, which I think is what it actually is.
01:11:01.000 she's selling flowers or something, and that's of course a symbol for something.
01:11:05.000 Then, you know, one or the other wins the bet and it's all this gentleman's bet or
01:11:08.000 whatever. And so that's what My Fair Lady is about. So that was written though by
01:11:12.000 George Bernard Shaw, who was a leader of the Fabian socialists, which had the
01:11:16.000 crest of a wolf in sheep's clothing, and it had the explicit agenda. If you can
01:11:23.000 look up there, they have a stained glass window that's very famous called the
01:11:25.000 Fabian glass or the Fabian window, and it actually says something like, it shows
01:11:29.000 the world heated up on a forge, set on an anvil, and they're hitting it with
01:11:33.000 hammers. And one of the people hitting it is George Bernard Shaw, and it's like
01:11:36.000 heated up and remolded to the heart's desire or something like that.
01:11:42.000 So they're trying to reshape the world, and they were socialists, and the goal was incremental sneaking in of socialism, wolf in sheep's clothing, so that people don't realize what you're doing.
01:11:49.000 It says, remold it nearer to the heart's desire.
01:11:52.000 Yeah, that's what it says, that's right.
01:11:52.000 Pray devoutly, hammer stoutly.
01:11:54.000 Wow.
01:11:55.000 Yeah, so this is actually a really famous piece of glass.
01:11:57.000 The Fabian Society is not well known now.
01:12:00.000 It, I think, still exists.
01:12:01.000 It spun off the Labour Party in the UK and probably the other Labour parties.
01:12:05.000 It also spun off the London School of Economics, which is probably its main operating base of think tanks now.
01:12:12.000 And the reference, like, at the very beginning of Kingsman, you know, you have, again, this kind of, like, roughneck Essex Cockney kid, right?
01:12:22.000 And he's just, like, all kinds of in trouble because he grew up without his dad.
01:12:25.000 His dad died in the thing.
01:12:27.000 And he has a special, like, Kingsman thing if he ever gets in trouble.
01:12:31.000 And he's always going around and horsing around getting in trouble.
01:12:33.000 And he has a foul mouth.
01:12:35.000 It's really kind of funny the way he talks, you know.
01:12:38.000 Eggsy is his name.
01:12:39.000 And then he ends up getting in trouble and he calls and it's like these gentlemen appear and now they're gonna make him into a gentleman warrior.
01:12:47.000 Yeah.
01:12:47.000 Secret service agent.
01:12:48.000 Gentleman spy.
01:12:49.000 And you know the movie's hilarious.
01:12:51.000 It's really worth watching.
01:12:52.000 It's like a semi... I don't know.
01:12:54.000 It's not even semi-serious.
01:12:55.000 It's a spoof off of like James Bond type stuff.
01:12:58.000 It's pretty serious.
01:12:59.000 But it's pretty serious too.
01:13:00.000 Yeah, but like it kind of it plays into the silliness of the gentleman spy.
01:13:03.000 Right, and so the theme of the first, there's two of the two films, and the theme of the first film though, you have, is it Samuel L. Jackson?
01:13:10.000 Yeah.
01:13:11.000 Is playing this megalomaniac, cell phone company, big tech mogul, Vincent Valentine or something, something Valentine, I forgot his first name.
01:13:22.000 And he basically has decided that we are the virus and the planet is going to die from global warming.
01:13:27.000 And so he devises this technology that can make people go insane and kill each other if they hear a noise.
01:13:37.000 And so what he offers is everybody in the world gets free cell phones, free SIM cards, free internet forever or whatever.
01:13:42.000 So everybody goes to get their free SIM cards and then he plays the sound when everybody has them.
01:13:46.000 And everybody starts killing each other everywhere they are to bring down the global population.
01:13:53.000 But the tramp in this movie stops him and saves all the people.
01:13:58.000 Well, that's right.
01:13:58.000 That's exactly right.
01:13:59.000 So is that kind of defying Shaw's story?
01:14:05.000 I mean, I watch this and I'm just... I mean, with the point, of course, I just brought up Shaw because they reference the movie, right?
01:14:12.000 And so they reference the play, I should say, My Fair Lady.
01:14:17.000 What I mean is, in the movie, the bad guy is the guy who wants to kill all the people to stop global warming.
01:14:22.000 Correct.
01:14:23.000 Using a high-tech media device that makes people crazy, which is super on the nose.
01:14:28.000 Exactly.
01:14:29.000 And the good guy is the regular person who didn't want to be involved and got dragged in and found an opportunity and then says, I'm going to stop these people and save the ones that I love.
01:14:38.000 Oh, right.
01:14:39.000 A regular guy who along the way learns manners maketh man.
01:14:42.000 Yeah.
01:14:42.000 Right.
01:14:42.000 So that's like their like little line of the gentleman warrior or spy or whatever.
01:14:47.000 But yeah.
01:14:48.000 So the question I watch this and I'm like, Did the people who wrote this film know?
01:14:54.000 Like, they made the My Fair Lady reference.
01:14:55.000 Did they know about the Fabian Socialists?
01:14:57.000 The plan of the Fabian Socialists?
01:14:58.000 They know where this is, like, what could be going on here?
01:15:02.000 Are they tapped into the idea that we're using media devices, like, in our pocket?
01:15:06.000 It's not noises, it's social media, to drive ourselves nuts and hate each other?
01:15:11.000 Have you seen, um, what's that, uh, what's that TV show?
01:15:14.000 The, the, uh, the virus one?
01:15:16.000 Oh, Utopia?
01:15:17.000 Utopia!
01:15:18.000 Yeah.
01:15:18.000 Have you seen Utopia?
01:15:19.000 I have not, but what a name.
01:15:20.000 They cancelled it.
01:15:20.000 Do you want to know why they cancelled it?
01:15:21.000 Two on the nose?
01:15:23.000 Yep.
01:15:23.000 Yep.
01:15:24.000 Whoa.
01:15:24.000 The show is about a, uh, there's these people and, spoiler alert, it's an old show but they
01:15:30.000 remade it on Amazon.
01:15:31.000 The Amazon remake apparently got cancelled because it was Two on the Nose.
01:15:34.000 Let me explain that, but, you know, let me explain.
01:15:39.000 There's a comic book, and in the comic there's clues to what's really going on.
01:15:44.000 To put it simply, a piece of visual entertainment gives a group of people foreknowledge as to what's really going on and what's really going to happen.
01:15:53.000 As it turns out, this tech mogul guy who thinks that we're overpopulated and the planet is destroying itself stages a pandemic and then offers up everybody a vaccine which sterilizes them.
01:16:08.000 And they canceled it because they were like, yeah, no.
01:16:15.000 And so the point was conspiracy theorists were saying a piece of visual entertainment that we can watch is talking about a virus that's being staged, you know, that's where a tech mogul is pushing a vaccine.
01:16:29.000 Little on the nose with what's going on right now.
01:16:31.000 And so, you have a show, they're reading a comic, learning the future, or what the plan is, the code names, Tech Mogul.
01:16:40.000 Well, we got a Bill Gates, we got a COVID-19, we got a vaccine program, and we got it free for everybody.
01:16:46.000 And so they were like, you know, stop the show.
01:16:48.000 Yeah, wow.
01:16:48.000 And the original version of it actually was back in like, I think 2011 or something in the UK.
01:16:52.000 Oof.
01:16:53.000 Yeah.
01:16:54.000 I mean, so, I mean... Life imitates art, I guess?
01:16:57.000 Do you ever see that video game Plague Inc.?
01:16:59.000 No.
01:16:59.000 Plague Incorporated?
01:17:00.000 It was like, the object... It's still super popular.
01:17:02.000 Yeah, well it got banned in China right after COVID.
01:17:05.000 Yeah, I remember that.
01:17:06.000 They banned it.
01:17:06.000 But the point of the game is you're playing as a virus or a pathogen of some sort and you want to infect all of humanity and destroy them.
01:17:11.000 And so after China banned the game, some time went by, they were like, we feel really bad about this.
01:17:16.000 They just released their expansion, The Cure.
01:17:18.000 So now you play as a doctor trying to vaccinate and stop the plague.
01:17:23.000 It's free for everyone until COVID's done, by the way.
01:17:25.000 Yeah.
01:17:26.000 So good, bad, I don't know, but super keyed into the manipulation of media.
01:17:30.000 That's for sure.
01:17:30.000 You know, there's a gender studies paper that's titled Women's Studies as a Virus, right?
01:17:36.000 Is that yours?
01:17:36.000 No, no, no.
01:17:38.000 This one's real.
01:17:39.000 It's by Michael Karger and Buran Fahs.
01:17:41.000 And they actually argue that the virus makes the ideal metaphor for feminist and women's studies and gender studies pedagogy.
01:17:48.000 And they say that what it should be is that people are being infected in their disciplines and by maybe minoring in the subject or whatever, then going off into graduate school to infect other disciplines.
01:17:59.000 And then they compare themselves.
01:18:01.000 I kid you not.
01:18:02.000 They compare themselves favorably to All of HIV, Ebola, and cancer.
01:18:08.000 Wow.
01:18:09.000 They compare their own discipline favorably.
01:18:11.000 They say HIV is immune suppressing and so you have to take up steps that make you immune suppressing because it's a very effective virus.
01:18:18.000 Ebola is extraordinarily powerful and dangerous and contagious.
01:18:22.000 I'm not joking.
01:18:24.000 They talk about viruses entering cells, maybe like the HPV or whatever, and changing the DNA in the cell, and then that leads to cancer.
01:18:30.000 They said that represents permanent transformational change of disciplines and other departments and other walks of life and affinities.
01:18:39.000 Cancer?
01:18:40.000 That's how they want to permanently change systems?
01:18:42.000 They literally call themselves Cancer, HIV, and Ebola.
01:18:46.000 I wish the grand conspiracies were real sometimes, because life's kind of boring.
01:18:50.000 I mean, it's like... And it's probably just you look for patterns.
01:18:52.000 I think they are sometimes.
01:18:54.000 That's the crazy part.
01:18:55.000 Yeah, like Gulf of Tonka, and we know that.
01:18:56.000 Yeah, we do know that.
01:18:57.000 For a fact.
01:18:58.000 I mean, for admission, anyway.
01:19:00.000 Mostly.
01:19:01.000 I think there is still some, you know, pushback.
01:19:03.000 But sometimes these crazy conspiracies are real, which makes life worth living.
01:19:07.000 I mean, it's just so interesting.
01:19:08.000 It also makes life very dangerous, if you look into certain things, like State Street, BlackRock, and, uh, what's that other big investment firm?
01:19:16.000 There's like three investment firms that run the earth, basically.
01:19:19.000 It's the real government.
01:19:21.000 Yeah, yeah.
01:19:21.000 At some point, and this is true, a lot of people realize because I've actually talked to politicians who've said this.
01:19:27.000 It's called Vanguard?
01:19:28.000 Vanguard, BlackRock, and Statesman.
01:19:30.000 Vanguard.
01:19:31.000 Biggest investment firm in the world.
01:19:32.000 I've actually met politicians who say it's sometimes better to have the power in the shadows than it is to be the public face.
01:19:38.000 And so a lot of politicians are straight-up performative.
01:19:40.000 They're there to entertain and keep stability in the sense that, like, It's almost like, you know, with the Colosseum of Rome.
01:19:48.000 People were suffering, they were hungry, there was chaos, there was corruption, so give them fights and throw bread at them.
01:19:53.000 Bread and circuses, right?
01:19:55.000 Well, now they're effectively doing that in Congress.
01:19:57.000 You'd see, for the longest time, these politicians yelling, like, you know, my colleague over there across the aisle just hates everybody and I am sick of it!
01:20:04.000 And then once it's over, like, you see the cameras, like, you know, the actual news network would pull away, but actually in the The C-SPAN never turns off, but nobody's watching it.
01:20:18.000 You'd watch them shake hands and grab a drink together and be hanging out and laughing, and it was like all a performance.
01:20:24.000 Because they keep the people in this country divided to a certain degree.
01:20:28.000 Maybe it's on purpose, maybe it's not, but it's actually what the machine does.
01:20:33.000 Whether these politicians intentionally do it, I'm sure a lot of people would say they would, the machine does it regardless.
01:20:39.000 Right.
01:20:40.000 The city people and the rural people are divided.
01:20:42.000 Even in red states, blue, like small towns are blue.
01:20:46.000 It's the weirdest thing.
01:20:47.000 I was in rural West Virginia and I drove through a tiny town and it was all critical race theory.
01:20:52.000 And I was like, wait, I just drove like a mile away and it was all Trump signs.
01:20:56.000 But in a small town where you had proximity, more density, it was like Black Lives Matter flags and stuff.
01:21:01.000 So you think they're playing performance, these politicians, because it's too dangerous to go deep on, like, the real issues?
01:21:07.000 The real companies, like Vanguard and stuff?
01:21:09.000 No, I think the problem is they don't care and they don't pay attention.
01:21:12.000 That might be true.
01:21:12.000 So it's an easy opportunity for those with massive power to just keep gaining power and no one will stop them.
01:21:17.000 I mean, Kennedy got killed.
01:21:19.000 I don't know if it was a political coup or not, but the dude was speaking out against the military-industrial complex and got killed.
01:21:25.000 That sucked.
01:21:25.000 So, and not since have any president ever come out against the military.
01:21:30.000 Yeah, he did a little bit.
01:21:30.000 Actually, not really.
01:21:31.000 No, he didn't really.
01:21:32.000 Yeah.
01:21:32.000 I mean, a little bit.
01:21:33.000 Weapons deals.
01:21:33.000 He acknowledged it, but he didn't like do anything.
01:21:35.000 He wasn't saying, I'm going to break it up.
01:21:37.000 He was all in favor of the weapons deals because of the money, but he was not in favor of the war because he promised to end it.
01:21:42.000 And so it wasn't, perhaps the, uh, if you believe in the conspiracies, it wasn't a kill worthy thing.
01:21:48.000 It was like, he's still selling our weapons.
01:21:49.000 We're getting money.
01:21:50.000 Using drone bombers.
01:21:52.000 How do we sell more if he's not using them?
01:21:54.000 You know what I mean?
01:21:55.000 So they didn't like that.
01:21:56.000 They wanted more war.
01:21:57.000 But he wasn't, like, overtly like, I'm gonna break up the military-industrial complex, like Kennedy said.
01:22:02.000 He was very overt about it.
01:22:04.000 Wow.
01:22:04.000 Fearless.
01:22:05.000 So, back to the China question.
01:22:08.000 Uh-oh.
01:22:09.000 Well, so, we had China Uncensored on the show.
01:22:11.000 I asked him about this, and I'm wondering, are you familiar with Thucydides' trap?
01:22:15.000 Yeah, vaguely.
01:22:16.000 Basically, like, we're facing a potential war with China based on historical precedent.
01:22:20.000 I'm wondering if we're already... Well, actually, we're already in it.
01:22:23.000 You know, a lot of people might not want to acknowledge this, but cyber war between the US and China has been pretty intense for a long time.
01:22:29.000 Let me just draw back to that word, political war.
01:22:31.000 Right.
01:22:32.000 Political warfare has been happening.
01:22:35.000 The biggest sign that I ever saw of that was the Chinese have a term to make fun of the woke, which is Baizhou.
01:22:39.000 Baizhou, yes.
01:22:40.000 Baizhou, yeah.
01:22:41.000 It means white left.
01:22:43.000 Yeah, white left.
01:22:44.000 It does, yeah.
01:22:44.000 And then all of a sudden, like, the Wumao accounts on Twitter... Yeah, Twitter stopped making fun of them.
01:22:50.000 It's like all of a sudden it stopped making fun of them and then the accounts were like, America has a big systemic racism problem.
01:22:56.000 And it's like all of a sudden they realized that this is a weapon.
01:23:00.000 Because they used to make fun of the white left.
01:23:02.000 Now they can see that it's destroying the country and they should embolden it.
01:23:05.000 Was Mao's revolution a form of political war in the beginning?
01:23:09.000 I mean, yeah, they were engaging in political warfare for sure.
01:23:12.000 The goal with the Cultural Revolution was kind of to cover up for the failure of the Great Leap Forward so that Mao could reclaim power.
01:23:19.000 So the Great Leap Forward was when he took all the farmers out of the farms.
01:23:22.000 Yeah, and he took all the farmers and said, you guys are going to all make steel, because steel is the one big measurement.
01:23:26.000 Pig iron.
01:23:26.000 Yeah, and they all ended up making crappy pig iron, and nobody was growing food.
01:23:31.000 Command economy, baby!
01:23:33.000 It was the Great Leap into the dumpster, is what it worked out to be.
01:23:36.000 The Great Leap into 50 million dead people, pretty much.
01:23:38.000 Okay, now that's not funny.
01:23:39.000 Not so funny.
01:23:41.000 Yeah, so that was a huge failure and they actually kind of chased Mao out of power.
01:23:45.000 I'm not real sharp on the Chinese history with remembering who all the names are in the Gang of Four and all of this stuff.
01:23:51.000 But then Mao came back with basically with his little red book indoctrinating students and young people and universities and schools.
01:23:58.000 and unleashed what he called the Red Guard and the Cultural Revolution to destroy the
01:24:03.000 four olds of society.
01:24:05.000 And you know, old habits, old ideas, old customs, old ways of thinking.
01:24:10.000 What do you think is happening right now?
01:24:12.000 It's like, you know, Dr. Seuss was a man of his time.
01:24:14.000 We have to destroy that.
01:24:16.000 His time was terrible.
01:24:18.000 Thomas Jefferson was a man of his time.
01:24:19.000 Throw a statue in the lake.
01:24:22.000 Man of his time.
01:24:23.000 Take the name off the school in San Francisco.
01:24:26.000 Throw his statue in the lake, too.
01:24:28.000 You know, write like the Antifa A or whatever it is on his face with spray paint.
01:24:33.000 Anarchy A. Which apparently they don't understand what that means.
01:24:36.000 I love this.
01:24:38.000 Yeah, exactly.
01:24:40.000 That's a better one.
01:24:42.000 Authoritarianism.
01:24:43.000 By the way, in that essay, not to dive into this, but in that essay on liberation I was reading, you repeatedly have Marcuse refer to the need to raise the red and the black flag.
01:24:54.000 That's what he's always referring to throughout that, in case you wonder who he's connected to.
01:24:58.000 Yeah.
01:24:58.000 What's that, the fascist and the communist flag together?
01:25:00.000 No, no, red and black is left anarchy.
01:25:02.000 It's a symbol of, what is it, black is labor, and red is...
01:25:08.000 Yeah, it's socialism and anarchy is what it is.
01:25:12.000 And then there's black and yellow.
01:25:13.000 That's why Luke really likes black and yellow.
01:25:15.000 That's capitalism.
01:25:16.000 That's the fancy symbol of Antifa, has a black and a red flag.
01:25:21.000 That's what it is.
01:25:23.000 And so he's referring to that in 1969.
01:25:24.000 It's like empowering the black and the red.
01:25:26.000 He even says the black and the red flag must be empowered behind this.
01:25:30.000 And it's like, huh?
01:25:32.000 Anti-fascist Acton or however it's said in German.
01:25:35.000 My German pronunciation is not very good.
01:25:36.000 Do you know what the Berlin Wall's real name was?
01:25:38.000 No.
01:25:39.000 The Anti-Fascist Protection Rampart.
01:25:41.000 That was the actual name of the Berlin Wall.
01:25:43.000 We just call it the Berlin Wall because that's the colloquial name.
01:25:45.000 That's the common name.
01:25:46.000 But I think it's important we remember what the true name was.
01:25:49.000 Yeah, exactly.
01:25:49.000 I mean, this is the thing.
01:25:50.000 So I was reading Bella Dodd's testimony again.
01:25:53.000 I read it in December or November last year, and I was reading it again a little bit today and a little bit last week.
01:26:01.000 And Bella Dodd was a member of the Communist Party USA who confessed and she actually left the party defected.
01:26:09.000 And in 1953, she confessed everything basically about the Communist Party's ideas to the House Committee on American Activities, now known as the House Judiciary Committee.
01:26:18.000 And, well, not quite.
01:26:20.000 It assumed all of that former committee's roles and duties, which are technically not dissolved.
01:26:25.000 But she's confessing, and she says, you will always find—I put this on Twitter just yesterday, or the day before—you will always find that the communist plan is to dress up its activities in high-minded terms.
01:26:37.000 So you'll find the communists saying that we are the anti-fascists.
01:26:40.000 Right.
01:26:41.000 And so if you oppose communism, you must be pro-fascist.
01:26:44.000 She was saying this in the 1950s about how this would happen.
01:26:49.000 I'm an anti-Republican.
01:26:51.000 Good.
01:26:51.000 I mean, I have those memes about becoming an anti-Republican.
01:26:53.000 Yeah, yeah.
01:26:54.000 See, I'm an anti-Republican.
01:26:56.000 That means I believe in free speech.
01:26:58.000 It means I believe in the right of the individual to assemble, the Constitution.
01:27:02.000 And if you oppose me, you're pro-Republican.
01:27:05.000 Yeah, that's the problem.
01:27:06.000 Yeah.
01:27:06.000 Yeah, exactly.
01:27:07.000 I'm glad you understand the logic.
01:27:08.000 The communists would say, um, rightists.
01:27:11.000 They hunted down the rightists.
01:27:12.000 They did.
01:27:12.000 Rightist intent, rightist beliefs.
01:27:14.000 This was big with Mao.
01:27:15.000 Also, of course, bourgeois, but, um, but they did go right after rightist.
01:27:19.000 It's too naked.
01:27:21.000 But I mean, Marcuse put that, his essay, Repressive Tolerance from 1965.
01:27:24.000 The thesis statement is literally almost word for word.
01:27:28.000 Movements from the left must be tolerated under all circumstances and movements from the right must not be tolerated.
01:27:33.000 What terrifies me is when I find myself saying people on the left, I feel like I've been indoctrinated and subconsciously being used as a pawn to segregate this culture war.
01:27:45.000 No, that's true.
01:27:46.000 What needs to be done is that the willful actors who are participating in this need their own identifiable name that actually sticks.
01:27:52.000 Woke is a thing, but it's slangy.
01:27:56.000 Critical social justice is a term that shows up in their literature, but it's scholarly.
01:28:00.000 You know, we could get really technical, and it's a critical constructivist.
01:28:03.000 That's what they do.
01:28:03.000 The Fighting Mongooses.
01:28:04.000 That's a good team name.
01:28:05.000 Yeah, I know, right?
01:28:06.000 It's not too bad.
01:28:07.000 Flying Eagles.
01:28:09.000 But I don't know, you know, what exactly to call them.
01:28:13.000 The Repressive Left would work, but people have tried variations.
01:28:16.000 It was Regressive Left for a long time.
01:28:18.000 Repressive is actually more... Let's call them Identitarians.
01:28:21.000 Identitarians is good, yeah.
01:28:22.000 Identity Marxists fits.
01:28:25.000 I think Identitarian works because it's the core ideology for a lot of what they push.
01:28:30.000 And Google search what Identitarian means and then, you know.
01:28:35.000 Authoritarian, I like a lot too.
01:28:37.000 But it's Identitarian.
01:28:38.000 These are people who want law based on race.
01:28:41.000 They want law and policy based on identity.
01:28:43.000 That's better.
01:28:44.000 Because if we think of it as left, then they'll use that to hunt people down.
01:28:49.000 And it's easy to be opposed to racism in all its forms, right?
01:28:53.000 Yeah.
01:28:53.000 Racism's wrong in all its forms.
01:28:54.000 There's a very powerful, simple statement that also has the concept of, or Robert Lifton would have called, thought-stopping technique, which is what they use on you all the time.
01:29:04.000 When they call you a racist, they're using thought-stopping techniques.
01:29:07.000 Lifton was, by the way, describing how Mao was able to use the Red Guard to take over when he talked about cults being built, and he was specifically analyzing the Maoist one.
01:29:17.000 They're going to put half a million dollars into this autonomous zone, apparently.
01:29:20.000 I don't know what that means.
01:29:20.000 Did they say what it was for?
01:29:22.000 No, it's GFAS.
01:29:23.000 Yeah, that's cool.
01:29:24.000 I'm not going to lie.
01:29:25.000 Stepping out of the system for a minute.
01:29:26.000 GFAS sounds cool.
01:29:28.000 Don't you want to go down and hang out at the GFAS?
01:29:30.000 I don't know.
01:29:30.000 Yes, but they'll shoot me, so I'm not going to.
01:29:32.000 Oh, never mind.
01:29:34.000 Yeah, journalists showed up and they were like, something bad's about to happen right now.
01:29:36.000 He's like, OK, we're leaving.
01:29:38.000 Yeah, that's true.
01:29:39.000 What a loser.
01:29:40.000 Yeah.
01:29:40.000 If a journalist shows up, he's like, what do you mean?
01:29:42.000 I'm outside.
01:29:43.000 And they're like, you better leave.
01:29:43.000 I'd be like, shut your mouth, kid.
01:29:45.000 Go home.
01:29:46.000 Bring it.
01:29:47.000 Make it happen.
01:29:48.000 That's a decision dilemma too.
01:29:49.000 They're going to put in a decision dilemma.
01:29:51.000 Yep.
01:29:51.000 And they lost.
01:29:52.000 And people are cowards.
01:29:54.000 Wait, what do you mean?
01:29:54.000 What's the decision dilemma?
01:29:55.000 You have to either show weakness or you have to overreact.
01:29:58.000 Interesting.
01:29:59.000 When they get up in your face and they're like, you better get out of here.
01:30:01.000 Well, I was actually talking about the, uh, just the establishment of an autonomous zone at all.
01:30:06.000 They're basically playing chicken with the city.
01:30:09.000 If the city overreacts, they're going to be like, help fascists.
01:30:12.000 This is why we need Klumblok.
01:30:14.000 Clown blog.
01:30:15.000 Yeah.
01:30:15.000 Is that like bat anti-shark spray?
01:30:17.000 A hundred people dressing clowns just dancing and playing music and like banging drums and like, you know, just because there's there's no aggression.
01:30:25.000 There's no attack.
01:30:25.000 There's no affiliation.
01:30:27.000 It's bizarre.
01:30:28.000 And it occupies space and sullies their ideology.
01:30:32.000 That's very Butlerian.
01:30:33.000 Listen.
01:30:34.000 If you send in a group of Proud Boys, it's horrifying white supremacists attacking poor Black Lives Matter protesters.
01:30:39.000 So my friend told me that the Proud Boys should all put on, like, they should march socially distanced on purpose.
01:30:45.000 They should all wear rainbow masks.
01:30:47.000 They should have, like, you know, like, the feminist power fist, like, in the circle with the plus, on their, like, emblazoned on their chest or whatever.
01:30:55.000 And they should go in singing, like, John Lennon's Imagine.
01:30:58.000 And then just see what happens, because that puts them in a decision dilemma.
01:31:01.000 But sure, but the story would then be, Yeah, maybe.
01:31:04.000 far right white supremacists troll black protesters? Maybe.
01:31:07.000 What happens if a bunch of unnamed, unidentifiable clowns show up playing music and banging
01:31:12.000 drums and giving out funnel cake and cotton candy? Four hundred articles about the Joker.
01:31:16.000 They'll just say a bunch of clowns showed up and are occupying the space and playing music and
01:31:21.000 when they're asked about what they're doing they just go, and then who are the clowns?
01:31:26.000 The people that occupy the Autonomous Zone, of course.
01:31:28.000 Exactly.
01:31:29.000 They're all clowns.
01:31:30.000 The Autonomous Zone is just occupied by a large group of clowns.
01:31:34.000 That's it.
01:31:35.000 Each and every clown is unidentifiable.
01:31:37.000 They're not a political faction.
01:31:38.000 They have no political ideology.
01:31:39.000 Their faces are painted and they're wearing wigs.
01:31:42.000 They're unidentifiable.
01:31:43.000 They don't bring weapons.
01:31:44.000 They don't fight anybody.
01:31:45.000 Literally just clowns having fun riding little tricycles and laughing and hugging and dancing.
01:31:49.000 And what would happen if a parade of clowns with balloons got brutally beaten by Antifa?
01:31:54.000 It would be art that was watched for millennia.
01:31:58.000 People would reference those videos of the clowns being taken away in handcuffs.
01:32:01.000 By who?
01:32:01.000 Cops?
01:32:02.000 Cops can't go in.
01:32:02.000 It would be terrifying.
01:32:03.000 It would be hilariously horrifying.
01:32:04.000 It's a decision dilemma.
01:32:06.000 What can these groups do if clowns occupy their space?
01:32:09.000 Peaceful, happy, go lucky, friendly family, friendly people, there just to bring a smile to people's faces who may be sad.
01:32:17.000 You are creating the decision dilemma back for them.
01:32:19.000 Right.
01:32:20.000 Which is actually what you have to do.
01:32:21.000 No public announcements, no organization, just a hundred clowns showing up on a certain date just to bring joy to people.
01:32:28.000 Joy and peace and love and funnel cake, of course.
01:32:31.000 Anti-violence.
01:32:33.000 And if you oppose the anti-violence, then you're pro-violence.
01:32:35.000 That's right.
01:32:36.000 You'd have to sustain it to really have an impact.
01:32:38.000 Like, seven weeks of clowns.
01:32:41.000 And they could always be the same clowns, but different people could come in.
01:32:44.000 But you could do this everywhere.
01:32:45.000 You could do it everywhere.
01:32:47.000 Because When you look at the black block when what the activists do
01:32:51.000 and they all wear black you can't figure out who these people
01:32:53.000 Are you don't necessarily know what their ideology is or they want and it's they do this on purpose
01:32:57.000 So they can't get caught when they commit crimes. Yeah, exactly. How are you gonna smear a bunch of clowns?
01:33:01.000 Just bunch of clowns Touché, dude.
01:33:04.000 I'm in it.
01:33:04.000 No, this is good.
01:33:06.000 It creates the necessary dynamic that you have to do.
01:33:09.000 People ask me all the time, like, what's basically the strategy?
01:33:12.000 What kind of thing do you have to do?
01:33:14.000 They're trying to put you in a decision dilemma.
01:33:15.000 You have to put them back in a decision dilemma.
01:33:17.000 And mid-level violence, beating them with mid-level violence, that's how you do it.
01:33:20.000 One of these leftists would dress like a clown and throw a molotov.
01:33:23.000 Oh, yeah, that's true.
01:33:24.000 That's the other thing, because, like, I was talking to a guy who wants a, like, I can't build technologies, and maybe you guys can.
01:33:30.000 He was saying it would be really helpful if you had, like, this app where, even if it's just anonymous, that you know, you know, you could type in your school board or whatever, like, Brown School Board or whatever.
01:33:38.000 and you could find it or school district and you could find out like parents can go in and opt in anonymously or whatever maybe there's a community function maybe it's not but you could actually just know there's like 90 other parents in the school's school district who oppose critical race theory yeah so you feel more emboldened to be able to speak up and if you could connect with one another even if it's anonymous at first or whatever but i was like it's just going to get like activists are just going to make fake accounts and descend on that like crazy right um maybe if it's like you know just the yes there are people here You can make it anonymous, like, you know.
01:34:09.000 Yeah.
01:34:09.000 No one can see your name, so you can say whatever you want.
01:34:11.000 Right.
01:34:11.000 I mean, that would have to be the case, because the second they find out who you are, you know, there's trouble.
01:34:16.000 I don't think so.
01:34:17.000 I think the only reason there's trouble is because there's 100 people, 10 of them are pointing, you know, the baseball bat, and the other 90 are like, but no one will have my back.
01:34:26.000 Yeah, well, the odds aren't that good for them, actually.
01:34:30.000 I've heard so many stories now, because I get preferential stories of this sort sent to me.
01:34:36.000 I've heard so many stories where you have a school district, for example, that serves maybe 5-10,000 families total in a community.
01:34:43.000 And the people do some digging and find out that it's a dozen activists.
01:34:47.000 Twelve against thousands.
01:34:48.000 Right.
01:34:49.000 And I've had cases where there's been small cities in the U.S.
01:34:53.000 where people on either like the city council or whatever have contacted me and said, you know, we did some digging, you know, we're getting constant relentless harassment and tons of comments on all the message boards, blah, blah, blah, from different accounts.
01:35:04.000 Well, they're all fake.
01:35:06.000 Turns out there's like 20 people.
01:35:08.000 Only one of them is a constituent because you have to have one to be able to file stuff.
01:35:11.000 And what tipped him off is like, why does the same person file everything, right?
01:35:15.000 You know, that requires a constituent to file?
01:35:17.000 But it's like, it looks like there's 400 people.
01:35:20.000 This happens to me on Twitter all the time.
01:35:22.000 So I tweet something about Hegel's philosophy, like anybody cares, right?
01:35:27.000 And, uh, all of a sudden, you know, I start getting dunked on, which is strange enough.
01:35:32.000 It must be touching some wire.
01:35:34.000 And then, I'm having, like, cartoon characters with, like, 60 followers reply tweet to my tweet, getting 300, 400 likes within an hour or two.
01:35:43.000 Like, you can't tell me that's organic.
01:35:46.000 Like, that's not normal.
01:35:48.000 I just don't check notifications.
01:35:49.000 Twitter behavior in any universe.
01:35:50.000 So just don't check notifications.
01:35:52.000 But no, the point is that what you actually have with a lot of these activists is you
01:35:57.000 have a perception that there are lots of them.
01:36:01.000 But what there actually are very small numbers of them, usually with lots of accounts that
01:36:06.000 are swarming in artificial fake ways to make it look like there's lots and lots and lots
01:36:11.000 of people.
01:36:11.000 That's a military tactic where you'll you'll make it seem like there's more of you surrounding
01:36:15.000 the enemy to confuse them, set off explosives over there and noise over there.
01:36:19.000 Yeah, I think these people feel like they're insurgents They are insurgents then they know they're insurgents the ones who are not just like your average like dopey foot soldier wokey They know what they're doing.
01:36:30.000 They're very informed They know that they're just trying to claim power.
01:36:33.000 They know what the tactics are.
01:36:34.000 They have books like Beautiful Trouble that are activist handbooks that tell them how to do this.
01:36:39.000 I'm at a bit of a loss.
01:36:40.000 Maybe you could help.
01:36:41.000 Now, I'm a big advocate of turn the other cheek.
01:36:44.000 I believe if someone wrongs you, the best thing you can do is just allow it, understand, see their humanity.
01:36:48.000 But I also see they crucified Jesus after he told people to do that.
01:36:53.000 One time a friend of mine would smack my leg a lot and I was like, ah, it really hurt.
01:36:56.000 And I was like, stop, please stop hitting me.
01:36:58.000 But she, she just was doing it.
01:37:00.000 It was just programmed to do it.
01:37:00.000 And one day I said, if you do it again, I'm going to hit you as hard as you're hitting me.
01:37:04.000 So she did it again.
01:37:04.000 And I hit her, smacked her super hard.
01:37:07.000 And she was screamed out and never hit me again.
01:37:10.000 This situation was solved.
01:37:11.000 Yeah, you hear this with parents and children who bite a lot, is that they finally just, you know, bite the kid back, and it's like, that's what it feels like.
01:37:19.000 The kid never bites again.
01:37:21.000 And it's like, there is a point there.
01:37:25.000 And this is where I was saying earlier, you know, cancel that.
01:37:27.000 If someone comes to cancel, you cancel them first.
01:37:30.000 It's kind of like a, it's semi-humorous, you know, thing, but I don't think until—I keep saying the asymmetry is the story.
01:37:38.000 The asymmetry is the goal.
01:37:39.000 The goal of repressive tolerance from 1965 was to create this asymmetry—left good, right bad.
01:37:46.000 Marcuse, in that essay, by the way, if you want to see how close to what is happening now it is, he says that the problem isn't even that right-wing movements exist and do things and that they would be violent.
01:37:57.000 They have to be stopped at the point where the thought enters their head.
01:38:00.000 Therefore, it requires not even just censorship, but pre-censorship of movements from the right.
01:38:05.000 And that asymmetry is what's being attempted.
01:38:07.000 That's what cancel culture is meant to do.
01:38:09.000 Cancel culture is meant to create pre-censorship.
01:38:12.000 In other words, the idea is not even out there anymore.
01:38:15.000 People are afraid, even if they hold the idea to say it, because if they say it, they're going to get cancelled.
01:38:19.000 They're going to get shot down off of their platform.
01:38:21.000 They're going to lose whatever it is that's their living, or whatever it happens to be.
01:38:23.000 The kids are going to get harassed at school.
01:38:25.000 So pre-censorship, to prevent people from even feeling like they can say the thing, is really their objective.
01:38:31.000 Until the symmetry is restored, it's just going to be L after L after L after L stacking up.
01:38:37.000 Until people grow spines and just say, shut up, I don't care.
01:38:40.000 Well, I mean, historically speaking, from what I've been told, but I'm not a scholar of this, I can't talk about it deeply, is until the children of the actual elites start having an impact, or the parents, I should say, of the children see an impact in their children's lives.
01:38:56.000 But it's already happening.
01:38:58.000 Sort of.
01:38:58.000 And they're playing into it.
01:39:00.000 Sort of.
01:39:00.000 I mean, that's the problem, is a lot of people at that level have bought into the ideology.
01:39:05.000 I hear from parents all the time that say, like, the one thing I wish more desperately than any is that I could talk to other parents at my kid's school and say, don't you just think this is all bogus?
01:39:15.000 Don't you think it's a terrible idea that they're bringing in trans strippers to dance?
01:39:18.000 Those people need to have someone throw a large ice-cold water balloon in their face.
01:39:26.000 Well, they said what's happening is they know that if, especially, and this is true in private schools, elite private schools, that if they say that to the wrong parent, their kid's going to get pushed out of the school.
01:39:35.000 Good!
01:39:35.000 Why is your kid in that school in the first place?
01:39:38.000 I mean, that's kind of the feeling that I have.
01:39:40.000 I've been kind of pissed off at people for a long time, which it's like, you know, people are like, I'm scared I'm going to lose my job.
01:39:46.000 It's like, everybody's going to lose their job if this goes badly.
01:39:48.000 Why do you want to work for a company that's funding this?
01:39:51.000 You know, look, I've stopped caring a lot about these autonomous zones to a certain degree in that Portland voted for the rioters.
01:39:58.000 They literally voted for the rioters.
01:40:00.000 And a lot of people are like, dude, I can't leave.
01:40:02.000 I'm stuck.
01:40:02.000 I sympathize.
01:40:03.000 I empathize.
01:40:03.000 If you're stuck and you can't afford to leave, I understand.
01:40:06.000 But maybe at a certain point to realize that we live in trying times and the world is not perfect.
01:40:11.000 See, for me, maybe it's easier.
01:40:12.000 I don't have children.
01:40:14.000 I've been homeless before.
01:40:15.000 I'm not scared of it.
01:40:16.000 If I had to just walk into the woods, I'd be like, okay.
01:40:20.000 But too many people are scared.
01:40:21.000 They've lived in comfort for too long, and they're scared of losing that luxury because they don't understand what it's like to actually have to survive on the streets.
01:40:28.000 And you know what?
01:40:29.000 Surviving on the streets?
01:40:30.000 Infinitely easier than surviving in the woods.
01:40:32.000 And you don't even have to do that.
01:40:33.000 No one's talking about, you know, making you a homeless person living under a bridge.
01:40:37.000 We're just saying, they might take your kid out of the school, and then you gotta figure out something for your kid.
01:40:41.000 Yeah.
01:40:42.000 Homeschool them.
01:40:43.000 This is the truth, is that the thing that you said, that we live in trying times now, and people have to start living as though... I mean, this is something I get mad about at the very smart people a lot, is, you know, my opinion is, I would agree with you if the world was the same way it was five years ago.
01:40:56.000 I haven't even said that.
01:40:58.000 I got asked a question, I gave a talk recently at a major university, a very famous major university, and I got asked a question that was very much to that, you know, they were like, well, don't you think you turn people off by some of the way that you behave on Twitter and the way that you behave?
01:41:14.000 And I was like, first of all, I don't care at all.
01:41:18.000 Good.
01:41:18.000 Bye.
01:41:19.000 But second of all, most importantly, is like, I can't first, why bye?
01:41:23.000 Because I can't put a chain around my neck and constrain the way that I need to feel like I need to talk to say what I need to say, because it might upset somebody who follows me on Twitter.
01:41:31.000 I don't care.
01:41:32.000 But secondly, and much more importantly, if the world was the way it was five years ago, or the way we thought it was five years ago, I probably would believe you or agree with you.
01:41:43.000 But it's not.
01:41:45.000 We're in a bad situation right now.
01:41:47.000 And if you don't act like we're in a bad situation right now and start living like what time it actually is, then this just keeps getting worse.
01:41:54.000 And it gets harder.
01:41:55.000 I told people this at the beginning of the riots last summer.
01:41:57.000 This won't blow over.
01:41:58.000 The longer you wait to stand up, the harder it gets.
01:42:01.000 Well, I'll put it this way.
01:42:01.000 We're all in a boat, and that boat is taking on water.
01:42:05.000 And you've got people like me and James Lindsay and Ian and other people we have on the show who are looking you in the face right now and say, start bailing.
01:42:13.000 And they're going, but, you know, I I don't know if I should because I mean, you know, that
01:42:36.000 people are gonna get they're gonna find out that I know the ship is sinking and then I'd rather look if I climb to the
01:42:36.000 top of the Titanic and go to the back, it's not sinking over there. It's actually going up. I think that'll be fine.
01:42:36.000 It's like bro. The whole thing's gonna split in half. It's gonna go under and I'll tell you this. We're all making our
01:42:36.000 ways toward the lifeboats. We tried bailing water.
01:42:36.000 Maybe we're past that point.
01:42:37.000 Now we're at the point where those of us who have been paying attention, and many of you who are watching, have already started getting ready for what those lifeboats figuratively are.
01:42:44.000 And a lot of people who think they can sit back and shut up and keep their head down are going to be in for a very, very rude awakening.
01:42:49.000 I will tell you, there's a meme where death goes knocking on every door.
01:42:53.000 And every door has like a trail of blood coming out.
01:42:56.000 And the meme will show like video game companies or movies or whatever.
01:42:59.000 And then he's knocking on the next door.
01:43:01.000 People seem to think, I will sit in my house and as these psychopaths go door-to-door threatening people, if I put up the flag, they'll walk past me and then, oh no, one day they don't.
01:43:12.000 They knock on your door and they say, well, are you going to give us money now?
01:43:15.000 Like they did to the businesses in Louisville.
01:43:18.000 You know that story?
01:43:19.000 Yeah.
01:43:20.000 Black Lives Matter went around to all the local businesses downtown and said, tithe or else.
01:43:25.000 And when one Cuban immigrant said no, they shattered a pot and said tithe or else.
01:43:30.000 And then they set up barricades and called him racist and shut down his business.
01:43:34.000 That's what happens.
01:43:35.000 Tithe or else.
01:43:37.000 They demanded a percentage of the income from these businesses.
01:43:40.000 Sit back at your own peril, fine.
01:43:43.000 But don't come crawling to me when you keep saying, but I'm scared to talk to the other parents.
01:43:49.000 Okay, well then you reap what you've sown.
01:43:51.000 You are the one watering the flowers of destruction, and then you're gonna come and complain about it later?
01:43:56.000 Dude, the rest of us are getting into the weeds and pulling out the trash and getting rid of the grubs, while you're sitting back saying, I don't want to get my hands dirty.
01:44:02.000 Okay, I'll say.
01:44:03.000 Yes.
01:44:04.000 Bail.
01:44:05.000 Pick up a pail and bail the water.
01:44:07.000 Love it.
01:44:07.000 Also, get the lifeboats ready.
01:44:09.000 But, I think we can salvage the hull.
01:44:11.000 So, if you're interested in the structure of the system and you want to repair it, that is also possible.
01:44:15.000 But we need to get the lifeboats ready and we need to bail.
01:44:18.000 That is by speaking your mind online.
01:44:20.000 You make a video, you let people know.
01:44:22.000 It's by talking to other parents and saying, that's ridiculous. No, I'm sorry, that makes no sense.
01:44:29.000 And then if they get mad and they huff and puff, be like, you keep your cool and you be polite and you be nice and
01:44:35.000 you let them freak out.
01:44:36.000 You want them to show their true colors when they have the tribalist panic attack.
01:44:41.000 And this is why I always try to be very nice on Twitter.
01:44:46.000 Because I want people to look at the interactions between these lunatics and see someone saying, I'm sorry, I didn't mean to upset you, and them saying, F you, you slimeball, I hate your guts.
01:44:55.000 And then when a regular person who's uninitiated sees that, they go, I don't want to hang out with the mean person.
01:45:00.000 Yeah, yeah.
01:45:03.000 You can't let the lie come through you.
01:45:04.000 I said earlier the Soviet Union lasted 70 years of pretty crap, to put it mildly.
01:45:12.000 Do you know what led the Soviet Union, when it started to actually collapse?
01:45:16.000 I mean, there's lots of economic factors and some other stuff and whatever, but it's when the people, the citizens, actually started to laugh at it.
01:45:24.000 It's like everybody knew for a long time it was a lie and they just kind of did it.
01:45:27.000 At some point they started to laugh at it.
01:45:30.000 And so when Tim said, you know, it's one parent talking to another and saying, that's ridiculous.
01:45:36.000 Like, I mean, it's stupid, I guess, but think about Harry Potter, right?
01:45:40.000 So you had your, your literature reference.
01:45:42.000 I can have my dorky literature reference.
01:45:44.000 Harry Potter.
01:45:44.000 Come on.
01:45:45.000 Harry Potter three with the, with the werewolf.
01:45:48.000 Come on.
01:45:49.000 Good one.
01:45:49.000 And which one does he teach?
01:45:51.000 Right?
01:45:51.000 You have this, yeah, the shape-changing boggarts, the thing that you're afraid of.
01:45:56.000 It is what it turns into the thing you are the most afraid of.
01:45:58.000 It's going to take away the thing you're most worried about.
01:46:01.000 It's a shape-shifting thing.
01:46:03.000 You don't know what it is and what is the magic spell that breaks it, that destroys it.
01:46:06.000 Ridiculous.
01:46:07.000 You call the thing ridiculous.
01:46:09.000 You just laugh at it.
01:46:11.000 And that this is actually a very useful parable.
01:46:14.000 Because the truth is this ideology, as horrible and dangerous as it is, is also ridiculous.
01:46:20.000 It is patently ridiculous.
01:46:21.000 The things that it says, the things that it claims, are laughably dumb.
01:46:25.000 Like, you know what you do?
01:46:27.000 It's really, really simple.
01:46:28.000 Get your friends, if you're a parent, and your anti-woke, And then invite an unsuspecting woke person to a barbecue.
01:46:36.000 And then when they're sitting there, start chanting, gooble gobble, one of us, gooble gobble, one of us, and that will absolutely convince them to change their minds.
01:46:44.000 Or runaway screaming, you know, like in the movie.
01:46:46.000 You could do a drum circle too.
01:46:49.000 Yeah, it's more subliminal.
01:46:51.000 Well, you gotta be careful not to appropriate anything there.
01:46:55.000 Yeah, that's right.
01:46:55.000 Snare drums.
01:46:56.000 Alright, let's go to Super Chats, my friends.
01:46:58.000 If you haven't already, smash the like button and subscribe.
01:47:02.000 Hit the notification bell if you're listening on iTunes or Spotify.
01:47:05.000 Give us a good review because, well, if you think we deserve it, it really does help.
01:47:09.000 Let's see what we got here.
01:47:10.000 Tim deserves it, by the way.
01:47:12.000 We deserve all of the five stars.
01:47:14.000 Five stars and thumbs up.
01:47:15.000 Why aren't you all liking the like button?
01:47:17.000 Smashing.
01:47:18.000 Tapping.
01:47:18.000 Like.
01:47:18.000 All of the above.
01:47:19.000 Liking all of the like buttons.
01:47:21.000 I won't say banging.
01:47:22.000 Banging.
01:47:23.000 Yeah.
01:47:24.000 Bang that like button.
01:47:25.000 I didn't say it.
01:47:27.000 People are saying.
01:47:28.000 People are saying like the like button or bang the like button.
01:47:31.000 Alright.
01:47:33.000 Acme says, it's been incredible reading your new discourses essays and then seeing the same ideas months later in articles written by serious journalists.
01:47:41.000 Dare to name any names?
01:47:42.000 You're always on the cutting edge of this ideology.
01:47:45.000 Aren't I?
01:47:47.000 What are some of the essays that have sparked Well, I mean, you know, I might have did a four-part podcast series explaining Herbert Marcuse's repressive tolerance, and Matt Taibbi might have written an article right after that that got a ton of attention.
01:48:00.000 That's cool, though.
01:48:00.000 Right on.
01:48:02.000 No, see, I'm not actually jelly, because I want the ideas to get out there.
01:48:05.000 That's the goal, right?
01:48:06.000 And I understand the Country Club mentality, and that's fine.
01:48:09.000 The goal is to get the ideas out there.
01:48:11.000 I see that Claire Lehman of Quillette fame is now talking about critical theory, where for like a year she told me it was too obscure.
01:48:21.000 It's normal, man.
01:48:21.000 It's good.
01:48:21.000 You know, it was a really weird moment for me when I started seeing some of my ideas be repeated by other people.
01:48:27.000 I mean, imitation is the sincerest form of flattery, right?
01:48:30.000 There was just a great article.
01:48:31.000 I mean, it's truly great.
01:48:32.000 But I'll say this.
01:48:33.000 I don't think it's imitation.
01:48:34.000 I think it's a good idea.
01:48:36.000 And then people who hear that and say it's a good idea come to believe that idea and come to share it in the same way you did when you came up with it.
01:48:41.000 That's correct.
01:48:42.000 Yeah, there's another great article, and I don't even remember for sure who it's by, Bhatia Ungar Sargon or something.
01:48:48.000 Sargon Ungar.
01:48:49.000 I don't remember exactly what her name is.
01:48:50.000 I'm not familiar with her.
01:48:52.000 She's new to me.
01:48:53.000 hyphenated last name which makes things complicated for me.
01:48:56.000 If one of you can find it, that's cool.
01:48:58.000 But she wrote a beautiful article.
01:49:00.000 I even, it's really funny because I predicted on Twitter the day before it came
01:49:03.000 out that somebody would probably be writing an article about Hegel which I kept getting made fun
01:49:07.000 of on Twitter about, that the Hegelian philosophy has relevance to what's going on.
01:49:11.000 And she writes this beautiful article about the Hegelian relevance to the woke movement.
01:49:15.000 I mean, that's part of it.
01:49:16.000 I literally tweeted about that yesterday.
01:49:18.000 There you go.
01:49:19.000 It's like you're developing the theory or the scholarly presentation and then other
01:49:24.000 people are making it more like mainstream palatable.
01:49:27.000 I mean that's part of it.
01:49:28.000 I do understand that I write dork stuff that's hard to read.
01:49:32.000 I need to figure out if more people are talking about the psychopathy aspect.
01:49:35.000 I actually think the most important article I've written in a long time I published it on Christmas in case because COVID because like we didn't have a Christmas this year because my mom had a COVID scare at work.
01:49:44.000 My brother is absolutely not hanging out like he's totally COVIDed out.
01:49:49.000 And then my kids are scattered to the winds for the moment, and COVID makes everything weird.
01:49:57.000 I had just gone on a trip, and so it was like, eh, let's just all do Christmas at home this year.
01:50:02.000 So I was like, well, there are a lot of people who are doing Christmas at home alone, so I'll put an article out that I was going to save until a little bit after, and it's called something like, Psychopathy and the Origins of Totalitarianism.
01:50:12.000 And it's super important to understand that a lot of the origins of what's going on here are people who are failing to cope with the world as it is and create what I called in there, borrowing from, what's his name?
01:50:25.000 It's terrible to forget a guy's name right in the middle of something important, but Joseph Piper.
01:50:30.000 Pseudorealities.
01:50:31.000 They create false realities and then they push those through with a false morality and a false logical form called a parology and a paramorality.
01:50:39.000 And I lay out this whole article that basically they're coping with the inability to cope with the world by trying to force everybody else to live in their delusions.
01:50:45.000 There you go.
01:50:46.000 And I haven't seen the psychopathy article, or argument, making it out far yet.
01:50:51.000 I really want to though.
01:50:52.000 Let's, uh, we'll read some more of these.
01:50:54.000 The Crazy One says, Hey Tim, look up the dates the U.S.
01:50:57.000 declared war on the Confederate States.
01:50:58.000 Huh.
01:50:59.000 Someone want to pop that up real quick?
01:51:01.000 All right, let's see.
01:51:02.000 Ian Hall says, isn't all rap pretty much anti-woke?
01:51:05.000 Like, even Nicki Minaj is alphabet.
01:51:07.000 I don't know what that means.
01:51:09.000 What does that mean?
01:51:09.000 Don't know.
01:51:10.000 Is there any woke rap?
01:51:11.000 Is it any good?
01:51:12.000 I mean, like... Probably not.
01:51:14.000 I mean, there's no such thing really as woke art.
01:51:17.000 It's like, it's technically kind of anti-art, because it's propaganda.
01:51:22.000 All right, here we go.
01:51:22.000 Dr. Roller Gator says, hi James, hope you made friends today and that everyone on the internet was nice.
01:51:27.000 Is he in all caps?
01:51:28.000 No.
01:51:29.000 Oh, what?
01:51:29.000 What the heck?
01:51:30.000 What?
01:51:31.000 Caps!
01:51:33.000 I love him.
01:51:35.000 Deplorable Pirate Captain Gunbeard says, my advice to everyone left of Marx, quit being a consumer, embrace self-reliance, making, and right to repair.
01:51:44.000 Corporate cop sellouts will hate and fear a culture built around that.
01:51:47.000 Yeah, I think so.
01:51:50.000 Paul Jimikowski says, Thank you for being a light in the dark, James.
01:51:54.000 Keep up the fight.
01:51:55.000 The only thing necessary for the triumph of evil is for good men to do nothing.
01:51:59.000 Do not let the lie come through you.
01:52:02.000 Solzhenitsyn.
01:52:04.000 Softshell Crab says, So I have two questions.
01:52:06.000 What can a person from a small town do to influence policies that help small towns and not big cities?
01:52:13.000 Do any of y'all play Warhammer 40k or know the lore?
01:52:16.000 I do not.
01:52:17.000 I used to.
01:52:17.000 I played Space Wolves for a while.
01:52:19.000 Space Marines.
01:52:20.000 I love Tyranids.
01:52:21.000 I don't really love them, but I appreciate their psychic ways.
01:52:24.000 They're giant ant creatures that have enslaved other races.
01:52:27.000 But what was the other question exactly?
01:52:29.000 How do you influence policies that will help small towns and not big cities?
01:52:31.000 The first thing I thought is make internet videos, because that affects everyone.
01:52:35.000 And give your personal experience.
01:52:36.000 Build culture.
01:52:37.000 That's true.
01:52:38.000 That's right.
01:52:39.000 Don't be afraid to actually write to your representatives, especially if you have some video stuff that you've put out.
01:52:44.000 Send it to them.
01:52:45.000 Share them.
01:52:45.000 Yeah.
01:52:46.000 Um, I also have not played this game.
01:52:48.000 The last game I ever played, I actually, it's not quite true it's the last game I ever played, but I was playing World of Warcraft in like 2005 or something.
01:52:54.000 Oh yeah.
01:52:55.000 And I'm like launching fireballs at something or another, pirates or something.
01:52:58.000 And I just realized, I was like, I'm putting a lot of effort into making this avatar of myself awesome.
01:53:04.000 When I could be putting effort into making myself awesome.
01:53:07.000 Sounds like you were barely in the game.
01:53:08.000 Sounds like Deadmines.
01:53:09.000 I mean, like level 15 or something.
01:53:11.000 Come on bro.
01:53:12.000 I was like 57 or something.
01:53:14.000 Fighting pirates?
01:53:15.000 Big pirates.
01:53:16.000 I guess, yeah, maybe it's been a while.
01:53:20.000 I forget where it was.
01:53:20.000 Maybe like Tanaris or something.
01:53:21.000 Yeah, I don't remember.
01:53:22.000 I played Warcraft for a long time.
01:53:24.000 It was 2005.
01:53:25.000 Me too.
01:53:26.000 And it might not even have been pirates.
01:53:27.000 I might be making that part up.
01:53:28.000 It might have been like, you know, raptors and dinosaurs.
01:53:31.000 I'm just making that up.
01:53:31.000 Stranglethorn Vale?
01:53:32.000 Yeah, it could have been Stranglethorn, Booty Bay.
01:53:34.000 It was Booty Bay, I think.
01:53:35.000 40-something.
01:53:37.000 Yeah.
01:53:37.000 And so anyways, I realized I could be making myself awesome with that time.
01:53:42.000 And like 30 minutes later, I just kept playing.
01:53:44.000 It's just the thought crossed my mind.
01:53:45.000 And like 30 minutes later, I was like, I'm done.
01:53:48.000 And I've never been able to get into a video game again.
01:53:51.000 I've tried a few times.
01:53:51.000 I just literally can't.
01:53:53.000 Well, I feel similar to what happened to me.
01:53:56.000 I kind of seeded back into Warcraft 3 in that 2009, but between 2006 and 2009, I just had, I took my Warcraft disc, smashed it, threw it into the sewer.
01:54:03.000 I was like, I'm done.
01:54:04.000 I have YouTube and I just went deep on YouTube videos.
01:54:07.000 I've got, uh, I just, I just bought Genshin Impact.
01:54:09.000 It's actually a really good game, but I just can't get into it because I'm like, I'm just not inspired to level up these abilities.
01:54:16.000 You know, it's an open world RPG and I'm like, I'd rather just, you know, build Chicken City.
01:54:20.000 You know what I mean?
01:54:21.000 Like a real life.
01:54:22.000 Real life.
01:54:23.000 Yeah.
01:54:24.000 I actually like- Like chickens are hilarious, dude.
01:54:25.000 I started turning the circle and training my martial art and meditating in place of video games.
01:54:30.000 And it was like, wow, I turned off World of Warcraft and I have like so many extra hours a day that I don't even know how to fill them all.
01:54:36.000 All right, we got, um, McChewy says, Diamond hands, my fellow apes.
01:54:41.000 AMC and GME to the moon.
01:54:42.000 What's up, Tim and friends?
01:54:43.000 Keep up the good work, y'all.
01:54:45.000 My friends, you may notice that we have some merchandise pinned to the chat.
01:54:48.000 If you go to timcast.com and click shop or click the pinned merchandise above the chat, You can get your very own Diamond Hands gorilla t-shirt.
01:54:59.000 It is the traditional I am a gorilla shirt, except now he's wearing a suit, holding wads of cash, smoking, got sunglasses on, and he's particularly happy because, you see, he had diamond hands.
01:55:10.000 He held his stonks until he maxed out his tendies, and now he is reaping the reward, so he's having a good day.
01:55:15.000 There were a lot of people that were... I can't believe it.
01:55:17.000 You know the GameStop thing?
01:55:19.000 People sold at like 50 bucks.
01:55:21.000 I'm not giving anyone advice.
01:55:22.000 I'm just giving you my opinion on what happened.
01:55:24.000 And then it hit like 350.
01:55:25.000 I don't know what happened today, but it's just been insane.
01:55:29.000 And I know some people who bought them and they're laughing like it's doubled their money.
01:55:33.000 This, this, it's a crazy scenario.
01:55:35.000 It's crazy.
01:55:35.000 I literally, I was on a flight a few weeks ago and I sat next to a literal hedge fund manager and it got upgraded.
01:55:41.000 I was in first class and I sat next to a literal like hedge fund manager and I was like, what'd you think of that GameStop?
01:55:46.000 And he's like, I didn't even know about that.
01:55:48.000 I was like, what in the world?
01:55:50.000 Memes, man.
01:55:51.000 Meme is mightier than the sword.
01:55:53.000 Alright, The Civic Nationalist says, James, I use the work you've done to make my arguments against the racist and sexist.
01:56:00.000 You have done amazing work, but they don't care.
01:56:02.000 So good idea is greater than bad ideas.
01:56:04.000 God save the Queen.
01:56:05.000 Long live Britain.
01:56:06.000 The sun has not yet set.
01:56:07.000 No, we're impeaching the Queen now.
01:56:09.000 Impeach the Queen!
01:56:10.000 That's right.
01:56:11.000 People are doing it.
01:56:11.000 People are posting a hashtag.
01:56:13.000 Hashtag Impeach the Queen, I guess.
01:56:14.000 It's like a reference to like... The more absurd stories come out, I can just like... I wonder at what point there will be a crisis of historical record on me when they don't know which stories are real stories.
01:56:29.000 So like, there's a bunch of stories about me.
01:56:31.000 Which ones are the real ones?
01:56:32.000 I tried to do that with academic papers.
01:56:34.000 Right, exactly.
01:56:35.000 Oh yeah, you talked about that in Rogren, you said you published like ten, how many papers?
01:56:40.000 Seven of them got in, there were twenty that we wrote.
01:56:43.000 Turning Mein Kampf into feminist literature and getting it accepted.
01:56:46.000 I almost, you know, so Tim's like a journalist or whatever and he was actually at Peter Bogosian's house while we were writing them.
01:56:51.000 And we had a couple of events in Portland and so we got together in Portland and we were working together and at one point, I mean the famous video when we put out where we're all celebrating and laughing in what appears to be a nice house and Pete's wearing like a suit jacket or whatever.
01:57:05.000 Like Tim's downstairs and we're like completely forgot he was there and we're celebrating.
01:57:08.000 But at one point, and I bet you don't remember this, but you might remember this.
01:57:12.000 We were in Pete's kitchen.
01:57:14.000 and I had an idea for a paper hit me and that's like I completely forgot that you weren't in on it and I just turned to everybody and like it's like the whole room went weird and I was like oh my god guys gentrified cornbread you were like what and everybody was like You know, stop, stop, stop.
01:57:34.000 Well, you know, I, I take, uh, sources seriously though.
01:57:37.000 So I wouldn't just like publish private details of somebody.
01:57:41.000 No, of course, of course.
01:57:42.000 You infiltrated.
01:57:43.000 You were, we were, uh, we were doing a, uh, what we're doing, we're doing an interview on critical race theory stuff.
01:57:49.000 Yeah, I couldn't remember if you were there.
01:57:51.000 This was when James Damore was there, so I couldn't remember if you were connected to that event, or... Yeah.
01:57:55.000 I don't think at the exact same time, but we stuck around.
01:57:58.000 We drove back and forth, I guess.
01:57:59.000 That's right, that's right.
01:58:00.000 Yeah, we went to the James Damore thing in Portland.
01:58:02.000 Yeah.
01:58:02.000 I think it was Portland, right?
01:58:03.000 Yeah, we hung out a little bit at Pete's house.
01:58:05.000 And then we all argued about universities versus media and stuff.
01:58:08.000 And turns out we were both right, and of course we were.
01:58:10.000 All right, let's see.
01:58:12.000 Brandon Beck says, have you seen George Alexopoulos' comic of you, Tim?
01:58:16.000 It's you feeding your chickens and you get abducted by aliens.
01:58:18.000 It's really funny.
01:58:18.000 I love the show.
01:58:19.000 I do.
01:58:20.000 It's amazing.
01:58:21.000 And we completed Chicken City today.
01:58:23.000 We have two little chicken houses, then two little chicken, like, bungalows.
01:58:28.000 Yeah, and then one chicken town hall, and we're gonna put addresses on them.
01:58:32.000 Yes!
01:58:33.000 So then, you know, it's like an actual city.
01:58:34.000 We gave Bucko a little tour.
01:58:36.000 Let him sniff it out.
01:58:37.000 He was stoked.
01:58:38.000 He was running around.
01:58:39.000 Is there gonna be like a squirrel autonomous zone in the middle?
01:58:42.000 Okay.
01:58:43.000 We'll find out.
01:58:43.000 That sounds great.
01:58:44.000 Maybe.
01:58:44.000 They can get through the wiring, I think.
01:58:46.000 No.
01:58:46.000 No, I don't think so.
01:58:47.000 I hope not.
01:58:47.000 A squirrel might be able to slip through.
01:58:49.000 We have to take it seriously, though, because we're in the middle of nowhere.
01:58:51.000 There's predators and stuff.
01:58:52.000 No, that's true.
01:58:53.000 I mean, yeah, you'll see an uptick in your foxes and hawks.
01:58:57.000 I think it's fine.
01:58:57.000 I think it's fine.
01:58:58.000 They have little houses, and we surrounded the whole thing.
01:59:01.000 It's caged in.
01:59:02.000 But I don't think the squirrels would attack chickens if they dig down there.
01:59:05.000 Chickens might attack squirrels.
01:59:06.000 Yeah, the chickens would eat the squirrels.
01:59:08.000 No, my neighbor has chickens.
01:59:09.000 They're cool.
01:59:09.000 Unsettling.
01:59:10.000 They're cool.
01:59:11.000 So, uh, another question.
01:59:13.000 Jordan Nick says, great, uh, guys, great work.
01:59:15.000 Love the show.
01:59:15.000 Quick question for Ian.
01:59:16.000 Who's your go-to army for 40k?
01:59:18.000 You seem like a Tau or Eldar player.
01:59:20.000 Oh, I love the Eldar man, but Josh picked him first.
01:59:24.000 I mean, the Harlequin Kiss was so deadly in the early days.
01:59:27.000 If you, the Distortion Cannon, if it hit any tank, it would just completely, the tank would disappear.
01:59:32.000 It was so overpowered.
01:59:34.000 So I always wanted to play Eldar, but since Josh already had my Space Wolves.
01:59:37.000 I think, I like Tyranids.
01:59:39.000 I like the psychic worm creature that can control armies of gene stealers.
01:59:43.000 Now I know what it's like for everybody when we mention Magic the Gatherer.
01:59:45.000 Yeah, you're gonna love 40K.
01:59:46.000 Tap your swamp.
01:59:47.000 Alright, this one's important for you.
01:59:49.000 Brian Kluver?
01:59:49.000 Do you know who that is?
01:59:51.000 I do.
01:59:53.000 Hey look, it's Jimmy Lindsey.
01:59:54.000 He and his sibling were raised in the old neighborhood where they were my younger sibling's friends.
01:59:58.000 He was just a kid down the road.
02:00:00.000 Hopefully will be known as a modern-day hero.
02:00:02.000 Keep going, James.
02:00:03.000 Right on, thanks Brian.
02:00:05.000 Thanks Brian.
02:00:06.000 Good to hear from you, man.
02:00:07.000 That's cool, huh?
02:00:08.000 Yeah.
02:00:10.000 All right, let's see.
02:00:12.000 Amalashok says, Tim, you hung the picture.
02:00:15.000 Now I have to fix the leaky toilet.
02:00:16.000 Told my wife I'd do it when you did.
02:00:18.000 Thanks.
02:00:19.000 A Pisces and nine of spades.
02:00:21.000 So yeah, we changed the image behind James because it was the Donald Trump Joe Rogan one.
02:00:27.000 But now that it's just Biden, I was like, we have two images of Biden eating children, you know?
02:00:32.000 Oh, okay.
02:00:32.000 Yeah, so behind you it's these eating boom... okay, boomer girl.
02:00:35.000 That's normal.
02:00:35.000 Not a child.
02:00:36.000 But then behind James we have Joe Biden just literally being handed a little girl who he eats.
02:00:39.000 Biden is all over this room.
02:00:41.000 I don't know what George Alexopoulos was thinking when he made the one of Joe Biden just eating the little girl and everyone cheering and giving thumbs up as he does it, but it is one of the greatest pieces of modern art I have ever seen.
02:00:52.000 Have you seen the meme where they took pictures of him, like, creeping on people's hair and stuff?
02:00:56.000 Uh-huh.
02:00:56.000 Pew, pew, pew, pew, pew, yeah.
02:00:57.000 Pew over it.
02:00:58.000 We'll bring him back.
02:00:59.000 Yeah, memes.
02:01:00.000 I love it.
02:01:02.000 Let's see.
02:01:03.000 Tim Pauls says, Hey Tim, I always enjoy your show.
02:01:06.000 I just did a metal cover of Will of the People.
02:01:08.000 I would be honored if you would check it out.
02:01:10.000 My channel is just my name.
02:01:12.000 Tim Pauls.
02:01:13.000 It almost sounds like the exact same name.
02:01:15.000 I will look into it.
02:01:16.000 That sounds nice.
02:01:18.000 Michael Leone says, Been watching since Fukushima.
02:01:20.000 You do great stuff.
02:01:21.000 You often mentioned making culture to impact the world.
02:01:24.000 So Minecraft playthrough when?
02:01:26.000 Probably never, but Chicken City.
02:01:28.000 Yeah, did you hear the song that I wrote, Willow People?
02:01:30.000 No, I didn't.
02:01:30.000 I'll have to show it to you later.
02:01:32.000 Everybody else, you can check it out.
02:01:33.000 It's an original song, and you'll love it.
02:01:35.000 It's very, very, very political.
02:01:38.000 In fact, I think people who are into politics would probably like it more than your average music fan, but I think it's a good song.
02:01:43.000 I mean, I wrote it.
02:01:43.000 I have to be proud of myself, I suppose.
02:01:45.000 You know what would be cool is if we had a Minecraft world of this house.
02:01:49.000 It might be a security Funk breach type thing where we don't want people knowing the layout of the house necessarily But if people could like walk around your house and they'd be like, oh I get to see what it sees What we'll do is we'll put an iPad on one of those like a big Segway or something and then you can log in and control it.
02:02:06.000 Mm-hmm There used to be they still probably still have these but there was this thing that my brother's friends had where it was a little robot and That could be remote controlled by anyone you give access
02:02:16.000 to.
02:02:16.000 And so this guy gave his friends access to drive the robot around his house.
02:02:20.000 And it has a camera on it, so you're driving around and you can yell things at people.
02:02:23.000 And it's just really funny to be this little robot in your friend's house.
02:02:27.000 And you go like, the dog is running, you're chasing the dog.
02:02:30.000 Yeah, it's fun. It was a fun little toy.
02:02:32.000 Alright, let's see.
02:02:34.000 Woody says, lived in SF and worked for a huge tech company.
02:02:38.000 Quit after seeing the Critical Race Theory stuff.
02:02:41.000 JL had been warning about, and now thanks to him and Timcat's crew, I finally moved out of crazy California and never voting Dem or lazy Republican again.
02:02:48.000 Here's to you all.
02:02:49.000 Hey, there you go.
02:02:50.000 Awesome.
02:02:51.000 Oh, hello.
02:02:52.000 Pablo Martina says, look at the post-millennial article of Sarah Silverman.
02:02:56.000 She's now claiming to be politically homeless, and that's a former Hollywood elite to respect.
02:03:00.000 Never been a big fan of Sarah Silverman's kind of humor, shock humor, where she has like really offensive things, but I love the ability that she, I love she has the ability to do it.
02:03:09.000 And so I absolutely welcome Sarah Silverman in with open arms to whatever group we are in terms of not like politically tribed, but still anti-censorship.
02:03:21.000 Believing in truth and weird stuff like that.
02:03:24.000 Look, Sarah should be allowed to say all the stupid, crazy jokes that she wants.
02:03:27.000 I agree.
02:03:28.000 I got a really good vibe from her in 2007 and just saw, like, she could see through it a little better than a lot of people.
02:03:34.000 She looks like somebody I actually know in real life, so I get kind of confused.
02:03:37.000 There you go.
02:03:38.000 All right, here you go.
02:03:38.000 Alan Ortega says, Thomas Sowell was once a Marxist.
02:03:41.000 When asked what caused him to change, he simply said, replied, facts.
02:03:46.000 There you go.
02:03:46.000 I mean, that's how it works.
02:03:48.000 It's a pseudo reality.
02:03:49.000 Yeah.
02:03:51.000 Ethos Tattoo says, I seen someone post about how much Fox makes of service providers for their channel.
02:03:58.000 Off service providers?
02:04:00.000 But I'd love to see how much MSNBC and CNN make of ad revenue when a black man is shot by police.
02:04:05.000 It's probably high.
02:04:06.000 They probably get a lot of big boost in ratings.
02:04:09.000 You can actually Google search the ad rates for all of these cable channels, all these shows.
02:04:14.000 Brandon Toms says, James Lindsay, the Woke Jedi, the Woke Jedi, excuse me, will bring balance to the Force.
02:04:21.000 Yes.
02:04:21.000 That's the goal.
02:04:22.000 There you go.
02:04:23.000 It's like, watch out, you know.
02:04:24.000 You kind of seem like Qui-Gon.
02:04:28.000 I didn't know if, like, that was going somewhere weird.
02:04:31.000 No, he's just a Jedi Master.
02:04:32.000 Well, he said, he adds, thanks for harrowing Woke Hell for the sake of all mankind.
02:04:36.000 I'll show you all my, my Punggen later.
02:04:39.000 I showed Lydia my pungent earlier.
02:04:41.000 Yeah, it's pretty cool.
02:04:42.000 It was very thick.
02:04:43.000 Oh.
02:04:44.000 Kevin Pilgrim says, this guy is freaking awesome.
02:04:47.000 Also, I wish Sour Patch Lid sang Yes played every time I accomplished something.
02:04:51.000 There you go.
02:04:51.000 It's an achievement sound.
02:04:54.000 We got a lot of really great superchats from people just giving superchats, so I really appreciate those superchats.
02:05:02.000 I love it.
02:05:03.000 Oh, I know.
02:05:03.000 I know about the Protosaber.
02:05:04.000 I think, yeah.
02:05:05.000 by an intersectional centipede. Also, Tim, you should look up Protosaber." Oh, I know
02:05:10.000 about the Protosaber. And there was one guy, was it the Hacksmith?
02:05:13.000 Is that his name? He made a Protosaber. So do you know about Protosabers in Star
02:05:18.000 Wars? I don't know the technicalities of them. It was like they wore backpacks because they didn't
02:05:22.000 have the power source for the ... For the little handheld?
02:05:26.000 Yeah, exactly.
02:05:27.000 And so what this dude did was he made a gas-powered plasma... It's, like, it's crazy.
02:05:32.000 It's... It looks like a lightsaber.
02:05:34.000 Whoa.
02:05:35.000 But it's just, like, a plasma jet, so it, like, cuts through stuff.
02:05:37.000 Is that the thing that you saw as an ad on in your Instagram?
02:05:40.000 No, it was... That one's just a little torch.
02:05:42.000 It's a little torch lighter.
02:05:43.000 It's got, like, a good beam.
02:05:44.000 It's, like, you can melt metal.
02:05:46.000 That's cool.
02:05:48.000 Alright, let's see.
02:05:50.000 David Hogan says, biggest super chat yet.
02:05:52.000 Broke warehouse worker from the Chicago Burbs.
02:05:54.000 Been watching you guys for two years.
02:05:56.000 You're the best.
02:05:56.000 Love you guys.
02:05:57.000 Thoughts on the change to propaganda laws snuck into the NDAA back in Obama's era.
02:06:03.000 I think that was more about foreign propaganda.
02:06:05.000 I'm not entirely sure.
02:06:05.000 I have to reread it.
02:06:06.000 Yeah, I don't know the law.
02:06:08.000 I think it was something about stuff that was made for American propaganda out overseas could now be allowed to be played here in the U.S.
02:06:15.000 Hmm.
02:06:15.000 Yeah.
02:06:16.000 I'm not entirely sure.
02:06:17.000 That's been a while.
02:06:18.000 I mean, we're definitely inundated in propaganda, and there's some books about propaganda that you should definitely read.
02:06:23.000 You should also read Edward Bernays' Manufacturing Consent, which is an important essay.
02:06:28.000 It's like eight pages long.
02:06:29.000 He's the father of modern propaganda, right, Bernays?
02:06:31.000 That's correct.
02:06:32.000 I mean, we're supposed to say public relations, but that's a piece of propaganda in and of itself.
02:06:36.000 Whoa, it's meta.
02:06:37.000 Yeah.
02:06:38.000 Infohole says Richard Dawkins developed the concept of a meme out of his contempt for Christianity, but the idea of a meme as a virus that spreads rapidly between minds is oddly useful for leftism.
02:06:48.000 There you go.
02:06:50.000 Yeah, memes, I mean, the evolution of conceptual or symbolic conceptual things totally makes sense.
02:06:56.000 And what happens with what we call memes, like taking Joe Biden being a creeper and sticking Pepe Le Pew's face on it?
02:07:03.000 It really fixes like that's genetic material mimetic material combining with one another D Mills says haven't watched the show yet, but just wanted to say Lindsay is an absolute boss I've shared the new discourses podcast on Antonio Gramsci with everyone I know so good a lot of people like that one.
02:07:21.000 You should check it out Yeah, I laid out Gramsci.
02:07:23.000 We talked about Gramsci laid out Gramsci And where he came from what he was thinking what his plan was and how it's relevant to today.
02:07:30.000 It's on new discourses You can check it out All right.
02:07:34.000 Corey Steinfield says, I really love the show and I appreciate the center center left viewpoint.
02:07:38.000 You guys are awesome.
02:07:40.000 I'm getting into crypto and I was wondering what your methods of buying and storing crypto were.
02:07:44.000 Keep up the great work.
02:07:46.000 Um, I don't think it's probably appropriate to talk about the ways in which we store.
02:07:51.000 Oh, like what websites?
02:07:53.000 There's multiple ones you can use.
02:07:54.000 You can use Coinbase, BlockFi.com is another interesting one.
02:07:57.000 You can use Binance.
02:07:58.000 You're not storing your crypto if you do that.
02:08:00.000 Well, someone's storing it.
02:08:01.000 It's on the blockchain.
02:08:02.000 You're just using that to get to it.
02:08:04.000 Basically, if you go to any one of these crypto distributors or whatever, and you buy Bitcoin on their website, and just have an account with them, they have the Bitcoin.
02:08:14.000 Okay?
02:08:14.000 You just agreed that they owe it to you.
02:08:17.000 So, if something bad happens, like what happened with Mt.
02:08:19.000 Gox back in the day, you have no Bitcoin.
02:08:22.000 Yeah, it's risky, but super convenient to move it around if you leave it on the internet.
02:08:26.000 For the layman, you know, if you're not... Cold storage is more difficult.
02:08:29.000 If you have a huge amount, get a gorgeous cold storage thing like the Nano X. I have one of those.
02:08:34.000 And then you can store it locally.
02:08:35.000 It's still on a blockchain, but you get to it through local keys.
02:08:39.000 What you need to do is really just make memes.
02:08:41.000 Memes?
02:08:42.000 Like you put a laser on your crotch and make that your Twitter profile.
02:08:45.000 You tweet HODL a lot.
02:08:48.000 You know, you tell everybody that James Lindsay's going to the moon.
02:08:51.000 I'll take Tim Pool with me.
02:08:52.000 I think it's the other way around, but we're going to the moon later tonight.
02:08:56.000 Everybody buy Conceptual Coin.
02:08:58.000 It doesn't exist, but buy it anyway.
02:09:00.000 We can make it very, very easily.
02:09:01.000 Manspreading.
02:09:02.000 It is extremely easy to make an ERC20 token.
02:09:05.000 Ted 2 says the military is implementing a mandatory extremism in the ranks stand down per Secretary of Defense to address extremism in the wake of January 6.
02:09:14.000 We are being required to watch a two-plus hour speech about white supremacy and its influence.
02:09:19.000 Look into it on Timcast, please.
02:09:21.000 I think I did talk about it back when that happened before.
02:09:23.000 I wonder if that's a new thing.
02:09:25.000 No, there's a lot going on.
02:09:26.000 The military has been woke-ifying pretty steadily for a while.
02:09:29.000 I hear from people at a lot of levels you know from officer to brass to to enlisted and it's like people are they're wokefying the damn thing it's not good it is not good so if that's you you know you got to know the facts for yourself and don't lose your head when you're getting you're not gonna like reject this you're not gonna reject the structure of the military if you're in it so you have to keep your head
02:09:55.000 What's with all the 40k?
02:09:57.000 Warhammer 40,000.
02:09:57.000 People love that game.
02:09:59.000 It's such a popular game.
02:10:03.000 Let me read this.
02:10:04.000 People love games, as I was about to say.
02:10:06.000 So, 20-something Drifter says, Tim, we need a clown block.
02:10:09.000 We need a death core of Krieg block.
02:10:12.000 Worst comes to worst, we expose the normies to 40k.
02:10:14.000 The Emperor wills it, and you shall obey.
02:10:17.000 Honestly, I bet there's a lot of metaphor between the... There's the Empire, the Imperials, and the Psychic Emperor, who's basically, they have him, like, in a... His body's, like, wasted away after thousands of years, but his psychic energy persists.
02:10:34.000 Then there's Space Marines, which are, like, genetically altered humans.
02:10:36.000 I could go on and on.
02:10:37.000 I don't know too much about it, though.
02:10:38.000 Tap your swamp.
02:10:40.000 Yeah, you can always just tap the swamp.
02:10:41.000 There you go.
02:10:42.000 Tap your swamp.
02:10:43.000 And then cast Dark Ritual to get three black mana.
02:10:46.000 And then, you know, cast Phyraxian Negator and get your turn 1, 5, 5.
02:10:50.000 And then you win.
02:10:51.000 7, 7, Flying Trample.
02:10:53.000 That's all I remember.
02:10:53.000 Block with what?
02:10:54.000 Turn 1, 5, 5?
02:10:54.000 Lord of the Pit?
02:10:55.000 Lord of the Pit.
02:10:56.000 You knew which card I was talking about.
02:10:58.000 7, 7, Flying Trample is the only thing I remember from Magic.
02:11:03.000 Biggest baddest card in the early days.
02:11:04.000 Yeah, like way back.
02:11:06.000 all right uh siri uh siren uh mcgowan probably pronouncing it wrong says i'm a 37 year old skateboarder we look at life from a different perspective than the rest of the world we need little my wife and i pulled our young kids out of school march 2020 we've seen this coming Yes, and if you would like to watch me skateboard, you can search on YouTube for Tim Pool, Nollie Hardflip Rewind, and Hang 10 Hardflip, and my good friend Brett Novak filmed and produced those videos, and they're like some of the best tricks ever on flat ground, trust me.
02:11:39.000 If you want to see me skateboard, you can download the same video and get one of those deepfake apps and put my face on Tim.
02:11:45.000 Boom!
02:11:46.000 That's all I got.
02:11:47.000 Alright, let's see.
02:11:48.000 Aurora Diaz says, any thoughts on unrestricted warfare written by CCP members that calls for the weaponization of everything to destroy the West?
02:11:56.000 Is there a chance this is just a new kind of warfare utilizing useful idiots in another country to destroy it from within?
02:12:02.000 It's an old kind of warfare using new tools.
02:12:04.000 It's political warfare.
02:12:05.000 It's not new.
02:12:06.000 We just forgot what it is in the West.
02:12:08.000 Like I said, the communists assessed this 40 years ago and said that literally the Americans' ability to detect political warfare is so degraded as it may as well not exist.
02:12:18.000 So, it's an old form of warfare.
02:12:21.000 It's the political warfare.
02:12:23.000 It is the most important concept you've never heard of.
02:12:25.000 It is what we're in the middle of right now.
02:12:27.000 Well, I can say for myself, I have no AMC or GameStop stonks.
02:12:30.000 I do not.
02:12:30.000 Do you guys?
02:12:30.000 I don't.
02:12:31.000 Not looking for financial advice, but how many people on the show are invested in AMC or GME just out of curiosity?
02:12:36.000 Well, I can say for myself. I have no AMC or GameStop stonks.
02:12:41.000 I do not I don't know. Nope. Sorry. We're just not diamond hands
02:12:46.000 enough I'm so deep in crypto.
02:12:48.000 I think that that was an important event though.
02:12:50.000 I didn't want to buy those because as like covering the story I didn't want to be a contributor to any like rise or gains and then have a stake in it.
02:12:58.000 But I do have Nokia because I actually like Nokia.
02:13:01.000 I used to do a lot of tech work with and I have I would have like every single cell phone and we did some mobile apps and I was doing mobile live streaming.
02:13:07.000 So when I heard there was a report that came out that Nokia was doing well it is one of the meme stocks but I guess it's not doing well like nobody really cared.
02:13:13.000 I just like the idea of, you know, having some stock in a tech company and then like reports are saying that Nokia is going to do well, so I own some, but I'm not giving anyone advice.
02:13:21.000 That was just my opinion on why I did.
02:13:22.000 Yeah, I don't have any advice at all, except that I think that it's like you, it's like, you know, I comment on this.
02:13:28.000 I think the GameStop thing was super, super actually symbolically important.
02:13:32.000 Yeah.
02:13:33.000 Sonny James says they probably can't fund the police in those areas anyway.
02:13:37.000 Half of the cost I would charge to risk my life for those coward politicians.
02:13:43.000 CDC issued warning about zombie virus.
02:13:45.000 When all of them peeps pour out of those starved cities, me thinks.
02:13:50.000 Did you guys hear about the zombie, the CDC zombie virus?
02:13:53.000 That's like a joke thing they've always had.
02:13:54.000 Yeah, but I was thinking today, we were talking about earlier, me and Adam were on his show, about how Hitler framed the Jewish population as rats.
02:14:02.000 And if we get to a situation where people are so homeless and destitute that they start to roam the streets in desperation and start eating other humans' dead bodies, like cannibalizing.
02:14:11.000 Dude, this is like some of the best left-wing crud that I still have.
02:14:14.000 Like, nobody believes me.
02:14:16.000 are desensitized to it because of propaganda like that.
02:14:18.000 Dude, this is like some of the best left-wing crowd that I still have.
02:14:21.000 Nobody believes me.
02:14:22.000 I used to see, like I never really watched TV, but my friends were all into The Walking
02:14:26.000 Dead and I was like, this is a bad metaphor.
02:14:30.000 Every right-winger is into killing zombies big time and seeing them everywhere.
02:14:35.000 This is not good.
02:14:37.000 I was a little afraid that The Walking Dead was propaganda for right-wingers to start thinking of leftists and zombies.
02:14:44.000 No dehumanization is good.
02:14:45.000 Don't do it.
02:14:46.000 I agree.
02:14:47.000 All right, we'll do a couple more Super Chats.
02:14:48.000 We got one from Jason Hopper.
02:14:49.000 He says, Diamond Hands Gorilla in pillow form and Harumph.
02:14:53.000 I'll buy both.
02:14:54.000 All right, I will get those graphics transferred over to pillows and make them available for everybody.
02:14:59.000 It is our Gorillo.
02:15:02.000 So that's the mock-up, our pillow, the graphic.
02:15:05.000 But then you see the burlap sack over there?
02:15:07.000 Yeah.
02:15:07.000 That's the official prototype.
02:15:08.000 Oh, that's the real one.
02:15:09.000 Yeah, we're doing a commercial where we're legit gonna buy airtime on TV.
02:15:13.000 Oh, no kidding.
02:15:14.000 It depends on if we can get it timed properly.
02:15:16.000 So we're working on it, we are.
02:15:18.000 And we want to do it at a specific strategic political moment.
02:15:21.000 I'll just leave it at that.
02:15:23.000 And I've already talked to Fox about it.
02:15:25.000 I told him on Twitter that if he doesn't make one the size of those beanbags, like huge beanbags that you can use as a bed, and call it the Hogzilla, that it's not real.
02:15:35.000 Like he's not committed until that happens.
02:15:37.000 Alright, Luminescent says, Hi Tim, good podcast.
02:15:40.000 Any news when Laowai and Serpentsa will appear here?
02:15:43.000 They are concerned about COVID.
02:15:46.000 That's their main hold up right now, but we are going to make it happen.
02:15:48.000 But, but, but everything's reopened.
02:15:51.000 Not everything.
02:15:52.000 Yeah.
02:15:52.000 I mean, I'll follow up with them.
02:15:54.000 Okay.
02:15:54.000 Lao Lai is really good though.
02:15:56.000 Yeah.
02:15:56.000 That'd be cool.
02:15:56.000 He's really good.
02:15:58.000 Alright, let's see.
02:15:59.000 I think maybe we can grab one more.
02:16:01.000 L says, great show.
02:16:03.000 James, you are brave and please know you are in my family's nightly prayers.
02:16:06.000 How do you see this critical race theory thing ending?
02:16:09.000 I've seen you mention bullets.
02:16:10.000 Scary stuff.
02:16:11.000 I don't have, I mean, like, it matters a lot on if I'm black billed at the moment you ask me that question.
02:16:19.000 If people will actually do, like we've just said on the show repeatedly, which is start saying it's ridiculous, start speaking plainly and truthfully about it, start showing up and using what's left of the institutional mechanisms, because a lot of the institutions are not dead.
02:16:35.000 A lot of the institutions are filled with people, and I know because I just spoke with some in the last two days, that just don't know what it is.
02:16:42.000 I have no idea.
02:16:43.000 I'm talking about legislative bodies.
02:16:45.000 I'm talking about school boards.
02:16:47.000 They have no idea that this is what's actually happening.
02:16:50.000 If you start informing these people, this is like the Hail Mary, then there's a way out of this where, in fact, it all can almost evaporate, as I think Helen put in Cynical Theories, as like a puff of its own contradictions or something like that.
02:17:05.000 But if it continues to take over more and more and more institutions, the ways out are very slim, and I don't advocate bullets.
02:17:17.000 It's a terrible thing.
02:17:19.000 That is the worst thing ever.
02:17:21.000 When I say that, what I'm actually thinking is that the United States is not a country that is likely to go the route of peasant societies like China was in the 60s, or like Russia was in 1917.
02:17:34.000 They're not likely to just kind of go along with this and fall into it, or Germany was in the 1930s.
02:17:40.000 it's likely that there will be fighting back, and that's a worst-case scenario short of just
02:17:46.000 outright losing, in my opinion. So start speaking up and speaking honestly, and the way out of this
02:17:51.000 works out by just speaking the truth and pointing out this is not what it purports to be. It is
02:17:56.000 reinventing racism. Nobody wants this. And start telling other people that in plain, easy language,
02:18:03.000 and you can actually convert people into understanding this.
02:18:06.000 And then it's like, we're not beholden to just sleepwalk off a cliff here. That's
02:18:12.000 so important.
02:18:14.000 Right on.
02:18:15.000 All right, everybody.
02:18:16.000 Thank y'all so much for hanging out.
02:18:17.000 Smash the like button, because it really does help.
02:18:19.000 And if you're listening on iTunes, Spotify, or other podcast platforms, leave us a good review.
02:18:23.000 Give us all those stars, because it helps.
02:18:25.000 It really does.
02:18:26.000 And if you really love the show and you're a big fan, you gotta share it.
02:18:29.000 That's really the only way these shows grow.
02:18:32.000 You know, I guess marketing campaigns don't really work.
02:18:35.000 You got to just have good word of mouth.
02:18:36.000 So we appreciate it.
02:18:38.000 You can follow me on all social media platforms at Timcast.
02:18:41.000 If you follow me on Twitter, you'll probably be confused half the time because it's meant to be a chaotic garbled nonsense mess for fun.
02:18:48.000 And I wonder how long until no one takes anything I post on Twitter seriously, which is kind of the point.
02:18:53.000 But my other YouTube channels are YouTube.com slash Timcast, YouTube.com slash Timcast News.
02:18:58.000 This show is live Monday through Friday at 8 p.m.
02:19:01.000 So we'll be back on Monday.
02:19:03.000 But I mean it this time.
02:19:06.000 We are going to have a special exclusive episode.
02:19:10.000 It's not going to be a podcast.
02:19:11.000 It's going to be the Chicken City.
02:19:14.000 Okay.
02:19:14.000 Okay.
02:19:15.000 Now the Chicken City is built, but it's an issue of whether or not we can actually get the chickens because a lot of places are saying it's not quite chicken time yet.
02:19:23.000 But if you want younger chickens, you can have them, and I think maybe we'll just get some, you know, we'll figure it out.
02:19:28.000 I want to make sure we do it properly for the sake of the chickens and their well-being, and I want to be able to film it when we do, so we might, I'm hoping that we'll be able to actually procure some chickens tomorrow and film the process and show you our little chicken village.
02:19:43.000 It's not very big.
02:19:44.000 And that will be bonus content available at timcast.com.
02:19:47.000 So we're really hoping with TimCast.com to do more than just podcast shows.
02:19:51.000 So if you're a member, there will eventually start being more stuff.
02:19:54.000 There will be, you know, training videos on the range.
02:19:55.000 Maybe Luke can do some, you know, drills, or he can talk about Airsoft and other things like that.
02:20:01.000 And we could just have some experts talk about some fun stuff.
02:20:03.000 But that's all available over there, so greatly appreciated when you sign up.
02:20:07.000 James, you want to shout out anything?
02:20:08.000 I mean, you can follow me on pretty much all social media at ConceptualJames, or you can follow my outlet, New Discourses, at New Discourses.
02:20:18.000 It's on most, if not all, of the platforms.
02:20:20.000 YouTube channel has all the podcasts.
02:20:22.000 It's also on the other podcasting stuff.
02:20:24.000 You want to check that out.
02:20:25.000 That's the main thing I'm kind of doing.
02:20:26.000 If you subscribe, even at like low levels, if you subscribe at all to any of the subscriber things that I have through New Discourses, then you have access to a second podcast I do for subscribers only that I call James Lindsay Only Subs.
02:20:38.000 I usually don't script those.
02:20:39.000 I just try to talk like man-to-mic, man-to-you, and connect.
02:20:44.000 So different, shorter form, more personal content.
02:20:49.000 So I'm going to try to, in fact, get into some advice-giving and personal content on there, too.
02:20:54.000 So go sign up at New Discourses on any of the platforms that you like and get into that.
02:21:00.000 Cool.
02:21:00.000 You guys can also follow me at iancrossland.net if you want to check out my social stuff.
02:21:03.000 James, really cool to meet you, man.
02:21:05.000 Good talk.
02:21:06.000 Heck yeah.
02:21:07.000 I was on Adam Krigler's show earlier, AdamCast, so if you want to go to youtube.com slash AdamCastIRL and check out that episode, I would highly recommend.
02:21:15.000 And thanks for having me, everybody.
02:21:17.000 Very cool.
02:21:18.000 And I am Sour Patch Lids on Twitter and Vines, and I am Real Sour Patch Lids on Gab and Instagram if you guys want to follow me there.
02:21:25.000 I'm so excited for the chickens.
02:21:27.000 I cannot wait.
02:21:28.000 Hopefully we get little ones that are still fluffy.
02:21:31.000 Fingers crossed because we keep saying we're going to have some weekend content.
02:21:34.000 We're going to have some weekend content.
02:21:34.000 We're going to go to the range and then some, you know, something just falls through.
02:21:37.000 But I think tomorrow we're going to film getting these chickens and then we will have a special Chicken City episode at TimCast.com.
02:21:44.000 Thanks for hanging out, everybody.
02:21:45.000 We'll see y'all then, I suppose.
02:21:47.000 Yep.