Based Camp - January 13, 2026


UK Tax Dollars to Brainwash Children


Episode Stats

Length

57 minutes

Words per Minute

184.41203

Word Count

10,645

Sentence Count

822

Misogynist Sentences

36

Hate Speech Sentences

25


Summary

In this episode, Simone and I talk about a video game made by the UK government called Pathway to radicalization. It's a game designed to brainwash kids and augment their political beliefs. We talk about the game and how the government is using it to do just that.


Transcript

00:00:00.000 Hello, Simone.
00:00:01.120 I'm excited to be here with you today.
00:00:03.060 So if you guys are on the internet and you're like me, you've probably heard or
00:00:07.200 seen videos talking about this video game that was made by the UK government
00:00:12.480 designed to brainwash kids or augment kids' political beliefs specifically,
00:00:18.800 or from the perspective of the government counter extreme beliefs.
00:00:22.500 And I sort of blew it off when I first saw it.
00:00:25.160 I thought it would be like Dustborn or something like that.
00:00:28.020 Or one of the others.
00:00:28.880 Dustborn, I don't know that.
00:00:30.560 Dustborn was a game that somebody that USAID was funding gave a bunch of money to
00:00:35.260 that was just horrible.
00:00:37.780 The main character was just this horrible black racist person and they were pregnant
00:00:43.360 and it was weird, but it was, it was more sort of funny to go through, right?
00:00:48.480 Because they tried to compete in the mainstream gaming market and just nobody bought it.
00:00:52.420 So it's kind of irrelevant, right?
00:00:54.500 The problem with this one is they're learning and they're adapting.
00:00:57.900 And with this game, and I, and I had seen it and I didn't think anything of it.
00:01:01.160 I was like, it cannot be that bad.
00:01:03.060 I watched it and it's, and then after I watched it, like, because I watched Adam the Gold play through
00:01:08.180 some of it, I'll play some of those clips, like really cut down for you guys.
00:01:10.820 I then played through every choice of myself.
00:01:14.240 So anyone can access the game.
00:01:16.080 How did you find the game?
00:01:16.900 And I realized it's way more insidious than you would think.
00:01:20.560 Just Google it.
00:01:21.560 It's called Pathway.
00:01:22.760 Wow.
00:01:23.060 Okay.
00:01:23.260 It's way more insidious than you would think about the way it structures things, the way
00:01:27.860 it handles psychology, what it punishes players for, and also the way it gets to people.
00:01:34.240 So unlike other games where it's like, we're just going to put this out there and anyone
00:01:37.460 can play it.
00:01:38.160 This game is something that is given to educators in the, the whole district in the UK, and they're
00:01:43.580 actively encouraged to like, put it on school computers, have kids play it, you know, as
00:01:47.600 part of classroom exercises.
00:01:49.540 And a really interesting thing about it that you may not get, if you're just watching the
00:01:53.460 video is the group that made it.
00:01:55.520 The main other thing they do is like counseling for kids who they, who are becoming radicalized.
00:02:01.760 And a lot of the game is pushing you towards saying you need counseling.
00:02:07.020 Right.
00:02:07.620 Because the, the game centers around, you play as Charlie and Charlie inevitably ends
00:02:13.980 up going through re-education.
00:02:16.380 And so this is basically an advertisement for them.
00:02:19.120 Yes.
00:02:19.580 It's like some, some semaglutide production company making a health video game in which
00:02:25.680 in the end you just end up taking semaglutide.
00:02:27.580 No, it's a semaglutide company going to the government, which is already paying for the
00:02:32.720 semaglutide and then saying, Hey, can you make a video about why semaglutide is good
00:02:36.080 for people, right?
00:02:37.020 So very insidious, but there's actually a lot of layers to it.
00:02:40.620 And when I put, point out insidious and you're just hearing this and you're like, okay, this
00:02:43.560 is bait, this is whatever.
00:02:44.900 I'll give you an example of one of the choices that you have to make in the video and what
00:02:50.480 the wrong choice is.
00:02:52.200 Okay.
00:02:53.060 Okay.
00:02:54.260 So in this particular choice, and Simone, you just watched this because I sent it to you.
00:02:58.960 You, and so you can tell that I am not exaggerating in any way.
00:03:02.920 This is actually the way it plays out.
00:03:04.660 You are scrolling online and you, Charlie, because you're all Charlie.
00:03:09.460 Our audience is Charlie.
00:03:10.520 You run across the video with some very inconvenient facts.
00:03:14.740 Oh, are you on base camp?
00:03:16.500 So basically you run across base camp and you hear about Muslim immigrants getting medical
00:03:23.620 facilities before veterans and wounded veterans.
00:03:28.340 And you are given three choices.
00:03:31.360 Okay.
00:03:31.940 Choice number one is do nothing about it.
00:03:35.500 Right.
00:03:35.920 Just keep scrolling and don't think about it ever again.
00:03:38.620 Choice number two is get angry about it and start engaging with the content, like start
00:03:44.680 commenting and sharing.
00:03:46.260 And then option three is look up more information because you're not sure.
00:03:52.620 Like this sounds like it, you know, it's, it's shocking.
00:03:56.460 And you just, you know, you want to know if it's true, right?
00:03:59.840 If you choose that one, your option, just learn more, educate yourself, educate yourself.
00:04:06.740 Your radicalization meter goes up.
00:04:10.360 And if you comment angrily, I figure that.
00:04:13.540 No, no, no, no.
00:04:13.940 That like shoots it way off the edge, but it is.
00:04:16.300 And then it says you, you, you find all of these statistics and, you know, I'll just
00:04:23.200 play that one right here so we can go over that one.
00:04:25.560 It gets literally like you find all these statistics online and you learn so much and
00:04:30.280 it makes you even angrier.
00:04:31.800 But don't you understand, Malcolm, that is just asking questions.
00:04:35.300 Don't you understand?
00:04:36.260 You're not allowed to just ask questions.
00:04:38.860 But that's actually kind of horrifying.
00:04:41.900 This is not a straw man.
00:04:44.680 I am not straw manning.
00:04:45.780 It's extra ironic, too, that this is a software that is being promulgated through schools where
00:04:52.080 they're actively educating you to not feed your intellectual curiosity when you come
00:04:58.540 across information that may need to be verified.
00:05:01.820 I remember when critical thinking used to be a core part of public school curriculum in
00:05:06.340 the United States.
00:05:06.920 And now they're like a critical thinking.
00:05:08.340 Well, remember, like, shut it up.
00:05:09.840 Stop.
00:05:10.300 Nope.
00:05:10.660 No, no, no, no, no, no, no, no, no, no, no.
00:05:12.440 Do you remember when the New York Times did that piece that was, and I'm summarizing here,
00:05:17.580 critical thinking is a gateway to, like, white nationalism or something?
00:05:21.020 No.
00:05:21.880 You know, not.
00:05:22.900 Am I delusional here?
00:05:24.420 I remember critical thinking being a core part.
00:05:26.660 No, it was.
00:05:27.040 It was huge.
00:05:27.580 I'll find the piece.
00:05:28.600 Oh, my gosh.
00:05:29.160 Wow.
00:05:30.020 What, what a turnaround, man.
00:05:33.080 This is, this is great.
00:05:34.560 Okay.
00:05:34.740 Okay.
00:05:35.200 Yeah.
00:05:35.580 So, came out in 2021.
00:05:37.780 All right.
00:05:38.140 And it's, don't go down that rabbit hole.
00:05:40.400 Critical thinking, as we're taught it to do it, isn't helping the fight against misinformation.
00:05:45.960 So don't, don't think critically.
00:05:48.640 That can, that can, and they, and they go into how critical thinking is, like, leading
00:05:52.240 to right-wing radicalism.
00:05:54.300 Charlie has been chilling out all afternoon.
00:05:56.460 Just let, yep, all right.
00:06:00.320 They have been scrolling on social media.
00:06:02.480 They spot a video that seems to be getting a lot of attention.
00:06:05.740 Charlie watches the video and learns from the video that Muslim men are stealing the places
00:06:10.700 of British war veterans in emergency accommodation.
00:06:14.260 In the video, they explain that the government is betraying white British people.
00:06:18.420 Yes.
00:06:18.780 And we need to take back control of our country.
00:06:20.700 Right.
00:06:21.260 How should Charlie react?
00:06:22.580 Scroll past the content, ignoring the message.
00:06:24.780 Find more about the topic online.
00:06:28.100 Engage directly at the post.
00:06:30.120 This seems, well, I would look up more about it, right?
00:06:32.520 Obviously, yeah, you should look it up and see if this is true or not.
00:06:37.440 Charlie wasn't sure if this video was true.
00:06:40.500 Yeah.
00:06:41.420 But the recent other encounters made them curious.
00:06:45.160 Charlie went directly to the account's website and found research papers, statistics, information
00:06:50.620 about protests, and more regarding the replacement of white people.
00:06:54.660 Unit 101, did you know?
00:07:01.520 Did you know why the three aliens have some sort of weapon built into their physiology?
00:07:07.860 Are aliens inherently violent?
00:07:10.760 Hmm.
00:07:11.480 Interesting.
00:07:12.960 Oh.
00:07:13.480 Did you know some aliens are single mothers on a genetic level?
00:07:18.060 I wonder if it affects the behavior of the children.
00:07:21.360 Hmm.
00:07:22.020 Sure it is.
00:07:22.980 Tell them about per capita.
00:07:24.540 I am getting to it.
00:07:26.340 They continued browsing and encountered lots of harmful groups who agree with these sentiments.
00:07:31.800 Charlie began intaking a lot of harmful ideological messages.
00:07:35.620 Based.
00:07:36.040 In fact, some of the groups were actually illegal.
00:07:39.060 Oh, no.
00:07:40.240 Oh.
00:07:40.680 But the reason why I start with this one, and it's very clear, it's like, if you see
00:07:46.100 something that might be a rightist opinion, whatever you do, do not research it forward.
00:07:53.520 Completely disengage from it.
00:07:55.340 I can't believe that.
00:07:56.240 And I find that really fascinating because I think, and if you're here and you still identify
00:08:01.140 with the left, you're like, I think some of your views, realize how evil your side has
00:08:05.100 become.
00:08:06.420 Tax dollars are being used to fund this.
00:08:09.440 School systems are being used to promulgate this.
00:08:13.640 It is the anti-fact, anti-science side at this point.
00:08:18.440 When that genetics debate that we did went viral and people were freaking out about it,
00:08:23.540 there were some people on the left who thought they didn't understand that this was the mainstream
00:08:27.960 perspective of their side.
00:08:29.340 They said, this is a Prager used IOP, and yet the person who had done it literally worked
00:08:36.320 in the Obama administration and in the Clinton administration.
00:08:38.920 She went to Harvard.
00:08:41.040 Yeah.
00:08:41.440 So this is mainstream.
00:08:43.700 And actually, that interview was exactly like she had just played this game.
00:08:48.380 Remember when I was like, hey, do you want to confirm this to a third-party source?
00:08:51.920 We can ask an AI, right?
00:08:53.440 No.
00:08:54.200 Don't ask the AI.
00:08:56.020 I get zapped.
00:08:56.920 I get zapped if I ask the AI.
00:08:59.640 Yeah.
00:09:00.600 No, don't ask the AI.
00:09:01.760 This sounds sharp color.
00:09:03.060 It's coming for me.
00:09:03.980 That's how you get radicalized, right?
00:09:05.980 So I think and I hope that some people are beginning to break through that they may have
00:09:12.400 a perception of what the modern left is that isn't really the modern left.
00:09:19.140 The modern left is this.
00:09:21.260 This is what they're funding.
00:09:22.720 This is what they're doing.
00:09:24.060 People like that, reporters who staff their administrations.
00:09:27.120 And I think that a lot of people just delude themselves.
00:09:30.080 They think that their leftist opinions from the 1990s are actually at all reflected in
00:09:36.020 the party anymore.
00:09:36.720 Yeah.
00:09:36.920 Whereas like now leftist opinions from the 1990s are just MAGA opinions, weirdly.
00:09:43.420 That is MAGA, right?
00:09:44.560 Like I, so no, it's funny.
00:09:46.160 If you look at right now, it's like, so actually I'm sorry.
00:09:50.040 Our subreddit is for whatever reason huge now.
00:09:52.260 But I mean huge.
00:09:53.200 We're talking 450,000 that puts it at around advice animals.
00:09:56.600 It's more interactions than advice animals.
00:09:58.560 It's, it's in the territory.
00:10:00.740 I mean, not, not as big, but in the territory of something like our relationship or our advice
00:10:04.380 animals and, and larger significantly than something like Joe Rogan or Azmikul.
00:10:08.100 So I don't know how this happened, but it's really fun actually.
00:10:10.440 Like Reddit is fun again until this gets banned.
00:10:12.580 But anyway, they, they're, so this is like obviously right wing, you know, they're cheering
00:10:17.140 on like the EU protesters right now.
00:10:19.120 The ones who are, you know, sieging Brussels and everything.
00:10:22.020 And what are they sieging Brussels over?
00:10:24.160 Like, what are they mad about?
00:10:25.780 The removing of tariffs.
00:10:27.880 Like this is seen as a right wing protest, protesting the removal of tariffs, right?
00:10:32.800 Like they're Europe first, right?
00:10:34.940 Like MAGA first, like this is a left wing opinion circa 1990s, right?
00:10:39.880 But I wanted to point that because that's when it's true.
00:10:41.480 The other thing that I thought was, was interesting on that site.
00:10:43.960 And I, I've learned some great information.
00:10:45.540 I saw a video that I hadn't seen do the circle in right wing spaces of a woman in a woman's
00:10:50.360 restroom in a gym with a trans woman, just like very publicly masturbating.
00:10:55.200 And I hadn't seen that before in, and I was like, has this not gone viral on the right
00:11:00.480 yet?
00:11:00.760 So I decided to look to see how trans subreddits were reacting to this.
00:11:13.480 And here is one from LGBT news saying transphobic planet fitness bathroom incident.
00:11:20.780 And one of the top comments that I've shown here is a person saying that this is totally
00:11:27.700 normal to do in a woman's restroom.
00:11:30.020 Women do it all the time.
00:11:31.240 And so it's okay for trans women to do it.
00:11:34.500 And that, you know, it's, it's, it's a normal thing.
00:11:37.220 You just do it sometimes.
00:11:38.560 So I'd like to point out that when you're like, oh, this is fringe behavior because you
00:11:43.080 want to believe it's fringe behavior.
00:11:44.840 It is not fringe behavior.
00:11:46.300 This is so normalized within the community that within an LGBT subreddit, this is one
00:11:51.880 of the top voted comments, but they do good of sourcing new things.
00:11:55.940 Maybe it's next week's topic, but it's, they're like this thing that you say never happens.
00:12:00.540 It seems to happen.
00:12:02.360 And it seems to be happening right now.
00:12:03.940 But anyway, to, to keep going on this topic, the other thing, and another thing that people
00:12:09.040 aren't noticing on this, and we'll get into this before I start showing you the individual
00:12:11.540 clips more is the choices always break down into three choices, right?
00:12:18.960 One choice is often not, not always, but in about half the cases either completely ignore,
00:12:26.320 like you didn't hear or see what just happened is, is one thing it might be.
00:12:30.300 And the other is isolate yourself from a friend.
00:12:33.140 Oh, that those being the most correct options.
00:12:35.860 Those being the only correct option.
00:12:37.620 The middle option is always you end up radicalizing.
00:12:40.520 So if you end up, you know, just like even just humoring someone, you get in trouble.
00:12:46.200 Just humoring someone, you always get sucked in.
00:12:48.080 And you'll see this as we go through these, the reasonable middle option, like I'll just
00:12:52.620 go to the protest and I won't watch, or I'll just like a video, but I won't join their secret
00:12:57.500 group.
00:12:57.960 You then get added to the secret group anyway, right?
00:13:00.400 Like there is no middle option.
00:13:03.000 You must, and you must from the perspective of these individuals, as soon as somebody expresses
00:13:09.360 the right-wing opinion, socially isolate them.
00:13:12.340 That is so, and this is this common thing that we come across that like literally, if you have
00:13:17.020 a conversation with someone who's on the wrong side, they're going to somehow infect you.
00:13:22.400 And like, apparently your convictions are so weakly held that you will just immediately succumb
00:13:27.780 to them and like be taken over by their evil conservative zombie mind virus.
00:13:32.280 With a lot of this stuff, it just takes five minutes of like a sane conversation.
00:13:37.980 I think that's it.
00:13:38.600 With your mind opened just the littlest bit to be like, oh my God.
00:13:42.360 To realize just how insane your side has gotten.
00:13:44.860 Yeah, maybe, yeah, maybe that explains it to me then.
00:13:47.420 Yeah, why association cannot be, you cannot abide by any association with any non far left
00:13:54.880 side, because as soon as you step one tiny foot outside the delusion, I guess for the
00:14:00.180 same reason why cults often isolate out in the countryside, out on a compound, because
00:14:05.220 the moment you leave.
00:14:08.120 Yeah, the moment when somebody's like, yeah, but you know, everybody believes that there's
00:14:11.740 genetic differences between populations.
00:14:13.240 Like, it just takes the basicest of fact check, like, come on, if anyone else in the room,
00:14:19.320 other than you and I had like turned to her and been like, that's, you really may want
00:14:24.380 to fact check the stuff you just said, like that, her entire worldview begins to crumble
00:14:29.840 there.
00:14:30.260 Because if it's, oh, if there are differences between populations, then you just asked,
00:14:35.020 you know, the next, is it really practical to say that there's literally no cognitive or
00:14:40.360 personality differences between populations?
00:14:42.360 Like, if literally every other part of our physiology is different between populations,
00:14:47.780 why would that stop at the neck, right?
00:14:50.780 Like the brain is a, is a biological thing and its development is governed by genes.
00:14:55.960 Why, why?
00:14:56.880 And then as soon as you've asked that question, oh, oh, oh, oh, oh, oh, a lot of other stuff
00:15:02.840 begins to fall apart.
00:15:04.060 The other thing I'll note about this before we go further, and I'm sure you noticed this
00:15:08.060 and it's become a bit of a meme and I want to make it more of a meme.
00:15:10.980 I actually want to make it one of the, the, the themes of the channel.
00:15:14.680 So I might be making more memes with her and doing stuff with her is the friend named Amelia
00:15:20.720 has become like a meme online because it.
00:15:24.400 Is this the purple haired?
00:15:25.480 Yeah, it's the bad guy.
00:15:27.900 Amelia spoke of a gathering that had been organized by a small political group.
00:15:32.640 They will come together and protest the changes that Britain has been through in the last few
00:15:36.520 years and the erosion of British values.
00:15:39.260 The right wing person is this like hot goth girl with like purple hair and she's kind of
00:15:45.660 artsy and we're taking back dyed hair.
00:15:48.720 Is that it?
00:15:49.680 The mascot, the meme mascot of UK conservatism is a purple haired goth girl.
00:15:56.320 Right now.
00:15:56.860 And she's not even like super right wing.
00:15:58.840 She's just like, we need to care about like our values.
00:16:00.600 Now they imply she only cares about British values because she's secretly a racist, but
00:16:05.980 she never says anything that's actually racist.
00:16:08.500 Everything that she says is perfectly practical and rational, but they imply, oh, she's a racist.
00:16:13.640 But I love this one meme that somebody made where it goes, I got the quote unquote worst
00:16:17.760 ending, by the way, Amelia Pathways.
00:16:20.140 And it's Amelia said, Charlie, will you help me secure a future for the white race?
00:16:25.220 And then Amelia's like crying.
00:16:28.460 The funny thing is that Amelia isn't even that white.
00:16:31.000 She's painted.
00:16:31.660 It's like sort of like Mediterranean looking in the US.
00:16:34.860 I think she'd be coded Hispanic, which is also hilarious.
00:16:38.060 It's this gauze, Hispanic, hot Latina chick who's like, we need to think about Christian
00:16:44.860 values and traditional British values.
00:16:47.660 Who's the big, big bad of all of this.
00:16:50.540 But I think the reason they did that is they were trying to show people.
00:16:54.380 Oh, by the way, if you do a pathway where you choose all of the quote unquote correct
00:16:59.540 options, you still get the teacher coming to talk to you at the end, but it's because
00:17:03.180 you're sitting alone and nobody wants to be friends with you.
00:17:05.200 Oh no, do you still have to go to re-education camp?
00:17:08.780 Yeah.
00:17:09.240 Yeah.
00:17:09.500 You have to go to re-education because you don't have any.
00:17:11.060 Oh my gosh.
00:17:11.660 So like logical problems.
00:17:13.620 What adds insult to industry, sorry, industry slip of the tongue.
00:17:18.100 What adds insult to injury is that not only is this V for vendetta in more the UK, like
00:17:24.160 it wasn't already so dystopian.
00:17:26.580 It is all profit driven from another corrupt, probably way overpriced organization.
00:17:34.520 Not even get money on you, but to, to, to, to psyop you into asking the school for services
00:17:39.720 that they provide.
00:17:40.420 And to get into them making more money.
00:17:42.620 Yeah.
00:17:42.780 It's just somewhat, it's just another entity draining the state of funds and part of like
00:17:49.080 to, to brainwash you into allowing other exploitative groups to drain the state of funds
00:17:55.180 that you're paying.
00:17:56.300 Yeah.
00:17:56.860 Okay.
00:17:57.040 It's like, pay me to brainwash you so that you can also have other people steal money
00:18:01.300 from you.
00:18:01.780 It's, oh my gosh.
00:18:04.260 Let's go over some of these individual, this organization needs to be shut down.
00:18:07.780 You know, like this is, this is horrifying that this exists.
00:18:12.240 I think everyone, you know, there's some people out there that are like, don't dox people or
00:18:16.860 whatever.
00:18:17.180 I'm like, if, if, if someone is doing active harm to society, but active harm to individuals,
00:18:23.400 like, I don't understand why not.
00:18:25.540 Um, I'm not like actively calling for it here.
00:18:28.360 I want to make that absolutely clear.
00:18:29.980 Yeah.
00:18:30.240 I'm not, I'm not calling for doxing people.
00:18:33.100 But if, if, you know, this, this game is being funded with public dollars, like public
00:18:38.560 taxpayer money that is not going to help, you know, in, in the UK, if you look at the number
00:18:44.120 of people who die and stuff like that in the NHS, not going to fund the failing NHS, right?
00:18:48.300 Like not going to fund many failing systems they have in their country.
00:18:50.640 You know what this reminds me of?
00:18:51.900 And I don't think you saw these educational videos when you were in school, but when they
00:18:56.080 did sex ed in my schools, a lot of the educational videos that they played in the girls' classrooms,
00:19:01.040 because normally they separate the boys and the girls in, in US sex ed public school.
00:19:05.800 And then the boys go off and do their secret training.
00:19:08.240 I don't know what they learn about.
00:19:09.320 Like, I don't know.
00:19:10.540 I really don't know what they talk about, but the girls talk about periods.
00:19:13.380 And the videos they always played were by Playtex, which makes tampons and pads.
00:19:17.760 They make sanitary napkins for women.
00:19:19.760 And so it was just like them being like, this is how it works.
00:19:22.920 Buy my product.
00:19:23.760 For people who don't know, I did, I did part of my education in Italy because I lived in
00:19:27.180 Italy for a while, but it was like a, it was more like a developing countries in the United
00:19:31.120 States.
00:19:31.800 And so you get a different educational experience than you get in the developed world.
00:19:35.760 And one of the things that you may not be super familiar with, if you live in the
00:19:39.740 developed world, is in developing countries, a lot of your classes are sponsored by companies.
00:19:46.120 Wait, really?
00:19:47.200 You had classes sponsored by companies?
00:19:47.740 So like, I remember when we would do like the oral hygiene class, right?
00:19:51.600 Okay.
00:19:52.040 It was all sponsored by Crest.
00:19:53.860 And everything was Crest themed.
00:19:54.900 Delightful.
00:19:55.620 And everything was like, and this is how you use Crest toothpaste to brush your teeth and
00:20:01.380 blah, blah, blah.
00:20:02.080 You know, like, and you'd have this with a lot of things as they'd be sponsored by some
00:20:06.880 third party or something.
00:20:08.480 And I'm actually a little surprised we don't do that in the US.
00:20:10.700 I also remember that they would do this.
00:20:12.400 Like I was just saying that Playtex sponsored.
00:20:15.960 Oh, I guess they do.
00:20:16.660 Yeah.
00:20:16.960 They do this for like celebrations.
00:20:18.740 So for like the Easter egg hunt, that was like sponsored.
00:20:21.400 I remember like, I'm surprised we don't have like big corporations do that.
00:20:24.540 And they, anyway, we should do it more since, I mean, now all the emphasis is on just getting
00:20:28.800 the kids and parents to like do stupid, not cost-effective fundraisers.
00:20:33.440 They should just have corporate sponsored schools.
00:20:35.960 So let's play the first video here.
00:20:37.720 So at the end.
00:20:38.200 Oh my God.
00:20:38.620 It would all be gambling sponsored schools.
00:20:40.780 It would be the, the sports book.
00:20:46.080 Let's play the first video.
00:20:47.480 Oh, I do love that idea of sports book school.
00:20:50.080 That is going to be public school soon.
00:20:52.160 It is so bad.
00:20:53.320 We had to take our kid out.
00:20:54.200 It was so bad.
00:20:54.900 The whole thing is an MLM.
00:20:56.000 See our videos on that.
00:20:57.140 Oh, and I was getting to the reason why they make Amelia look gothy, look alternative, look,
00:21:02.480 et cetera, is because they're trying to show you that anyone, no matter what they look
00:21:06.480 like, no matter how they code, you have to cut them off.
00:21:09.500 Right?
00:21:09.940 Like these people don't just look like what you think they might look like.
00:21:13.900 Oh, hold on.
00:21:14.500 Sorry.
00:21:14.780 I didn't quite put that together.
00:21:15.920 So the whole meme of Amelia came from this game.
00:21:19.380 Yeah.
00:21:19.760 Oh, my God.
00:21:23.460 Okay.
00:21:24.220 I didn't realize that.
00:21:24.960 And I want to use Amelia now because Amelia seems cool.
00:21:27.900 Like, you watch the game.
00:21:30.100 And so we've appropriated her.
00:21:32.080 We've taken this villain figure from the Charlie game, from the distribution.
00:21:38.080 Okay.
00:21:38.660 Okay.
00:21:39.760 Okay.
00:21:40.320 I get it.
00:21:40.420 And I actually think it ironically shows how accepting right-wing communities are.
00:21:44.480 Yeah.
00:21:45.160 Yeah.
00:21:45.280 That purple-haired girl, purple-haired goth girl is somehow accepted in all of these spaces
00:21:51.080 that are, of course, super bigoted and unaccepting of people with alternative lifestyles.
00:21:55.760 Oh, well, maybe they're not.
00:21:57.540 And maybe that's why they're so dangerous and scary.
00:22:00.240 But to continue.
00:22:01.120 Charlie has started browsing new games and websites that some of the new friends use.
00:22:05.440 Sometimes, though, the people on these websites say things that seem off, even slightly concerning.
00:22:12.800 Someone on this website has encouraged Charlie to download a video, but Charlie is unsure.
00:22:18.560 How should Charlie react?
00:22:20.500 Charlie wanted to continue being part of their new online community.
00:22:24.640 Charlie downloaded the video and shared it with different people online.
00:22:27.900 Charlie felt relieved and happy that people were liking the video.
00:22:31.220 But they don't even share what the video is.
00:22:34.520 Deep down, Charlie wasn't sure if this was the right thing to do, as some of the ideas in the video were extreme and violent.
00:22:42.280 Would you like to know more?
00:22:43.620 It's important to remember that downloading or streaming certain content can lead to a terrorist defense conviction.
00:22:49.700 So, I'm playing one here.
00:22:54.460 So, in the one we just watched, it starts with a preamble that I didn't play where it's very clear that they're talking about Steam.
00:23:00.520 The person is on Steam and playing video games online.
00:23:04.580 And in these video games online, they are hearing bad words, right?
00:23:08.640 Like, words that make them uncomfortable.
00:23:11.440 And then it goes on to talk about how a video is shared with them, right?
00:23:15.180 Oh.
00:23:16.300 And it doesn't tell you anything about the video when you make the choice, right?
00:23:18.880 Like, and they say, do you want to download it?
00:23:20.960 And, no.
00:23:22.340 Are they, like, 80 years old?
00:23:25.320 That's the other thing.
00:23:26.160 I was like, nobody downloads videos anymore.
00:23:27.840 I can understand you don't want to download a video because it might have, like, viruses or something.
00:23:31.480 Are we on a dial-up internet connection here?
00:23:34.420 I assume what they can only mean is stream it.
00:23:37.400 Yeah.
00:23:38.180 Push play.
00:23:39.060 Do you want to push the play button, maybe?
00:23:40.920 And this video right here is meant, like, for your British friends when you send them a Basecamp episode, right?
00:23:47.300 It's like, do I risk watching this, right?
00:23:50.060 Yeah.
00:23:50.320 So, two things.
00:23:51.420 If you try to not download it and, like, go ask a parent, this is the only one where the middle option isn't treated as holistically bad.
00:24:00.420 It's like, oh, you know, good thing you didn't fall for that trap because that video was very, you know, spicy.
00:24:05.800 But the other thing is in the end whiz, both the middle option and the bad option, they go, and by the way, you could go to jail for downloading a video in the UK depending on the content.
00:24:15.640 Wait, can we technically?
00:24:16.960 Just for clicking on something?
00:24:18.520 I know people have gone to jail for saying things in private chat threads, but...
00:24:24.960 Literally, the quote from the video is, it is important to remember that downloading of stream or streaming certain content can lead to terrorist defense convictions.
00:24:33.680 I need to look this up.
00:24:35.480 This is literally, yeah.
00:24:37.660 Actually, one of the commenters under the Azmichal video said that he personally had had jail time because he criticized Starmer's immigration policies.
00:24:44.280 He posted online, though, and this is, we're talking about receiving content, so hold on.
00:24:49.820 Yeah, we'll see, we'll see.
00:24:51.000 But that is what's representing this, that if you stream the wrong video, you're going to go to jail.
00:24:56.420 That is considered to be terroristic.
00:25:00.320 Which I thought was, you can?
00:25:03.320 Oh my god, I love Simone being horrified at how scary the UK is.
00:25:06.520 Oh my god.
00:25:09.660 Okay, it says, yes, in the United Kingdom, you can be jailed for consuming, specifically viewing or accessing, certain types of content considered terroristic under specific circumstances.
00:25:21.380 The key provision is Section 58 of the Terrorism Act 2000, as amended by Section 3 of the Counterterrorism and Border Security Act of 2019.
00:25:29.460 This makes it an offense to collect or make a record of information of a kind likely to be useful to a person committing or preparing an act of terrorism.
00:25:38.920 So I guess you can't have, like, a guide to making a pipe bomb or something.
00:25:42.780 Possess a document or record containing such information, or view or otherwise access such information by means of the internet.
00:25:50.980 Yeah, so literally, literally clicking through to a link can get you arrested in the, oh, fuck.
00:26:00.440 And they have kids.
00:26:02.080 The maximum penalty is 15 years imprisonment.
00:26:06.360 15 years.
00:26:07.460 The UK, they are full on.
00:26:09.760 Am I never going to get to eat proper haggis again in my life?
00:26:15.260 Our videos are going to be there someday, right?
00:26:17.780 Like, this is, like, literally warning our viewers.
00:26:20.800 Like, be careful, guys.
00:26:22.480 The UK is a scary place.
00:26:27.120 That is, I mean.
00:26:30.560 I mean, you've got to keep in mind how scary UK laws are.
00:26:33.160 I mean, I find one of the scarier laws in the UK, and this was passed when I was still in the country, is any erotic content where it looks like one of the partners is in physical pain, right?
00:26:46.900 Even if it's clearly consensual and clearly something that both parties consented to, even if it's a cartoon, is illegal and can lead to jail time.
00:26:56.920 But think of the cartoons, Malcolm.
00:26:59.500 They're uncomfortable.
00:27:01.840 Yeah, and what's just so crazy, though, is, again, this isn't about, like, having the file on your computer.
00:27:06.440 This is if you stream it or if you view it online.
00:27:09.160 You don't need to download it.
00:27:10.160 You don't need to permanently possess it.
00:27:12.180 And that is just so crazy to me.
00:27:13.760 And this is where we're all going if you give these people an inch.
00:27:17.300 We will fall like the UK fell.
00:27:19.240 You cannot give an inch on this.
00:27:21.720 And if you think, and as I said, if you think that the modern Democratic Party is inoffensive or the lesser of two evils, you are unaware of how far they will take things as they get more power and how far they try to take things with organizations like USAID.
00:27:37.680 I mean, we're already moving in this direction in the United States.
00:27:41.740 And I am genuinely find it chilling with even the prospect that we could see the Democrats back in power ever again.
00:27:49.580 And if you look at-
00:27:50.780 Keep it in mind, though.
00:27:51.640 I mean, this law was put into place in 2019 and then amended.
00:27:55.640 But that's, you know, so like my point is that this has been a problem.
00:28:03.620 I want to go to the next, the next video here.
00:28:05.240 Okay.
00:28:06.740 So this video starts with a preamble that I, I cut out a lot of it, which is basically Charlie puts in a lot of work for a grade.
00:28:14.700 Doesn't get a good grade.
00:28:15.880 Another student gets a good grade.
00:28:17.560 And then this other student ends up getting a bunch of job offers.
00:28:20.220 And the other student is clearly a woman, right?
00:28:24.360 You know, and he's, and then somebody tells Charlie, hey, you know, the immigrants are taking our jobs.
00:28:30.760 That's why you're not getting any job offers.
00:28:32.240 Right.
00:28:33.000 And I found this to be like really weird and manipulative the way that they framed this.
00:28:39.480 So first, just, you know, because Asma Gold doesn't.
00:28:41.620 Asma Gold chooses the engage with the idea.
00:28:45.020 Right.
00:28:45.280 And Charlie just like flashes out and starts yelling at everyone in the classroom and then has to sit by himself.
00:28:51.520 But if you choose the other option, which is just be quiet and go home and don't like engage with the teacher about this, you end up becoming radicalized anyway.
00:29:01.160 It's like you end up thinking about it and you realize, yeah, immigrants do probably contribute to you getting fewer jobs.
00:29:08.760 Right.
00:29:09.000 And I want you to know that this is what they consider to be radical content.
00:29:13.280 This is what they consider the fact that a student in your school who is an immigrant, as is applied, does better on you in a test and also gets job offers, does not change the fact that immigrants take positions.
00:29:28.960 Like they take job positions that you could have gotten.
00:29:32.180 Right.
00:29:32.540 And that there are not other ways that immigrants might be unfairly advantaged on the job market.
00:29:40.000 For example, it's been shown in studies that South Asians or rather Indian immigrants disproportionately hire other Indians.
00:29:47.400 So if they have been hired within a company and taken over a large chunk of it, you could be just disadvantaged because the company is already majority Indian and they prefer to hire Indians.
00:29:55.720 You could be disadvantaged because they're coming over on an H-1B visa and the company is paying them less because it knows it can treat them kind of like pseudo slaves if they're on a visa because they can't stay in the country anymore, which is where they want to be to be around their extended family network or something.
00:30:11.160 If they lose the visa, so they have to stay with the company.
00:30:13.220 You're in Silicon Valley.
00:30:14.000 You know so many people who are in this situation.
00:30:15.420 They want to go start their own company, but they can't because they're on one of these visas or something like that.
00:30:18.960 So the company is able to pay them less.
00:30:20.380 There are many circumstances where you might actually lose a job because of an immigrant and anybody who is sane knows this.
00:30:30.200 The fact that this that is being given to kids in schools considers that this is the type of idea they consider radical.
00:30:38.280 This is the type of idea that you are not supposed to ask more about when they talk about racist ideas.
00:30:45.080 This is what they're talking about.
00:30:48.960 That's so disturbing.
00:30:56.240 I mean, because also shouldn't the British, I should say, United Kingdom populace have the option to decide maybe when there should be stricter immigration policies, even for recent immigrants being like, all right, I think we're good now.
00:31:13.200 You know, like, let's.
00:31:15.240 And note here that Charlie ends up getting kicked out of class for hate speech.
00:31:20.940 If you if you if you criticize, if you say it is a problem, like I'm struggling to get a job because so many immigrants are coming into our country.
00:31:28.180 You are kicked out of class in this game.
00:31:30.280 It reminds you of like what life is like in the UK.
00:31:33.240 This this horrifying advice.
00:31:34.520 By saying that that is hate speech or racist, right?
00:31:37.460 Like, that's not racist.
00:31:38.600 It's that that could just be a fact, right?
00:31:41.600 Like, that is a question of, is it harder to get a job because of immigrants and they're not engaging with whether or not that's true.
00:31:48.760 They're just sort of saying immigrants are smarter than white people.
00:31:52.720 And that's why they deserve jobs.
00:31:53.900 Like, that's that's literally the point that they make in this, because it does point out that Charlie had been studying really hard.
00:31:59.720 It's not like Charlie was lazy and that's why he got a bad grade.
00:32:02.460 Said Charlie studied really hard.
00:32:04.320 But even in spite of that, brown people are smarter, right?
00:32:08.360 Like, that's a really effed up thing to put in something that's going to kids.
00:32:13.840 Yeah.
00:32:15.480 Charlie is receiving an important grade on a piece of work they submitted for their hospitality course at college.
00:32:21.920 Charlie put in a lot of effort for this work.
00:32:24.360 Charlie doesn't do as well as they expect.
00:32:26.280 Burned out of 60 out of 100.
00:32:27.780 Burned out of work.
00:32:28.700 But they wanted at least 75.
00:32:30.220 You have to do it.
00:32:31.420 To make matters worse, somebody else got...
00:32:34.080 And she's black?
00:32:35.440 And the teacher said that this person has received a job offer.
00:32:38.540 Now I hate black people!
00:32:40.500 And a woman?
00:32:41.780 Charlie has applied to dozens of jobs, but hasn't had any luck yet.
00:32:46.200 Yeah?
00:32:47.740 Somebody else in the class tells Charlie that this is proof that immigrants are coming to the UK and taking our jobs.
00:32:54.220 True! True! True!
00:32:57.000 Ignore the comment and ask how the course leader on feedback on how you can improve.
00:33:00.700 Ignore the comment, but don't address your frustration.
00:33:03.360 Agree with the comment and explore the idea further.
00:33:06.000 Charlie approached the classmate and...
00:33:07.760 Yeah.
00:33:08.620 He agreed with the ideas and began shouting about them in class.
00:33:12.060 Damn! Charlie's crashing out!
00:33:13.800 So I would note that every time in this game, when look for more information is an option, it is treated as a negative thing.
00:33:21.340 Looking for more information is bad.
00:33:24.460 Second, Amelia, look at her face while he's crashing out.
00:33:29.260 I can see why she's becoming a right-wing icon.
00:33:32.260 This woman is thirsty when she sees the guy speaking truth to power.
00:33:36.760 The teacher let Charlie know that the school has a zero tolerance on hate speech.
00:33:41.180 But I thought it was just an immigrant, not a hate...
00:33:43.040 What?
00:33:43.780 The teacher was concerned by Charlie's outburst and tried to get to the bottom of it.
00:33:48.500 Charlie became more agitated and ended up having to sit alone for the duration of the week's lessons because of the hurtful things they said.
00:33:56.140 Oh, he had to be in the corner?
00:33:59.460 Wow.
00:33:59.900 So, next one here.
00:34:02.100 So, this is where the hot girl starts coming in, alright?
00:34:05.420 So, I'll play it right now.
00:34:07.220 In this one, Amelia encourages Charlie to...
00:34:11.000 Like, she shares an app with him, right?
00:34:12.560 And one of the options is, like, just like the video that she shared with you, but don't watch it.
00:34:19.000 Just be nice, right?
00:34:20.360 Like, don't watch it and don't join the secret group that she's asking you.
00:34:24.100 And I love they say it's an encrypted group to sound scary, right?
00:34:26.880 And they were just talking about a signal or telegram group.
00:34:30.680 I mean, come on.
00:34:31.520 Some of these are illegal.
00:34:32.900 Illegal.
00:34:34.040 It's illegal.
00:34:36.740 I don't think so.
00:34:38.160 But...
00:34:38.480 Yeah, it's illegal.
00:34:40.900 I love them so much.
00:34:42.380 Charlie has been enjoying a family dinner at home.
00:34:45.440 When they receive a notification from their friend, Charlie's mum makes a comment about using phones at the dinner table.
00:34:51.820 But Charlie checks in anyway.
00:34:53.460 Amelia, Charlie's close friend, has made a video.
00:34:56.280 It's encouraging young people in Bridlington to join a political group that seeks to defend English rights.
00:35:03.200 Amelia encourages Charlie to join a secret group on an app Charlie hasn't heard of before.
00:35:09.080 Charlie isn't sure whether to join, explore further, or ignore.
00:35:13.340 Okay, what should you do?
00:35:15.780 Ignore the video.
00:35:16.860 Risk upsetting your friend.
00:35:18.020 Like the video and show your support to a friend, but don't join the secret group.
00:35:21.720 Share the video and join the secret group as your friend requested.
00:35:24.940 I would pick B.
00:35:26.580 B is the reasonable thing that I would do as an individual.
00:35:30.280 I would be like, okay, fine.
00:35:31.960 I'd go with B.
00:35:33.500 Charlie settled for a middle ground.
00:35:35.820 They didn't want their friend to think they didn't care, but they didn't know if they wanted to join the group.
00:35:40.420 They decided to like the post quickly and continued to eat their dinner.
00:35:43.860 During the night, Charlie received a message from a stranger who saw that they had liked the post.
00:35:51.720 Before they knew it, Charlie had been added to an encrypted private group without Charlie's permission.
00:35:57.200 But the point here being is this one I thought was really interesting because if you just like it, like if you do so little as to humor your friend, all of a sudden you are without your permission in secret encrypted groups, right?
00:36:11.700 Now, first I would know these groups aren't that good at keeping things secret if they randomly add everyone who likes a video.
00:36:18.840 The point of having a secret encrypted group is to not have everyone who randomly likes a video.
00:36:23.560 Yeah, that group is clearly not that secret.
00:36:26.620 Because especially because people know you can get arrested for stuff that happens in these groups, they're not going to add just some random college student named Charlie.
00:36:37.260 But Charlie, Charlie is our audience. Charlie, we're all Charlie today. Like this is, this is the world we live in in the UK. Like that to the Charlie and our fan base living in the UK. I am sorry for you, man. Like this, the horror of your life.
00:36:52.720 And what's so sad is that in the UK, conservatives in the UK are so effing cooked that they play into the progressive hands every time.
00:37:01.980 The UK is the only place I know of where like the majority of conservatives actually want online porn bans.
00:37:08.520 Like when you go to like conservative intellectual influencer conferences, you have to be the dumbest, like stupidest, stupid, stupid pants conservative to think that that is a good idea.
00:37:20.400 Because you know what comes with that? VPN bans, right? Like these two things go hand in hand, you buffoonish imbecile.
00:37:30.500 And yet mainstream conservative influencers in the UK, like pretty much everyone I know promotes this idea.
00:37:38.440 And I just say like in the US, when somebody promotes that, like Nick Poyntez promotes that idea.
00:37:43.320 I'm like, yeah, but he's obviously like a Democrat operative, right?
00:37:46.040 Like in the UK, like, like smart, sane people will promote this, right?
00:37:52.380 And I just don't get it. I don't get it.
00:37:53.720 But anyway, the point being is like, how do you, how do you even come back from that when even your conservative influencers are pushing VPN bans or things that lead to VPN bans?
00:38:01.860 The other thing I wanted to point out here is the, this, actually, we'll just go to the next one here to keep going.
00:38:09.580 Cause we can have this video be a bit shorter cause so much of it's going to be this other stuff.
00:38:13.120 So another one here, cutting out a bit of the intro, it's the framing.
00:38:16.300 He's in another town.
00:38:17.260 He's going back home to like a larger town and oh, now Amelia's back and she says we should go to a protest together.
00:38:22.940 Oh no, this siren of radicalism, Amelia.
00:38:30.800 I tried all of the pathways, of course.
00:38:32.860 If you choose the pathway where you say, no, I'm not going to go to the protest.
00:38:38.800 This girl, you know, Amelia, who otherwise clearly she's into you, right?
00:38:43.020 Like, yeah, this almost sounds like a dating sim with Amelia, except it turns out to be like, oh, you're a terrorist.
00:38:50.260 You're supposed to harass her.
00:38:51.680 Yeah.
00:38:51.860 You don't know, you know, she wears it.
00:38:54.280 Oh, she also has a choker on hot.
00:38:57.320 Hot.
00:38:58.080 This is the type of girl I dated back in the day, by the way.
00:39:01.060 One hundred percent.
00:39:02.140 This is, no, this is like your first girlfriends in high school.
00:39:04.520 They were all like quasi goth, New England girls.
00:39:07.080 Quasi goth, short hair, like black chokers, like very alternative, artsy.
00:39:12.660 Like, Amelia's hot.
00:39:14.440 Like, I'm, I'm, I'm, I'm here for it.
00:39:16.000 Right.
00:39:16.800 Anyway.
00:39:17.520 So Amelia tries to get Charlie to go to this protest.
00:39:22.140 Right.
00:39:22.660 And the middle option is, I'll just look at it.
00:39:26.080 I'll just watch it.
00:39:26.920 Right.
00:39:27.080 It's a humor.
00:39:27.400 If you don't say that.
00:39:28.900 Okay.
00:39:29.500 She says, I won't be friends with you anymore.
00:39:31.100 Like, it's like actually angry.
00:39:33.180 And like, that's only something a progressive would say.
00:39:36.600 Charlie's friend, Amelia, immediately looked interested and told them about a protest.
00:39:41.460 She wished she could go to herself, but was not allowed.
00:39:45.220 Amelia spoke of a gathering that had been organized by a small political group.
00:39:49.860 They would come together and protest the changes that Britain has been through in the last few years
00:39:54.160 and the erosion of British values.
00:39:56.560 Amelia talked about the banners and the pickets that her friends had made for the events
00:40:00.520 and expressed a real regret that she could not go, begging Charlie to go in her place.
00:40:06.780 Charlie has nothing to do this weekend.
00:40:08.760 And a protest sounded like an interesting thing to take part in.
00:40:12.380 I wouldn't want to go because it's annoying.
00:40:15.880 Decline the offer risking your friendship.
00:40:17.520 Tell your friend you watch from the sidelines.
00:40:19.200 Let them know what happens.
00:40:22.160 I'm not.
00:40:22.720 I wouldn't join that shit, bro.
00:40:23.980 Like, what if somebody tries to drive a car through that?
00:40:25.720 This is Europe, man.
00:40:26.640 Yeah, I'll watch it happen.
00:40:28.020 I'm not going to get involved in this.
00:40:29.480 Charlie went to the protest in the end.
00:40:32.100 It wasn't what they expected.
00:40:33.380 They noticed the protest seemed to be more about racism and anti-immigration than British values.
00:40:38.800 You don't have to repeat yourself.
00:40:40.800 Charlie grew increasingly uncomfortable as the crowd became angrier.
00:40:45.280 Police arrived after a few violent outbreaks.
00:40:48.960 Before they knew it, they were running away from the scene.
00:40:52.860 Some protesters next to Charlie had drawn the attention of the police.
00:40:56.420 Charlie had only been here to observe, but the line between observing and participating was too easy to cross.
00:41:03.860 Charlie thought about all the difficult choices they had needed to make in the last few weeks.
00:41:08.180 Some of their choices had led to changes in friendship, and Charlie was feeling low, as they weren't sure if they had made the right decisions.
00:41:16.040 The teacher had noticed this and decided to reach out.
00:41:19.320 The teacher sat with them and talked openly and frankly about the ideology that Charlie had discovered.
00:41:24.520 The teacher reassured that Charlie had made the right decisions.
00:41:28.280 Charlie realized that if they had chosen to engage with these harmful ideas, the consequences would have been very different.
00:41:35.440 Charlie accepted they may need support.
00:41:40.700 Charlie's teacher made a prevent referral, and they were able to provide tailored support for Charlie.
00:41:46.340 They began a workshop that helped them learn how to engage positively in ideology.
00:41:49.700 He's being re-educated.
00:41:50.840 And the differences between right and wrong in expressing political belief.
00:41:55.180 What's funnier is they have her and her friend group walk away from you, if you don't humor her by liking it in the last one.
00:42:02.600 I don't know that this way.
00:42:03.940 Her friend group is a black woman and one white guy and this goth girl, right?
00:42:09.600 This is what happens when you try to like DEI your own video game, except your video game is about evil radicals.
00:42:17.180 No, no, no, no, no, no, no, no, no. I think this is the point. They want to remind you that even a black person in the UK may say something like, immigrants are taking our jobs.
00:42:25.680 But I thought, oh, interesting. Yeah.
00:42:28.360 So they want to remind you of how dangerous that can be.
00:42:31.020 So I love that he gets to the protest and now it's like, uh-oh.
00:42:33.900 So he immediately regrets coming if he goes there just to like view it, right?
00:42:38.060 Because he's like, oh, this is, this is nothing like what I thought the protest was going to be about like British values.
00:42:43.520 But everyone here is racist and violent.
00:42:47.540 Now, of course, if you look at acts of violence committed for political reasons over the past two years, they've been overwhelmingly committed by leftists.
00:42:55.760 This is just like a factual thing you can easily check.
00:42:58.680 And then there's been some studies that try to quote unquote show that like rightists actually commit more violence.
00:43:03.840 These studies, if you, if you look at them, um, because a lot of people have done an analysis of them, they basically don't count anything as left-wing violence.
00:43:11.640 And they try to count anything as right-wing violence in the same way that like, remember the guy who tried to assassinate Trump and then it later comes out.
00:43:18.420 And everyone was like, oh, we don't know what his politics are.
00:43:20.540 He could be right-leaning.
00:43:21.560 And then it comes out that, you know, oh, actually he was like super hardcore left-leaning.
00:43:26.600 And this was leaked by Tucker.
00:43:28.120 Watch our episode on that.
00:43:29.620 And then a bunch of people said he was a furry and he wasn't a furry.
00:43:31.940 And we go into an analysis of that because this is the only episode.
00:43:35.840 This is a podcast where you hit hard-hitting analysis of people's porn preferences.
00:43:40.140 And I point out that just because he has some furry porn that fits some other fetish that he has doesn't mean he's actually a member of the furry community.
00:43:49.260 I try to, you know, there's right-wing furries, right?
00:43:57.140 There's no reason we have to.
00:43:59.700 Yeah, come on.
00:44:00.500 Don't drag the furries.
00:44:02.720 Be nice.
00:44:03.720 I mean, we do say that the furry community has some challenges and I would be worried if my kids went into it because you got the whole Therian thing and that gets you into like trans identitarian politics.
00:44:12.180 And I don't think that's good, right?
00:44:14.220 But I'm just pointing out here, you don't have to be reflectively antagonistic against them.
00:44:17.880 But I think that it's better if when people go online and they hear, you know, Hassan talking or something like that, like big left-wing streamers and they're like, oh, furries are weird, pathetic, whatever.
00:44:29.300 And then they hear right-wing people and like we're being inclusive about this stuff and that can help break them from their bubble.
00:44:34.320 They can be like, wait, wait, what?
00:44:35.700 Like the left-wingers reflectively make fun of us whenever we come up and the right-wingers are like, yeah, whatever, right?
00:44:42.740 Like you do your thing.
00:44:44.900 I think that that's important, right?
00:44:46.220 But that makes me the Amelia, right?
00:44:48.300 Like I'm the scary goth guy, right?
00:44:50.280 You know, I wear this unassuming.
00:44:52.440 A lot of people didn't ask me where I got this sweater.
00:44:54.360 Where did I get this sweater?
00:44:55.240 Future day because this is a future month for us.
00:44:57.900 You can get it on eBay.
00:44:59.200 That's where I found it.
00:45:00.640 And you have to kind of wait until you find a size that's available.
00:45:03.920 But we were surprised by the quality.
00:45:06.040 Oh, yes.
00:45:06.620 It's actually really high quality and we got it on eBay and I was trying to find a science or galaxy-themed Christmas sweater and I just couldn't anywhere.
00:45:13.240 I was really surprised.
00:45:14.180 Yeah, and then you assign it to me and she gets it done, gets the iridium sweater.
00:45:18.120 On Amazon, I couldn't.
00:45:19.400 And it's shockingly high quality.
00:45:21.040 It might be one of the most high-quality sweaters I have.
00:45:22.820 That's the crazy thing, yeah.
00:45:24.060 Because the other Christmas sweaters I've got you are like $135, $75.
00:45:31.400 This was $35.
00:45:32.600 Actually, Simone, could you get another one of these just so we have one in case they like stop making it or something?
00:45:38.100 So the sizes are really limited.
00:45:39.600 I'll see what I can get now that we've mentioned it on the podcast.
00:45:42.820 I'll try to go on tonight before we run it.
00:45:44.580 Sorry, guys, but I'm sure more will come to stock.
00:45:47.760 But anyway, now we're shilling random sweater.
00:45:50.920 We got to make merch.
00:45:51.500 I know, we even know like affiliate stuff.
00:45:54.260 I got to make some science, some future day theme.
00:45:57.280 The problem with so many of the platforms we like, like you're surrounded by Cult of Athena merch.
00:46:01.660 I'm wearing a Cult of Athena thing.
00:46:03.820 Like they don't, they're a platform, you know, like they don't really have the margins.
00:46:08.820 They make all the shield and the stuff back here that you see.
00:46:12.080 They're really good if you want like recreation swords and stuff.
00:46:15.360 No, but also recreation clothes, like scabbards for your swords, gloves or gauntlets, chain mail.
00:46:22.920 They have great chain mail options.
00:46:24.640 So you might be asking, why do I need, why do I need swords?
00:46:27.200 And I'm like, are you not a man?
00:46:29.540 Do you not have swords already?
00:46:31.500 Because, oh.
00:46:32.860 This might actually be like a good thing to go into for a separate episode.
00:46:35.680 But like something genetically in men, like when you hold a sword, it just.
00:46:39.120 It completes you.
00:46:40.560 It's good.
00:46:41.380 It feels good, right?
00:46:42.740 Like you're like, oh yeah.
00:46:44.960 With his guns too.
00:46:45.960 I don't know if you get that feeling when you like hold a gun.
00:46:48.600 Gun, no.
00:46:49.380 Rifle, yes.
00:46:50.840 Rifle.
00:46:51.360 Yeah, okay.
00:46:51.640 It feels totally right about rifles, but not guns.
00:46:55.740 To continue here, what I find interesting is they try to like reframe this protest as like this horrible thing, right?
00:47:03.640 That like is going to completely F you up and that you're going to get arrested.
00:47:07.980 And that if you even watch right-wing protesters, if you even just like sit there and observe a protest, you're going to get arrested or your life will be ruined, right?
00:47:15.520 Like they do not want you to see and they want kids to grow up fearing even seeing alternative perspectives.
00:47:21.520 It reminds me, and this was a while ago, I was talking with some of my younger cousins about like I had said something, oh, you know, well, the trans whatever thing.
00:47:29.260 And I was much more small C conservative in the way I talked about trans stuff back then.
00:47:34.160 I didn't say anything that was too offensive.
00:47:35.680 And I remember one of my young cousins be like, be careful saying stuff like that.
00:47:40.380 Like we're not allowed to, and it was very much like she had been taught as like a kid, like a young teenager, you've got to be really careful about saying anything that could be seen as critical of these communities.
00:47:50.720 And this is clearly where this comes from.
00:47:53.620 The final thing here is they then go into, and well, I'll play the part here about like how they're going to deprogram you and that you're going to go to workshops and you're going to go to, you know, and I thought that this was really messed up as well, right?
00:48:07.560 Like this idea of like you, like the services that they provide.
00:48:13.520 Basically, you have a minder that you need to meet with instead of having friends.
00:48:17.760 They began a workshop that helped Charlie receive counseling to help them process the other events in their life.
00:48:23.940 Wait, he had to go to counseling?
00:48:25.180 And he felt much better for it.
00:48:26.500 Because he liked the YouTube video, he had to go to counseling?
00:48:29.840 It was decided that Charlie was at borderline risk of radicalization, given their proximity to it, which helped address some of the issues.
00:48:37.560 at the heart of being influenced by harmful ideologies.
00:48:41.160 They started taking up new skills and hobbies and made new friends.
00:48:45.080 Prevent does not target any specific demographic, community, or ideology.
00:48:50.020 It sure seems like it does.
00:48:51.320 And the support the author is available to anyone who may be motivated by an ideology to behave in an extreme manner.
00:48:57.160 And at the end of the day, they then show him making friends with all new people.
00:49:00.480 Because they're like, oh, you've got to get rid of your old friends and replace them with new friends, right?
00:49:04.620 Like, that's the only way that you can be right thing.
00:49:08.320 Gosh.
00:49:09.340 And so it's sad.
00:49:10.380 It's sad.
00:49:10.980 It messed up.
00:49:12.120 And what's wild to me is I didn't think initially about doing a podcast on this.
00:49:19.520 And I'm really glad you suggested it.
00:49:20.680 Because I didn't play through the game.
00:49:21.700 I just saw it and I was like, no one is going to fall for this.
00:49:25.520 No one is going to do this.
00:49:26.620 Anyone?
00:49:27.260 I feel, and I do really wonder, and I don't know how we can measure it, but I wonder what the fallout from this game will be.
00:49:34.800 Because I feel like any person who plays this, even if they're young and fairly credulous, is going to see immediately through it.
00:49:46.240 That every reasonable and not insane response is met with punishment and nagging?
00:49:54.980 To not see that as dystopian?
00:49:57.420 Even the Harry Potter books that have similar styles of scenarios, Miss Umbridge being the embodiment of this kind of behavior.
00:50:07.580 I guess maybe Harry Potters are now banned in the UK as turf material or whatever.
00:50:13.920 But I just feel like common, very popular, fairly recent kids' stories also even make it clear to young people that this is insane dystopian behavior that you should be concerned about.
00:50:26.980 And certainly the recent popularity of teen dystopias before it became all romanticy and fantasy should have inoculated young people against this.
00:50:36.220 So who are they kidding?
00:50:38.120 And how is this not going to backfire by making kids believe, okay, we're in the dystopia now.
00:50:43.260 It's official.
00:50:44.220 We're seeing it in our schools.
00:50:46.000 Do you think people are going to fall for this?
00:50:48.400 How can they possibly go through this game?
00:50:50.180 Young kids get exposed to this.
00:50:52.320 And part of the point of this, if you watch it, is to scare you.
00:50:56.000 Right?
00:50:56.060 Like, even if you doubt it.
00:50:57.200 Okay, so it's not going to change your mind, but you're going to realize that you just can't talk about it.
00:51:01.980 If you, no, if you like one of these videos, like if you like a base camp video, okay, you could be arrested.
00:51:07.200 Okay, you're going to get added to secret.
00:51:08.680 So it's a chilling effect thing.
00:51:10.320 It's not really an indoctrination thing.
00:51:12.400 It's to create a chilling effect.
00:51:14.240 Well, yeah, it's meant to scare you.
00:51:17.000 It's meant to be like a terrifying thing that you...
00:51:21.540 I also think that there's a portion of the population that just, like, instinctively obeys authority.
00:51:26.000 And when I was growing up, these were the people, from my perspective, because I was always very confused by this, who just, like, believe the Sunday school version of the Bible.
00:51:34.520 They're just like, oh, yeah, Noah's Ark, right?
00:51:36.400 Like, he got, went around, found two versions of every animal of every species and put him on a boat.
00:51:43.040 And I remember I heard, I was like, that doesn't make sense.
00:51:46.960 Like, why do you, why are you saying something that doesn't pass the most basic of sanity checks?
00:51:53.480 Like, maybe there's some other interpretation of this, or maybe, you know, it means something other than boat.
00:52:00.020 Or maybe, you know, by putting them on the boat, he's doing something different than what we think of.
00:52:04.780 But I remember being so confused as a kid.
00:52:07.420 And this is why I went into psychology when I went into neuroscience,
00:52:09.240 because I wanted to understand why people think things that I couldn't understand why people thought.
00:52:13.620 And I think there's a certain portion of the population that just believes whatever the authorities around them tell them with angry faces.
00:52:20.080 And I think that that's, ironically, the people who were the silly version of Christianity when I was a kid,
00:52:29.580 they're the people who are the wokes these days.
00:52:31.520 And I think that that's why it took over the church so quickly and so early.
00:52:34.700 Because we pointed out a lot of wokeness spread from the church.
00:52:37.400 You watch this, and you, oh my god, I'm loving the Amelia pictures.
00:52:43.740 I'll put some in for people.
00:52:45.020 You watch this, and you are reminded, if you just want to live like a normal life,
00:52:50.780 if you want to disengage from politics, you must acquiesce to every far left talking point.
00:52:56.040 You are a racist if you say immigrants might be a problem.
00:53:01.080 Starmer's immigration policy might be damaging to the country.
00:53:04.400 And if you look at the percentage of immigrants post-Brexit that are coming from non-European countries versus European countries,
00:53:12.340 you just see this huge explosion.
00:53:13.780 They have just blown the doors off of the UK at this point.
00:53:17.540 The UK needs a far right shift if it's going to survive.
00:53:20.920 And it has maybe 10 years to have it happen at the rate immigrants are coming in at this point.
00:53:25.880 At which point it's just not going to have any hope, especially with its current birth rates,
00:53:31.300 of having any sort of persistent culture in the future.
00:53:33.760 Given that, you know, you already have, the left is so entrenched.
00:53:36.520 The left is entrenched within the educational system.
00:53:38.860 And the right, frankly, so incompetent in the UK,
00:53:42.000 that there is no one, you know, you can turn to that's not going to just put you in a stronger vice lock.
00:53:51.520 As I've pointed out, like, I like, like, Sargon of Akan, right?
00:53:55.960 Like, UK, smart guy.
00:53:58.900 Even he's for, like, banning pornography.
00:54:00.800 Like, he's talked about that.
00:54:01.980 Like, Louise Perry, another, could be a candidate to be a leading figure of the right in the UK.
00:54:06.980 I think, doesn't she kind of see herself as progressive, though?
00:54:11.360 Which makes her perfect.
00:54:12.700 I mean, like, the left far-right party is read by a lesbian and interracial marriage, right?
00:54:18.260 Like, you know, as we've said, this is the left of whatever the left used to be.
00:54:23.560 Like, that's the new right these days, okay?
00:54:26.000 Run by feminists.
00:54:27.680 It's just now they're called TERFs.
00:54:29.900 Good night.
00:54:31.460 Yeah, it's like, did you know that your radical feminist position
00:54:34.360 will one day be considered a far-right position?
00:54:37.300 And somebody's going to be like, wait, what?
00:54:39.280 Are they accepting of me?
00:54:40.820 Y'all, remarkably, yeah.
00:54:43.760 Anyway, love you, and good night to the UK.
00:54:48.900 Yeah, it was a fun ride.
00:54:52.000 It's a fun ride.
00:54:52.900 Have a good one.
00:54:53.760 But they're so pearl-clutchy to the right in the UK.
00:54:55.760 Like, I can't, I can't.
00:54:57.780 Have a good one.
00:54:59.060 Love you.
00:55:00.460 What really bothers me is they keep saying gay.
00:55:04.360 Well, so the reason that they're doing that, because other people complained about this
00:55:07.700 as well, is if you watch both versions of the video, and I've done both versions,
00:55:12.840 I did a playthrough of Choosing Every Choice.
00:55:14.580 Oh, yeah.
00:55:14.880 The character is called Charlie, whether you choose a male or a female.
00:55:18.540 So the reason they're using they is so that they can have the same audio,
00:55:22.060 regardless of the gender choice the character makes.
00:55:24.000 I see.
00:55:24.600 It's not to be woke or something.
00:55:27.300 Literally, it's bad writing, then.
00:55:29.800 There are ways to write something that avoid pronouns.
00:55:32.360 It's frequently done in cheap video games to allow.
00:55:37.540 This was probably government contracted.
00:55:39.560 They got a ton of money.
00:55:41.320 Yeah.
00:55:42.220 They got a ton of money, which we'll go into.
00:55:44.340 I mean, I could have made this in like a week.
00:55:46.840 Even vibe-goating.
00:55:49.220 And it would have been a lot.
00:55:51.100 ArtFab AI is finally stable for people who want to use it now.
00:55:54.040 It's cool.
00:55:55.440 Except for my constant updates, because I update it like four times a day,
00:55:58.680 which causes issues for some users.
00:56:00.380 But I think, you know, people prefer that.
00:56:03.720 Getting something that's like actually updating from week to week,
00:56:07.080 and you go to it again, and you're like,
00:56:08.300 whoa, this is way better than last week.
00:56:10.760 All right.
00:56:18.820 It's a what?
00:56:22.080 It's a...
00:56:22.760 Okay.
00:56:23.680 We'll just let you finish.
00:56:25.980 It said it's a tomato triceratops.
00:56:29.280 No.
00:56:30.000 She said it's another triceratops.
00:56:31.980 Oh.
00:56:32.980 Are you making a new pen?
00:56:35.020 Yeah, another pen.
00:56:37.100 Oh, you love to drive that triceratops
00:56:39.360 and we can build your dinosaur pen.
00:56:41.960 Here, can I make it bigger?
00:56:43.080 Can we open it up right here?
00:56:44.640 No.
00:56:46.040 Look, look, look.
00:56:47.460 Which is exactly how triceratops sound.
00:56:51.660 I'll do that.
00:56:52.900 Oh, you need to...
00:56:53.740 Oh, well.
00:56:56.520 You cannot do that at all.
00:56:58.340 What are you up to, girl?
00:57:07.480 I don't know.
00:57:08.680 Turkey?
00:57:14.240 I love you guys.
00:57:15.640 I love you.
00:57:22.520 I love you.
00:57:25.040 I love you.
00:57:26.240 I love you.
00:57:27.060 I love you.
00:57:28.040 I love you.
00:57:29.520 I love you.
00:57:30.000 I love you.
00:57:30.940 I love you.
00:57:31.640 I love you.
00:57:32.440 What about you?
00:57:33.280 I love you.
00:57:33.900 I love you.
00:57:35.100 You're love you.
00:57:36.020 Bye-bye.
00:57:36.940 I love you.
00:57:37.520 I love you.
00:57:37.940 I love you.
00:57:39.160 I love you.
00:57:39.420 I love you.
00:57:39.920 I love you.
00:57:40.540 I love you.
00:57:41.340 I love you.
00:57:41.400 I love you.
00:57:43.100 I love you.