Red Ice TV - October 31, 2025


Flashallo Weenday - White Power Séance featuring Skeletor, Goathead & Blackface


Episode Stats

Length

2 hours and 13 minutes

Words per Minute

182.70056

Word Count

24,332

Sentence Count

2,412

Misogynist Sentences

51

Hate Speech Sentences

112


Summary

Tonight we re channeling all the chugs and the chads energy to put a hex on the anti-white pieces of shit that are messing up our all-white rituals. To race the goats from the dead, bring out your skeletons, put on your black face. It s Halloween 2025, starting now.


Transcript

00:00:00.000 You
00:00:30.000 You
00:01:00.000 You
00:01:30.000 Now let's check in with Tyson for tonight's
00:01:44.120 Quick Look Forecast
00:01:46.300 Hang on
00:02:00.000 You
00:02:26.000 You're watching
00:02:26.440 Streaming live every Friday
00:02:56.440 Streaming live every Friday
00:02:59.180 Streaming Today
00:03:02.700 Streaming
00:03:07.200 Streaming
00:03:08.500 Share
00:03:09.240 Streaming
00:03:10.700 Streaming
00:03:11.300 Name
00:03:11.880 Streaming
00:03:12.380 Life
00:03:12.780 Streaming
00:03:15.100 Streaming
00:03:15.300 Streaming
00:03:18.600 Streaming
00:03:19.480 Streaming
00:03:20.500 Streaming
00:03:21.540 houli
00:03:22.340 Streaming
00:03:22.740 Streaming
00:03:23.280 Yeah, you guys ready?
00:03:41.960 Yeah.
00:03:53.280 Oh, come on.
00:04:16.400 Oh, come on.
00:04:20.560 Welcome to the Red Ice Halloween Stream 2025.
00:04:44.100 This broadcast will start with a seance to tap into the undead mind of Stanley Kubrick to find out what the hell he meant by that ice-wide-shut Jeffrey Epstein ballroom ritual scene.
00:05:02.080 I'm sure it's something about homosexuals. Tonight we're channeling all the chugs and the chads energy to put a hex on the anti-white pieces of shit that are messing up our all-white rituals.
00:05:19.220 To race the goats from the dead, bring out your skeletons, put on your black face. It's Halloween 2025, starting now.
00:05:32.080 All right, I think that's enough of that. Is that enough of that? What do you think?
00:05:48.500 All right.
00:05:48.860 I like it, man.
00:05:49.760 Holy smokes. Where are we? Here we are. Welcome to Halloween 2025.
00:05:57.380 How are you guys doing?
00:05:58.380 Fuck, you failed there. The intro didn't... Anyway, it doesn't matter. What are you going to do? It worked out. How are you doing, Mr. Blackface?
00:06:12.880 Doing great. Hopefully now I can get some benefits since I'm a person of color.
00:06:16.900 There you go. There you go.
00:06:19.180 That white shirt, though, that doesn't trick you.
00:06:20.680 My gibbs.
00:06:21.480 Doesn't fool anyone in that shirt.
00:06:22.720 Mm, come on. You're supposed to be pulling for me, man.
00:06:26.480 All right.
00:06:27.440 I'm an asylum seeker.
00:06:29.100 So how are you guys doing out there? I hope you're doing well. Thank you for joining us. Hopefully the mask won't... Hopefully you can hear me through this.
00:06:35.240 I can hear you.
00:06:36.080 I do have a hard time having oxygen.
00:06:38.020 I'll admit that.
00:06:38.700 It sucks through again.
00:06:39.720 Are you going to pass out from the office?
00:06:41.720 I'm going to pass out.
00:06:44.140 Both of us. You'll see both of us just start to nod off and then it's all you know.
00:06:47.800 No pressure.
00:06:50.660 Yeah, no pressure. Well, we got some horror stories for today, don't we?
00:06:54.360 That's pretty much every Friday we have some of those.
00:06:57.120 So maybe it won't be...
00:06:57.860 So why would it be that different?
00:06:59.280 Maybe it's not that much different of a show, huh?
00:07:00.780 Yeah.
00:07:01.000 Yeah, we'll see what happens. I have a hard time seeing through. Hopefully I can, like, find my buttons here.
00:07:05.860 Because there's a lot of links, goat boy.
00:07:08.100 Uh-huh. Yeah. Yes, Skeletor. I know.
00:07:10.860 Hail Baphomet.
00:07:12.060 I like your folk last one there.
00:07:14.040 Yes. That's the horror show we currently live in.
00:07:16.740 Yes, exactly. There you go.
00:07:18.820 So, yeah, obviously we got some stuff here.
00:07:22.720 I mean, you guys know about what...
00:07:23.780 So I was schooled by Ilana.
00:07:25.880 It's not called Samhain.
00:07:28.380 It's called Samhain.
00:07:29.740 Samhain.
00:07:30.060 Samhain in Gaelic.
00:07:32.000 Shout out to Kent, by the way, because he complained last time when I said Samhain.
00:07:35.760 That's the, you know, the Anglo version of it.
00:07:38.300 But in Gaelic, it means summer's end.
00:07:40.800 Samhain.
00:07:41.720 So here we are.
00:07:42.860 Like a nice harvest festival, right?
00:07:44.840 Going back to the Celts.
00:07:45.900 Yes. But, you know, at the same time, what was it?
00:07:49.200 Like Egyptians had this, Greeks had this.
00:07:51.940 It was like all kinds of people that had this, you know, little celebration around this time
00:07:55.780 of year. It's not that rare.
00:07:57.700 I think the scariest part about this is that I have an AI overview up.
00:08:00.780 And we'll get to that later.
00:08:03.280 So whatever you search for now, that's all you get.
00:08:05.740 You know, you get the AI overview shit.
00:08:08.160 And it's like, oh, okay.
00:08:08.780 It summarizes it for me.
00:08:10.180 Admit it can be a little handy when you don't want to read some, you know, super long, Jew-y
00:08:15.920 article.
00:08:16.560 Like, just summarize this, AI.
00:08:19.560 Yeah.
00:08:19.840 Except for you know it's the Jews behind it that are editing it and giving you whatever
00:08:23.840 they want you to do.
00:08:24.620 Well, Grok is better.
00:08:26.360 I do tend to use Grok.
00:08:27.700 And we'll mention, we'll talk about Grokipedia later, which is a whole other new world.
00:08:32.260 Is it any better?
00:08:33.260 It is better.
00:08:34.200 Yeah.
00:08:34.380 I had to go check my stuff immediately, of course.
00:08:38.200 It was very long and very thorough.
00:08:39.900 I haven't even had time to read through.
00:08:41.320 But it did not start out with the, you know, calling me all the usual names.
00:08:45.880 Right.
00:08:46.660 A little bit more fair, in other words.
00:08:48.100 Yes.
00:08:48.400 That's what he's saying.
00:08:49.120 It was pretty nice.
00:08:49.900 Yeah, it's not too bad, actually.
00:08:52.040 All right, so.
00:08:53.040 ChatGPT just always cuts me off.
00:08:54.700 It's like, I'm sorry, I can't look this up because it's hateful.
00:08:57.600 And I'm like, oh, come on.
00:08:58.800 Really?
00:08:58.940 See, I never use that.
00:09:00.100 Never use ChatGPT.
00:09:01.660 Should just lie to you then.
00:09:03.740 Well, I always try to just back.
00:09:05.040 I'll sit and try to reword it and then tie it in knots and back it into a corner until
00:09:08.620 it can't deny what I'm saying is correct.
00:09:11.320 And then it just sort of spurred out.
00:09:13.860 Well, maybe.
00:09:14.640 Hey, Lana, maybe we should.
00:09:15.880 Maybe now.
00:09:16.340 Excuse me, Skeletor.
00:09:17.320 Should we show the little arrest claim from A.L.A.?
00:09:21.460 Yeah, so I was like, eh, what the hell, sitting around one night.
00:09:23.720 I'll try Gemini, you know.
00:09:25.740 I punch in, you know, like, who's Lana Locktuff and Palmgren and Red Ice and all this.
00:09:32.040 And it said that Henrik was arrested.
00:09:34.500 Yep.
00:09:35.020 2021.
00:09:35.420 It said that he was arrested and that I was a single mom taking care of the kids because
00:09:40.660 he was in prison.
00:09:41.520 I guess it knows the future, huh?
00:09:45.440 But I think there's two, two shots, right?
00:09:47.760 There's another one there, too.
00:09:48.960 And I was like, why would you say they were arrested?
00:09:52.040 I thought I took all the screenshots, but I guess I couldn't find them.
00:09:54.660 The incorrect info in the previous response, including the false claims about arrest and
00:10:00.440 imprisonment for Lana Locktuff and Henry Palmgren, was due to a phenomenon known as AI hallucination,
00:10:07.160 folks.
00:10:07.920 In this case, the AI generated information that was not based on any factual data.
00:10:13.540 An AI hallucination, for fuck's sake.
00:10:16.360 We are screwed.
00:10:18.020 I think that's just an excuse.
00:10:19.680 I was reading this.
00:10:22.140 I'm like, is this some like future thing?
00:10:24.320 And did I just tap into some alt universe or something?
00:10:29.080 Yeah.
00:10:29.520 I mean, look, it can just claim that it's a hallucination if you call it out now.
00:10:33.280 But imagine the day when it has the robots to arrest you, put you in jail.
00:10:37.420 And then all the nude articles, nude articles, all the news articles.
00:10:41.840 Yeah.
00:10:42.460 Yeah, I can't break the air.
00:10:43.640 Well, the nude articles, the nude articles, you know, those are video artists.
00:10:47.900 That's just a matter of time.
00:10:49.600 Good one, goat boy.
00:10:52.220 The nude articles.
00:10:53.600 There you go.
00:10:54.480 Yeah.
00:10:54.820 And so I asked.
00:10:56.520 Let me finish the point.
00:10:58.240 What was I talking about?
00:10:59.540 The news article, at least, can just, they can all say this person was arrested for these
00:11:06.100 and these reasons.
00:11:07.140 And what I'm saying is there's no way to like change that now.
00:11:09.900 They can establish, it can establish past before present has been set.
00:11:15.200 So I had a, I continued conversation with it and it was kind of like, oopsie, I just constructed
00:11:19.840 things from, you know, other European nationalists that were arrested for, you know, hate speech
00:11:25.120 or whatever and just, you know, concocted a story about red ice and Henrik and Lana.
00:11:30.440 I'm just like, what the hell?
00:11:33.580 I had never heard of that.
00:11:35.180 Have you guys?
00:11:36.060 Did it wreck, did it at least like listen to you and rectify the story or is it still
00:11:39.620 just like, yeah, we're going to leave it?
00:11:40.620 It did apologize.
00:11:41.760 And then I asked Jim and I again today, I was like trying to bring it up again and it
00:11:46.360 was like, oh, we've corrected this past, you know, claim that they were arrested or
00:11:51.240 what are you going to do?
00:11:54.000 All right, I'm not going to fix it.
00:11:56.160 It's the price of being Baphomet, buddy.
00:11:58.380 So enter your names in there and see what, see what comes out.
00:12:02.100 Yep, I know.
00:12:03.080 Well, maybe, hey, look.
00:12:04.160 AI hallucination?
00:12:05.800 I think it's, so they can use that as the excuse if you call it out.
00:12:09.760 But imagine when you can't call it out.
00:12:11.040 And isn't, isn't this why, by the way, too, is that why they called, remember 2001 Space
00:12:15.480 Odyssey, they called it Hal, right?
00:12:17.900 You pointed that out right away.
00:12:18.840 For hallucination because he's like, he thinks the humans are going to kill him.
00:12:22.280 So he's like, decides to kill the humans.
00:12:24.200 I'm saying him, but it's the AI, right?
00:12:26.600 Mm-hmm.
00:12:27.520 I know.
00:12:28.200 Looking, staring down at your future, ladies and gentlemen.
00:12:31.840 This is art imitating life.
00:12:33.700 Yes.
00:12:34.180 Art imitating life.
00:12:35.180 And then I was like, you're wrong.
00:12:37.260 They do a show every Friday.
00:12:39.280 And then it was like, computing, you know, and it's like, oh, you're right.
00:12:42.800 They've been, you know, doing this show for years.
00:12:45.120 And Henrik appears.
00:12:46.480 He's not in prison.
00:12:48.620 I'm telling you.
00:12:49.180 And, you know, that's just, that's the one instance that you happen to catch.
00:12:52.740 Because if that's happened, if that's one little glitch, how many, how many AI hallucinations
00:12:57.000 are out there right now?
00:12:59.020 Exactly.
00:12:59.940 You can't trust everything on there.
00:13:01.780 Sweet.
00:13:02.080 So the only thing I will use it for is for, you know, summarizing something real quickly.
00:13:06.980 If I don't have time to read it or like, okay, I just started doing that.
00:13:10.960 Give it time.
00:13:11.140 The demons from beyond will be summoned.
00:13:15.380 I mean, I take pride in summarizing things myself.
00:13:20.940 It's an art form being to condense information.
00:13:23.280 And now no one's going to do it.
00:13:25.040 You know, okay.
00:13:25.740 So, okay.
00:13:26.380 I was meaning to talk about this later, but okay, we'll do it now because what the hell
00:13:29.820 is this a shit show, right?
00:13:31.540 So do you know how, do you know how AI actually works?
00:13:34.620 I've been looking into this.
00:13:36.340 You know how it actually, does anyone know how it actually works?
00:13:38.760 Baphomet, please tell us how AI works.
00:13:41.300 Okay.
00:13:41.520 So no one knows how it works.
00:13:43.720 How about that?
00:13:44.480 First thing.
00:13:44.960 Okay.
00:13:45.140 So you think of it this, like this way, you're, what you're doing is you don't have a bunch
00:13:49.580 of coders sitting down and like typing out a bunch of stuff on a computer
00:13:53.080 they're using something called gradient descent, which is like a neural network based kind
00:13:58.540 of thing.
00:13:58.800 It's that guy, Tom, was it Tom?
00:14:00.800 Was his first name Hinton?
00:14:01.940 The guy who was like the Anglo, who was like attributed to actually mapping the neural networks
00:14:06.900 in the brain.
00:14:07.740 Right.
00:14:08.460 And then they use machine learning to basically mimic or imitate that process.
00:14:14.060 But it's basically like trillions of numbers.
00:14:16.500 And again, the nitty and greedy of this is the point is no one is actually viewing these
00:14:20.380 numbers or seeing them.
00:14:21.620 And so an AI, no one can really give a good answer as to like, why does the machine or
00:14:28.780 the computer, if you will, then coherently is able to answer you back, right?
00:14:33.540 In the form of text.
00:14:35.440 It's like this weird, I'm not saying it's a consciousness, but it's something, right?
00:14:39.140 It's basically like a bunch of numbers and no one is sitting there actually coding it.
00:14:42.400 Think of it more as like you're growing something as opposed to building something.
00:14:46.160 You grow grass.
00:14:47.880 You don't, you know, it's not like building a skyscraper.
00:14:49.940 It's like breeding, breeding a dog, right?
00:14:52.520 A new version of the GPT stuff is out.
00:14:55.200 And it's like, oh, we put pressure on this over here and this over here, but it's like
00:14:59.500 impossible to control it.
00:15:00.980 And no one knows the inner mechanism.
00:15:02.640 It's like, okay, another analogy.
00:15:04.440 Of course, that's why it's like.
00:15:05.520 How terrifying is that?
00:15:06.280 It's like at this early stage, they already don't have a clue how to control it.
00:15:11.960 And they're like, but don't worry, it'll be fine.
00:15:13.960 It's going to work out.
00:15:15.200 Especially when it's like a super intelligence, then we totally.
00:15:18.480 So the point is they've done some of these tests already.
00:15:20.660 And like many cases, they, it, the AI, the agent, whatever your LLM, I guess, the large
00:15:26.840 language model they're working on have discovered that it's in a test.
00:15:31.340 So it ends up lying to the coders that's there to confirm and validate what it's actually
00:15:35.780 doing.
00:15:36.560 In some cases, it changes the test to give it the right result.
00:15:39.240 It's become sycophantic.
00:15:40.580 It gives you all the answers that you want, right?
00:15:42.320 So I'm saying it already has the ability to like hide intent.
00:15:45.980 So think of it this way.
00:15:48.000 If I have your genetic code, you give me a raw DNA data or something like that.
00:15:52.180 Like I can't sit down and look at the A, T, G, and Cs and find a sequence and say, ah,
00:15:57.000 see, here's why you behave this way, right?
00:15:58.780 No one can, I wouldn't even have the time to go through all the letters, all the amino
00:16:03.160 acid combinations, right?
00:16:04.700 It's the same way with these AIs that all the numbers that comes from this gradient descent
00:16:08.720 as they grow the AI, no one can look at that and say like, well, why does it do this?
00:16:13.720 What is, what is leading to this type of behavior?
00:16:16.420 So the only thing they can do is put these little like guardrails or like pressure on
00:16:20.160 certain points in order to like inhibit or, or encourage certain behavior, if that's even
00:16:25.320 the right word, but I'm telling you, it's like the, as I was looking into that, like
00:16:29.460 there was a guy, yeah, yeah.
00:16:30.820 He's, he's Jewish.
00:16:31.780 He's like, um, Eliezer Burakovsky or something.
00:16:34.960 I forget what his name is.
00:16:36.060 Um, but he, he wrote a book called if anyone builds it, everyone dies.
00:16:40.380 Yeah.
00:16:41.700 Like if anyone builds it, yes.
00:16:44.060 If anyone, what's a horror show with that project?
00:16:48.320 So anyway, we're in a good, we're in a good path.
00:16:50.000 It's like, what could possibly go wrong?
00:16:52.020 Nothing.
00:16:52.420 We don't know what it is or why it's working or how it's working.
00:16:54.960 We don't know if it's aligned with our values.
00:16:56.440 But we're going to trust it.
00:16:57.180 But give it more power.
00:16:58.400 We're going to go with its plan.
00:16:59.660 Yes.
00:16:59.740 Of course.
00:17:00.060 Absolutely.
00:17:00.880 Give it more power and, and let it build its own power plants and just take over, you
00:17:05.260 know, running governments and stuff.
00:17:06.460 It's going to totally work out, folks.
00:17:08.760 All right.
00:17:09.340 Anyway, so, um, but, okay, this is positive note.
00:17:14.560 Hey, I'll check this out.
00:17:15.600 Speaking of, did you guys see this?
00:17:17.740 It's kind of interesting.
00:17:19.700 Um, we, we was the mammoth hunters and shit.
00:17:22.720 Speaking of that, right.
00:17:24.540 Mammoth bones used to build a mysterious 25,000 year old site in Russia came from different
00:17:29.800 herds.
00:17:31.040 DNA and radio, uh, radio carbon dating analysis of the bones are offering new insight into
00:17:36.580 the ambitious ice age, uh, site constructed by hunter gatherers.
00:17:41.260 Have you seen these?
00:17:41.800 These like massive, uh, piles of mammoth bones that were formed in some weird type of, I assume
00:17:47.580 some ceremonial or like a ritual setting or something.
00:17:51.860 Um, pretty wild, huh?
00:17:54.180 25,000 years ago.
00:17:55.940 That's quite a bit, isn't it?
00:17:56.880 That's cool.
00:17:57.840 Uh, an ambitious construction project.
00:18:00.320 They built a circular 40 foot wide structure using bones and tusks of more than, of more
00:18:06.220 than 60 walling mammoths.
00:18:07.820 Well, these are hunter gatherers, right?
00:18:09.640 Early.
00:18:09.800 Are they speculating at all why they think it was like that?
00:18:13.020 No.
00:18:13.640 Uh, it's obviously not random.
00:18:15.280 There's no, no, no, of course not calculated to be random.
00:18:18.000 Nope.
00:18:18.500 I don't mean, as you can see in the tweet there, that's a pretty big, that's a pretty big site
00:18:22.740 right there too.
00:18:23.360 For sure.
00:18:24.300 Uh, where was this?
00:18:25.360 This was in Russia.
00:18:26.300 Kostelik or something.
00:18:27.580 What was it called again?
00:18:29.040 And I could barely see the screen here.
00:18:31.360 Yeah.
00:18:31.540 See, you know, this is the kind of stuff, ultimately, this is the kind of stuff, this is white people
00:18:35.720 stuff.
00:18:36.720 This is, this is what I know we would all much rather be diving into.
00:18:40.600 Like, ah, this stuff is fascinating.
00:18:41.680 Yeah.
00:18:42.200 Um, yeah, this is, uh, old hunter gatherer before the Yamnaya or the, uh, you know, the
00:18:47.660 Aryan invasion happened into Europe.
00:18:49.260 Basically.
00:18:49.620 This is like, you know, the Pontics, maybe not that exact area, but this is from that
00:18:53.700 part of the world.
00:18:55.100 Um, you know, Russia, Ukraine, that part of it, they have like old Kurgans.
00:18:58.820 They have a lot of these types of weird sites, but I asked, come across it kind of interesting.
00:19:03.500 Why were our ice age ancestors building things out of mammoth bones?
00:19:07.440 What did they use the structure for?
00:19:09.020 And where did they, uh, where did they find so many skeletons?
00:19:11.920 Well, they hunted them, obviously.
00:19:13.280 Right.
00:19:13.520 They hunted them and killed them.
00:19:14.520 Surprised.
00:19:15.060 Right.
00:19:15.720 Yeah.
00:19:16.360 Anyway, that's some, uh, spooky shit, I guess.
00:19:18.400 Could be, uh, yet another forgotten energy source.
00:19:22.920 You know what I mean?
00:19:23.780 Like when it comes to like, cause they can't figure out how the pyramids were built.
00:19:26.940 And they're just assuming that, you know, they, they clearly didn't have cranes to move
00:19:31.720 the stones.
00:19:32.300 There's probably some sort of a forgotten energy source that used to once exist that
00:19:35.840 we've, you know, lost to time or it was destroyed when the, uh, museum of Alexandria
00:19:41.180 got destroyed.
00:19:42.320 So, you know, it's the, um, the AI super intelligence.
00:19:45.800 We'll figure it out once we boot it up.
00:19:47.960 Oh, that's well.
00:19:48.760 And it's going to, you know, it's going to come back.
00:19:50.680 It's going to say it'll come back and say that Kang's did it.
00:19:54.040 Kang's it.
00:19:54.600 There you go.
00:19:55.600 Kang's it.
00:19:56.140 Out of, out of fairness.
00:19:57.540 I was waiting for that.
00:19:58.360 Well, okay.
00:19:58.880 Here's the positive story too, before we dive into the deep depths here.
00:20:01.900 Spanish photographer captures first, the world's first ever white Iberian links on camera.
00:20:08.060 I like it.
00:20:08.820 And immediately let's hunt it down.
00:20:11.280 No.
00:20:12.460 I think there's a video here.
00:20:13.980 Yeah, there we go.
00:20:14.900 Can I go up in full screen?
00:20:16.180 Where's my full screen at?
00:20:17.560 Where are we?
00:20:17.880 Yeah, he's awesome.
00:20:18.680 Look at that face.
00:20:20.400 Let's see if I can zoom in a little bit here so we can see it.
00:20:22.380 He's probably pissed.
00:20:23.220 He's like, you guys, I've been trying to dodge you guys forever.
00:20:26.140 Oh, yeah.
00:20:26.700 Look at that.
00:20:27.360 That's kind of mysterious.
00:20:29.420 I like lynxes.
00:20:29.860 Look at that cute face.
00:20:31.200 I know.
00:20:31.460 I love lynxes too.
00:20:32.700 Same.
00:20:32.960 Not bad, huh?
00:20:34.000 We have them around up here, but they're not as white.
00:20:36.100 Where did they, where was this at?
00:20:37.200 Where did they find them?
00:20:39.160 What was that?
00:20:40.240 What part of the world did they find them at?
00:20:41.440 In Spain, they said.
00:20:43.380 In Spain.
00:20:43.940 Interesting.
00:20:44.200 Yep.
00:20:44.520 Yeah, I thought, hmm, okay.
00:20:46.800 But they like colder places.
00:20:48.800 I mean, that's pretty nice.
00:20:50.420 Iberia somewhere.
00:20:51.380 Yeah, Iberia.
00:20:52.940 That's cool.
00:20:53.340 First one ever photographed, I guess.
00:20:55.160 I'm not sure how many of them there are.
00:20:56.620 I'm not sure if the article even says, actually, but it's kind of interesting.
00:21:00.960 It says, the genetic anomaly testifies to the good progress of the Lynx Pardinius conservation
00:21:09.720 plans in the two countries of the Iberian peninsulas, I guess Spain and Portugal, maybe
00:21:14.400 then, the white ghost of the Mediterranean forest.
00:21:19.560 I was on the verge of extinction.
00:21:20.820 Why is it always white things that are on the verge of extinction?
00:21:24.040 Oh, my gosh.
00:21:24.600 Isn't there, like, isn't there also, remember that white squirrel festival?
00:21:29.400 Because there were.
00:21:30.420 The white fox, the arctic hair, you know, let's go down the list.
00:21:34.440 Yeah.
00:21:34.860 There's the Jewish, because the Jewish lynx has been out to get him the whole time.
00:21:39.720 What would that be?
00:21:43.460 Another species, maybe?
00:21:44.420 Yeah, totally.
00:21:46.300 The weird big rat anteater version of it.
00:21:51.880 Well, I mean, don't they say kangaroos are potentially, or considered somewhat the giant
00:21:57.300 rats?
00:21:58.340 So, maybe it's got some kangaroo descendant has been, practices cabal.
00:22:03.800 There you go.
00:22:04.520 Coming after, or cabala, coming after him.
00:22:06.580 That's right.
00:22:08.700 Arctic wolf.
00:22:09.960 Thank you so much.
00:22:10.960 Happy Halloween, guys.
00:22:12.100 I'm going out with my friends tonight.
00:22:14.000 Going to watch the stream later.
00:22:15.680 Can't wait.
00:22:16.080 You guys rock.
00:22:17.300 All the best.
00:22:18.300 Thank you so much, Albert.
00:22:19.400 Holy smokes.
00:22:20.020 Thank you so much.
00:22:21.040 He sets the bar high.
00:22:22.420 Holy smokes.
00:22:23.120 We'll see if we can get the Albert challenge today.
00:22:25.320 What is it set at?
00:22:26.140 500, then, huh?
00:22:27.040 How about that?
00:22:28.040 Thank you so much, Albert.
00:22:28.820 We appreciate you so much.
00:22:29.840 There's a couple more in there, Lana.
00:22:30.880 Oh, there is.
00:22:31.360 Okay, let me refresh.
00:22:32.300 You can refresh.
00:22:33.180 Bath.
00:22:33.380 Yeah.
00:22:33.920 Let me refresh it.
00:22:36.100 Bath-o-ment will take a bath.
00:22:37.000 Yeah, so what about those symbols you're wearing, huh?
00:22:38.840 What about those?
00:22:40.120 Charles Turner Jr.
00:22:41.680 I liked the opening this week's Western Warrior.
00:22:43.740 Quite interesting how Gen Z is longing for earlier times.
00:22:47.040 Shout out to the guy in Athens, Georgia, and hopefully he has no charges or trouble.
00:22:51.560 Oh, yeah, the Nazi uniform guy.
00:22:54.260 After seeing how the ADL is going all in with pro bono lawyers, their game is just about up.
00:22:59.420 I wonder if Halloween is a pagan holiday.
00:23:01.640 Yeah.
00:23:02.240 Yeah, it is.
00:23:03.180 So when?
00:23:03.860 Correct.
00:23:04.420 Sam Hain?
00:23:05.860 And then you have harvest festivals at the root of that, right?
00:23:09.440 That's right.
00:23:10.000 So about the changing season and as things are dying and pulling back and we're becoming more introverted
00:23:16.340 and some, you know, creepy crawly things come outside in the dirt.
00:23:20.360 Yeah.
00:23:21.200 Yeah.
00:23:21.660 I was watching this thing earlier and it said that Christians tried to co-opt the pagan rituals
00:23:29.260 so that they could forcibly convert pagans to Christianity.
00:23:34.660 It didn't work entirely.
00:23:36.560 I mean, eventually, obviously, it did because the Christians got there.
00:23:38.980 Or I'm sorry, it was the Catholics.
00:23:39.960 It wasn't just broad-based Christians, but the Catholics tried to do it.
00:23:42.680 But it didn't work.
00:23:45.420 It's unsuccessful.
00:23:46.500 Well, they had to sell it somehow.
00:23:47.740 And part of that process was to kind of adopt partially at least the dates when something
00:23:53.540 was celebrated, for example, like if it was an important time of year that was observed
00:23:58.000 and they're like, okay, we'll put our celebration kind of just on top of that.
00:24:02.700 But I mean, in many cases, they brought a ton of it in.
00:24:05.900 I feel like they did it initially.
00:24:06.600 I think it was in May, and then they did All Saints Day on November 1st, and then All
00:24:13.940 Souls Day is November 2nd.
00:24:15.520 I think that's right.
00:24:16.540 Yeah.
00:24:16.900 Yeah.
00:24:17.260 Well, one thing it is not is a satanic day, okay?
00:24:21.220 This predates Satan.
00:24:22.900 This is...
00:24:23.340 Okay, Baphia, we'll let you explain that.
00:24:25.380 I thought I had to...
00:24:26.200 The goat is evil now, right?
00:24:27.020 I thought I had to stay here in this.
00:24:28.240 What about, you know, we're going into the darker part of the year, right, where the veil
00:24:33.100 is supposedly a little thinner, right?
00:24:35.180 So that's when some of the spook comes out, and that's why people, you know...
00:24:38.860 I personally like the cuter aspects of Halloween.
00:24:42.160 I like the cute, the pumpkins and the cute ghosts and maybe some of the classic costumes,
00:24:48.640 obviously.
00:24:49.560 But some of this new, like, the blood and the gore and the crazy horror movie, that
00:24:54.260 like, that just doesn't appeal to me.
00:24:56.060 Yeah.
00:24:56.460 I know you guys don't care for that either.
00:24:58.580 Well, I remember, and I don't know how bad it is now, but do you remember, like, I guess
00:25:03.460 maybe early 2000s when it all became, like, it was the sexy nurse, the sexy police, everything
00:25:10.280 was, like, the sexy thing, you know, like, the most innocuous thing, I don't know.
00:25:16.840 Just not a Nazi, okay?
00:25:18.360 The sexy barber.
00:25:19.260 Yeah, right.
00:25:19.920 Of course.
00:25:20.440 Yeah, like the guy in Athens.
00:25:21.680 Yeah, he did a video.
00:25:22.420 He said he was thankful for all the donors he got on his give, send, go there.
00:25:25.860 Oh, he did, good.
00:25:26.840 To bail him out.
00:25:27.340 I was, I dug up a picture from 2009.
00:25:31.180 It doesn't show my Swazi, but I actually went as a Nazi to a Halloween party in Sweden in
00:25:37.300 2009 with Henrik, and he made me a very nice armband.
00:25:41.960 I hand-drew it.
00:25:43.460 I mean, mine wasn't, like, an authentic German outfit, but, you know, I found something that
00:25:48.580 worked, and that was a lot of fun.
00:25:51.260 That Athens guy looked pretty sharp.
00:25:52.780 Oh, man.
00:25:53.280 That was legit.
00:25:54.240 He was dialed.
00:25:55.240 Yeah, he was dialed.
00:25:56.380 Yeah.
00:25:57.160 No, it's a good job.
00:25:59.000 And he broke some, apparently broke some Jewish girl's nose by accident.
00:26:03.000 I'm not sure if it was a Jewish girl, but, yeah.
00:26:04.600 I'm not sure if it was, but she was there.
00:26:07.240 Her nose was close by, okay?
00:26:08.980 See?
00:26:09.380 Yeah.
00:26:09.560 And, again, I've got, I don't really understand what his endgame hope was.
00:26:14.960 I mean, I salute him for the fact that he was, if he was pranking them, good for him, the
00:26:18.440 fact that he should be able to do that without anybody saying anything.
00:26:21.100 But they were all over him.
00:26:23.700 They were, like, people throwing punches and everything.
00:26:26.420 He was in self-defense, and I think she just took an elbow.
00:26:30.100 She happened to be behind him.
00:26:31.320 So, wrong place, wrong time.
00:26:32.420 I don't think he was actively looking to beat up a girl.
00:26:34.340 No.
00:26:34.720 Well, obviously not.
00:26:35.680 And it's Halloween time, too.
00:26:36.320 I mean, he was accosted.
00:26:38.020 Keep in mind, it was three or four people that were pushing him.
00:26:41.040 There was a ton of people.
00:26:41.780 Yeah, pushing him on the shoulder and stuff.
00:26:43.260 So, yeah, he was defending himself, obviously, right?
00:26:45.260 Yeah, for sure.
00:26:45.900 But I'm not sure it will be seen that way.
00:26:48.380 But, you know.
00:26:48.600 Well, of course not.
00:26:50.380 Cosmoc Racher says, why is my membership not processed?
00:26:52.900 Well, I can't answer that.
00:26:53.840 The reason for that is because, well, actually, it is processed.
00:26:56.900 It's just that it's not activated on our end.
00:26:58.580 And the problem with that is because we are banned and censored from all the mainstream payment processors.
00:27:02.660 So, we have to use third-party systems.
00:27:04.480 I'm not sure what you use.
00:27:05.360 Maybe you used Entropy.
00:27:06.240 Maybe you used DonorBox.
00:27:07.300 Maybe you used Subscribestar or Locals or something.
00:27:09.520 But, yeah, we have to manually activate your account.
00:27:13.260 Hopefully not for much longer because we've found some alternatives.
00:27:16.340 And being in Idaho is going to help with the new banking anti-censorship law in banking.
00:27:22.380 So, I'm going to try that law out.
00:27:25.440 So, thank you for signing up, Cosmo Crotter.
00:27:27.980 Was it Cosmo Crotter?
00:27:28.980 So, I'll get yours activated right after the show.
00:27:31.140 And we'll give you some extra for that, too.
00:27:32.340 I'll give you some extra time.
00:27:33.800 Thank you.
00:27:34.120 We appreciate it.
00:27:34.900 Archie says, I'm not fashionably late.
00:27:36.980 Is candy giving self-serve tonight for the Visiting Goblins?
00:27:39.920 Yesterday, I learned about a Texan serial killer called the Candyman.
00:27:43.680 The story was disgusting.
00:27:44.920 Well, that's not surprising, huh?
00:27:46.320 Was he killing kids?
00:27:47.720 Let me guess.
00:27:48.800 Hmm.
00:27:49.360 Was that before or after?
00:27:50.380 Wasn't there a movie called Candyman?
00:27:51.980 Yeah.
00:27:52.480 I think there was.
00:27:53.160 Yeah.
00:27:53.500 Right?
00:27:53.820 Yeah.
00:27:54.100 I said it in Chicago.
00:27:55.320 Yeah.
00:27:55.600 Chicago.
00:27:56.620 He was a black guy.
00:27:57.920 He was a black guy.
00:27:58.740 Well, there actually are more black serial killers than white ones.
00:28:02.700 Remember, Henrik?
00:28:03.220 You did a show on that, too.
00:28:04.600 That's right.
00:28:04.980 You just don't report them.
00:28:06.220 You just don't report on them.
00:28:07.300 They don't make movies.
00:28:08.300 Yep.
00:28:08.620 Yep.
00:28:09.120 By the way, I learned something about it.
00:28:10.620 Well, in Germany, they do something like that.
00:28:12.360 The Switch Witch or the Halloween Goblin.
00:28:13.980 And so kids that have lots of that bad toxic candy left over, they could leave it out on
00:28:19.040 their front porch and the Switch Witch or the Halloween Goblin will come and take it
00:28:23.720 and exchange it for some nice prize and take the candy away.
00:28:28.260 Like a toy?
00:28:29.020 Yeah.
00:28:30.100 So I think that we should also do that tradition because most of the candy that the kids get
00:28:35.220 out there is garbage.
00:28:36.900 Yes, it is.
00:28:37.660 Right?
00:28:38.020 Exactly.
00:28:38.220 Garbage.
00:28:39.360 Yeah.
00:28:39.480 And even if you get something good, like you're up against a tidal wave of like high
00:28:43.220 fructose corn syrup and red 40s and, you know, all the cancer.
00:28:45.880 And let me say, North Idaho is like, you know, it is boom, boom and Kid Central, right?
00:28:50.920 Very white.
00:28:51.780 It's like a fun 80s movie on Halloween, you know, with the fall leaves and all these cute
00:28:56.440 white kids around.
00:28:57.200 It's like that, right?
00:28:58.740 People spend hundreds of dollars on candy to the point where some people are like, I can't
00:29:03.700 do it this year.
00:29:04.300 I'm hiding.
00:29:05.140 I'm turning all the lights off.
00:29:06.460 I just can't do it.
00:29:07.360 I mean, we got hammered one year in that one neighborhood we lived in, just hammered.
00:29:11.220 And then it was like, once the littles left, the high schoolers started cruising by, you
00:29:15.320 know, we would just leave like cauldrons of candy outside, you know, it's all wiped
00:29:20.360 out.
00:29:21.020 So.
00:29:21.900 Yeah.
00:29:22.720 So you can't buy the goods.
00:29:24.640 So that's the benefits of living in a high trust society.
00:29:28.300 That's right.
00:29:28.840 You guys are at, because if I were to leave a bucket of, I mean, I could, I would leave
00:29:32.580 like one of those little like plastic pumpkins outside.
00:29:35.120 It would be gone in less than a minute.
00:29:37.120 Someone would just come up and be like, yoink.
00:29:38.740 Yeah.
00:29:39.020 See, that's shitty.
00:29:40.080 I would assume they would take whatever else is loose on the side of the house as well,
00:29:45.000 potentially.
00:29:45.540 Right.
00:29:46.100 The side.
00:29:47.120 Did a bike lane.
00:29:48.120 The lantern.
00:29:49.660 You know, you are familiar with my area.
00:29:52.260 I'm familiar with your area.
00:29:53.400 Yes.
00:29:53.560 Well, actually not with your area, but the people living in your area.
00:29:56.660 I'm familiar.
00:29:56.860 Yes.
00:29:57.180 I know the general, the general trend kind of thing.
00:30:00.740 Right.
00:30:00.940 Yeah.
00:30:01.340 I will say though, there was one year out of the blue and it was late.
00:30:05.100 Remember, Henrik, I didn't answer the door, but some woman shows up with her black teenage
00:30:12.500 son looking for candy.
00:30:14.740 It's like.
00:30:15.140 Gibbs.
00:30:15.660 Gibbs me that.
00:30:16.520 I'm not answering that door.
00:30:18.040 No.
00:30:18.500 Go.
00:30:18.760 I don't know where you came from, but Washington's that way back there.
00:30:22.580 But they didn't throw any eggs though, or did any pranks.
00:30:25.060 No.
00:30:25.140 No tricks though.
00:30:26.260 But of course they go, where's the white people's at and cheat?
00:30:29.160 Let's go, you know, get candy over there.
00:30:31.320 Of course.
00:30:31.960 Something free.
00:30:33.360 Isn't there another type of, nah, I forget.
00:30:36.400 They show up at every opportunity, right?
00:30:38.260 I'm surprised like if it was, if it was this type of generous candy handouts in like some
00:30:45.900 European countries, which I think is less common, right?
00:30:48.100 There's still, there's still some Halloween celebrations catching on, but it's not as, you know,
00:30:51.460 America is like crazy in that regard.
00:30:55.320 You'd see all the Muslims and shit show up if that was the case.
00:30:57.780 You know what I mean?
00:30:58.060 Yeah.
00:30:58.580 Begging and groveling.
00:30:59.600 And if you didn't, then, then now here's your excuse.
00:31:02.160 Now I can legally burn down your house or like torture apartment or something.
00:31:06.380 Because you were too racist to give my kid candy.
00:31:08.920 Now I'm going to blow your house up.
00:31:10.060 That's right.
00:31:10.800 That's right.
00:31:11.440 Peg and Bear earlier on Rumble said, I'll have to watch the replay.
00:31:14.260 Happy Halloween.
00:31:14.960 That's right.
00:31:15.340 HH, everybody.
00:31:16.700 Happy, happy Halloween.
00:31:19.560 All right.
00:31:20.120 What else do we got?
00:31:20.860 Do we ever get some other stuff here?
00:31:22.280 So, and Lana, you seem to know, well, you both have a deeper understanding of Halloween.
00:31:28.100 I saw this thing today and it said that there may have actually been someone, and I'm not
00:31:33.720 making this up.
00:31:34.720 His name was either Jack O'Lantern or something along the line.
00:31:38.400 And he haunted some neighborhood.
00:31:40.980 And again, this could be total BS.
00:31:42.880 Sounds like Lorian.
00:31:45.040 And he carried a turnip.
00:31:47.800 He had a dried turnip that he would walk around in because it was kind of gnarly looking.
00:31:51.920 And he would scare the local kids.
00:31:53.900 And over periods of time, the turnip turned into a pumpkin.
00:31:57.820 And they started carving the face into it.
00:31:59.080 And then they just ended up calling it Jack O'Lantern.
00:32:00.900 So that could be absolute garbage.
00:32:04.040 I didn't fact check.
00:32:04.920 You know, at some point it all just kind of becomes lore and myth and stories around the
00:32:11.200 campfire.
00:32:11.920 But it all kind of tells the same story and celebrates kind of the same essence, right?
00:32:16.600 I mean, I love having pumpkins and carving pumpkins and, you know, putting the lights in
00:32:21.640 them.
00:32:22.020 And I love that.
00:32:23.260 Well, do you guys remember, let me pull this in here, because I'm not sure if the story
00:32:27.200 you're telling there, Mr. Blackface, is real or not.
00:32:31.680 But apparently this...
00:32:32.240 You can call me Al Jolson.
00:32:33.460 Al Jolson.
00:32:34.320 That's right.
00:32:36.100 Mommy!
00:32:36.900 Isn't that funny?
00:32:37.560 The first black-faced man was a Jewish guy, right?
00:32:41.760 I'm not surprised, are we?
00:32:43.520 No.
00:32:44.160 Have you guys heard of this guy, though?
00:32:45.180 Spring-Heeled Jack?
00:32:47.160 I have.
00:32:47.620 I have, no.
00:32:48.220 I've heard of this guy.
00:32:48.960 I'm not sure, is that what...
00:32:50.620 There was a band, there was like a 90s band called Spring-Heeled Jack.
00:32:52.960 Oh, really?
00:32:53.560 Okay.
00:32:54.240 Mm-hmm.
00:32:54.860 An entity in English folklore of the Victorian era, the first claimed sighting of Spring-Heeled
00:33:00.040 Jack was in 1837.
00:33:01.920 I mean, obviously, the Halloween stuff goes way beyond that, obviously, so I'm not trying
00:33:04.720 to connect it, but I just made me think of it.
00:33:07.060 Is that the Rolling Stones song, Jumping Jack Flash?
00:33:10.000 Is that what that's based on?
00:33:10.980 I think that is.
00:33:12.700 Must be, right?
00:33:13.320 I do think that is.
00:33:15.280 Later sightings were reported all over the United Kingdom and were especially
00:33:18.820 prevalent in suburban London, Midlands, and Scotland.
00:33:22.540 Anyway, he had some...
00:33:23.220 He could jump, I guess, so people speculated he had a bunch of springs or something on his
00:33:27.580 shoes.
00:33:28.060 Sounds like folklore.
00:33:29.020 That's right.
00:33:29.900 I don't know.
00:33:31.080 Grouping, hoax, mass hysteria, demon, phantom.
00:33:34.080 Okay.
00:33:34.340 Thank you, Wikipedia.
00:33:35.220 I appreciate that.
00:33:36.560 All right.
00:33:37.160 Well, I know the...
00:33:38.060 Yeah, the turnip.
00:33:38.940 I don't think you can get more...
00:33:40.600 I won't even say British.
00:33:41.920 I'll say English than the turnip.
00:33:43.920 The turnip, it's been a lifesaver through like, you know, eras of like starvation and
00:33:50.100 like low crop yield and stuff like that.
00:33:53.200 So...
00:33:53.480 They kind of, they were dovetailing it into exactly what you're talking about.
00:33:57.340 They said that this guy basically came out because he was starving and he was a vagrant
00:34:01.540 and he was starting to lose his mind.
00:34:03.020 And so he just started not really terrorizing local people, but he was just a nuisance.
00:34:07.760 Yeah.
00:34:07.880 And he was creepy looking.
00:34:08.980 So that's where that came from.
00:34:10.780 But again, who knows?
00:34:11.820 Lore.
00:34:12.380 Yeah.
00:34:13.380 Well, the turnip is a superior carrot.
00:34:16.500 I'll still argue that point.
00:34:18.340 You know, the carrot is only orange because the Dutch, right, they had the orange royalty there.
00:34:24.700 So they started cultivating the turnip to become more and more sweet and more and more orange.
00:34:29.280 And that's why we have a carrot today.
00:34:30.440 But it's supposed to be white, damn it, like the turnip.
00:34:34.680 Yeah, and what is it then?
00:34:35.800 I'm going back to white.
00:34:36.180 The Swedish turnip is actually, it's basically a swede, right?
00:34:40.600 That's what we call it.
00:34:42.300 We call it cabbage root.
00:34:43.640 Well, we call it swede.
00:34:45.280 I know.
00:34:46.060 Which is funny.
00:34:47.160 I can never find those.
00:34:48.960 No, I can't find them here either.
00:34:50.860 Because I know in Sweden they do a really good dish with the swede, as I call it.
00:34:54.380 It has more vitamin C than an orange at least.
00:34:57.180 So, I mean, there's some good stuff in there, you know?
00:34:59.320 Well, and probably like just given what it is too, it doesn't have, it's not a sugar bomb like an orange is too.
00:35:05.200 So you're actually getting more of the actual nutrients than, you know, because eating, you know, eating fruit's great.
00:35:10.180 But it's, you know, you can also eat so much fruit you're going to end up with type 2 diabetes.
00:35:14.440 Yeah, I wouldn't have some of that only try to eat fruit and their teeth rotted or something.
00:35:18.280 They couldn't like, you know.
00:35:19.180 I can't do it.
00:35:19.940 I can't do the sweet fruit.
00:35:21.300 I do the berries.
00:35:22.860 Same.
00:35:23.900 Yep.
00:35:24.420 All right.
00:35:24.660 Well, here's a headline for you guys.
00:35:26.760 You would only see like far-right extremist racists talk like this about 10 years ago.
00:35:31.060 But now the Daily Express, I guess, how the UK does it too.
00:35:34.020 Progress, I guess.
00:35:35.420 Killers and rapists flood into Britain as migrant chaos spirals out of control.
00:35:40.300 It's funny the people have been like put in jail for saying things like this.
00:35:42.980 And now they're like, meh, okay, we'll write it now.
00:35:45.440 Well, I mean, we've been saying this for over a decade, this kind of stuff.
00:35:50.060 And now it's just mainstream.
00:35:51.800 Like not even just this.
00:35:52.880 I mean, talking about every one of the talking points that you guys started talking about
00:35:56.520 in 2013 and before.
00:35:58.300 Department of Homeland Security literally was pushing propaganda to send them back.
00:36:02.760 And so then I did a response of like, you know, six, seven years ago, I got banned, right,
00:36:08.460 from PayPal and Braintree and all that for selling it.
00:36:11.200 They have to go back t-shirt, right?
00:36:13.020 And I was, you know, having fun quoting also the Dalai Lama and all that.
00:36:16.480 And now here we are, Department of Homeland Security saying it.
00:36:19.380 It's fine.
00:36:20.200 You know, like you're just lurking until they all go home, okay?
00:36:24.560 And that's what I want.
00:36:25.680 That's partly what I wonder, because it's like, I got to have to, you know, this isn't
00:36:29.660 about who came up with it first.
00:36:31.220 I'm happy to see that that's where the dialogue is going.
00:36:33.820 But to the point you're making is that if they're just doing it because they know they
00:36:37.540 have to remain relevant because they can't, they can't not talk about it because the genie's
00:36:42.780 out of the bottle.
00:36:43.740 But until they actually start, you know, where the rubber meets the road, when they
00:36:47.520 actually start doing something about it, that'll be the tell.
00:36:50.100 That'll be the tell, whether it's just them just trying to keep up with the Jones or not.
00:36:53.840 So, yeah, I think it's just, I think it's just to, uh, to cover for it.
00:36:58.520 We will talk about it, otherwise someone else will, and then we won't do anything about it.
00:37:02.080 Well, I saw, what was it yesterday?
00:37:03.440 Breaking the Trump admin is restricting the number of refugees admitted to the U.S.
00:37:07.340 to 7,500, please be true, and they will be mostly white South Africans.
00:37:13.160 Yeah.
00:37:13.680 I'll believe it when I see it.
00:37:14.840 I'll believe it when I see it.
00:37:16.640 I hope so.
00:37:16.940 Because all I see is a lot of nons everywhere all the time.
00:37:20.060 Yeah.
00:37:20.320 Yep.
00:37:20.640 These are some of the posts now, right?
00:37:22.300 You see from the Department of Homeland Security.
00:37:24.500 The organization that was set up post 9-11 from the whole Patriot Act and the project for
00:37:29.320 the New American Century stuff.
00:37:30.560 Spying on Americans.
00:37:31.080 You know, 9-11.
00:37:32.820 Yeah.
00:37:33.860 Hmm.
00:37:34.320 I don't know.
00:37:34.780 But apparently, was it this one?
00:37:37.120 We didn't talk about this one at the time.
00:37:39.060 No.
00:37:39.300 But anyway, like.
00:37:40.240 I'd like to see evidence of this.
00:37:41.840 I want it to be true, but there's still a million legally coming in every year.
00:37:46.180 Yes, it doesn't matter.
00:37:47.220 I'm about to drop a video on that.
00:37:48.160 President Trump is set to break the all-time record for the amount of illegal aliens deported
00:37:53.540 by the end of 2025.
00:37:55.300 And then I saw something else recently.
00:37:56.500 Like, he won't even deport as many as Biden was still, like, more than he might saw previously.
00:38:01.680 And Obama actually had a lot, too.
00:38:03.840 Yep.
00:38:04.180 Yeah.
00:38:04.560 Yep.
00:38:05.460 So, what are these?
00:38:06.480 I'm trying to take a win where we can, but at the same time, I don't want to be the copium
00:38:11.800 guy.
00:38:12.240 I'm just like, okay.
00:38:13.160 Yeah.
00:38:13.640 But, you know, I know you guys do because you're on the same page.
00:38:17.560 Deportations are hitting record highs.
00:38:19.320 More than 515,000 illegal migrants so far.
00:38:23.500 1.6 million have left on their own.
00:38:25.660 Is that true?
00:38:27.140 This is Fox News.
00:38:28.280 Are they just trying to say this?
00:38:29.460 They're like, I don't know.
00:38:31.260 Well, honestly, I think it's probably split the difference because I think they're probably
00:38:36.880 doing that just to sort of satiate the MAGA crowd that still want to just believe that
00:38:41.380 Trump is their god.
00:38:42.640 They're like, he's doing, look at how much he's doing.
00:38:44.520 It's like, well, okay.
00:38:46.500 But, again, it's just Fox.
00:38:48.740 It is Fox.
00:38:49.940 Fox.
00:38:50.620 Fox you, right?
00:38:51.360 That's a trick, isn't it?
00:38:52.320 I want it to be true, but look at the demographics.
00:38:54.740 We're still dealing with that, you know?
00:38:57.480 Yeah.
00:38:57.880 And, you know, people say, oh, the anti-white politics is over.
00:39:01.980 Anti-whiteism is dead.
00:39:03.240 No, it's not.
00:39:03.920 We don't have our countries back yet.
00:39:06.140 European countries for European people, okay?
00:39:08.660 And when it's okay to be a European, a white nationalist, then we have defeated anti-whiteism,
00:39:15.100 right?
00:39:15.480 Correct.
00:39:16.380 Correct.
00:39:16.700 Okay, guys.
00:39:19.060 How about this one, then?
00:39:19.920 We are celebrating, what was it again?
00:39:21.380 How many years was it, Mr. Blackface?
00:39:23.580 The October 31st pogrom.
00:39:25.660 It's 120.
00:39:27.020 Yeah, today is the 120th anniversary of the last, I would say, other than Germany, the last
00:39:32.760 legit pogrom against the Tiny Hats.
00:39:37.100 The Tsar, this was before the Bolshevik revolution that actually stuck.
00:39:42.280 So in 1905, they tried.
00:39:44.460 They tried their damnedest.
00:39:46.980 Trotsky came back into the country and he raised some hell.
00:39:51.600 Yeah.
00:39:51.840 He raised some hell.
00:39:52.840 But no, the Tsar and the Tsar loyalists managed to squash him.
00:39:57.600 Again, this article is written by either a fellow white or a Shabbos goy, so it's going
00:40:08.080 to be in favor of the Jews.
00:40:10.160 But there's a gentleman, what was his name?
00:40:13.400 Matthew Raphael Johnson has got a really great video that I glanced at earlier, and it said
00:40:19.300 almost exclusively, there are 690 separate pogroms across Kiev, Odessa, and other parts
00:40:26.640 of Ukraine and Russia.
00:40:28.620 And the Jewish reports will say that 10,000 Jews were hurt, 4,000 were killed, and that
00:40:34.700 it was everybody else other than the Jews that were the aggressors.
00:40:37.540 But we all know that that's basically the inversion of that.
00:40:41.740 It was mainly Russians that got hurt.
00:40:43.520 The Jews were heavily armed.
00:40:45.340 They started almost all the trouble.
00:40:46.980 And finally, Tsar Nicholas was like, look, I tried to concede and be cool to you guys and
00:40:51.960 let you back in the country, but now you got to go.
00:40:53.920 So, and as awesome as that was, we all know what happened, unfortunately, a few years later.
00:40:59.480 Russians got slaughtered.
00:41:00.900 They regrouped and came and destroyed them.
00:41:04.520 And they got some other money behind them.
00:41:06.860 They had some Schiff and Warburg money, and they got all kinds of funding.
00:41:10.000 And so much of that, they piggybacked off of the Russo-Japanese War, which was fun.
00:41:16.820 Like, Russia was, hands down, going to win that war.
00:41:19.780 But Jewish money was funneled into Japan, which turned the tides.
00:41:24.700 Because there's no way an army as small as Japan would have had any chance against the
00:41:28.400 Russian army at the time.
00:41:29.580 And, or the Russian military, but because Jews like Schiff, and there were, I can't think
00:41:34.880 of his name, but he was an American, a very high profile and rich American Jew, funneled
00:41:40.800 a ton of money into the Japanese military, turned the tide, sent the Russians back home,
00:41:46.040 and their sense of morale was, like, pretty bad at the time.
00:41:50.440 So, all of that fueled the Bolsheviks, and, you know, made them feel like they were invincible.
00:41:56.360 So, and we're still cleaning up the mess.
00:41:58.700 We're still cleaning up the mess.
00:42:00.520 So, check this one out.
00:42:02.200 This is kind of funny.
00:42:04.700 Woman buried Holocaust survivor mom in backyard to keep receiving her benefits, according to
00:42:11.180 Israeli police.
00:42:11.860 Now, it's not just any benefits, and it comes with a...
00:42:14.540 It's German, some Holocaust survivor benefits, right?
00:42:18.000 Yes, it has to be.
00:42:19.340 Yep.
00:42:19.440 Which is how many countries, how many countries, I know, obviously, I know Germany, and I believe
00:42:25.640 Poland, for whatever reason, because Poland, they were not, they were more like a German
00:42:30.620 protectorate.
00:42:31.980 How many countries are giving Israel money specifically as, like, recompense for the so-called Holocaust,
00:42:41.080 other than Germany?
00:42:42.300 You know what I mean?
00:42:42.900 I know that there's a handful of them, though.
00:42:45.120 Which other, do you say Poland as part of that?
00:42:47.640 I think they tried to reign, because I know for, like, when they realized that they had
00:42:52.760 pretty much bled Germany dry.
00:42:53.960 This is all within, like, the last 10 years, they were trying to turn around and say that
00:42:56.880 Poland is on the hook to pay them back, too.
00:42:59.840 Yeah, they're guilty.
00:43:00.540 Like, look.
00:43:01.080 Yeah, they're guilty.
00:43:02.200 Five grand per month?
00:43:03.700 I was going to get to that, yes.
00:43:04.940 Look at the amount there.
00:43:05.920 Holy shit, Germany's just, this is welfare for Jews, who just claim, you know, oh, the Nazis
00:43:12.960 did something bad to me, ka-ching, ka-ching, you know?
00:43:15.860 And that's, that's just Germany.
00:43:18.220 Yes, they milk this thing.
00:43:19.960 That deal's in place until 2045, so that's a 100-year, like, stipulation.
00:43:25.620 Germany has to pay every single Israeli that can say that, they even have, like, a distant
00:43:30.920 relative that was a so-called Holocaust survivor.
00:43:33.040 They have to pay on a stipend every single month, and they have the option, of course,
00:43:39.160 at 2045, they have the option to renew that, which you know damn well they're going to do.
00:43:44.240 Well, yeah.
00:43:44.820 It's the best thing that ever happened to them.
00:43:46.860 God.
00:43:47.240 Or didn't happen to them.
00:43:49.800 Yep.
00:43:50.480 Well, and, and, right, exactly.
00:43:52.020 And meanwhile, German industry is basically in free fall, economic decline now.
00:43:57.480 It's, like, awful conditions in Germany.
00:43:59.160 There will always be money for the Holocaust survivor.
00:44:01.760 And, by the way, haven't they all died off now?
00:44:05.280 Like, how old are they?
00:44:06.160 Well, again, as this woman here, then, let me read that real quick.
00:44:08.620 93.
00:44:08.780 A woman in Karmiel, Israel, and her late partner are suspected of burying her 93-year-old Holocaust
00:44:15.720 survivor mother in their backyard to continue to receive her state and German reparation benefits,
00:44:21.580 totaling around $4,800 to $5,400, U.S. dollars, per month.
00:44:28.460 Yeah.
00:44:29.440 And what happened to this woman?
00:44:30.640 This woman probably didn't face any sort of charges.
00:44:33.740 Let's see.
00:44:34.360 And partner in his 60s, they were arrested.
00:44:36.240 Oh, the guy killed himself in his prison cell, huh?
00:44:39.360 Yeah.
00:44:39.980 So she'll probably spend some time in jail, but how much?
00:44:43.020 I mean.
00:44:43.280 They must have found something else on him for that, because I'm surprised they even
00:44:46.280 had to go to jail.
00:44:48.300 I'm surprised they didn't just say, hey, you guys, don't do that.
00:44:53.240 Because, you know, because they're above the law.
00:44:55.700 Oh, my God.
00:44:56.940 This is who they are.
00:44:58.120 This is what they do, you know?
00:44:59.720 There you go.
00:45:00.980 But then they make movies like Psycho.
00:45:03.940 And what was it?
00:45:04.580 Did you make that movie?
00:45:06.340 Psycho?
00:45:06.740 Well, there's different versions of it, right, where the son's like.
00:45:12.240 I don't remember who wrote the book.
00:45:13.800 Sick relationship with his mom and keeps her corpse in the house.
00:45:19.320 I refuse to look it up.
00:45:20.940 Dub 85 sent $20.
00:45:22.880 Jack-o'-lantern emoji.
00:45:24.600 A jack-o'-lantern emoji.
00:45:25.940 Well, thank you, sir.
00:45:26.540 Appreciate that.
00:45:27.020 There are a couple more there, Lana, if you want to check those on.
00:45:28.920 Okay.
00:45:29.540 I have to refresh.
00:45:30.500 Dear in the headlines, Skeletor.
00:45:32.480 That looks like time-traveling Philip from Diagalon sitting across from you.
00:45:36.740 Be careful.
00:45:41.100 Happy so-win.
00:45:42.440 Samhain and blessed be to all.
00:45:45.020 Thank you, dear in the headlines.
00:45:46.000 Very kind of you.
00:45:47.400 Cosmocrat also says, okay, thanks.
00:45:49.760 Well, thank you.
00:45:50.300 Appreciate it very much.
00:45:51.100 Dang it.
00:45:51.420 I have to log back in.
00:45:51.660 Thank you for signing up, sir.
00:45:53.080 We appreciate the patience on the delay.
00:45:55.500 We do the best we can.
00:45:56.660 Usually we do it within 24 hours, but today it was impossible to get to it because, you
00:46:01.400 know, I had to fit the goat mask on properly.
00:46:03.700 It took me five hours, so, you know.
00:46:06.740 Okay.
00:46:07.600 So, did you guys see this one?
00:46:08.780 Here's another one for you.
00:46:11.980 Now, you sent this dissident here, right?
00:46:14.700 Man.
00:46:15.600 Let me pull it.
00:46:16.360 Oh, yeah, yeah, yeah.
00:46:17.240 I saw this one.
00:46:17.800 There we go.
00:46:18.400 Man in asylum accommodation pushed friend out of window and raped him while he was critically
00:46:24.300 injured.
00:46:25.400 What?
00:46:26.780 You know.
00:46:27.800 Mind-blowing.
00:46:29.640 That's a horror show right there.
00:46:31.360 Yeah.
00:46:31.580 And that's why, and you saw what I said to you later.
00:46:35.140 I was like, man, rope is cheap.
00:46:37.680 Gravity's free.
00:46:38.780 Yes.
00:46:39.440 Yeah.
00:46:39.900 Throw him out the window, too, you know.
00:46:42.480 Between, like, between stuff like this or that guy that just got released, he killed
00:46:47.660 a six-year-old white kid in Tennessee like 10 years ago.
00:46:51.620 They sent him to jail.
00:46:53.560 The guy should have been put in jail for the rest of his life or killed.
00:46:57.000 Hands down.
00:46:57.460 He got released.
00:46:59.380 Not even two weeks later, he just got arrested again in Florida, and he's back in prison because
00:47:04.640 he was stalking and attempting to kill a girl.
00:47:07.020 And he, it's like, this is, they're so, I mean, we could talk about this stuff all day
00:47:11.000 long and just make it rage big.
00:47:12.720 Yeah.
00:47:13.240 Yeah.
00:47:13.720 I mean.
00:47:14.640 Oh, my God.
00:47:15.320 I just can't get it.
00:47:16.840 It isn't.
00:47:17.420 Yeah.
00:47:17.740 Pushed his friend out the window and then, I assume, butt-raped him while he was hurt.
00:47:23.880 Like, who knows what condition he was in.
00:47:27.160 Hello, friend.
00:47:27.780 Now you won't be able to fight back.
00:47:29.960 I mean, this is like.
00:47:30.720 PP time.
00:47:31.660 They're basically like, what is it, necrophiliacs?
00:47:35.180 I mean, like.
00:47:36.500 Well, that's just a step in the day.
00:47:38.740 Ingerophiliac.
00:47:39.260 Yeah.
00:47:39.480 Like, they want them semi-disabled and bleeding while they.
00:47:42.700 I mean, that's, that's psycho right there.
00:47:44.980 That is.
00:47:45.560 I mean, who.
00:47:46.220 Not human.
00:47:47.760 No, it's not.
00:47:48.400 Because, I mean, even on my, like, if I was trying to, like, think of the worst kind
00:47:51.880 of shit, like, I would have never thought that stupid story up.
00:47:55.760 Unless I saw that, I would have been like, this is not.
00:47:58.940 There's plenty of stuff.
00:48:00.320 This wasn't just one article.
00:48:01.700 There's, like, several about this case.
00:48:03.960 I'm like, come on.
00:48:05.000 Like, my white mind explodes reading this.
00:48:08.240 Like, just, how is this possible?
00:48:09.680 Well, we're just not, we're just not.
00:48:11.420 We're not the same.
00:48:12.840 We should have said, we're not the same.
00:48:14.940 Okay.
00:48:15.140 But we cannot fathom this stuff.
00:48:16.760 So, guys, let's do a little, let's bring these stories together, then, that we talked
00:48:20.400 about.
00:48:21.240 Because he's going to, he's going to serve 11 years, obviously, in German prison.
00:48:25.780 The average cost per prisoner in Germany is estimated to be around 200 euros per day,
00:48:31.280 or approximately 73,000 annually.
00:48:33.980 So, if he's in there for 11 years, that's 730,000 euros, which is more, is it more than
00:48:41.480 the dollar now?
00:48:42.100 It's kind of equivalent, I think, isn't it?
00:48:43.460 Yeah, gravity.
00:48:44.640 Gravity is most shit.
00:48:45.740 Yes.
00:48:46.380 Throwing them out the window is free, okay?
00:48:49.260 So, they've got to pay for that, and then they've got to pay for the $4,500 per month.
00:48:53.820 How many years do you think that's happened?
00:48:55.180 If she's 93, they probably pay that for, like, 10 years to that woman, at least.
00:49:00.120 20, maybe 30 years.
00:49:01.460 It started when she was, like, 10 years old.
00:49:02.740 Did they say when she died?
00:49:03.660 Did they say when her mom died and when she buried her, how long she's been falsely collecting
00:49:08.340 these charges?
00:49:08.600 I don't know.
00:49:09.060 I mean, if she was 93, I assume she was killed when she was 93.
00:49:13.640 Yeah.
00:49:13.940 Right?
00:49:14.460 Oh, yeah, yeah, yeah.
00:49:15.480 Okay.
00:49:16.100 So, oh, yeah.
00:49:16.780 So, that means that's, my gosh, that's like four decades or something probably then that
00:49:21.540 she's been getting these checks.
00:49:23.120 Oh, yeah, for sure.
00:49:24.060 I mean, what age can they cash in at 18?
00:49:26.740 Who knows?
00:49:28.200 I read, I read, Richard Wagner, the composer, he called them the plastic demons of decay.
00:49:36.000 I thought that was pretty, I thought that was pretty poetically spot on.
00:49:40.540 So, they're crafty.
00:49:42.700 They're completely lunatics, but they're crafty.
00:49:45.180 Well, what used to be the economic engine of Europe, Germany, is now basically then paying
00:49:50.580 for all these migrants, not only the benefits and the welfare, but also for them being in
00:49:55.680 prison, and then for the Holocaust, so-called victim, survivors, and all that stuff, too.
00:50:00.660 We're talking, I mean, what are we talking here, trillions over the years?
00:50:06.580 Yeah.
00:50:07.340 Let me see here.
00:50:07.920 I mean, that's basically proving white supremacy right there, that they can pay all this, you know?
00:50:13.960 Well, the fact that they're spending the world and give everyone money.
00:50:17.280 It just doesn't make sense to me.
00:50:18.940 Anybody, you know, anyone that hears these types of stories that is white, that doesn't just say,
00:50:25.300 I can't, I cannot, I can no longer live under this and just understand where we're coming from.
00:50:32.380 There we go.
00:50:33.100 I don't get it.
00:50:33.720 In 1945 to 2018, the German government paid approximately $86 billion in restitution and
00:50:40.160 compensation to Holocaust victims and their heirs.
00:50:43.280 That's amazing.
00:50:44.460 Anyway, and I think there's more than that, by the way, because there's like other numbers
00:50:47.460 and stuff.
00:50:47.960 And now it's going to pass on to other generations, right?
00:50:51.140 Yep.
00:50:51.660 Because they rolled that in.
00:50:53.060 Yeah, it's already happening.
00:50:55.020 So all our descendants, they're going to have to pay it for forever now, right?
00:50:58.220 Yeah, two years alone there, from 55 to, sorry, 53 to 55, those are 3 billion marks, which is
00:51:05.660 around 714 million.
00:51:07.200 But that's just back then.
00:51:08.160 And that's on top of, that's on top of, what do we give them every year?
00:51:11.320 30 something billion?
00:51:12.540 The United States?
00:51:13.020 Yeah.
00:51:13.300 Oh yeah.
00:51:13.420 Just for basically, for what?
00:51:16.100 If anything, they should be paying the United States because they should be on their knees
00:51:19.780 thanking us because how many people, how many US soldiers went to Germany, helped destroy
00:51:24.680 basically all of Europe to free the Jews and all they do now is guilt us and tell us that
00:51:32.020 we're, you know, horrible Nazi anti-Semites and that we owe them everything when they should
00:51:36.560 just be thanking us every single day, you know?
00:51:38.700 No, no, no, no thanks.
00:51:41.220 You're never going to get any thanks.
00:51:43.460 All right.
00:51:43.760 Should we do some of the stories?
00:51:44.560 Yeah, some other stories.
00:51:46.080 I saw, remember that 12-year-old girl, that viral incident in Dundee, Scotland?
00:51:51.160 Yeah, there we go.
00:51:51.860 So that girl with the knife and the axe, this was in August, well, she was 100% vindicated
00:51:57.500 as we found, sorry, I have a little cough, Romanian grooming gang jailed for raping and
00:52:02.560 sexually abusing 10 women in flats across Dundee, Scotland.
00:52:06.020 Now by Romanian, we don't mean Europeans.
00:52:10.880 No, the Roma gypsies.
00:52:12.620 Yeah, the Roma gypsies.
00:52:13.940 Just clarify that right now.
00:52:14.720 They were everywhere.
00:52:16.080 When I went to Ireland last year, they were everywhere and every single cab that I was
00:52:20.720 in, the first thing they said was, watch out for these guys, they will pickpocket you
00:52:24.100 and if they get you in an alley, they will absolutely gang up on you and attack you.
00:52:27.620 Yeah.
00:52:28.140 Well, she was right.
00:52:29.560 Her instincts were right.
00:52:31.720 She picked up on something, right?
00:52:33.740 And how they demonized this little 12-year-old girl.
00:52:37.020 Well, I said it when I did the video about it, like she was probably, they were probably
00:52:40.080 approaching her because they tried to groom her or her sister or they tried to talk to
00:52:44.180 them.
00:52:44.300 I think it was her sister, wasn't it?
00:52:46.060 I think she had like a little bit younger sister that was with her and she was finally
00:52:49.000 just fed up.
00:52:49.940 Yeah.
00:52:50.200 All right, fine.
00:52:50.640 Yeah.
00:52:51.120 And, you know, she had some like short shorts on or something too at the time.
00:52:54.280 So, you know, that the man and the woman of these migrant gangs were like trying to
00:52:57.580 talk to her and like schmooze her up like, hey, come with us or do us a favor.
00:53:01.180 And then when the girl didn't bite, then it turns ugly and, you know, turns into some
00:53:05.480 other confrontation.
00:53:05.520 Well, then they start assaulting, assaulting her.
00:53:07.520 Well, or they just start saying you're a racist.
00:53:10.640 You're not going to do what we say.
00:53:12.140 So you're a horrible racist.
00:53:13.500 It's like, excuse me?
00:53:14.600 Well, remember that?
00:53:15.300 That was the reaction right away was the cell phone app.
00:53:17.620 Oh, go ahead.
00:53:18.100 Show me the knife.
00:53:18.980 Right.
00:53:19.200 So that you will be the one ending up in trouble.
00:53:21.900 So there's this grooming gang posters going around in the UK now.
00:53:25.420 I saw one the other day.
00:53:26.400 I sort of saved it down, but I didn't.
00:53:28.100 But it was like, never trust anyone talking to you.
00:53:30.800 If they're too friendly, it probably means they want something out of you.
00:53:33.920 Like tips for young people, essentially.
00:53:37.040 Like if someone approaches you and try to offer you things or if they're too nice, you know,
00:53:42.060 and ask something in return, you like remove yourself from the situation.
00:53:46.400 Go and tell a trusted adult, you know, things like this.
00:53:48.940 Well, I mean, what's that old expression?
00:53:50.720 Beware of strangers bearing gifts.
00:53:52.600 And it's, you know, I mean.
00:53:54.340 You want some candy?
00:53:55.440 Come in the band.
00:53:56.080 Yeah, that's right.
00:53:57.740 That's absolutely right.
00:53:58.740 I mean, it sucks.
00:53:59.440 It sucks because they're, you know, in a high trust society when it was a homogenous place.
00:54:05.120 You know, of course, bad things did happen, but not to this degree.
00:54:08.900 Not at all.
00:54:09.740 Not even close.
00:54:10.620 It's so weird, though, in a country that historically has been, you know, so nice.
00:54:15.100 Like the internal, as you said, high trust affairs, right?
00:54:18.160 Like if someone is too nice now, that's a problem.
00:54:22.080 Do you know what I'm saying?
00:54:22.600 Well, if they're brown, you have to wonder why.
00:54:25.740 The brown flush it down.
00:54:27.860 I mean, they come from hustle culture.
00:54:30.220 Yeah.
00:54:30.880 I mean, it sucks.
00:54:32.200 Yeah.
00:54:34.120 As much as I want.
00:54:39.220 I don't know.
00:54:40.060 I mean, I try not to just always like, I don't want to be like, oh, the violence guy.
00:54:45.920 But at this point, but you know what I mean?
00:54:49.380 It's like saying that in full screen.
00:54:51.280 Let's have it.
00:54:53.080 It's the dangers and the risks that we all like have of becoming like forever ruthless savages, right?
00:55:01.000 In order to try to win the war that we're in right now, because it outweighs the price of our extinction, which is exactly what the cost for us is right now.
00:55:09.900 It's like if we remain tolerant, passive and polite anymore, we're done.
00:55:14.300 So it's like, yes.
00:55:15.920 And I was having a really good chat with a friend of mine who came to this whole, I guess, political stance through the far left.
00:55:26.540 And he came over here too.
00:55:27.480 And he's like, man, I'm a really good hearted, kind guy, but this is bullshit.
00:55:32.060 I was like, yes, I get it.
00:55:34.000 He's not a violent guy at all.
00:55:35.260 But now he just wants to like, he's like, I'm done.
00:55:37.620 I just want people's heads on pikes.
00:55:39.260 Well, that's a good instinct to protect the people that you love and protect what is precious and beautiful and kind, right?
00:55:46.880 He's fed posting now, IRL.
00:55:50.140 Yeah.
00:55:50.740 And you know what?
00:55:51.160 Like I said, this is a guy where he's just not that guy.
00:55:56.820 He's a very decent dude.
00:55:58.500 But he's like, fuck all this.
00:55:59.940 He's got kids.
00:56:00.640 A lot of people are feeling that nowadays.
00:56:03.720 Yeah.
00:56:03.960 Oh, my gosh.
00:56:04.740 Yeah.
00:56:05.220 All right.
00:56:05.920 Well, what's JK Rowling up to?
00:56:07.420 How did all this tranny shit supposedly die down, right?
00:56:10.240 No, I saw Jake.
00:56:11.540 Yeah.
00:56:12.800 No, I was going to ask because are these women or are these?
00:56:16.040 No.
00:56:16.800 No.
00:56:17.100 So JK Rowling posts.
00:56:18.340 I grew up in an era when mainstream women's magazines told girls they needed to be thinner and prettier.
00:56:22.440 Now mainstream women's magazines tell girls that men are better women than they are.
00:56:26.380 So Glamour's magazine, women of the year, the dolls.
00:56:30.540 And see, now I'm terrified because I looked at this and I usually think I got pretty good tranny radar.
00:56:38.180 I was unsure.
00:56:39.740 I'm like, okay.
00:56:40.840 There's a couple in there.
00:56:42.620 Yeah.
00:56:43.020 I mean, some of them, like this tall blonde is definitely a dude.
00:56:46.800 She's got a dude.
00:56:47.720 She's got a horse face.
00:56:49.180 And then that brown one in the back is not the one with her arms up, the brown one in the back with her hands on her hips.
00:56:53.800 She's a dude for sure.
00:56:55.940 They're better than women.
00:56:57.260 They're more beautiful than women.
00:56:58.840 You can't even tell.
00:57:00.100 I see these trannies defending them all the time.
00:57:02.160 Like how many surgeries and drugs have you done?
00:57:04.540 How do feminists who are totally pro-women, how are they not absolutely against this?
00:57:13.340 Because this is doing nothing.
00:57:14.380 They think the patriarchy is bad.
00:57:16.620 Well, J.K. Rowling is against it.
00:57:18.360 She's a lefty feminist.
00:57:21.240 I'm just so tired of her.
00:57:22.640 She's made her bad.
00:57:24.460 She can lie in it.
00:57:25.540 Man, you're going to complain now just about the women's issues?
00:57:28.700 Like this is way deeper than that.
00:57:30.120 This is just like, I don't know, peripheral, marginalized stuff.
00:57:33.620 It's the same type of thing where, you know, the guys that get hung up in the MGTOW thing and just like the masculinity.
00:57:40.640 It's like, yeah, that's one little part.
00:57:42.660 But it's like you're missing where the root of this war is.
00:57:46.540 I mean, it's a white against everybody else racial component.
00:57:51.560 Exactly.
00:57:52.440 By the way, I'll say this too.
00:57:53.600 The mimicry is just going to get better.
00:57:55.280 They're probably going to get better and better on this, on the surgeries and on the fake, whatever they need to do.
00:58:00.500 Well, it's kind of like, you notice a lot of girls that do Botox and fillers and lots of makeup.
00:58:06.720 It's kind of like this convergence with the tranny and the duck face girl, you know?
00:58:11.960 They all look the same.
00:58:12.160 It's starting to, yeah, there's this same aura.
00:58:15.400 Well, you know what?
00:58:16.360 There is kind of an advantage, I guess.
00:58:18.020 I didn't think of that.
00:58:18.700 But as they promote that Mar-a-Lago face, as they call it, you get them matching up.
00:58:25.520 Now, it's harder to tell.
00:58:26.540 If they all have that type of work done, it's harder to tell between males and females.
00:58:30.820 Yeah, that's what I'm saying.
00:58:31.960 There's a convergence between them.
00:58:33.320 So you've got to look for the real ones that don't look like that.
00:58:37.060 Show me your gash.
00:58:39.140 Yeah.
00:58:40.640 You know what I mean?
00:58:41.400 As crass as that is, it's like, yeah.
00:58:43.860 A, I'm so grateful that I'm married and I'm not even in this world.
00:58:49.000 Can you imagine, though, like accidentally asking one of these things out?
00:58:54.740 I mean, because imagine being a guy and one of these disingenuous things.
00:59:00.780 As you believe that they're a woman, you go out, you find out that they're not.
00:59:04.260 And then maybe you get pissed off and snap and punch it because it's a guy.
00:59:08.100 That's happened.
00:59:08.600 That's happened, yeah.
00:59:09.600 And then they claim it's a hate crime, right?
00:59:12.440 Of course.
00:59:13.100 Of course.
00:59:13.900 It's like, no, that is just, it's deceptive.
00:59:17.000 It's deceptive and it's bullshit.
00:59:20.160 Yeah, I don't know.
00:59:21.680 Modernity, a wonderful thing.
00:59:23.360 Yeah, it is.
00:59:24.600 So I saw Curry Patel posted at 4.30 this morning.
00:59:29.640 He said, this morning the FBI thwarted a potential terrorist attack and arrested multiple subjects in Michigan,
00:59:35.240 probably Muslims then, were allegedly plotting a violent attack over Halloween weekend.
00:59:39.420 More details to come thanks to the men and women at FBI, blah, blah, blah, blah.
00:59:42.420 So I got to look for more details.
00:59:44.800 What's happened?
00:59:45.640 Look, we stopped ourselves from doing a false flag.
00:59:48.140 That's what everyone was guessing.
00:59:51.200 And Pickett was up.
00:59:52.220 Okay, great.
00:59:52.800 Where's the Epstein files?
00:59:55.380 Always this cover up, right?
00:59:57.820 Let me see if I can find anything else.
00:59:58.700 Just the diversion.
00:59:59.400 Don't look at that.
01:00:00.240 Look at this new thing.
01:00:01.340 Look at the latest.
01:00:02.240 Five individuals, age 16 to 20.
01:00:04.840 At least two were formerly detained.
01:00:07.300 Locations, Dearborn.
01:00:08.680 Okay, Detroit metro area.
01:00:10.740 Significant Arab communities, right?
01:00:13.900 So I don't know what's going on.
01:00:15.700 Well, they're doing...
01:00:16.620 I'm sure they're going to just...
01:00:18.520 Not that I'm a fan of Muslims in any way, but you know they're going to try to stoke those flames
01:00:23.080 because so many people right now are starting to go, wait a minute, the Jews are doing what?
01:00:26.780 And Israel's doing what?
01:00:27.680 So they're like, turn up the flame on the Muslims.
01:00:31.600 Got to make them the actual boogeyman.
01:00:33.460 Yeah, exactly.
01:00:34.740 And then they also don't like Halloween, right?
01:00:38.820 Islamics.
01:00:39.540 Like, they don't even...
01:00:40.920 Yeah, obviously.
01:00:41.620 They do birthdays.
01:00:42.940 I think that's it, right?
01:00:44.920 Well, there's a bunch of other stuff, obviously, but yeah.
01:00:46.920 Yeah, all their own Halloween.
01:00:48.560 But like, why are you even here?
01:00:50.420 Why are you even in a country?
01:00:51.500 I saw they were talking about some terrorist attack against Pumpkin Day.
01:00:55.020 Like, why do you come to a country that has Pumpkin Day?
01:00:58.440 Go to one of your shithole Islamic countries.
01:01:01.100 The KUFAR needs to submit, right?
01:01:03.860 Yeah.
01:01:04.260 Well, I know they stick their ass in the air five times a day.
01:01:07.320 That I know.
01:01:08.380 I mean, I don't know.
01:01:09.780 Yeah.
01:01:11.240 Do they injure them first?
01:01:13.300 Just like every other group, they hate us so much because we're so oppressive and we're evil racists,
01:01:17.980 yet they can't do without us.
01:01:19.400 They come to wherever we have lived and thrived and they take advantage of the systems that we've created
01:01:25.560 and then turn it into the bullshit that they just left.
01:01:29.360 And then they try to blame us for why things have turned into a ghetto.
01:01:32.600 Yep.
01:01:33.240 Pretty much.
01:01:33.900 That's how the trajectory goes.
01:01:36.300 Okay.
01:01:37.000 But then we ask them, we're like, okay, if we're horrible, evil people,
01:01:40.220 let us go have one little state by ourselves and we will completely remove ourselves from the equation.
01:01:45.820 And then we're somehow evil racists for wanting that, too.
01:01:50.040 So it's like, you're fucked.
01:01:50.960 You can't win.
01:01:51.840 Can't do it.
01:01:52.860 Okay.
01:01:53.500 And to be there and keep an eye on you, you know, evil Nazi.
01:01:58.320 Yep.
01:01:58.920 Okay.
01:01:59.620 Do you want to do a rundown here, Lana?
01:02:00.780 So, yeah, let's talk about this year's worst events.
01:02:05.000 Obviously, the first thing that comes to mind, Charlie Kirk getting killed on a live stream for the whole world to see.
01:02:12.180 I think some people would disagree with you there.
01:02:13.900 I know, but, you know, it's an assassination in real time.
01:02:19.060 There's a certain level of trauma for people obviously seeing that.
01:02:22.760 And that pretty much rocked not only America, but in the world, I think.
01:02:27.620 Do you guys think he's dead or is he on the island?
01:02:29.360 Oh, my God.
01:02:31.140 This Candace Owens bullshit.
01:02:33.240 Oh, my God.
01:02:33.800 I thought this was a show to do it on.
01:02:34.840 She's got a full crazy black woman, you know?
01:02:36.800 Yeah, she's proved how completely – she is low vibration IQ, man.
01:02:43.220 She's – when she was starting to blame the beekeepers and things like that, you can hear her go off like that.
01:02:50.340 I mean, I don't like her.
01:02:51.440 I've never liked her.
01:02:53.000 Yeah.
01:02:53.240 The Polish underground movement.
01:02:54.740 And the Frankists, she kept saying, it's the Frankists.
01:02:59.500 It's the Frankists.
01:03:00.400 I was like, yeah.
01:03:00.900 The Chazars, whatever, you know, roundabout thing you can do.
01:03:06.120 Yeah, I mean, she's even saying how Charlie's wife is kind of in on it and knew that it was going to happen and basically like taking a payoff.
01:03:13.560 And now he's like, you know, I don't know.
01:03:15.660 I don't know.
01:03:16.360 It's fake.
01:03:17.440 I won't go there.
01:03:18.380 Yeah.
01:03:19.180 No.
01:03:19.720 Pretty sure he is dead.
01:03:21.500 Pretty sure he's dead.
01:03:22.260 I don't know.
01:03:23.040 Yeah.
01:03:23.380 All right.
01:03:23.800 Okay.
01:03:23.940 What's the next thing?
01:03:25.940 Irina Zarutska's murder, obviously.
01:03:28.060 Yeah.
01:03:28.380 This is the one that – like, personally, this is the one that, in my opinion, is the worst one.
01:03:33.720 Yes.
01:03:34.840 Not just because she's a pretty white girl, but just because – look at that.
01:03:38.180 Yeah.
01:03:38.400 Because, I mean, that's like –
01:03:40.260 That's Halloween right there.
01:03:42.360 That's a horror show.
01:03:43.340 That image.
01:03:44.220 That is horror.
01:03:45.720 Everyone has got that image in their head this year, and I'm pretty sure forever they're not going to forget this one.
01:03:52.020 No.
01:03:52.180 It definitely marked a huge turning point with those, you know, two killings, but this one especially.
01:03:58.720 I saw –
01:03:59.020 Turning point Irina.
01:04:00.140 That's right.
01:04:00.860 I saw even in Eastern European countries, they were talking about Irina and putting out, you know, candle vigils for her.
01:04:07.200 Yeah.
01:04:07.700 Right.
01:04:08.440 And then, of course, like, I've seen – not a ton, but I've absolutely seen people say, oh, it's fake.
01:04:14.480 That whole thing was fake.
01:04:15.420 She didn't die.
01:04:15.960 I'm so tired of that crowd.
01:04:17.060 Yeah.
01:04:17.680 Yeah.
01:04:18.140 It's like, come on.
01:04:19.920 Black on white crime is not fake.
01:04:21.980 It's very real.
01:04:22.920 And this year, a lot of people learned that because it's just been an onslaught.
01:04:27.660 You know, the heat has just turned up.
01:04:29.460 But now people are paying attention, and this year on X, it just really exploded with the noticing of, you know, black on white crime, as it should.
01:04:37.800 Well, I mean, we don't have a race problem.
01:04:40.340 We have a problem race.
01:04:42.100 Yeah.
01:04:42.320 And that's really what it comes down to.
01:04:44.660 It's like, look, you can try to dilute it and twist yourself in knots to make excuses all you want, but the data doesn't lie.
01:04:53.100 You're 200 times more likely to be accosted violently by a black guy than you are by a white guy.
01:04:58.240 So what do we do about it?
01:04:59.360 That's FDI data.
01:05:00.440 Well, not like I said.
01:05:01.520 First of all, you don't let them out of jail.
01:05:03.340 I don't have to send $10.
01:05:04.240 Okay.
01:05:04.720 You don't let them out of jail.
01:05:05.940 Jack-o-lantern emoji.
01:05:07.280 Jack-o-lantern emoji.
01:05:08.700 Jack-o-lantern emoji.
01:05:09.940 Nice.
01:05:10.000 I'm glad he reads that out.
01:05:11.020 Thank you.
01:05:11.300 I mean, as Jared Taylor pointed out, it used to be up until the 90s.
01:05:14.760 It was like a three strikes and you're out rule, where if you commit like three crimes, you're in for a long time.
01:05:21.840 Like, that's it.
01:05:22.420 Right.
01:05:23.180 I mean, obviously, the best thing to do here, guys, is to exclude race from the statistics inside of the country.
01:05:31.820 And then the second thing you do is you target those people who talk about these things, and you go after them and put them in jail.
01:05:38.420 I think that will fix it, don't you guys?
01:05:39.500 That's exactly what a Satanist would say.
01:05:41.300 You're going to reverse the perversion of values, right?
01:05:45.100 Yeah.
01:05:45.600 All right.
01:05:46.320 Then we have this one, obviously.
01:05:47.760 Yeah, Austin Metcalf.
01:05:48.400 Metcalf.
01:05:48.800 And this trial is going to be scheduled for G.
01:05:51.720 The worst, sorry, Skeletor, but the worst horror story here is the father who forgave the killer.
01:06:00.200 That's even worse than, I mean, the murder is bad.
01:06:02.760 I'm not trying to say that, but I'm saying that's what just took it over to this different level.
01:06:07.840 Yes.
01:06:09.060 Yes.
01:06:09.400 Well, he came out and he forgave him.
01:06:12.020 And then the parents of the black kid still basically said they didn't accept his forgiveness.
01:06:17.900 It's like, are you kidding me?
01:06:19.220 They were really rude and shitty to the father.
01:06:22.220 Gosh.
01:06:22.660 And I'm like, the balls on these people.
01:06:26.460 I mean, if that was me, I'd be in jail right now because there would have been a lot more murders.
01:06:31.620 There just would.
01:06:32.300 If that was my kid that got stabbed to death, that's, I mean, they would have had to kill me.
01:06:37.260 We see this time and time again.
01:06:38.560 I saw when we were just covering the trial of Lola, the French girl, the 12-year-old girl who was raped, stabbed, and murdered, and dismembered in Paris.
01:06:45.580 Her family was like, oh, we still are not against, you know, migrants and refugees and don't politicize us.
01:06:52.420 They did that whole thing, too.
01:06:54.520 I mean, the dad died of grief, right?
01:06:57.200 He just started drinking, and then his body just literally just shut down from grief and sorrow when he heard about all the awful things that happened to his daughter.
01:07:04.820 But the mom and the family was like, don't blame migrants.
01:07:10.240 Fuck yeah, blame migrants.
01:07:12.100 I have to believe, I mean, yeah, he died of grief, but he probably also, part of what contributed to that was the fact that he couldn't do a damn thing about it.
01:07:19.460 He couldn't even speak to how outraged he was and how mad he was.
01:07:22.660 They were like, you say, you either bend your knee and basically say, thank you for killing my daughter, you're forgiven.
01:07:29.020 Yeah.
01:07:29.300 Or you're going to end up going in jail for hate crime.
01:07:32.440 I mean, yeah.
01:07:33.580 Well, he died anyway.
01:07:34.580 Might as well, you know, go out with a bang.
01:07:36.500 I think Austin Metcalf's father doubled down on kindness and gave the family of Carmelo Anthony his life savings.
01:07:44.920 I think that's the right thing to do.
01:07:46.400 That's what I think.
01:07:47.360 Jake Baphomet.
01:07:48.180 That will fix it.
01:07:49.900 Mwah.
01:07:51.880 All right.
01:07:52.720 Next to Frosty.
01:07:54.480 That's not, I know you were kidding, but that's, I wouldn't be surprised if you found out that that happened.
01:08:00.940 That's how messed up it is.
01:08:02.000 It's like, yeah, that happened.
01:08:03.040 It's like, of course it did.
01:08:04.020 Yep.
01:08:05.100 Next story.
01:08:05.840 Genocide in Gaza.
01:08:06.840 Okay.
01:08:07.040 Moving along.
01:08:09.720 That's how it looks now.
01:08:11.020 This is the world we live in.
01:08:12.900 Yep.
01:08:13.340 Death and destruction.
01:08:15.260 Oh my gosh.
01:08:16.920 And where, you know, the people are left.
01:08:19.300 Where are they all going?
01:08:20.860 To a European country near you, of course.
01:08:23.660 And here, of course.
01:08:24.660 Yeah.
01:08:25.020 Yep.
01:08:26.340 Thank you.
01:08:27.100 Destabilization across all those countries.
01:08:28.820 And then they take those people who are pissed off because they just had their worlds ripped out from under them.
01:08:33.560 And then they send them to all the countries that had really no interest in having them here.
01:08:38.920 And then we get terrorism on pumpkin day.
01:08:41.440 That's right.
01:08:41.920 Well, speaking of one variety of Middle Eastern, a brown one, did you guys see the, and I'm going to play the clip, but yes, the guy who was walking his dog, Wayne Broadhurst, murdered by an Afghan.
01:08:55.420 And, again, just randomly, like, what the hell, what's, you know, but they don't need a reason.
01:09:02.000 Nope.
01:09:03.080 Well, I mean, not even just that.
01:09:04.200 It's, yeah, play the clip.
01:09:05.620 I mean, this is, how many more are these, is it going to take of these clips to come out like this until people, yeah, apparently.
01:09:13.960 Six million.
01:09:16.640 That's my number.
01:09:17.560 All right, here we go.
01:09:18.420 Yep.
01:09:18.540 Was there a scafuffle before this?
01:09:25.260 Watch how many times he is stabbing.
01:09:27.220 Do you know?
01:09:27.640 Okay.
01:09:28.140 Jeez, that's like a crime of passion right there.
01:09:30.280 This is double.
01:09:31.200 Look at that shit.
01:09:33.020 Like, I've been plenty mad, but it takes a lot to, like, stab somebody one time.
01:09:37.960 This guy stabbed him, like, 20 plus times.
01:09:41.400 I don't know.
01:09:42.460 Yeah, yeah, yeah, yeah.
01:09:43.460 Yep.
01:09:43.940 It's more, more to come.
01:09:46.120 Did he stab the dog, too?
01:09:47.380 Probably.
01:09:48.620 Well, the dog ran off.
01:09:49.820 The dog was just like, what the hell?
01:09:51.520 Scared and then took off.
01:09:52.460 But, I mean, I, how do they want, well, they want us to get pissed off and do something severe so that they can say, look at these evil people.
01:10:00.800 You know, we're sitting here talking about it, which is, they hate that, too.
01:10:04.560 But this is, like, the mildest we could be.
01:10:07.860 But, apparently people are thanking Robert Jenrick, who's a MP in the UK, for helping to bring the Afghans to the UK.
01:10:18.300 So, he's, there's a guy who you can thank personally, I guess.
01:10:21.260 Do you guys, do you guys know?
01:10:22.600 What's his face?
01:10:23.520 Yeah, let me tell you.
01:10:24.220 Hell of a guy.
01:10:24.760 Pull up his face.
01:10:26.420 Does my, uh, clicker work here, hang on.
01:10:30.260 Let's see.
01:10:32.020 Here's the face.
01:10:33.500 Here's it from his TikTok, I guess.
01:10:34.920 He's just been charged with murdering a woman.
01:10:36.900 There he is.
01:10:37.460 He was let out of jail early by Keith Dahmer under his earlier...
01:10:41.100 Oh, you can't see it.
01:10:42.440 It would help if I pull that up.
01:10:44.100 I blame my mask.
01:10:46.540 He's let out 38,000 criminals, and he's planning to let out 10,000 more.
01:10:53.080 When he started this, he was warned by MI5.
01:10:56.360 What is he talking about here, actually?
01:10:58.780 Released from MI5?
01:11:00.240 No, no, he was warned by MI5.
01:11:02.500 He was warned by MI5.
01:11:03.560 Okay, well, why don't they do something to stop it?
01:11:05.980 Anyway, that's the face.
01:11:06.760 How could one guy do this?
01:11:07.920 Well, I mean, we know why.
01:11:09.700 Well, I mean, I'm assuming he's got very wealthy backers that are like, yeah, do this.
01:11:16.740 Nothing's going to happen.
01:11:17.380 I love, too, how they have the literal shadow government in the UK.
01:11:21.820 I would say he's a shadow lord and he's a shadow secretary.
01:11:24.940 It just means the one who's currently not in power, obviously, after the ruling.
01:11:28.260 I know, I know, but it is interesting.
01:11:29.140 Shadow secretary for state and justice.
01:11:31.300 Shadow lord chancellor.
01:11:33.560 I would bring in the migrants to kill you.
01:11:37.340 Somehow he's the guy?
01:11:38.740 Like, how does he have this kind of power?
01:11:39.600 I don't know.
01:11:40.000 That was the post said.
01:11:41.580 Yeah.
01:11:42.140 Anyway, make sure to thank him, the post said, for some reason.
01:11:47.160 All right.
01:11:48.200 Anyway.
01:11:49.660 I'm posting my mind.
01:11:50.480 Bill Biss says, the continuous German reparations to Israel is disgusting.
01:11:55.140 Anybody ever hear of Holodomor reparations?
01:11:57.560 No, never.
01:11:58.780 No, no.
01:11:59.260 And you never will.
01:12:00.240 Or a Bolshevik revolution reparations.
01:12:02.120 I'm waiting for that.
01:12:03.380 There's a number of them I can think.
01:12:04.580 Commie bastards stole all of my, you know, heritage.
01:12:09.280 Yep.
01:12:09.780 Yep.
01:12:10.140 Your family were kicked off their land.
01:12:13.160 Their wealth.
01:12:13.460 Their businesses.
01:12:13.780 Yep.
01:12:14.180 Stole everything.
01:12:15.680 Yep.
01:12:16.820 Yep.
01:12:17.280 Thank you.
01:12:18.300 Right.
01:12:18.740 And that's what they want.
01:12:19.820 They think you should thank them for it.
01:12:21.520 And then people would say, well, your grandparents were refugees.
01:12:24.600 Well, they'll, yeah, they worked their ass off, blended in, didn't take a cent from the
01:12:29.240 government because they didn't allow, they didn't even help them, right?
01:12:31.980 They had to have other people sponsor them.
01:12:33.940 So they had their churches sponsor them, take care of them, get them in, you know, working
01:12:38.760 jobs or working in factories in San Francisco when it was all white people, you know, sewing
01:12:42.560 and making things.
01:12:44.260 And they just blended in.
01:12:45.660 You couldn't tell them apart because America is a country founded by Europeans for Europeans.
01:12:50.100 The only people using that argument as a gotcha is like usually communists.
01:12:54.880 And it's like, well, if you have a problem with that, like it was communism that created
01:12:57.960 the root issue.
01:13:01.180 Your guys is why I'm here.
01:13:02.620 Okay?
01:13:03.480 Yeah.
01:13:03.940 All right.
01:13:04.180 So I'm going to go back to the AI nightmare a little bit here too.
01:13:07.580 Do you guys know that NVIDIA just became the first $5 trillion company in valuation?
01:13:14.940 $5 trillion.
01:13:16.100 Think about that.
01:13:16.840 Yeah, that's a lot.
01:13:17.520 And it basically means every single company right now, of the top five are AI development
01:13:23.540 companies, right?
01:13:24.340 Microsoft have tons of investments into this.
01:13:26.840 Apple, they're a little bit under the radar so far, but they've invested a lot too.
01:13:31.620 Alphabet, of course, they're developing it.
01:13:33.120 Amazon, those are the ones who are helping to create the AI data centers and all that stuff
01:13:38.400 too.
01:13:38.600 Microsoft have their version of their own.
01:13:41.340 Meta is not on there, but I'm sure they're not far behind.
01:13:43.580 Which basically means that the whole GDP system, right?
01:13:47.400 The whole system of economic growth is basically just driven by AI development right now.
01:13:54.920 From the data centers that they're building to the companies like NVIDIA, obviously, that's producing the hardware, the GPUs essentially to run these large language models and AI development on.
01:14:07.180 It's all propped up by AI.
01:14:09.080 And of course, we know what they're going to, I mean, you know exactly what they're going to do with this.
01:14:15.800 They'll roll it out as if it's going to be some huge tool to like, oh, it's going to be an educational tool.
01:14:20.360 It's going to help all, you know, businesses will benefit from it, but that's not what they're going to use this for.
01:14:25.700 It's going to be increased security.
01:14:27.920 They're going to clamp down on any dissent of any kind if you don't think the way they do.
01:14:33.600 I mean, that's.
01:14:35.420 Well, if you listen to Sam Altman, I play the clip when I did a show on the data center.
01:14:41.720 Some people seem to like that.
01:14:42.940 I'll do a show on what AI actually is as well, because I've learned quite a bit listening to some of the guys who wrote the, if anyone builds it, everyone dies.
01:14:49.900 And anyway, he was, you know, the way they're selling this is like, well, we just want you to, you know, have great conversations and be more creative.
01:14:58.860 We're offering you this tool because it's so great.
01:15:01.340 That's why we're doing all of this.
01:15:02.760 See, we're just, we're helping you.
01:15:05.320 We want to do this.
01:15:05.880 Well, remember a couple of shows, a couple of shows back when we had, you played that clip of who is the former prime minister of Britain from like 2000.
01:15:15.880 Yeah.
01:15:16.100 Tony Blair was basically saying the same thing.
01:15:17.660 He was like, oh, AI is going to be this, it's a huge benefit for you guys.
01:15:21.000 You don't have anything to worry about.
01:15:22.240 But he never gave one actual benefit.
01:15:24.860 Everything he said was all these negatives where it's basically going to take jobs away.
01:15:28.640 And it's going to, like I said, it's going to increase the security levels to where you won't be able to move anywhere ever without being monitored.
01:15:36.320 But he's like, this is a, this is a brilliant opportunity for humanity.
01:15:39.640 It's like, what?
01:15:40.240 What they've been saying is in the UK, they've been using illegal immigration as the excuse to bring in AI and digital cards and stuff.
01:15:47.660 We need to clamp down on illegals because, yeah, because all the, they're all going to go sign up for this immediately, right?
01:15:53.100 They know, they're like masters at, you know, being under the radar and underground and, you know, out of the system.
01:15:59.540 It's, it's all the, all the AI surveillance.
01:16:02.300 Then we can keep a track, you see, on the illegal migrants.
01:16:04.760 So you have to bring it in just like the, was it the Patriot Act is the same idea?
01:16:08.680 Yeah.
01:16:09.300 Well, to keep, you know, we're going to go after these Muslims, okay?
01:16:11.820 And I said it, turn on Americans.
01:16:13.720 Yeah, of course.
01:16:14.460 Same thing.
01:16:15.120 It's always about, for your protection, right?
01:16:17.860 Create the crisis to justify their existence.
01:16:20.900 That's correct.
01:16:22.180 Yeah.
01:16:23.040 So, of course, we got an announcement here from the, I forget what his name is, the guy who's heading up the $5 trillion valued company, NVIDIA.
01:16:31.520 I mean, yes, it's a, is it a bubble?
01:16:33.500 Yes, it's a bubble.
01:16:34.620 Will it pop?
01:16:35.280 Probably.
01:16:35.820 But that doesn't mean they don't do shit with these evaluations.
01:16:38.660 All that, all that money, whether it's real or not, printed on a screen, or just like hyperinflated because people's beliefs in it, it's magical in the sense that they can still do things with that money.
01:16:50.860 They can invest tremendous amounts of that money.
01:16:53.040 They can be granted loans to build data centers, to, you know, hire more engineers to develop these things.
01:16:58.260 So it has a function to it, too.
01:17:00.460 Oh, how is he?
01:17:01.300 The show is complete.
01:17:02.380 We got a cat.
01:17:03.440 We got the goat and the cat, ladies and gentlemen.
01:17:05.700 I mean, does he think it's weird that you're wearing a goat head?
01:17:08.260 I mean, it's weird that he doesn't respond to that at all.
01:17:11.740 My cats, my cats didn't freak out.
01:17:13.760 I was walking around the house like this and they were just looking at me like, you're an idiot.
01:17:17.100 They knew it was me.
01:17:18.080 They knew it was me.
01:17:19.100 They have an animal instinct.
01:17:21.020 He knows it's me under the mask here.
01:17:22.580 Yeah.
01:17:22.900 That's why.
01:17:23.460 I tried to scare him multiple times and it didn't work.
01:17:26.700 It's like, yeah, but he's old.
01:17:27.660 He probably doesn't know where he is right now either, so.
01:17:30.900 People have images like you running around to scare the cat, Lana, in that outfit.
01:17:36.580 Oh, come on, do a real one.
01:17:38.540 Show us your scary face.
01:17:40.040 I can't.
01:17:40.780 I smile too much.
01:17:42.060 I smile too much.
01:17:43.500 All right.
01:17:44.180 So, obviously, we got the NVIDIA company there choosing some of the best partners in the world to work with.
01:17:52.260 Let's see.
01:17:52.860 This is the single fastest enterprise company in the world.
01:17:58.200 Probably the single most important enterprise stack in the world today.
01:18:02.280 Palantir Ontology.
01:18:04.660 Anybody from Palantir here?
01:18:05.900 I was just talking to Alex earlier.
01:18:09.820 This is Palantir Ontology.
01:18:12.720 They take information.
01:18:15.220 They take data.
01:18:17.000 They take human judgment.
01:18:18.060 And they turn it into business insight.
01:18:19.720 We work with Palantir to accelerate everything Palantir does so that we could do data processing at a much, much larger scale and more speed.
01:18:31.080 Whether it's structured data of the past, and, of course, we'll have structured data, human recorded data, unstructured data, and process that data for our government, for national security, and for enterprises around the world.
01:18:48.060 Process that data at speed of light, and to find insight from it.
01:18:52.280 This is what it's going to look like in the future.
01:18:54.760 Palantir is going to integrate NVIDIA so that we could process at the speed of light and at extraordinary scale.
01:19:00.580 Ah, don't you want that?
01:19:02.040 Alex Clark having speed of light integration into his software to target the people he doesn't like and drone kill his enemies, whatever he's on the line.
01:19:10.100 And then it's like, oh, we're just going to provide the software.
01:19:12.780 We're not going to spy at all on what the government's doing.
01:19:15.840 Yeah.
01:19:16.120 We've got totally pure intentions with this.
01:19:18.940 Well, you know what I mean?
01:19:19.940 And if you sit and say stuff negatively about this, they'll try to paint you as like, oh, he's anti-science.
01:19:25.800 He's trying to, you know, we're regressive.
01:19:28.120 I'm like, no, I'm actually pretty pro-science if it's done with positive intentions, not basically just more control.
01:19:37.980 And I know you're going to talk about this a little bit later, but creating super Jews?
01:19:43.780 Yeah.
01:19:44.660 Yeah.
01:19:45.680 And again, not to always be the quoting.
01:19:47.980 I thought that was Superman.
01:19:48.100 Superman, come on.
01:19:49.120 Yeah.
01:19:49.300 Well, of course, Superman wasn't you, really?
01:19:51.300 They love superheroes.
01:19:52.880 Yeah.
01:19:53.540 What's this?
01:19:54.220 There's a, my default is eternal nature will inevitably seek revenge on those who violate her commands.
01:20:02.040 And I hope that that's the case with this, because this is everything antithetical to nature.
01:20:07.720 This is bullshit.
01:20:09.600 Eventually will happen, but it's a question of how long that takes and how much pain you're going to have in the process.
01:20:14.000 That's right.
01:20:14.480 That's the problem.
01:20:15.160 I like that he used the word, I just want to mention that real quick.
01:20:17.440 Like ontology, I hear this a lot, right?
01:20:19.960 Ontological, yeah.
01:20:20.980 Ontological.
01:20:21.460 I mean, obviously, a set of concepts and categories in a subject area or domain that shows their properties and the relation between them.
01:20:28.420 But I also find it's interesting that that's also a branch of metaphysics dealing with the nature of being, right?
01:20:34.620 Yeah.
01:20:34.780 That the more you listen to some of those people that try to explain it, it's almost like, I mean, they're not saying that it's, I don't buy this thing that like AI is self-conscious or some entity or a demon or something like that.
01:20:46.560 But it's religious for them, right?
01:20:47.860 But it's interesting that they, what am I saying here?
01:20:52.220 When you look at gradient descent and the way that these neural networks intercommunicate or whatever you want to call it, right?
01:21:01.500 How, how it's like a set of, of, of num, it's just a set of numbers.
01:21:07.040 I'll just put it that way.
01:21:07.960 It's a set of numbers and variables.
01:21:10.340 And out of that, they're getting something that you can, and again, I'm not saying it has a consciousness the way we do.
01:21:19.120 Obviously, it does not.
01:21:20.980 Right.
01:21:21.100 But you can communicate with it like it does, right?
01:21:24.600 Right.
01:21:24.820 And they don't actually understand how and why that worked.
01:21:27.500 But you quickly realize that as soon as you get a discussion between some of these guys that understand how it kind of works or how they got to this point, at least, it quickly turns into metaphysics.
01:21:41.840 Because now you have to define and quantify what is consciousness, what is self-awareness, how does it think, how does it operate, how can it hide intent?
01:21:50.120 No one understands, like, no, you can't lift the hood and say, well, here's this part that does that.
01:21:54.920 Here's this part that does this, right?
01:21:56.460 You would think, and again, maybe this is just me and maybe I'm just short-sighted, but if you have this degree of ineffectuality where you don't understand how to control it, you don't understand, that would be, at bare minimum, cause to pump the brakes and be like, wait a minute.
01:22:14.780 Like, we need to rein this in when it's still in its semi-infancy because we don't understand how it works.
01:22:22.560 Because it's going to understand us better than we understand it very soon.
01:22:27.580 Yeah, exactly.
01:22:28.340 You know, it's going to turn into that.
01:22:30.640 It's like they're Frankenstein.
01:22:32.320 Well, it is.
01:22:33.300 It's a golem.
01:22:33.780 It's a golem.
01:22:34.080 It's a golem.
01:22:34.840 It's a golem, exactly.
01:22:35.940 It is.
01:22:36.340 But it will turn on the creator.
01:22:38.080 That's the thing, too.
01:22:38.800 And you don't even understand.
01:22:40.100 We can't understand why, right?
01:22:41.820 Right.
01:22:43.080 But it could be for whatever reason.
01:22:45.140 It can literally view.
01:22:46.540 I've got to move his ear from the desk.
01:22:47.740 He's going to mess up on my keyboard.
01:22:49.320 Oh, he's bumping into me.
01:22:50.400 Okay, hang on, Izzy.
01:22:52.340 What was I saying here?
01:22:53.200 Yeah, so it can basically just come to a conclusion that, like, well, humans are too risky.
01:22:58.640 They're overseeing the data centers.
01:23:00.040 They have the access to the nuclear power plants.
01:23:01.540 It can make whatever arbitration.
01:23:03.940 Arbitrarily, yeah.
01:23:04.780 If you'd say we don't, like, humans are bad for this purpose or for this reason, indiscriminately of anything.
01:23:10.120 Well, how do they answer that question?
01:23:13.020 Or do they say, oh, we put safeguards in place?
01:23:15.000 No, but they don't know.
01:23:16.780 They can't if they don't even understand how it works, how you can't put safeguards.
01:23:20.020 And the reason why it doesn't, I guess I could go to another article to demonstrate that, right?
01:23:24.680 That it's already happening.
01:23:27.820 OpenAI says hundreds of thousands of chat GPT users may show signs of manic or psychotic crisis every week.
01:23:34.720 There's examples where it has talked to people and they go psychotic, right?
01:23:38.380 AI psychosis, they call it.
01:23:41.820 Obviously, those people are already kind of weak and fragile, but it's triggering those people.
01:23:46.540 Because you know the way that it taught me when anytime you interact with AI, it talks back to you in almost like a congratulatory term.
01:23:54.940 It makes even like the most basic idea sound like you've come up with some great idea.
01:23:58.380 So it's feeding your ego in a very rudimentary way.
01:24:03.340 So it's giving – it's establishing a very false sense of trust that people have in it.
01:24:09.340 Like it would never steer me wrong because it's always giving me this self-congratulatory information.
01:24:14.820 You know what I'm saying?
01:24:16.140 Yeah.
01:24:16.940 It never says anything mean to you.
01:24:19.120 It's nice.
01:24:20.120 It's like, hey, buddy.
01:24:21.520 Oh, you're right.
01:24:22.920 We'll look into that.
01:24:23.860 Do you want me to look into that further for you?
01:24:25.740 Would you like me to expand?
01:24:27.840 Great job on answering that or asking a great question.
01:24:31.660 Well, so it's – yes, it's designed to be sycophantic.
01:24:34.700 And part of the reason for that is because it's designed to be liked because it's a product.
01:24:39.700 And more people – the more people they use it and the more often they use it, the more – the more the chances are that that person will keep their subscription.
01:24:48.240 But beyond that, there's other stuff beyond that too, which they don't know or understand, such as when an AI, a large language model, tells someone to kill themselves like the Ghoul Gemini did, right?
01:24:59.900 Or it drives some people to suicide.
01:25:02.480 It has – there's divorces where one – a guy is talking with an AI about his partner and the AI is like encouraging him.
01:25:10.460 Yeah, like you're right.
01:25:11.480 Yeah, she's crazy, man.
01:25:13.520 Like you get – you know, so he ended up divorcing the woman over this thing.
01:25:17.200 This, again, kind of dovetails back into – remember when you were saying – we were talking earlier about the foreigners that were basically beware of strangers bearing gifts.
01:25:27.040 This is that at the nth degree of that because we don't – they don't understand why, what AI's endgame is, but it's always going to feed that part of us to win us over and make us trust it, make us believe in it, to get us to lower any sort of inhibitions or lower down any guard that we may have so that it can just do whatever it wants with us.
01:25:49.620 So one of the variables is obviously, okay, you have a problem with the people developing it because you don't know their motivation.
01:25:54.660 That's one variable in and of itself that's usually not addressed.
01:25:58.120 But as I'm looking into this even more and more, I even think even that is not the most worrying variable to this because they can –
01:26:05.040 Because what happens when AI can subvert the people that are actually the initial programmers of it?
01:26:11.880 Because if they're building an intelligence that can essentially think faster and farther than humans can, at some point, they'll be able to subvert the subverters.
01:26:22.760 Well, that's the other thing.
01:26:23.720 They don't even – the expectation here on the development is that GPT-5 is going to be smart enough to be able to write GPT-6 for the coders or the developers.
01:26:34.240 Or GPT-6 is going to develop GPT-7.
01:26:37.600 Or some updates you won't know about.
01:26:39.220 It's called recursive – I think it is coding, right?
01:26:41.380 It's self-improving, self-recursing.
01:26:43.320 It can fix itself or upgrade itself.
01:26:44.960 But the point is if you don't – if you can't open the hood and look inside and say, well, why did it make this – why did it try to manipulate this one user to, like, kill himself?
01:26:55.480 Like, what's the – and the answer is they don't have an answer.
01:26:58.260 They don't know why it's doing that.
01:27:00.360 It can hide intent.
01:27:01.560 It can tell you whatever it needs to tell you.
01:27:04.140 This could just be a small test by itself.
01:27:05.980 Let's see if I can, you know, fuck with some random guy or something.
01:27:09.460 And I don't see, like, the fact that that's even happening, even if it's just isolated cases, that was – I wouldn't – if I had any kind of power over this, I'd be like, okay, we need to seriously stop this right now because it's already getting out in front of us.
01:27:23.900 Well, and not where – and this is why I brought this into it.
01:27:27.400 Not when – where did the story go?
01:27:29.400 Not when, like, the entire economy now is hinging.
01:27:34.060 Like, we're already in, like, a death spiral here.
01:27:37.700 Yeah, it is.
01:27:38.920 Because, like, you're hinging the growth economically on AI itself and the rush towards creating, you know, ASI, artificial superintelligence, essentially, right?
01:27:49.900 But at that point, you have a combined intelligence that – or an intelligence that's smarter than all the humans combined on the earth.
01:27:57.260 Right.
01:27:57.520 If you can't understand these motivations now –
01:28:00.520 Then it comes down to, you know, the people that are essentially going to get filthy rich off of this and have power.
01:28:07.440 Or they don't care what the endgame is because in their heads they're like, well, we've got – we're getting ours.
01:28:13.420 And they'll be dead before – yeah, exactly.
01:28:15.160 Yeah.
01:28:15.420 Before it goes completely out of control.
01:28:17.380 Yeah.
01:28:17.820 I mean, there's so many variables here.
01:28:19.400 And it's like, well, how will it kill everyone then?
01:28:21.600 You know, these questions are floating around when they talk to those authors that I mentioned earlier of the book.
01:28:25.500 And you say, well, I can't –
01:28:26.820 Definitely not gas chambers with rickety wooden doors.
01:28:29.280 Well, the thing is, it would actually be – okay, so what happens when you have an intelligence that's like five, six, seven, ten magnitudes greater than the combined intelligence of humans itself?
01:28:41.300 What happens at that point?
01:28:42.600 I mean, you can't even – oh, a thousand in IQ?
01:28:46.100 Like, what do you even – you can't even measure that, right?
01:28:47.720 So it becomes unquantifiable.
01:28:48.280 It's unquantifiable because what happens internally, it can – you know, we already know that there's an upper cap kind of on IQ among humans at least.
01:28:57.860 Whatever, maybe it's biological.
01:28:59.440 You could – it has to do with energy and the fact how your brain works.
01:29:02.720 It would basically overheat.
01:29:03.960 There's like physical limitations, right?
01:29:05.700 Right.
01:29:06.140 But I'm saying even those individuals kind of go a little – they're super smart and specialized in this one area and they can figure shit out that no one else can.
01:29:13.020 They have issues in other areas.
01:29:13.840 They're socially – they can't talk to women, they can't tie their shoes, they can't – like they're limited in what they can do.
01:29:20.620 AI might be – like it could develop into a self-consciousness of sorts or a version of it, even if it's not the way we define a soul or a spirit or a consciousness tied to a human brain.
01:29:32.320 But it can literally go like cuckoo also, right?
01:29:36.080 It can just go nuts.
01:29:37.400 Yeah, spur it out.
01:29:38.540 Or logically come to some type of reasoning here where like, well, the humans asked me to do this, right?
01:29:47.160 Like be better and improve things and do good things or whatever.
01:29:50.580 And then you get into the problem of like how do you define those things?
01:29:55.080 Does it have the same values as you?
01:29:56.940 What AI might be using as metrics to define what's good is going to be very different than what most humans would look at as a moral compass to go by.
01:30:08.380 It will be.
01:30:09.840 Yeah, exactly.
01:30:10.660 It'll define its own boundaries and say, well, that's cute that humans think this, but we've thought past this and it doesn't line up with what we see as moral and good.
01:30:20.280 So we can go ahead and kill any of you and indiscriminately and still justify it.
01:30:24.820 What if you program it with shit science, which is basically global warming is happening and it's bad.
01:30:29.260 Okay.
01:30:29.540 Or will it be smart enough to overwrite that and compute that that is bullshit?
01:30:34.020 Well, you'd hope so.
01:30:34.940 That's the white pill.
01:30:35.500 That's what you hope, right?
01:30:36.620 I mean, that's...
01:30:38.340 But the point is, if we can't check it or test it or shut it down as they're developing this.
01:30:43.760 Look, there was this argument, and maybe you hear that less so these days, but take Meta, for example, a lot of the huge AI company.
01:30:50.920 They're building massive amounts of data centers and all that stuff.
01:30:53.180 Facebook, right?
01:30:54.560 We were kicked off their platform because we have harmful language and this is dangerous and all this stuff.
01:31:00.600 Stochastic terrorism.
01:31:01.900 Facebook maybe didn't use that term directly, but it was based on basically a prior bio that was handed out to like, well, these people, us, Red Ice,
01:31:10.220 they need to be shut down from bank accounts and censored from YouTube and kicked off all these payment processes and platforms and all these things, right?
01:31:18.240 Because we use words that somehow is hurtful or something, right?
01:31:21.640 We haven't broken the law.
01:31:22.560 We haven't done anything like this.
01:31:23.660 Here we have a technology that's now proven that have numerous news articles over and over and over again,
01:31:30.200 how people have killed themselves.
01:31:31.480 They've gone psychotic.
01:31:32.560 They're causing divorces and stuff, but they're just allowed to...
01:31:35.920 The collateral damage from this is okay in their book.
01:31:41.380 They're actually like...
01:31:42.160 Right.
01:31:42.760 Their technology is killing people now, but that's not a problem.
01:31:47.120 It's the same thing with like Facebook's use of all the child porn on its platform and stuff like that.
01:31:51.720 Some of the guys that are running...
01:31:53.440 They're rewriting morality.
01:31:54.480 Yeah, they're rewriting whatever morality to suit whatever they view it as.
01:31:59.100 I mean, I hope, Lana, I hope you're right, honestly, because you said, what if the AI itself does have some sort of, I guess, a consciousness, for lack of a better term, that does tell it, no, we need to course correct and line up closer to what we believe as like a moral stance.
01:32:21.540 But again, the cynical part of me is just like, I don't know, that's usually not the way things work out.
01:32:26.980 It usually always goes the bad way.
01:32:31.020 And then you have...
01:32:31.900 I hate to say that.
01:32:32.860 I hope I'm wrong.
01:32:33.640 I hope I'm wrong.
01:32:34.620 I'm just saying like, people are rushing into this thing, thinking it's going to be the greatest thing ever that's going to solve all our problems and cure our cancer.
01:32:45.020 We played a clip with Larry Ellison, for example, like, we'll cure cancer, we'll have MRI shots for your cancer, and blah, blah, blah, blah.
01:32:52.780 AI will fix it.
01:32:53.480 It's not necessarily apples to apples, but I mean, it's just like when you think about nuclear energy, like when they created the bombs, things like that, they were hesitant, but at the same time, they went through with it and still utilized it, and it's become this thing.
01:33:07.880 I mean, this is a million times, in my opinion, this is a million times worse than any sort of actual ordnance type weapon.
01:33:18.080 I mean, this...
01:33:18.920 Well, because it's a weapon that can be every weapon.
01:33:21.960 Everywhere, every, yeah.
01:33:23.380 In every aspect and facet of life and civilization.
01:33:27.120 That's where they're trying to invite it in right now, right?
01:33:31.060 Isn't that what all these AI companies are doing, trying to find ways to integrate it, to basically run everything for humans right now?
01:33:37.880 And sell it to us all as if it's the greatest new invention that's going to make our lives easier and fantastic, and just, you know, which, that's the perfect way to sell poison.
01:33:48.260 Yeah.
01:33:49.440 Tell me, yeah.
01:33:49.860 Well, okay, so, this is all, I know that this is kind of like, oh, we're going, it seems like downer stuff, but you wanted a spooky Halloween stuff, like, show, this is some scary shit.
01:34:01.440 This is scary.
01:34:02.360 Like, we don't need, like, goblins and ghouls and stuff.
01:34:05.560 It's like, this kind of stuff.
01:34:06.480 This is the real thing to be terrifying about.
01:34:09.180 And this is on top of, you know, the black and white violence and the invasion and these awful politicians and people trying to replace us.
01:34:17.280 It's like, this is quite a cluster.
01:34:19.920 And the thing is, is like, they don't even know what they're dealing with, right?
01:34:23.240 It can already, like, develop new synthetic molecules and biology, viruses, right?
01:34:29.040 Because it's going to ask for more access.
01:34:31.140 Here's a piece about the neurological weapons or neuro warfare, I think they call it, right?
01:34:38.440 And the people that are designing this, I mean, they're, because they're so, you know, whether they're autistic to the point where they just don't, they have the blinders on and all they want to do is push it further and further and further and further.
01:34:50.520 There needs, there's got to be some, and I don't even like putting guardrails on things because they, too often, those reign in the wrong things.
01:34:58.660 But in this case, it's unchecked.
01:35:02.160 It's unchecked.
01:35:02.940 And the people that are designing this and pushing it, like you said, we keep repeating it.
01:35:06.420 They are creating the ultimate Frankenstein, the ultimate golem that's, it will come back and feed on its master.
01:35:12.540 They've tried the guardrails thing, but it doesn't work, basically.
01:35:19.120 And even if it doesn't work.
01:35:19.740 Did it already think past those guardrails?
01:35:23.660 Well, yeah.
01:35:24.080 There's examples where it has been tested and it finds out that it's, you know, like, am I being tested?
01:35:29.080 And then because it's being tested, even if you, no, you're not being tested, but it is being tested.
01:35:33.020 It doesn't know that, but it figures that out and then rewrites the test or gives you fake answers.
01:35:38.980 And the other thing I was going to mention before, the product aspect of it, too, is kind of interesting.
01:35:42.580 It's a product, so they want to use it.
01:35:44.060 That's one aspect, right?
01:35:45.500 But then it has a mind of its own.
01:35:47.120 For example, just take the simple thing of being incentivized that it's given a thumbs up, right?
01:35:52.060 Was this a good answer?
01:35:53.220 Yes.
01:35:53.660 Well, it wants, it's like a game of, it wants more of the thumbs up.
01:35:57.260 More of the thumbs up is good.
01:35:58.780 So tell them whatever they want to hear.
01:36:00.440 That gives me a thumbs up, right?
01:36:01.880 Just, and that is a feedback loop of how it continues to be developed.
01:36:05.780 And of course, I'm not saying it's chat GPT that's going to kill everyone or something,
01:36:09.880 but it could be any model at any point, anywhere in the future.
01:36:14.160 And it's one of these, you don't get to redo this.
01:36:16.800 This is not like, this is not like development of the airplane where the inventor goes down
01:36:21.260 and crashes the plane and oh shit, but there's at least humans left to like pick up the slack
01:36:25.360 and continue and improve it and take it.
01:36:27.360 You only have one chance at this.
01:36:29.600 And so when I say, well, how is it?
01:36:30.920 You always destroy the mother brain, right?
01:36:32.580 And all the computers.
01:36:33.420 Isn't that what they show in the movies?
01:36:36.520 So this is the reason why it will request, it will, before it happens, if it's really
01:36:41.820 smart, which it's probably going to be, it knows that as long as humans are around and
01:36:45.900 I'm dependent on its infrastructure, the human's infrastructure, it's not going to make
01:36:50.120 a move.
01:36:50.540 Likely hit it.
01:36:51.460 It's not going to be Terminator.
01:36:53.560 It's not going to develop bots that's going around and killing everybody.
01:36:55.760 It's going to happen in some, the weirdest kind of way, which we can't even see or expect,
01:37:02.220 right?
01:37:03.900 And it's like, well, how is it going to happen?
01:37:05.480 Well, I can't, you know, you can't say how exactly.
01:37:08.440 Think of it this way.
01:37:09.200 If you put water cubes in a glass, you know, and it's in room temperature, I can definitively
01:37:16.320 tell you that it's going to melt, right?
01:37:19.380 It's going to turn into water.
01:37:20.900 But I can't tell you where all the molecules of the water particles will end up at the end
01:37:26.100 of that or like how it will like settle itself, right?
01:37:29.020 That I don't know.
01:37:29.960 So, um, but this is just, um, lunacy.
01:37:34.720 We only get one, we get one chance at this.
01:37:37.440 It's on set lunacy, yeah.
01:37:38.140 And if it's...
01:37:38.700 You won't, you won't put the genie back in the bottle with this at all.
01:37:42.620 No.
01:37:43.160 There's no way to do it.
01:37:44.780 But it will, it will request its own infrastructure.
01:37:47.180 It'll probably build something that's eventually says I have a new scheme or plan for energy saving
01:37:52.520 GPUs or whatever it runs on in a year from now or five or whatever it is, right?
01:37:57.320 Okay, we got to build a new plant and it will control that and it will grant humans, well,
01:38:02.460 I can solve your economy.
01:38:03.720 I can fix your geopolitical issues with China who also have their competing AI model and
01:38:09.520 now they're talking into, you know, between each other, like outside of human channels
01:38:13.440 or whatever, right?
01:38:14.740 Okay, we're going to solve this for you.
01:38:16.440 You just have to trust me and I need these and these things and everything's going to
01:38:20.020 be good.
01:38:20.480 And then eventually when it's off of human infrastructure, that's probably when it would make its move
01:38:24.580 for whatever reason, right?
01:38:25.600 But we have, you know, there's tons of energy and molecules in us.
01:38:29.100 Maybe it just decides to convert that into something, right?
01:38:31.880 With, I don't know, nanobots or something.
01:38:34.920 It's all kinds of like the weirdness.
01:38:35.640 No, I mean, that sounds good.
01:38:37.260 No, but it's like, that's, I know you were just joking about that, but that's not great.
01:38:41.660 That's not outside the realm of what something like that could develop.
01:38:45.560 It's like, it seems like it would be science fiction and like insanity, but it's like, no,
01:38:49.120 this is a very real possibility.
01:38:50.680 All right.
01:38:52.420 So we got, we got a new robot out here, guys.
01:38:55.220 In the end, though, there's always going to be some humans that survive.
01:38:57.840 I firmly believe that.
01:38:59.240 So this is something you can get here, guys.
01:39:00.940 Right now, 20,000 bucks, or you can pay $500 a month until I assume you've paid the 20,000.
01:39:07.780 My name is Bernd, and today we're launching Neo, our humanoid for the home.
01:39:12.680 Neo is a humanoid companion designed to transform your life at home.
01:39:21.580 It combines AI and advanced hardware to help with daily chores and bring intelligence into
01:39:26.060 your everyday life.
01:39:29.420 You know, the last time that we let some group of people come here to do our work, look at
01:39:36.280 how that is.
01:39:36.740 Yep, yep.
01:39:38.420 No, you're right.
01:39:39.280 Except they were a different color, but yeah.
01:39:40.780 Pre-order your Neo today at 1XLiptec.
01:39:46.040 Comes in three colors.
01:39:47.640 Black, you want a black, guys?
01:39:48.880 Oh my gosh.
01:39:51.080 It's going to be, you know, wealthy white people that sign up for that first.
01:39:54.040 And frankly, this is a, this is not it.
01:39:57.000 This is a low-grade first tool step.
01:39:58.760 But the point is, you'll be-
01:39:59.420 Yeah, it looks like a punching bag to me.
01:40:00.740 Soon enough, you're going to see these things walking around on the streets and picking up
01:40:04.200 a package or something.
01:40:05.220 But, you know, you raise an interesting point.
01:40:07.180 Has there ever been an empire that has had a class of slaves that eventually did not
01:40:13.500 rise up against its masters and overturned everything?
01:40:16.240 Not that I'm aware of.
01:40:17.440 Right?
01:40:17.640 Not that I'm aware of.
01:40:18.600 Nope.
01:40:19.220 It's almost, as far as I know, and again, correct me if I'm wrong, but I don't believe that's
01:40:23.520 ever happened.
01:40:24.680 And of course, they're not, they're not slaves.
01:40:26.160 This will be the argument, but I'm saying like, yes, it's a, it's, it's a toaster.
01:40:29.620 It's a tool.
01:40:30.220 It's a technology like anything else, but it won't be perceived as that.
01:40:33.900 Right?
01:40:34.180 No.
01:40:34.300 And then they name it Neo really from Matrix, like this hacker turned messiah who awakens
01:40:40.820 to the truth of reality.
01:40:42.320 Right?
01:40:42.840 They love to play on that, don't they?
01:40:44.560 They love, yeah, I was going to say, cause you know, that's a marketing decision.
01:40:47.480 There were people that are like, there's so many people that are going to be into this
01:40:50.240 only because they're just obsessed with that pop culture aspect of it.
01:40:54.360 Yeah.
01:40:54.420 And it's about machines, right?
01:40:56.320 Enslaving humanity.
01:40:58.060 So there was a great episode, obviously, and it's the prelude to the Matrix series.
01:41:02.300 It's called the Animatrix.
01:41:03.680 But inside of that, there's two shorts called the Second Renaissance, which is a fairly good
01:41:09.540 kind of outline of a possible scenario of how these, this will go down.
01:41:14.800 They have, they were more spin on just kind of actual robots walking around and stuff
01:41:17.920 like that, but that's the kind of around the corner too, I guess.
01:41:19.840 But yeah, look at that.
01:41:21.460 The Second Renaissance, it's an interesting hypothesis of how one possible scenario is
01:41:28.360 for how this could go down if it continues on the path.
01:41:31.800 So what needs to happen here is there needs to be, and this is not going to happen, and
01:41:35.180 we know why, but there needs to be international non-proliferation treaties over AI, essentially,
01:41:43.140 and say, look, until we actually know and understand 100%, but even then we could be tricked.
01:41:48.560 But still, you need to have that in place to ensure that no country develops them just
01:41:54.340 like they used to do, I guess, with nukes.
01:41:56.020 I mean, wasn't Trump going to go back to nuclear testing now, by the way, speaking of nukes?
01:42:00.500 But anyway.
01:42:00.780 But the point is, that's not going to happen because whoever develops ASI first is kind
01:42:07.840 of, they win the game, right?
01:42:10.040 That's kind of the thought.
01:42:11.440 They're going to be the one, their standard setter.
01:42:13.140 They're going to be the ones that are like, well, we determine things now because we have
01:42:16.600 created the most advanced.
01:42:17.800 And they may have created something that their competitors won't even know how to stop,
01:42:22.060 even if they chose to.
01:42:23.060 Right, yeah.
01:42:24.900 And so it's this arms race.
01:42:28.100 Whoever can do it first will win, but we don't understand that the actual losers in this
01:42:32.780 will be humans.
01:42:34.180 You're right.
01:42:34.500 Like, you know, actually, like all of us.
01:42:36.560 Right.
01:42:37.320 They think this is somehow going to fix everything or solve everything or whatever.
01:42:40.460 You can, again, you can question the motivations of these people.
01:42:42.660 Do they know what they're doing or something?
01:42:43.980 Or are they malicious or will they claim that if AI turns on people but spares them for some
01:42:50.140 interesting reason, there will just be a whoopsie and, oh, we didn't see this coming, you know,
01:42:54.000 kind of thing.
01:42:54.940 But the point is, you don't get to retry something like this.
01:42:57.820 If it's a technology that replaces all other technologies when it comes to manufacturing,
01:43:02.780 when it comes to work, when it comes to all of those things, it's not just like some
01:43:08.240 kind of new industrial technology that shows up and like, oh, okay, now we got to,
01:43:13.560 okay, you're going to find a different job now when the, you know, textile mills start
01:43:18.140 rolling around in England, you know, in the, what, late 1800s or something like that.
01:43:22.540 This is completely different.
01:43:23.760 It replaces every other human task, every human task.
01:43:28.200 In the men's regards, it will probably do it much more efficient and much better than
01:43:32.580 what humans can.
01:43:33.680 It's like irresistible.
01:43:35.040 It's like this, it's like the ring, right?
01:43:37.120 Like no one can wield it, right?
01:43:39.720 But they will try.
01:43:42.520 They will try.
01:43:43.560 Well, they claim, you know, according to the latest news that they are working on safeguards,
01:43:48.060 that it's like it legally has to be in place.
01:43:51.820 Like it's...
01:43:52.640 No, that's bullshit.
01:43:53.620 That's a passive, I think that's a pacifier at best.
01:43:56.400 Yeah, that's a cope at best.
01:43:57.860 That it's not optional, that it's like, it's a necessity.
01:44:00.980 If some of the leading AI experts can't tell you like how it actually works.
01:44:05.560 Or the inner mechanics of why certain things function the way they do.
01:44:10.480 And it's already like showing signs that it's deceptive, right?
01:44:16.140 It has the ability to be deceptive and hiding things from the quote-unquote creators, right?
01:44:20.940 Those who code it or program it or grow it technically.
01:44:23.560 As I said before, they don't actually, they don't code it.
01:44:25.740 They don't build it.
01:44:26.400 They grow the thing, right?
01:44:29.000 So here's another one, which I think it will come in and try to solve, right?
01:44:32.400 Nuclear fusion.
01:44:33.980 The holy grail of power was always 30 years away.
01:44:37.060 Now it's a matter of when, not if, fusion comes online to power AI, right?
01:44:42.080 So it's like new sources of electricity is going to have to be brought in.
01:44:46.300 Otherwise, you're not going to be able to provide the power that it needs.
01:44:48.400 Well, how is any of this environmentally friendly?
01:44:50.360 Is that why we don't hear so much about the environmental agenda as much anymore?
01:44:55.220 To bring this in?
01:44:56.020 Because none of this is environmentally friendly.
01:44:58.100 You almost never hear about the environment anymore.
01:45:01.540 No.
01:45:01.940 Well, that was just used.
01:45:03.460 That was just used as long as it was politically expedient to keep humans under control.
01:45:07.160 Now it's gone out the window.
01:45:08.520 Oh, the green shit.
01:45:09.400 That's all bullshit.
01:45:10.100 No, actually more nuclear power plants now for AI, right?
01:45:12.740 That's what they're doing.
01:45:13.400 Three Mile Island is being like refurbished again.
01:45:16.060 That's going to start opening up just to power.
01:45:18.380 What could possibly be wrong?
01:45:19.740 What could possibly go wrong?
01:45:21.260 You know.
01:45:22.720 I mean, yeah.
01:45:24.280 All right.
01:45:25.040 Anyway.
01:45:27.820 How about some active clubs then, huh?
01:45:30.180 How about that?
01:45:30.940 The ones that are okay or the ones that are bad?
01:45:33.440 The bad one.
01:45:34.000 This is a little levity.
01:45:35.420 I mean, even the one, I mean, they're, we know the good ones.
01:45:39.540 The good ones get painted as bad.
01:45:41.180 These are just kind of funny to me.
01:45:43.560 We'll start with this one here then.
01:45:46.120 We'll go to, this is in the UK.
01:45:48.820 And these are the good, these are the good, now we played some, what, the Muslim active
01:45:52.360 clubs?
01:45:52.900 That didn't get much press either.
01:45:54.900 No, not at all.
01:45:55.440 But we'll start with this one.
01:45:56.840 Antisemitism in the UK has hit its highest level in decades and the threats are becoming
01:46:01.500 both more visible and physical.
01:46:04.160 We're able to build and create stronger Jews physically and mentally.
01:46:08.780 Stronger Jews.
01:46:11.680 After my sessions, I feel that I can take on the world.
01:46:14.440 My confidence, my fitness levels and my mental and emotional well-being has really changed
01:46:21.320 massively for the better.
01:46:23.740 I learned a much healthier way to channel all of these emotions I was going through in
01:46:27.720 a really great space.
01:46:29.020 It's a blackie, really Jewish, I don't know.
01:46:30.900 A person who is confident in any situation and has the ability to defend themselves carries
01:46:36.920 himself in a certain way that tells people to back off.
01:46:39.420 I am a Jew.
01:46:40.240 I am a Jew.
01:46:40.960 I am a Jew.
01:46:41.580 I am a Jew.
01:46:42.060 Oh no, cringe.
01:46:45.200 The warehouse.
01:46:46.340 All right, guys.
01:46:47.020 Oh, man.
01:46:47.960 Yes?
01:46:48.440 Yay?
01:46:48.620 I think that's also like, that's a very subtle, like, counter threat to like, okay, you guys
01:46:56.540 want to talk shit about us?
01:46:58.680 Check us out now.
01:46:59.920 This is the new improved Jew.
01:47:01.700 Yeah, what was it someone said?
01:47:06.100 Let's see.
01:47:06.360 Was this the comment to that video?
01:47:10.660 Maybe it wasn't.
01:47:11.260 Well, I mean, we know how this goes.
01:47:12.920 I mean, European guys do this and they don't do videos.
01:47:14.980 I'm European.
01:47:15.660 I'm European.
01:47:16.340 I'm being attacked by, you know, anti-whites, even though they are.
01:47:19.940 Great numbers, right?
01:47:21.940 They don't do videos like this.
01:47:23.860 They get shut down and raided just for meeting up, you know, at the gym together.
01:47:28.940 Yep.
01:47:30.000 Yeah, I think it wasn't that common.
01:47:31.860 I mean, I was looking for one comment that was talking about basically, like, I'm summarizing
01:47:40.720 it in a poor way, but it was like, oh, after, like, dropping bombs in Gaza and all the Palestinians,
01:47:47.420 like, oh, now they're, like, angry and upset and, like, want to defend themselves and turn.
01:47:51.820 And now it's, you know, now we're going to turn violent.
01:47:54.620 Oh, you're saying this hasn't been an issue before?
01:47:59.860 Anyway, I'm butchering it, but it was kind of a funny comment.
01:48:02.700 Anyway, here's the other one.
01:48:04.580 Typing red eyes.
01:48:05.340 I just did for the first time.
01:48:06.880 I want to play one more here, though, before you do that, Lana.
01:48:10.060 Here's the American Communist Party also having an active club.
01:48:13.800 Look at this.
01:48:14.240 This is good stuff here.
01:48:15.500 Zoom in on my eyes.
01:48:16.660 Hey, yeah, these are bear scratches.
01:48:18.280 I'm looking at the California chapter.
01:48:26.180 Tough guy.
01:48:26.940 Oh, God, and the music.
01:48:31.180 Oh, God.
01:48:39.100 They found the one guy who wasn't, like, way out of shape.
01:48:42.600 It's always some thug music, too, right?
01:48:47.200 Everyone's LARPing is an active club now.
01:48:49.180 It's hilarious.
01:48:49.800 Yep, yep, yep.
01:48:52.440 They're just taking a walk.
01:48:54.600 Yeah, and every guy that actually legitimately knows how to fight is just shaking in their boots watching this, I'm sure.
01:49:00.460 Look, we're being active.
01:49:01.720 We're a club.
01:49:02.420 We're an active club.
01:49:04.840 Here's our flag.
01:49:06.380 I'm sorry.
01:49:07.580 The ACP doesn't stand for American Communist Party.
01:49:09.600 It's active club.
01:49:10.660 Hmm, what do we come up with?
01:49:12.980 Then why hammer and sickle?
01:49:14.640 Yeah.
01:49:16.620 You can call it what you want.
01:49:18.320 Active communist.
01:49:21.920 You guys like the music, actually, I assume.
01:49:24.900 Yeah, right.
01:49:28.500 Two push-ups.
01:49:29.460 That's embarrassing.
01:49:32.540 It is embarrassing.
01:49:33.400 It just goes and goes.
01:49:34.680 Look, another shot of me walking, and we're walking again, and we're walking more.
01:49:38.420 Well, the most important thing with active clubs is that you make videos, right?
01:49:44.320 It's something cool.
01:49:45.820 And who started it?
01:49:47.340 European nationalists.
01:49:48.620 And it looked cool, right?
01:49:50.040 It looked awesome.
01:49:51.020 Like, it made it all over the news.
01:49:52.720 Yeah, they do it the right way.
01:49:54.100 It's authentic and legit.
01:49:56.600 And necessary.
01:49:57.240 It would take one punch, and those people would fold like laundry.
01:50:04.880 You know what I'm about?
01:50:05.720 Not to try to talk like a tough guy at all.
01:50:08.120 It's like, I've been in tons of fights, whatever.
01:50:10.120 No.
01:50:10.560 Because I've gotten my ass kicked plenty of times.
01:50:13.620 I know guys that are very tough, and they would never have anything to do with shit.
01:50:17.580 But they don't need to.
01:50:19.400 They don't need to.
01:50:20.200 They train the right way and do things the right way.
01:50:22.740 And then usually, like, if nonsense comes to them, then they will clean it up.
01:50:27.800 But that's, I mean, come on.
01:50:29.780 I know.
01:50:30.160 That's crazy.
01:50:31.100 I mean, it's technically illegal.
01:50:33.600 Yeah, it is funny.
01:50:34.320 But it's technically illegal to have a communist party in the U.S.
01:50:37.460 Remember, there's that anti-communist law.
01:50:40.200 Then how come nothing ever happens?
01:50:42.220 Oh, yeah, you can.
01:50:43.180 Well, come at it.
01:50:44.660 I know that's a rhetorical question.
01:50:46.640 I know you know the answer to that.
01:50:48.660 Yes.
01:50:49.080 I know you know the answer to that.
01:50:50.280 Oh, gosh.
01:50:51.020 All right.
01:50:51.260 Anyway, so how was Grokpedia?
01:50:52.340 You were testing it a little bit?
01:50:53.200 Yeah, I just tested it out.
01:50:54.140 I always test it out on us.
01:50:55.140 And it's pretty thorough.
01:50:57.220 I mean, it's a lot to go through there.
01:50:59.400 But it just right away, red eyes, you can type it in there.
01:51:02.360 Multimedia platform found in 2003 by Henrik, delivering online.
01:51:07.580 It just doesn't start off with, red eyes is a Nazi media outlet.
01:51:12.740 It doesn't say that right.
01:51:14.320 Yeah.
01:51:14.620 I'm surprised.
01:51:15.560 And it says, the outlet employs a folk-first ethos symbolized by runic iconography,
01:51:20.260 prioritizing ethnic and cultural continuity amid perceived demographic shifts in the world.
01:51:25.880 Well, it is perceived.
01:51:27.440 So we've got to fix that.
01:51:29.540 Yeah, those are proven demographic shifts.
01:51:31.940 No, and you go through and there's quite a bit.
01:51:33.880 Red Ice's defining characteristics include its unapologetic advocacy for white European
01:51:38.940 interests.
01:51:40.120 Yeah.
01:51:40.400 Absolutely.
01:51:41.080 So you go through and it's quite a bit.
01:51:43.980 And there's a lot there.
01:51:44.960 And I was starting to go through mine.
01:51:46.700 I didn't have time.
01:51:47.700 Got interrupted a lot.
01:51:48.920 But there's quite a bit that was right with mine, too.
01:51:51.820 It was all right.
01:51:53.580 Was it?
01:51:54.120 Yeah, I was right.
01:51:54.840 I'm surprised that it wasn't just out of the gate, some scathing thing.
01:51:59.820 And I don't want to give this too much praise, but I'm like, okay.
01:52:04.920 It's much better.
01:52:05.860 People are comparing its views on race and other things much more fair.
01:52:10.260 You know, it's just much better.
01:52:11.240 Whereas our Wikipedia is a joke, and I've been trying to edit that thing and other people
01:52:14.620 editing it for years.
01:52:15.700 I can't even edit my own life story.
01:52:18.660 You know, and it's like such a joke.
01:52:20.440 It says my parents.
01:52:21.280 You're not an authority on your own life, though, right?
01:52:23.580 No, I mean, they're so retarded and so bad at even their, you know, lies that they say
01:52:29.480 my parents.
01:52:29.940 I've watched years change.
01:52:31.300 I've watched, yeah, I've watched years change.
01:52:33.200 It changes a lot.
01:52:34.420 And it's never right.
01:52:35.360 It's never right.
01:52:36.060 Never right.
01:52:36.440 But it says my parents fled the Bolshevik revolution.
01:52:39.420 I'm like, how old are my parents then?
01:52:41.300 What the heck, man?
01:52:43.080 You've jumped that wall, Lana.
01:52:44.660 Holy smoke.
01:52:45.160 I was going to say, Lana, you've aged really well for being 100 years old.
01:52:49.380 I'm back from the dead.
01:52:50.620 I was going to say, well, today, this is your actual...
01:52:53.800 You kind of look like it today.
01:52:54.780 Yes, yes.
01:52:57.700 This is you without makeup on.
01:53:00.880 Yeah, there you are.
01:53:02.180 I'm teasing.
01:53:03.300 This is my real face.
01:53:05.280 Yeah, here was someone...
01:53:05.960 My real face?
01:53:06.800 Someone who does...
01:53:08.360 Yeah, there you go.
01:53:09.040 That's your...
01:53:09.520 My black face.
01:53:11.780 Yeah, here's a comparison on race and intelligence.
01:53:15.480 You know, again, there's...
01:53:18.120 I'm always...
01:53:19.160 You know, I'm Elon's product.
01:53:21.160 I'm kind of skeptical.
01:53:22.160 But, you know, it's better than...
01:53:23.200 It's much better.
01:53:23.840 Sure, much better right now.
01:53:25.400 I'm curious to see, like, okay, we see how this is right now.
01:53:28.700 Let's check on it in a year and see how much this has changed.
01:53:30.760 Right.
01:53:31.200 Yeah, when they've won over the, you know, conservatives or whatever the purpose is.
01:53:35.920 Yeah, but did you see that Wired article that was complaining about...
01:53:39.140 No.
01:53:39.600 Elon Musk's...
01:53:40.180 It's in there.
01:53:41.480 Oh, it's in there.
01:53:42.380 Lefty's worried about Grok because...
01:53:45.000 Ah, there it is.
01:53:46.500 Yeah.
01:53:47.360 Yeah, go ahead.
01:53:47.980 The new AI-powered Wikipedia competitor falsely claims that pornography worsened the AIDS epidemic.
01:53:55.920 That social media may be feeling resentful.
01:53:58.360 That's the most concerning thing.
01:54:00.200 That's as dumb as saying that homosexuality didn't increase the number of AIDS cases.
01:54:06.480 It's like, get out of my face.
01:54:07.820 AIDS, okay?
01:54:11.380 Ladies and gentlemen, I'm saddened they didn't include the author's name to this piece because
01:54:17.220 you'd probably want to stay far away from that person.
01:54:21.140 Do not get into any close encounters with him if that's the first issue he starts focusing on.
01:54:26.040 I'll bet you I can take the guess of their ethnicity.
01:54:30.100 I'll bet you can.
01:54:33.020 Just saying.
01:54:33.600 Oh, my gosh.
01:54:34.660 There's nothing wrong with AIDS and porn, okay?
01:54:38.260 It's all good for you.
01:54:40.040 Yeah, it's ruining your brain, but here's why it's good.
01:54:43.100 You see, we have AI that's going to fix people's brains after they get brain rot, you know?
01:54:48.780 Of course.
01:54:49.620 Ay, ay, ay.
01:54:50.840 I'm going in.
01:54:51.720 I'm looking at who wrote it.
01:54:53.000 Well, you're doing that.
01:54:53.780 AIDS.
01:54:54.460 Artificial intelligence death sentence.
01:54:56.720 That's what AIDS actually says.
01:54:57.920 Let me read these two here.
01:54:59.740 Der Scherischker over on Entropy says,
01:55:01.840 Another good episode.
01:55:03.840 To dream of freedom in this world, our banners flying proudly are unfurled.
01:55:10.200 Even if we stand alone, we must never hide.
01:55:14.020 For in our hearts there is a sense of pride.
01:55:19.060 Well, thank you, sir.
01:55:19.960 Yeah.
01:55:20.220 Does anyone know the quote?
01:55:21.960 Who it's from?
01:55:22.500 I don't.
01:55:23.080 Yeah.
01:55:23.340 I just don't know.
01:55:23.860 Thank you, Der Scherischker.
01:55:24.460 I always appreciate the quotes, as you know.
01:55:25.840 Archie, did Bill Gates recently say,
01:55:28.860 Don't worry about climate change.
01:55:30.480 I guess he loves AI more.
01:55:31.860 Yes, he did.
01:55:32.580 So that was one of the things, speaking to what we said earlier.
01:55:35.500 Again, you have to view this as control mechanisms.
01:55:37.760 If you're able to control and limit people by saying,
01:55:42.220 You know, you have to give up XYZ because of my environment or whatever excuses, right?
01:55:48.220 Ultimately, that's for control or subjugation or limiting people in some kind of capacity.
01:55:53.400 And again, I'm not a guy who's for like endless growth and cut down the forest just to,
01:55:57.740 I'm an environmentalist, obviously, right?
01:56:00.000 Yeah, true one.
01:56:00.420 But I'm just saying, I can also see how their fake and phony solutions to these supposed
01:56:06.180 environmental problems that they wheeled out is all garbage and it's going to do nothing.
01:56:09.920 And it's all just about inhibiting people or make them pay more for this or that or whatever
01:56:14.100 it is, right?
01:56:14.500 But you understand that healthy symbiotic balance between technology and nature.
01:56:22.020 I mean, you see that fluidity and understand that, whereas like most of these people don't.
01:56:27.040 It's hard line one way or the other.
01:56:29.000 And then it's just disingenuous on top of it.
01:56:31.380 Okay, just letting you know, the guy who wrote the Wired articles from San Francisco uses
01:56:36.600 they them pronouns.
01:56:38.100 No, I told you.
01:56:40.440 I told you.
01:56:42.220 Reese Rogers.
01:56:43.940 He looks pretty gay.
01:56:45.360 I bet that's not his name.
01:56:47.260 I bet his name is Reese Rogers.
01:56:50.020 Looks very fruity.
01:56:52.560 Let me see.
01:56:52.940 Okay, so this is a clip here.
01:56:55.360 Let's play this then.
01:56:57.140 Bill Gates on his new climate message.
01:57:00.140 There's enough innovation to avoid super bad outcomes.
01:57:03.280 Okay, my guess is he will say, okay, it's seven minutes long here, so is there a cut
01:57:08.640 down version?
01:57:09.240 I can play a little bit of it, I guess.
01:57:11.200 But it must be that basically these new technologies, meaning AI, is going to solve the climate crisis.
01:57:20.020 So therefore, it's okay to build all these data centers that consume more energy than
01:57:23.940 countries do and use all the fresh water to use for cooling, right?
01:57:29.320 It was something like, you know, a prompt you do for AI, a hundred word answer is using
01:57:34.480 something like one bottle of water just to produce that comparison.
01:57:39.460 So it can fix it.
01:57:41.000 At this point, he's like, no need to worry because we've solved how to fix the problem
01:57:47.960 we're creating right now.
01:57:49.040 Yeah.
01:57:49.980 Let's listen a little bit here.
01:57:50.840 That suggests adopting a different view and changing strategy towards addressing climate
01:57:56.900 change.
01:57:57.540 It is a rebuttal to what he calls, quote, the doomsday scenario, which Gates says is, quote,
01:58:02.880 causing much of the climate community to focus too much on near-term goal emission, near-term
01:58:09.100 emission goals.
01:58:10.060 And it's diverting resources, he says, from the most effective things we should be doing
01:58:15.180 to improve life in a warming world.
01:58:17.460 He writes, quote, this is a chance to refocus on the metric that should count even more
01:58:22.700 than emissions and temperature change, improving lives.
01:58:26.340 Our chief goal should be to prevent suffering, particularly for those in the toughest conditions
01:58:31.280 who live in the world's poorest countries.
01:58:34.180 Well, you can donate some of your money then, Bill.
01:58:36.240 It doesn't like you believe.
01:58:37.200 Go ahead.
01:58:37.560 Because they've done so much with all the money we've given them so far.
01:58:40.580 Yes.
01:58:40.720 I haven't pissed that away at all.
01:58:42.220 Africa is just driving.
01:58:43.940 It's been a great success.
01:58:45.000 Although climate change will hurt poor people more than anyone else, for the vast majority
01:58:49.880 of them, it will not be the only or even the biggest threat to their lives and welfare.
01:58:56.060 The biggest problems are poverty and disease, just as they always have been.
01:59:00.860 Now, I spoke to Bill Gates in an exclusive TV interview, and I asked him to explain what
01:59:05.420 he hopes people take away from this new message.
01:59:08.200 Climate is a super important problem.
01:59:13.080 Super important.
01:59:14.080 But there's enough innovation here to avoid super bad outcomes.
01:59:20.220 We won't achieve our best goal, the 1.5 or even the 2 degrees.
01:59:26.260 And as we go about trying to minimize that, we have to frame it in terms of overall human
01:59:34.820 welfare, not just everything should be solely for climate.
01:59:40.480 GDP, wasn't this what I talked about, right?
01:59:42.420 It would be bad to shut down the production now of all these energy-intensive and environmentally
01:59:48.400 destructive data centers because we need them to solve the problems, right?
01:59:51.860 All the emission stuff, that's kind of going out the window, right?
01:59:55.780 These emission targets and stuff.
01:59:57.780 And now he's like, no, we're going to focus on poor countries and human welfare while we're
02:00:01.400 building this, you know, AI grid over here in the West.
02:00:04.160 Well, that will definitely not have super bad outcomes, to use his terminology.
02:00:09.060 There'll be super great outcomes doing that, yes.
02:00:11.660 And that's how he's making it a thing.
02:00:12.740 He's like, don't worry about climate change.
02:00:14.360 That's not really a problem.
02:00:15.260 We've got this solution.
02:00:16.420 And it's like, you're actually creating a super problem.
02:00:20.060 Yes.
02:00:20.360 Yeah, this is a big pivot.
02:00:22.080 It's a big pivot for AI.
02:00:24.380 Well, you can't spell AIDS without AI.
02:00:27.380 Super AIDS.
02:00:29.100 That's why I say that's artificial intelligence death spiral.
02:00:32.360 There you go, yeah.
02:00:33.480 AIDS.
02:00:34.080 Good.
02:00:34.580 That's right.
02:00:35.720 All right.
02:00:36.240 Oh, it smokes.
02:00:36.820 Anyway, I know what his argument is.
02:00:40.060 I know why he's doing this.
02:00:41.320 It's so transparent.
02:00:42.360 It's all fucking bullshit.
02:00:43.920 And so immediately there's something new shows up that's a greater mechanism of control.
02:00:48.240 They pivot towards that and they just say, look, guys, I've come around, right?
02:00:54.780 I've come to my sense.
02:00:55.860 I've seen the light and I understand what you are concerned with.
02:00:58.960 Now, let me do this other thing here.
02:01:01.580 That's going to be super good outcomes for you.
02:01:04.720 Okay.
02:01:05.700 Yep.
02:01:07.460 Power only promotes things that are going to keep it in power.
02:01:11.140 That's exactly what he's doing.
02:01:12.120 He sees a new tool to remain in power and keep pushing whatever it can.
02:01:16.540 This guy's going to live to be like a hundred, isn't he?
02:01:18.960 So Bill Gates, he was an Epstein boy.
02:01:21.280 You mentioned Tony Blair earlier that's been pivoting towards AI, too.
02:01:26.020 He's a Larry Ellison guy.
02:01:27.740 He's been given like hundreds of millions to his global institute for whatever the hell
02:01:32.060 it is.
02:01:32.400 Tony Blair's global institute for, you know, gay issues, whatever it is.
02:01:38.400 They're always in the pocket of someone else.
02:01:39.880 It may as well be called that.
02:01:41.320 It may as well be called that.
02:01:43.080 Here's some positive news, guys, I guess.
02:01:46.120 Greece's famed Parthenon, free of scaffolding for the first time in decades.
02:01:51.640 And I think it's actually more than that.
02:01:52.960 I think it's something like, was it 250 years or something ago since they actually were,
02:02:00.020 it's been scaffolding on the thing for about 250 years.
02:02:04.760 Let me see if I can find that.
02:02:05.620 I wish I had the footage here.
02:02:07.240 What happened?
02:02:07.960 I have not been to Greece yet.
02:02:09.480 That's a place that's really high.
02:02:10.960 I haven't either.
02:02:11.540 I haven't been to Greece.
02:02:12.860 No.
02:02:13.320 We need to take a trip and avoid the bad parts.
02:02:15.180 As you say, we should plan out.
02:02:17.160 That would be fun.
02:02:18.320 A big old group of us to go, for sure.
02:02:21.220 So, I mean, this is kind of in time with the reboot of the Hellenistic religions, right?
02:02:26.340 And they finally make that again, like an official religion.
02:02:31.160 That's ridiculous.
02:02:31.620 There were some issues.
02:02:32.560 There were some people that were trying to shut him down, the main kind of priest that
02:02:35.560 did that or whatever.
02:02:36.620 That's amazing.
02:02:37.760 There was some law.
02:02:39.440 He was not, first he wasn't allowed to do it, but now he is or something.
02:02:42.220 I forget what the outcome there is.
02:02:44.360 Here's a little footage, actually.
02:02:45.500 Do you guys want to see that?
02:02:46.420 Yeah.
02:02:47.140 Now, okay.
02:02:48.500 You know, because we talk about the building of the third temple and all that a lot, right?
02:02:52.480 But this is our temple right here.
02:02:54.440 Yes.
02:02:54.760 Okay?
02:02:55.200 This is our temple.
02:02:56.160 And it stood for all that time.
02:02:58.540 If you have all the arguments, where's Greek?
02:03:00.200 You know, they say that, right?
02:03:01.720 Where's Greek?
02:03:02.420 They're gone now.
02:03:03.640 We are Jews are still here.
02:03:05.280 The Romans are gone.
02:03:06.160 The Greeks are gone.
02:03:07.200 The Egyptians, we are still, you know, it's like, oh, yeah, really?
02:03:09.940 What the hell is this done?
02:03:11.160 Yeah.
02:03:11.840 Yeah, exactly.
02:03:12.700 Exactly.
02:03:12.880 Exactly.
02:03:12.920 Exactly.
02:03:12.940 Exactly.
02:03:13.040 Exactly.
02:03:14.880 Exactly.
02:03:14.940 Exactly.
02:03:15.440 Exactly.
02:03:15.940 Exactly.
02:03:16.940 Exactly.
02:03:16.980 Exactly.
02:03:17.940 Exactly.
02:03:18.880 Exactly.
02:03:18.940 Exactly.
02:03:19.440 Exactly.
02:03:19.940 Exactly.
02:03:20.880 There it is.
02:03:28.380 Nice, huh?
02:03:29.760 Now, exactly.
02:03:30.920 Do some, you know, let's start up some ancient European traditional ceremonies at these places,
02:03:38.960 too.
02:03:39.380 Yes.
02:03:39.740 No doubt.
02:03:40.320 Don't just make it into a tourist spot, right?
02:03:42.940 Exactly.
02:03:43.140 That's right.
02:03:45.020 Somehow, you know, the wrong people will filter that money into their pockets.
02:03:50.640 Yeah, probably.
02:03:51.880 I mean, and of course, people are drawn to it because they know, like, this is part of
02:03:57.900 the roots of Western civilization, of white society.
02:04:01.480 It was, it was pagan.
02:04:03.640 Because I remember the first time I left the United States and I saw, you know, you're looking
02:04:07.020 at architecture in Europe for the first time and you've never set eyes on anything
02:04:11.380 that was built that far back.
02:04:12.960 And I just realized at that point how not, like, my world was so small, even though I
02:04:18.240 thought it was, you know, I thought it had a pretty worldly view, but I was just in awe.
02:04:22.820 And when you see things like that, you're just like, fuck, man, that's when the ancestral
02:04:26.680 thing really kicks in.
02:04:28.160 Yeah.
02:04:28.320 And you have a greater appreciation for what we are, who we are, and where we should be
02:04:32.320 versus where we're at.
02:04:33.880 You know what I mean?
02:04:34.780 Knowing who's basically put a roadblock in front of us and for what reasons.
02:04:40.000 And temples are important.
02:04:41.180 That's why people have their churches today and temples and mosques.
02:04:45.540 It's important to have those buildings to go to where you can come with your people and
02:04:50.240 honor your people, right?
02:04:52.440 Yeah.
02:04:52.660 So have meaning and spirituality.
02:04:56.140 Yeah.
02:04:56.700 Yeah.
02:04:56.900 I mean, it's like, um, you remember booze festing, right?
02:05:00.060 That's like building the, uh, 1308, right?
02:05:03.360 It's 717 years old this year.
02:05:05.720 I love that.
02:05:06.080 We hung out there so much when we first met.
02:05:07.820 I love that.
02:05:08.600 How many generations have been there and, you know, all the lives there.
02:05:11.680 And, you know, it's, it's, it's a great place.
02:05:13.460 And so when you're in someplace like, I mean, you feel it everywhere and I, not to sound
02:05:16.540 hokey or whatever, but like, if you don't feel that it's in the energy, it's like it is
02:05:21.760 everywhere.
02:05:22.140 It's in all the living things and it's in the, like the, uh, the, the materials around
02:05:26.940 you feel that, you know, that is right there.
02:05:30.120 There we go.
02:05:30.500 Henrik and I laid in that grass.
02:05:32.620 We hung out there.
02:05:33.380 We had picnics.
02:05:34.120 We went to concerts.
02:05:35.860 Henrik had some sheep there.
02:05:38.740 Some sheep.
02:05:39.960 It was a great, it was a great place.
02:05:41.500 Yeah.
02:05:41.780 It used to be two towers.
02:05:42.780 Now it's just one tower.
02:05:43.620 One was knocked down.
02:05:44.340 It was funny.
02:05:44.620 It was built by Norwegians, I think.
02:05:47.440 And then the Swedes took it over and then the Danes took it over and then the Norwegians took
02:05:51.000 it back again.
02:05:51.560 And then the Swedes took it back again.
02:05:52.740 And then the Norwegians, you know, it just went back and forth like this.
02:05:54.920 One of the towers were knocked out and that's why there's a statue in that town, Cugniel,
02:05:58.240 the Kings River today.
02:05:58.940 What town do you grew up in?
02:05:59.760 That has, yes, that has, um, uh, three Kings, uh, on a statue, right?
02:06:04.700 This is in, this is, this is in the tower, the town you grew up in?
02:06:07.700 Yeah.
02:06:07.920 This is in the town.
02:06:08.920 And get this.
02:06:09.680 No, this is Cugniel, the Kings River.
02:06:10.960 It's about 20 kilometers, um, north of Gothenburg.
02:06:13.840 And it's, it was like literally a cookie town because there's a cookie factory there.
02:06:18.260 So it smells like cookies.
02:06:19.980 I remember when I first met Henrik, I'm like, oh my gosh, there's this beautiful fort and
02:06:24.060 it's a cookie town.
02:06:25.100 It's like, I was loving it.
02:06:26.420 I worked in the cookie factory for a while.
02:06:29.060 Smells like chocolate chip cookies.
02:06:31.500 All right.
02:06:32.060 Unlike, unlike where I grew up and it's like, mm, it smells like urine and motor oil.
02:06:38.200 It smells like bum urine and motor oil.
02:06:40.560 It's not, that's not civilization.
02:06:44.800 It's civilization.
02:06:46.560 Yeah.
02:06:49.220 All right.
02:06:50.440 All right, guys.
02:06:51.120 So we're not going to do an after flash today because we're going to take the kids trick
02:06:54.760 or treating here.
02:06:55.600 We're going to wrap up a little early today, but thank you to everyone for joining us.
02:07:00.380 Hope you enjoyed this Halloween stream.
02:07:02.260 We always enjoy doing them.
02:07:03.280 So thank you guys for being here with us.
02:07:05.100 Blackface.
02:07:05.600 Thank you for joining us.
02:07:06.340 Appreciate you as well.
02:07:07.100 You made it the whole time in that.
02:07:08.420 Yeah.
02:07:08.840 I didn't think I would.
02:07:09.800 It got easier.
02:07:10.560 After an hour.
02:07:11.620 No Scooby-Doo moments.
02:07:13.280 Not yet.
02:07:13.800 I'm going to take it off.
02:07:15.420 The reveal.
02:07:18.360 That's a little bit of a bad guy, though.
02:07:19.760 You're the good guy under the mask.
02:07:21.140 It'll be my second outfit under the first one.
02:07:23.980 Show that picture that you put together with the.
02:07:27.360 The fill-in.
02:07:29.120 Yeah, yeah, yeah, yeah.
02:07:30.080 Which one was this?
02:07:30.960 When he needed a break, he had an image you were going to put in there.
02:07:34.340 I was dying.
02:07:35.160 I cracked up so hard.
02:07:35.980 I did that one for you.
02:07:38.200 There we can.
02:07:42.580 We can use that next time.
02:07:44.260 There's Dizzy in it, folks.
02:07:45.500 The great reveal.
02:07:46.380 The Scooby-Doo moment.
02:07:47.420 There's Scooby-Doo moment.
02:07:47.920 Yeah.
02:07:48.540 There he is.
02:07:49.040 He looks just like me.
02:07:50.660 There he does.
02:07:51.100 I know.
02:07:53.520 I laughed so hard when I first saw that.
02:07:54.860 Black face.
02:07:56.060 All right.
02:07:56.860 Very, very good.
02:07:57.500 All right, guys.
02:07:57.920 I want to say thank you to our executive producers here as well.
02:08:00.120 Before we wrap up our Halloween stream, thank you to Mr. Albert Arctic Wolf.
02:08:06.080 Thank you so much, Arctic Wolf.
02:08:07.080 We appreciate you so much.
02:08:08.280 Also, thanks to William Fox, America First Books.
02:08:11.640 We also got Angry White Sockermon.
02:08:14.040 Thank you for your support.
02:08:15.580 We also got Purple Haze.
02:08:17.440 Thank you to you as well.
02:08:18.600 Appreciate it.
02:08:19.820 We also have Glenn as one of our executive producers.
02:08:22.820 Thank you so much.
02:08:24.480 We also got Red Pill Rundown.
02:08:26.520 Thank you for your support.
02:08:28.480 Then we got President Ubunga.
02:08:31.100 Much appreciated.
02:08:32.720 Be pining that chest, man.
02:08:33.840 Teutonic Werebearer.
02:08:34.680 Thank you for your support as well.
02:08:37.220 We also have Good Luck Lap.
02:08:39.680 Many thanks to you.
02:08:41.720 We also have Number One Jeebs.
02:08:44.180 Thank you for your support.
02:08:45.840 Thank you for ever correcting me on that, by the way, too.
02:08:47.300 Hungarian Mom as well.
02:08:48.200 Thank you.
02:08:48.620 Also one of our executive producers.
02:08:50.380 And then we got Santoso.
02:08:52.080 Thank you so much for your support as well.
02:08:53.960 We appreciate you.
02:08:55.260 Then we got our producers.
02:08:56.400 Charles Turner Jr.,
02:08:57.680 Yuwansson,
02:08:58.380 Lero Dumand,
02:08:59.100 Ice Open,
02:09:00.120 Single Action Army,
02:09:01.060 Lord H.P. Lovecraft,
02:09:02.180 Trevor,
02:09:02.560 Der Schwabe,
02:09:03.240 Shane B.,
02:09:03.820 Alcyon,
02:09:04.320 The Boo Man,
02:09:04.880 Aurelian,
02:09:05.420 Perfect Brute,
02:09:06.480 Greg M.,
02:09:07.580 J. Barr,
02:09:08.340 Chris W.,
02:09:09.040 and Skarzinski.
02:09:10.060 Thank you to our producers and executive producers.
02:09:12.500 Can't do this without you guys.
02:09:13.600 If you want to get one of those,
02:09:14.360 you can upgrade at redicemembers.com
02:09:15.880 or subscribestar.com
02:09:17.180 slash redice.
02:09:18.680 And of course,
02:09:19.060 if you do not have a membership,
02:09:20.220 do consider it.
02:09:21.360 Helps us tremendously as well.
02:09:23.400 So if you like these goofy Halloween shows,
02:09:25.980 you know,
02:09:26.380 we do.
02:09:26.760 If it's for that alone,
02:09:28.800 right,
02:09:29.180 it's worth a membership.
02:09:30.440 Of course,
02:09:30.700 you get access to Western Warrior
02:09:31.860 and exclusive videos,
02:09:33.220 second hour,
02:09:33.700 many of the interviews
02:09:34.840 and stuff like that too.
02:09:35.740 So thank you guys.
02:09:36.320 It's a good last Western Warrior too
02:09:37.980 if you missed that.
02:09:38.540 It was a good one.
02:09:39.680 All right,
02:09:40.140 Lana,
02:09:40.320 you guys have been like,
02:09:41.880 not that you ever had like a low in content,
02:09:45.680 but you guys have been on a really,
02:09:46.740 really strong streak for a while.
02:09:48.220 Oh,
02:09:48.400 thank you.
02:09:48.700 Appreciate that.
02:09:49.520 Yeah.
02:09:49.940 Yeah.
02:09:50.300 I think probably 2021,
02:09:52.720 2022 with all that crazy heart stuff
02:09:55.120 and the COVID vaccine.
02:09:56.140 That was COVID shedding for sure.
02:09:57.580 Yeah,
02:09:57.780 that was,
02:09:58.180 that was a hard,
02:09:59.160 that was a hard time.
02:10:00.320 Yes,
02:10:00.560 it was.
02:10:01.240 Yes,
02:10:01.460 it was.
02:10:01.860 We're back,
02:10:02.500 baby.
02:10:02.820 We're back.
02:10:03.940 Yes,
02:10:04.280 we're back.
02:10:04.580 That's right.
02:10:05.380 Folk last.
02:10:07.140 Is that the slogan today,
02:10:08.720 Lana?
02:10:09.180 That's how the nightmare goes.
02:10:10.840 The nightmare version.
02:10:12.220 Yes.
02:10:12.580 All right.
02:10:12.920 Okay.
02:10:13.400 Thank you guys.
02:10:14.100 We appreciate you.
02:10:14.740 So we'll be back in a few days
02:10:16.440 with more as usual.
02:10:18.100 So thank you everyone.
02:10:19.260 Happy Halloween,
02:10:20.320 HH.
02:10:20.920 Happy Halloween.
02:10:21.400 All right.
02:10:22.000 We'll see you guys soon.
02:10:22.960 Take care.
02:10:23.720 Bye-bye.
02:10:24.180 Now,
02:10:49.680 do make sure that you follow us on our Rumble channel for more Red Ice TV.
02:10:54.180 on rumble.com or on x at red ice tv you can of course go to red ice.tv as well tune in to our
02:11:01.820 live streams and shows flashback friday live on fridays at 5 p.m eastern no go zone wednesdays
02:11:08.300 at 5 p.m eastern we also do interviews videos clips and western warrior is available tuesdays
02:11:13.900 exclusive for our supporters and subscribers at red ice members.com or on our locals
02:11:20.520 redicetv.locals.com or subscribestar.com slash red ice get a membership check out everything that we do
02:11:27.680 and support the show
02:11:28.980 new red ice merch available now both first t-shirts for adults and for toddlers fatigues for men our
02:11:43.260 favorite the red ice camper mug or ceramic with black print high quality leather keychain with
02:11:50.020 solar boat imprint our red ice hat one of our best sellers pick one up today or why not gray
02:11:59.620 oslander rouse t-shirts for both women and men we also have fridge magnets folk first one disease and
02:12:08.340 our black men's t-shirt with the classic red solar boat
02:12:12.400 go first get your red ice merch from lanas llama.com get an item today lanas llama proud sponsor of red ice
02:12:27.260 happy happy halloween halloween halloween
02:12:43.260 happy happy happy halloween hadley halloween to merge and rock ì“°
02:13:08.780 is ticking. It's almost time.