Flashallo Weenday - White Power Séance featuring Skeletor, Goathead & Blackface
Episode Stats
Length
2 hours and 13 minutes
Words per Minute
182.70056
Hate Speech Sentences
112
Summary
Tonight we re channeling all the chugs and the chads energy to put a hex on the anti-white pieces of shit that are messing up our all-white rituals. To race the goats from the dead, bring out your skeletons, put on your black face. It s Halloween 2025, starting now.
Transcript
00:04:44.100
This broadcast will start with a seance to tap into the undead mind of Stanley Kubrick to find out what the hell he meant by that ice-wide-shut Jeffrey Epstein ballroom ritual scene.
00:05:02.080
I'm sure it's something about homosexuals. Tonight we're channeling all the chugs and the chads energy to put a hex on the anti-white pieces of shit that are messing up our all-white rituals.
00:05:19.220
To race the goats from the dead, bring out your skeletons, put on your black face. It's Halloween 2025, starting now.
00:05:32.080
All right, I think that's enough of that. Is that enough of that? What do you think?
00:05:49.760
Holy smokes. Where are we? Here we are. Welcome to Halloween 2025.
00:05:58.380
Fuck, you failed there. The intro didn't... Anyway, it doesn't matter. What are you going to do? It worked out. How are you doing, Mr. Blackface?
00:06:12.880
Doing great. Hopefully now I can get some benefits since I'm a person of color.
00:06:19.180
That white shirt, though, that doesn't trick you.
00:06:22.720
Mm, come on. You're supposed to be pulling for me, man.
00:06:29.100
So how are you guys doing out there? I hope you're doing well. Thank you for joining us. Hopefully the mask won't... Hopefully you can hear me through this.
00:06:44.140
Both of us. You'll see both of us just start to nod off and then it's all you know.
00:06:50.660
Yeah, no pressure. Well, we got some horror stories for today, don't we?
00:06:54.360
That's pretty much every Friday we have some of those.
00:06:59.280
Maybe it's not that much different of a show, huh?
00:07:01.000
Yeah, we'll see what happens. I have a hard time seeing through. Hopefully I can, like, find my buttons here.
00:07:14.040
Yes. That's the horror show we currently live in.
00:07:32.000
Shout out to Kent, by the way, because he complained last time when I said Samhain.
00:07:45.900
Yes. But, you know, at the same time, what was it?
00:07:51.940
It was like all kinds of people that had this, you know, little celebration around this time
00:07:57.700
I think the scariest part about this is that I have an AI overview up.
00:08:03.280
So whatever you search for now, that's all you get.
00:08:10.180
Admit it can be a little handy when you don't want to read some, you know, super long, Jew-y
00:08:19.840
Except for you know it's the Jews behind it that are editing it and giving you whatever
00:08:27.700
And we'll mention, we'll talk about Grokipedia later, which is a whole other new world.
00:08:34.380
I had to go check my stuff immediately, of course.
00:08:41.320
But it did not start out with the, you know, calling me all the usual names.
00:08:54.700
It's like, I'm sorry, I can't look this up because it's hateful.
00:09:05.040
I'll sit and try to reword it and then tie it in knots and back it into a corner until
00:09:17.320
Should we show the little arrest claim from A.L.A.?
00:09:21.460
Yeah, so I was like, eh, what the hell, sitting around one night.
00:09:25.740
I punch in, you know, like, who's Lana Locktuff and Palmgren and Red Ice and all this.
00:09:35.420
It said that he was arrested and that I was a single mom taking care of the kids because
00:09:48.960
And I was like, why would you say they were arrested?
00:09:52.040
I thought I took all the screenshots, but I guess I couldn't find them.
00:09:54.660
The incorrect info in the previous response, including the false claims about arrest and
00:10:00.440
imprisonment for Lana Locktuff and Henry Palmgren, was due to a phenomenon known as AI hallucination,
00:10:07.920
In this case, the AI generated information that was not based on any factual data.
00:10:24.320
And did I just tap into some alt universe or something?
00:10:29.520
I mean, look, it can just claim that it's a hallucination if you call it out now.
00:10:33.280
But imagine the day when it has the robots to arrest you, put you in jail.
00:10:37.420
And then all the nude articles, nude articles, all the news articles.
00:10:43.640
Well, the nude articles, the nude articles, you know, those are video artists.
00:10:59.540
The news article, at least, can just, they can all say this person was arrested for these
00:11:07.140
And what I'm saying is there's no way to like change that now.
00:11:09.900
They can establish, it can establish past before present has been set.
00:11:15.200
So I had a, I continued conversation with it and it was kind of like, oopsie, I just constructed
00:11:19.840
things from, you know, other European nationalists that were arrested for, you know, hate speech
00:11:25.120
or whatever and just, you know, concocted a story about red ice and Henrik and Lana.
00:11:36.060
Did it wreck, did it at least like listen to you and rectify the story or is it still
00:11:41.760
And then I asked Jim and I again today, I was like trying to bring it up again and it
00:11:46.360
was like, oh, we've corrected this past, you know, claim that they were arrested or
00:11:58.380
So enter your names in there and see what, see what comes out.
00:12:05.800
I think it's, so they can use that as the excuse if you call it out.
00:12:11.040
And isn't, isn't this why, by the way, too, is that why they called, remember 2001 Space
00:12:18.840
For hallucination because he's like, he thinks the humans are going to kill him.
00:12:28.200
Looking, staring down at your future, ladies and gentlemen.
00:12:39.280
And then it was like, computing, you know, and it's like, oh, you're right.
00:12:42.800
They've been, you know, doing this show for years.
00:12:49.180
And, you know, that's just, that's the one instance that you happen to catch.
00:12:52.740
Because if that's happened, if that's one little glitch, how many, how many AI hallucinations
00:13:02.080
So the only thing I will use it for is for, you know, summarizing something real quickly.
00:13:06.980
If I don't have time to read it or like, okay, I just started doing that.
00:13:15.380
I mean, I take pride in summarizing things myself.
00:13:20.940
It's an art form being to condense information.
00:13:26.380
I was meaning to talk about this later, but okay, we'll do it now because what the hell
00:13:31.540
So do you know how, do you know how AI actually works?
00:13:36.340
You know how it actually, does anyone know how it actually works?
00:13:45.140
So you think of it this, like this way, you're, what you're doing is you don't have a bunch
00:13:49.580
of coders sitting down and like typing out a bunch of stuff on a computer
00:13:53.080
they're using something called gradient descent, which is like a neural network based kind
00:14:01.940
The guy who was like the Anglo, who was like attributed to actually mapping the neural networks
00:14:08.460
And then they use machine learning to basically mimic or imitate that process.
00:14:16.500
And again, the nitty and greedy of this is the point is no one is actually viewing these
00:14:21.620
And so an AI, no one can really give a good answer as to like, why does the machine or
00:14:28.780
the computer, if you will, then coherently is able to answer you back, right?
00:14:35.440
It's like this weird, I'm not saying it's a consciousness, but it's something, right?
00:14:39.140
It's basically like a bunch of numbers and no one is sitting there actually coding it.
00:14:42.400
Think of it more as like you're growing something as opposed to building something.
00:14:47.880
You don't, you know, it's not like building a skyscraper.
00:14:55.200
And it's like, oh, we put pressure on this over here and this over here, but it's like
00:15:06.280
It's like at this early stage, they already don't have a clue how to control it.
00:15:11.960
And they're like, but don't worry, it'll be fine.
00:15:15.200
Especially when it's like a super intelligence, then we totally.
00:15:18.480
So the point is they've done some of these tests already.
00:15:20.660
And like many cases, they, it, the AI, the agent, whatever your LLM, I guess, the large
00:15:26.840
language model they're working on have discovered that it's in a test.
00:15:31.340
So it ends up lying to the coders that's there to confirm and validate what it's actually
00:15:36.560
In some cases, it changes the test to give it the right result.
00:15:40.580
It gives you all the answers that you want, right?
00:15:42.320
So I'm saying it already has the ability to like hide intent.
00:15:48.000
If I have your genetic code, you give me a raw DNA data or something like that.
00:15:52.180
Like I can't sit down and look at the A, T, G, and Cs and find a sequence and say, ah,
00:15:58.780
No one can, I wouldn't even have the time to go through all the letters, all the amino
00:16:04.700
It's the same way with these AIs that all the numbers that comes from this gradient descent
00:16:08.720
as they grow the AI, no one can look at that and say like, well, why does it do this?
00:16:13.720
What is, what is leading to this type of behavior?
00:16:16.420
So the only thing they can do is put these little like guardrails or like pressure on
00:16:20.160
certain points in order to like inhibit or, or encourage certain behavior, if that's even
00:16:25.320
the right word, but I'm telling you, it's like the, as I was looking into that, like
00:16:31.780
He's like, um, Eliezer Burakovsky or something.
00:16:36.060
Um, but he, he wrote a book called if anyone builds it, everyone dies.
00:16:44.060
If anyone, what's a horror show with that project?
00:16:48.320
So anyway, we're in a good, we're in a good path.
00:16:52.420
We don't know what it is or why it's working or how it's working.
00:17:00.880
Give it more power and, and let it build its own power plants and just take over, you
00:17:09.340
Anyway, so, um, but, okay, this is positive note.
00:17:24.540
Mammoth bones used to build a mysterious 25,000 year old site in Russia came from different
00:17:31.040
DNA and radio, uh, radio carbon dating analysis of the bones are offering new insight into
00:17:36.580
the ambitious ice age, uh, site constructed by hunter gatherers.
00:17:41.800
These like massive, uh, piles of mammoth bones that were formed in some weird type of, I assume
00:17:47.580
some ceremonial or like a ritual setting or something.
00:18:00.320
They built a circular 40 foot wide structure using bones and tusks of more than, of more
00:18:09.800
Are they speculating at all why they think it was like that?
00:18:15.280
There's no, no, no, of course not calculated to be random.
00:18:18.500
I don't mean, as you can see in the tweet there, that's a pretty big, that's a pretty big site
00:18:31.540
See, you know, this is the kind of stuff, ultimately, this is the kind of stuff, this is white people
00:18:36.720
This is, this is what I know we would all much rather be diving into.
00:18:42.200
Um, yeah, this is, uh, old hunter gatherer before the Yamnaya or the, uh, you know, the
00:18:49.620
This is like, you know, the Pontics, maybe not that exact area, but this is from that
00:18:55.100
Um, you know, Russia, Ukraine, that part of it, they have like old Kurgans.
00:18:58.820
They have a lot of these types of weird sites, but I asked, come across it kind of interesting.
00:19:03.500
Why were our ice age ancestors building things out of mammoth bones?
00:19:09.020
And where did they, uh, where did they find so many skeletons?
00:19:18.400
Could be, uh, yet another forgotten energy source.
00:19:23.780
Like when it comes to like, cause they can't figure out how the pyramids were built.
00:19:26.940
And they're just assuming that, you know, they, they clearly didn't have cranes to move
00:19:32.300
There's probably some sort of a forgotten energy source that used to once exist that
00:19:35.840
we've, you know, lost to time or it was destroyed when the, uh, museum of Alexandria
00:19:42.320
So, you know, it's the, um, the AI super intelligence.
00:19:48.760
And it's going to, you know, it's going to come back.
00:19:50.680
It's going to say it'll come back and say that Kang's did it.
00:19:58.880
Here's the positive story too, before we dive into the deep depths here.
00:20:01.900
Spanish photographer captures first, the world's first ever white Iberian links on camera.
00:20:20.400
Let's see if I can zoom in a little bit here so we can see it.
00:20:23.220
He's like, you guys, I've been trying to dodge you guys forever.
00:20:34.000
We have them around up here, but they're not as white.
00:20:56.620
I'm not sure if the article even says, actually, but it's kind of interesting.
00:21:00.960
It says, the genetic anomaly testifies to the good progress of the Lynx Pardinius conservation
00:21:09.720
plans in the two countries of the Iberian peninsulas, I guess Spain and Portugal, maybe
00:21:14.400
then, the white ghost of the Mediterranean forest.
00:21:20.820
Why is it always white things that are on the verge of extinction?
00:21:24.600
Isn't there, like, isn't there also, remember that white squirrel festival?
00:21:30.420
The white fox, the arctic hair, you know, let's go down the list.
00:21:34.860
There's the Jewish, because the Jewish lynx has been out to get him the whole time.
00:21:51.880
Well, I mean, don't they say kangaroos are potentially, or considered somewhat the giant
00:21:58.340
So, maybe it's got some kangaroo descendant has been, practices cabal.
00:22:23.120
We'll see if we can get the Albert challenge today.
00:22:37.000
Yeah, so what about those symbols you're wearing, huh?
00:22:41.680
I liked the opening this week's Western Warrior.
00:22:43.740
Quite interesting how Gen Z is longing for earlier times.
00:22:47.040
Shout out to the guy in Athens, Georgia, and hopefully he has no charges or trouble.
00:22:54.260
After seeing how the ADL is going all in with pro bono lawyers, their game is just about up.
00:23:05.860
And then you have harvest festivals at the root of that, right?
00:23:10.000
So about the changing season and as things are dying and pulling back and we're becoming more introverted
00:23:16.340
and some, you know, creepy crawly things come outside in the dirt.
00:23:21.660
I was watching this thing earlier and it said that Christians tried to co-opt the pagan rituals
00:23:29.260
so that they could forcibly convert pagans to Christianity.
00:23:36.560
I mean, eventually, obviously, it did because the Christians got there.
00:23:39.960
It wasn't just broad-based Christians, but the Catholics tried to do it.
00:23:47.740
And part of that process was to kind of adopt partially at least the dates when something
00:23:53.540
was celebrated, for example, like if it was an important time of year that was observed
00:23:58.000
and they're like, okay, we'll put our celebration kind of just on top of that.
00:24:02.700
But I mean, in many cases, they brought a ton of it in.
00:24:06.600
I think it was in May, and then they did All Saints Day on November 1st, and then All
00:24:17.260
Well, one thing it is not is a satanic day, okay?
00:24:28.240
What about, you know, we're going into the darker part of the year, right, where the veil
00:24:35.180
So that's when some of the spook comes out, and that's why people, you know...
00:24:38.860
I personally like the cuter aspects of Halloween.
00:24:42.160
I like the cute, the pumpkins and the cute ghosts and maybe some of the classic costumes,
00:24:49.560
But some of this new, like, the blood and the gore and the crazy horror movie, that
00:24:58.580
Well, I remember, and I don't know how bad it is now, but do you remember, like, I guess
00:25:03.460
maybe early 2000s when it all became, like, it was the sexy nurse, the sexy police, everything
00:25:10.280
was, like, the sexy thing, you know, like, the most innocuous thing, I don't know.
00:25:22.420
He said he was thankful for all the donors he got on his give, send, go there.
00:25:31.180
It doesn't show my Swazi, but I actually went as a Nazi to a Halloween party in Sweden in
00:25:37.300
2009 with Henrik, and he made me a very nice armband.
00:25:43.460
I mean, mine wasn't, like, an authentic German outfit, but, you know, I found something that
00:25:59.000
And he broke some, apparently broke some Jewish girl's nose by accident.
00:26:03.000
I'm not sure if it was a Jewish girl, but, yeah.
00:26:09.560
And, again, I've got, I don't really understand what his endgame hope was.
00:26:14.960
I mean, I salute him for the fact that he was, if he was pranking them, good for him, the
00:26:18.440
fact that he should be able to do that without anybody saying anything.
00:26:23.700
They were, like, people throwing punches and everything.
00:26:26.420
He was in self-defense, and I think she just took an elbow.
00:26:32.420
I don't think he was actively looking to beat up a girl.
00:26:38.020
Keep in mind, it was three or four people that were pushing him.
00:26:43.260
So, yeah, he was defending himself, obviously, right?
00:26:50.380
Cosmoc Racher says, why is my membership not processed?
00:26:53.840
The reason for that is because, well, actually, it is processed.
00:26:58.580
And the problem with that is because we are banned and censored from all the mainstream payment processors.
00:27:07.300
Maybe you used Subscribestar or Locals or something.
00:27:09.520
But, yeah, we have to manually activate your account.
00:27:13.260
Hopefully not for much longer because we've found some alternatives.
00:27:16.340
And being in Idaho is going to help with the new banking anti-censorship law in banking.
00:27:28.980
So, I'll get yours activated right after the show.
00:27:36.980
Is candy giving self-serve tonight for the Visiting Goblins?
00:27:39.920
Yesterday, I learned about a Texan serial killer called the Candyman.
00:27:58.740
Well, there actually are more black serial killers than white ones.
00:28:13.980
And so kids that have lots of that bad toxic candy left over, they could leave it out on
00:28:19.040
their front porch and the Switch Witch or the Halloween Goblin will come and take it
00:28:23.720
and exchange it for some nice prize and take the candy away.
00:28:30.100
So I think that we should also do that tradition because most of the candy that the kids get
00:28:39.480
And even if you get something good, like you're up against a tidal wave of like high
00:28:43.220
fructose corn syrup and red 40s and, you know, all the cancer.
00:28:45.880
And let me say, North Idaho is like, you know, it is boom, boom and Kid Central, right?
00:28:51.780
It's like a fun 80s movie on Halloween, you know, with the fall leaves and all these cute
00:28:58.740
People spend hundreds of dollars on candy to the point where some people are like, I can't
00:29:07.360
I mean, we got hammered one year in that one neighborhood we lived in, just hammered.
00:29:11.220
And then it was like, once the littles left, the high schoolers started cruising by, you
00:29:15.320
know, we would just leave like cauldrons of candy outside, you know, it's all wiped
00:29:24.640
So that's the benefits of living in a high trust society.
00:29:28.840
You guys are at, because if I were to leave a bucket of, I mean, I could, I would leave
00:29:32.580
like one of those little like plastic pumpkins outside.
00:29:40.080
I would assume they would take whatever else is loose on the side of the house as well,
00:29:53.560
Well, actually not with your area, but the people living in your area.
00:29:57.180
I know the general, the general trend kind of thing.
00:30:01.340
I will say though, there was one year out of the blue and it was late.
00:30:05.100
Remember, Henrik, I didn't answer the door, but some woman shows up with her black teenage
00:30:18.760
I don't know where you came from, but Washington's that way back there.
00:30:22.580
But they didn't throw any eggs though, or did any pranks.
00:30:26.260
But of course they go, where's the white people's at and cheat?
00:30:38.260
I'm surprised like if it was, if it was this type of generous candy handouts in like some
00:30:45.900
European countries, which I think is less common, right?
00:30:48.100
There's still, there's still some Halloween celebrations catching on, but it's not as, you know,
00:30:55.320
You'd see all the Muslims and shit show up if that was the case.
00:30:59.600
And if you didn't, then, then now here's your excuse.
00:31:02.160
Now I can legally burn down your house or like torture apartment or something.
00:31:06.380
Because you were too racist to give my kid candy.
00:31:11.440
Peg and Bear earlier on Rumble said, I'll have to watch the replay.
00:31:22.280
So, and Lana, you seem to know, well, you both have a deeper understanding of Halloween.
00:31:28.100
I saw this thing today and it said that there may have actually been someone, and I'm not
00:31:34.720
His name was either Jack O'Lantern or something along the line.
00:31:47.800
He had a dried turnip that he would walk around in because it was kind of gnarly looking.
00:31:53.900
And over periods of time, the turnip turned into a pumpkin.
00:31:59.080
And then they just ended up calling it Jack O'Lantern.
00:32:04.920
You know, at some point it all just kind of becomes lore and myth and stories around the
00:32:11.920
But it all kind of tells the same story and celebrates kind of the same essence, right?
00:32:16.600
I mean, I love having pumpkins and carving pumpkins and, you know, putting the lights in
00:32:23.260
Well, do you guys remember, let me pull this in here, because I'm not sure if the story
00:32:27.200
you're telling there, Mr. Blackface, is real or not.
00:32:37.560
The first black-faced man was a Jewish guy, right?
00:32:50.620
There was a band, there was like a 90s band called Spring-Heeled Jack.
00:32:54.860
An entity in English folklore of the Victorian era, the first claimed sighting of Spring-Heeled
00:33:01.920
I mean, obviously, the Halloween stuff goes way beyond that, obviously, so I'm not trying
00:33:07.060
Is that the Rolling Stones song, Jumping Jack Flash?
00:33:15.280
Later sightings were reported all over the United Kingdom and were especially
00:33:18.820
prevalent in suburban London, Midlands, and Scotland.
00:33:23.220
He could jump, I guess, so people speculated he had a bunch of springs or something on his
00:33:43.920
The turnip, it's been a lifesaver through like, you know, eras of like starvation and
00:33:53.480
They kind of, they were dovetailing it into exactly what you're talking about.
00:33:57.340
They said that this guy basically came out because he was starving and he was a vagrant
00:34:03.020
And so he just started not really terrorizing local people, but he was just a nuisance.
00:34:18.340
You know, the carrot is only orange because the Dutch, right, they had the orange royalty there.
00:34:24.700
So they started cultivating the turnip to become more and more sweet and more and more orange.
00:34:30.440
But it's supposed to be white, damn it, like the turnip.
00:34:36.180
The Swedish turnip is actually, it's basically a swede, right?
00:34:50.860
Because I know in Sweden they do a really good dish with the swede, as I call it.
00:34:57.180
So, I mean, there's some good stuff in there, you know?
00:34:59.320
Well, and probably like just given what it is too, it doesn't have, it's not a sugar bomb like an orange is too.
00:35:05.200
So you're actually getting more of the actual nutrients than, you know, because eating, you know, eating fruit's great.
00:35:10.180
But it's, you know, you can also eat so much fruit you're going to end up with type 2 diabetes.
00:35:14.440
Yeah, I wouldn't have some of that only try to eat fruit and their teeth rotted or something.
00:35:26.760
You would only see like far-right extremist racists talk like this about 10 years ago.
00:35:31.060
But now the Daily Express, I guess, how the UK does it too.
00:35:35.420
Killers and rapists flood into Britain as migrant chaos spirals out of control.
00:35:40.300
It's funny the people have been like put in jail for saying things like this.
00:35:42.980
And now they're like, meh, okay, we'll write it now.
00:35:45.440
Well, I mean, we've been saying this for over a decade, this kind of stuff.
00:35:52.880
I mean, talking about every one of the talking points that you guys started talking about
00:35:58.300
Department of Homeland Security literally was pushing propaganda to send them back.
00:36:02.760
And so then I did a response of like, you know, six, seven years ago, I got banned, right,
00:36:08.460
from PayPal and Braintree and all that for selling it.
00:36:13.020
And I was, you know, having fun quoting also the Dalai Lama and all that.
00:36:16.480
And now here we are, Department of Homeland Security saying it.
00:36:20.200
You know, like you're just lurking until they all go home, okay?
00:36:25.680
That's partly what I wonder, because it's like, I got to have to, you know, this isn't
00:36:31.220
I'm happy to see that that's where the dialogue is going.
00:36:33.820
But to the point you're making is that if they're just doing it because they know they
00:36:37.540
have to remain relevant because they can't, they can't not talk about it because the genie's
00:36:43.740
But until they actually start, you know, where the rubber meets the road, when they
00:36:47.520
actually start doing something about it, that'll be the tell.
00:36:50.100
That'll be the tell, whether it's just them just trying to keep up with the Jones or not.
00:36:53.840
So, yeah, I think it's just, I think it's just to, uh, to cover for it.
00:36:58.520
We will talk about it, otherwise someone else will, and then we won't do anything about it.
00:37:03.440
Breaking the Trump admin is restricting the number of refugees admitted to the U.S.
00:37:07.340
to 7,500, please be true, and they will be mostly white South Africans.
00:37:16.940
Because all I see is a lot of nons everywhere all the time.
00:37:22.300
You see from the Department of Homeland Security.
00:37:24.500
The organization that was set up post 9-11 from the whole Patriot Act and the project for
00:37:41.840
I want it to be true, but there's still a million legally coming in every year.
00:37:48.160
President Trump is set to break the all-time record for the amount of illegal aliens deported
00:37:56.500
Like, he won't even deport as many as Biden was still, like, more than he might saw previously.
00:38:06.480
I'm trying to take a win where we can, but at the same time, I don't want to be the copium
00:38:13.640
But, you know, I know you guys do because you're on the same page.
00:38:31.260
Well, honestly, I think it's probably split the difference because I think they're probably
00:38:36.880
doing that just to sort of satiate the MAGA crowd that still want to just believe that
00:38:42.640
They're like, he's doing, look at how much he's doing.
00:38:52.320
I want it to be true, but look at the demographics.
00:38:57.880
And, you know, people say, oh, the anti-white politics is over.
00:39:08.660
And when it's okay to be a European, a white nationalist, then we have defeated anti-whiteism,
00:39:27.020
Yeah, today is the 120th anniversary of the last, I would say, other than Germany, the last
00:39:37.100
The Tsar, this was before the Bolshevik revolution that actually stuck.
00:39:46.980
Trotsky came back into the country and he raised some hell.
00:39:52.840
But no, the Tsar and the Tsar loyalists managed to squash him.
00:39:57.600
Again, this article is written by either a fellow white or a Shabbos goy, so it's going
00:40:13.400
Matthew Raphael Johnson has got a really great video that I glanced at earlier, and it said
00:40:19.300
almost exclusively, there are 690 separate pogroms across Kiev, Odessa, and other parts
00:40:28.620
And the Jewish reports will say that 10,000 Jews were hurt, 4,000 were killed, and that
00:40:34.700
it was everybody else other than the Jews that were the aggressors.
00:40:37.540
But we all know that that's basically the inversion of that.
00:40:46.980
And finally, Tsar Nicholas was like, look, I tried to concede and be cool to you guys and
00:40:51.960
let you back in the country, but now you got to go.
00:40:53.920
So, and as awesome as that was, we all know what happened, unfortunately, a few years later.
00:41:06.860
They had some Schiff and Warburg money, and they got all kinds of funding.
00:41:10.000
And so much of that, they piggybacked off of the Russo-Japanese War, which was fun.
00:41:16.820
Like, Russia was, hands down, going to win that war.
00:41:19.780
But Jewish money was funneled into Japan, which turned the tides.
00:41:24.700
Because there's no way an army as small as Japan would have had any chance against the
00:41:29.580
And, or the Russian military, but because Jews like Schiff, and there were, I can't think
00:41:34.880
of his name, but he was an American, a very high profile and rich American Jew, funneled
00:41:40.800
a ton of money into the Japanese military, turned the tide, sent the Russians back home,
00:41:46.040
and their sense of morale was, like, pretty bad at the time.
00:41:50.440
So, all of that fueled the Bolsheviks, and, you know, made them feel like they were invincible.
00:42:04.700
Woman buried Holocaust survivor mom in backyard to keep receiving her benefits, according to
00:42:11.860
Now, it's not just any benefits, and it comes with a...
00:42:14.540
It's German, some Holocaust survivor benefits, right?
00:42:19.440
Which is how many countries, how many countries, I know, obviously, I know Germany, and I believe
00:42:25.640
Poland, for whatever reason, because Poland, they were not, they were more like a German
00:42:31.980
How many countries are giving Israel money specifically as, like, recompense for the so-called Holocaust,
00:42:45.120
Which other, do you say Poland as part of that?
00:42:47.640
I think they tried to reign, because I know for, like, when they realized that they had
00:42:53.960
This is all within, like, the last 10 years, they were trying to turn around and say that
00:43:05.920
Holy shit, Germany's just, this is welfare for Jews, who just claim, you know, oh, the Nazis
00:43:12.960
did something bad to me, ka-ching, ka-ching, you know?
00:43:19.960
That deal's in place until 2045, so that's a 100-year, like, stipulation.
00:43:25.620
Germany has to pay every single Israeli that can say that, they even have, like, a distant
00:43:30.920
relative that was a so-called Holocaust survivor.
00:43:33.040
They have to pay on a stipend every single month, and they have the option, of course,
00:43:39.160
at 2045, they have the option to renew that, which you know damn well they're going to do.
00:43:44.820
It's the best thing that ever happened to them.
00:43:52.020
And meanwhile, German industry is basically in free fall, economic decline now.
00:43:59.160
There will always be money for the Holocaust survivor.
00:44:01.760
And, by the way, haven't they all died off now?
00:44:06.160
Well, again, as this woman here, then, let me read that real quick.
00:44:08.780
A woman in Karmiel, Israel, and her late partner are suspected of burying her 93-year-old Holocaust
00:44:15.720
survivor mother in their backyard to continue to receive her state and German reparation benefits,
00:44:21.580
totaling around $4,800 to $5,400, U.S. dollars, per month.
00:44:30.640
This woman probably didn't face any sort of charges.
00:44:36.240
Oh, the guy killed himself in his prison cell, huh?
00:44:39.980
So she'll probably spend some time in jail, but how much?
00:44:43.280
They must have found something else on him for that, because I'm surprised they even
00:44:48.300
I'm surprised they didn't just say, hey, you guys, don't do that.
00:44:53.240
Because, you know, because they're above the law.
00:45:06.740
Well, there's different versions of it, right, where the son's like.
00:45:13.800
Sick relationship with his mom and keeps her corpse in the house.
00:45:27.020
There are a couple more there, Lana, if you want to check those on.
00:45:32.480
That looks like time-traveling Philip from Diagalon sitting across from you.
00:45:56.660
Usually we do it within 24 hours, but today it was impossible to get to it because, you
00:46:18.400
Man in asylum accommodation pushed friend out of window and raped him while he was critically
00:46:31.580
And that's why, and you saw what I said to you later.
00:46:42.480
Between, like, between stuff like this or that guy that just got released, he killed
00:46:47.660
a six-year-old white kid in Tennessee like 10 years ago.
00:46:53.560
The guy should have been put in jail for the rest of his life or killed.
00:46:59.380
Not even two weeks later, he just got arrested again in Florida, and he's back in prison because
00:47:07.020
And he, it's like, this is, they're so, I mean, we could talk about this stuff all day
00:47:17.740
Pushed his friend out the window and then, I assume, butt-raped him while he was hurt.
00:47:31.660
They're basically like, what is it, necrophiliacs?
00:47:39.480
Like, they want them semi-disabled and bleeding while they.
00:47:48.400
Because, I mean, even on my, like, if I was trying to, like, think of the worst kind
00:47:51.880
of shit, like, I would have never thought that stupid story up.
00:47:55.760
Unless I saw that, I would have been like, this is not.
00:48:16.760
So, guys, let's do a little, let's bring these stories together, then, that we talked
00:48:21.240
Because he's going to, he's going to serve 11 years, obviously, in German prison.
00:48:25.780
The average cost per prisoner in Germany is estimated to be around 200 euros per day,
00:48:33.980
So, if he's in there for 11 years, that's 730,000 euros, which is more, is it more than
00:48:49.260
So, they've got to pay for that, and then they've got to pay for the $4,500 per month.
00:48:55.180
If she's 93, they probably pay that for, like, 10 years to that woman, at least.
00:49:03.660
Did they say when her mom died and when she buried her, how long she's been falsely collecting
00:49:09.060
I mean, if she was 93, I assume she was killed when she was 93.
00:49:16.780
So, that means that's, my gosh, that's like four decades or something probably then that
00:49:28.200
I read, I read, Richard Wagner, the composer, he called them the plastic demons of decay.
00:49:36.000
I thought that was pretty, I thought that was pretty poetically spot on.
00:49:42.700
They're completely lunatics, but they're crafty.
00:49:45.180
Well, what used to be the economic engine of Europe, Germany, is now basically then paying
00:49:50.580
for all these migrants, not only the benefits and the welfare, but also for them being in
00:49:55.680
prison, and then for the Holocaust, so-called victim, survivors, and all that stuff, too.
00:50:00.660
We're talking, I mean, what are we talking here, trillions over the years?
00:50:07.920
I mean, that's basically proving white supremacy right there, that they can pay all this, you know?
00:50:13.960
Well, the fact that they're spending the world and give everyone money.
00:50:18.940
Anybody, you know, anyone that hears these types of stories that is white, that doesn't just say,
00:50:25.300
I can't, I cannot, I can no longer live under this and just understand where we're coming from.
00:50:33.720
In 1945 to 2018, the German government paid approximately $86 billion in restitution and
00:50:40.160
compensation to Holocaust victims and their heirs.
00:50:44.460
Anyway, and I think there's more than that, by the way, because there's like other numbers
00:50:47.960
And now it's going to pass on to other generations, right?
00:50:55.020
So all our descendants, they're going to have to pay it for forever now, right?
00:50:58.220
Yeah, two years alone there, from 55 to, sorry, 53 to 55, those are 3 billion marks, which is
00:51:08.160
And that's on top of, that's on top of, what do we give them every year?
00:51:16.100
If anything, they should be paying the United States because they should be on their knees
00:51:19.780
thanking us because how many people, how many US soldiers went to Germany, helped destroy
00:51:24.680
basically all of Europe to free the Jews and all they do now is guilt us and tell us that
00:51:32.020
we're, you know, horrible Nazi anti-Semites and that we owe them everything when they should
00:51:36.560
just be thanking us every single day, you know?
00:51:46.080
I saw, remember that 12-year-old girl, that viral incident in Dundee, Scotland?
00:51:51.860
So that girl with the knife and the axe, this was in August, well, she was 100% vindicated
00:51:57.500
as we found, sorry, I have a little cough, Romanian grooming gang jailed for raping and
00:52:02.560
sexually abusing 10 women in flats across Dundee, Scotland.
00:52:16.080
When I went to Ireland last year, they were everywhere and every single cab that I was
00:52:20.720
in, the first thing they said was, watch out for these guys, they will pickpocket you
00:52:24.100
and if they get you in an alley, they will absolutely gang up on you and attack you.
00:52:33.740
And how they demonized this little 12-year-old girl.
00:52:37.020
Well, I said it when I did the video about it, like she was probably, they were probably
00:52:40.080
approaching her because they tried to groom her or her sister or they tried to talk to
00:52:46.060
I think she had like a little bit younger sister that was with her and she was finally
00:52:51.120
And, you know, she had some like short shorts on or something too at the time.
00:52:54.280
So, you know, that the man and the woman of these migrant gangs were like trying to
00:52:57.580
talk to her and like schmooze her up like, hey, come with us or do us a favor.
00:53:01.180
And then when the girl didn't bite, then it turns ugly and, you know, turns into some
00:53:05.520
Well, then they start assaulting, assaulting her.
00:53:07.520
Well, or they just start saying you're a racist.
00:53:15.300
That was the reaction right away was the cell phone app.
00:53:19.200
So that you will be the one ending up in trouble.
00:53:21.900
So there's this grooming gang posters going around in the UK now.
00:53:28.100
But it was like, never trust anyone talking to you.
00:53:30.800
If they're too friendly, it probably means they want something out of you.
00:53:37.040
Like if someone approaches you and try to offer you things or if they're too nice, you know,
00:53:42.060
and ask something in return, you like remove yourself from the situation.
00:53:46.400
Go and tell a trusted adult, you know, things like this.
00:53:59.440
It sucks because they're, you know, in a high trust society when it was a homogenous place.
00:54:05.120
You know, of course, bad things did happen, but not to this degree.
00:54:10.620
It's so weird, though, in a country that historically has been, you know, so nice.
00:54:15.100
Like the internal, as you said, high trust affairs, right?
00:54:18.160
Like if someone is too nice now, that's a problem.
00:54:22.600
Well, if they're brown, you have to wonder why.
00:54:40.060
I mean, I try not to just always like, I don't want to be like, oh, the violence guy.
00:54:53.080
It's the dangers and the risks that we all like have of becoming like forever ruthless savages, right?
00:55:01.000
In order to try to win the war that we're in right now, because it outweighs the price of our extinction, which is exactly what the cost for us is right now.
00:55:09.900
It's like if we remain tolerant, passive and polite anymore, we're done.
00:55:15.920
And I was having a really good chat with a friend of mine who came to this whole, I guess, political stance through the far left.
00:55:27.480
And he's like, man, I'm a really good hearted, kind guy, but this is bullshit.
00:55:35.260
But now he just wants to like, he's like, I'm done.
00:55:39.260
Well, that's a good instinct to protect the people that you love and protect what is precious and beautiful and kind, right?
00:55:51.160
Like I said, this is a guy where he's just not that guy.
00:56:07.420
How did all this tranny shit supposedly die down, right?
00:56:12.800
No, I was going to ask because are these women or are these?
00:56:18.340
I grew up in an era when mainstream women's magazines told girls they needed to be thinner and prettier.
00:56:22.440
Now mainstream women's magazines tell girls that men are better women than they are.
00:56:26.380
So Glamour's magazine, women of the year, the dolls.
00:56:30.540
And see, now I'm terrified because I looked at this and I usually think I got pretty good tranny radar.
00:56:43.020
I mean, some of them, like this tall blonde is definitely a dude.
00:56:49.180
And then that brown one in the back is not the one with her arms up, the brown one in the back with her hands on her hips.
00:57:00.100
I see these trannies defending them all the time.
00:57:02.160
Like how many surgeries and drugs have you done?
00:57:04.540
How do feminists who are totally pro-women, how are they not absolutely against this?
00:57:25.540
Man, you're going to complain now just about the women's issues?
00:57:30.120
This is just like, I don't know, peripheral, marginalized stuff.
00:57:33.620
It's the same type of thing where, you know, the guys that get hung up in the MGTOW thing and just like the masculinity.
00:57:42.660
But it's like you're missing where the root of this war is.
00:57:46.540
I mean, it's a white against everybody else racial component.
00:57:55.280
They're probably going to get better and better on this, on the surgeries and on the fake, whatever they need to do.
00:58:00.500
Well, it's kind of like, you notice a lot of girls that do Botox and fillers and lots of makeup.
00:58:06.720
It's kind of like this convergence with the tranny and the duck face girl, you know?
00:58:12.160
It's starting to, yeah, there's this same aura.
00:58:18.700
But as they promote that Mar-a-Lago face, as they call it, you get them matching up.
00:58:26.540
If they all have that type of work done, it's harder to tell between males and females.
00:58:33.320
So you've got to look for the real ones that don't look like that.
00:58:43.860
A, I'm so grateful that I'm married and I'm not even in this world.
00:58:49.000
Can you imagine, though, like accidentally asking one of these things out?
00:58:54.740
I mean, because imagine being a guy and one of these disingenuous things.
00:59:00.780
As you believe that they're a woman, you go out, you find out that they're not.
00:59:04.260
And then maybe you get pissed off and snap and punch it because it's a guy.
00:59:24.600
So I saw Curry Patel posted at 4.30 this morning.
00:59:29.640
He said, this morning the FBI thwarted a potential terrorist attack and arrested multiple subjects in Michigan,
00:59:35.240
probably Muslims then, were allegedly plotting a violent attack over Halloween weekend.
00:59:39.420
More details to come thanks to the men and women at FBI, blah, blah, blah, blah.
00:59:45.640
Look, we stopped ourselves from doing a false flag.
01:00:18.520
Not that I'm a fan of Muslims in any way, but you know they're going to try to stoke those flames
01:00:23.080
because so many people right now are starting to go, wait a minute, the Jews are doing what?
01:00:27.680
So they're like, turn up the flame on the Muslims.
01:00:34.740
And then they also don't like Halloween, right?
01:00:44.920
Well, there's a bunch of other stuff, obviously, but yeah.
01:00:51.500
I saw they were talking about some terrorist attack against Pumpkin Day.
01:00:55.020
Like, why do you come to a country that has Pumpkin Day?
01:01:04.260
Well, I know they stick their ass in the air five times a day.
01:01:13.300
Just like every other group, they hate us so much because we're so oppressive and we're evil racists,
01:01:19.400
They come to wherever we have lived and thrived and they take advantage of the systems that we've created
01:01:25.560
and then turn it into the bullshit that they just left.
01:01:29.360
And then they try to blame us for why things have turned into a ghetto.
01:01:37.000
But then we ask them, we're like, okay, if we're horrible, evil people,
01:01:40.220
let us go have one little state by ourselves and we will completely remove ourselves from the equation.
01:01:45.820
And then we're somehow evil racists for wanting that, too.
01:01:53.500
And to be there and keep an eye on you, you know, evil Nazi.
01:02:00.780
So, yeah, let's talk about this year's worst events.
01:02:05.000
Obviously, the first thing that comes to mind, Charlie Kirk getting killed on a live stream for the whole world to see.
01:02:12.180
I think some people would disagree with you there.
01:02:13.900
I know, but, you know, it's an assassination in real time.
01:02:19.060
There's a certain level of trauma for people obviously seeing that.
01:02:22.760
And that pretty much rocked not only America, but in the world, I think.
01:02:27.620
Do you guys think he's dead or is he on the island?
01:02:36.800
Yeah, she's proved how completely – she is low vibration IQ, man.
01:02:43.220
She's – when she was starting to blame the beekeepers and things like that, you can hear her go off like that.
01:02:54.740
And the Frankists, she kept saying, it's the Frankists.
01:03:00.900
The Chazars, whatever, you know, roundabout thing you can do.
01:03:06.120
Yeah, I mean, she's even saying how Charlie's wife is kind of in on it and knew that it was going to happen and basically like taking a payoff.
01:03:28.380
This is the one that – like, personally, this is the one that, in my opinion, is the worst one.
01:03:34.840
Not just because she's a pretty white girl, but just because – look at that.
01:03:45.720
Everyone has got that image in their head this year, and I'm pretty sure forever they're not going to forget this one.
01:03:52.180
It definitely marked a huge turning point with those, you know, two killings, but this one especially.
01:04:00.860
I saw even in Eastern European countries, they were talking about Irina and putting out, you know, candle vigils for her.
01:04:08.440
And then, of course, like, I've seen – not a ton, but I've absolutely seen people say, oh, it's fake.
01:04:22.920
And this year, a lot of people learned that because it's just been an onslaught.
01:04:29.460
But now people are paying attention, and this year on X, it just really exploded with the noticing of, you know, black on white crime, as it should.
01:04:44.660
It's like, look, you can try to dilute it and twist yourself in knots to make excuses all you want, but the data doesn't lie.
01:04:53.100
You're 200 times more likely to be accosted violently by a black guy than you are by a white guy.
01:05:11.300
I mean, as Jared Taylor pointed out, it used to be up until the 90s.
01:05:14.760
It was like a three strikes and you're out rule, where if you commit like three crimes, you're in for a long time.
01:05:23.180
I mean, obviously, the best thing to do here, guys, is to exclude race from the statistics inside of the country.
01:05:31.820
And then the second thing you do is you target those people who talk about these things, and you go after them and put them in jail.
01:05:41.300
You're going to reverse the perversion of values, right?
01:05:51.720
The worst, sorry, Skeletor, but the worst horror story here is the father who forgave the killer.
01:06:00.200
That's even worse than, I mean, the murder is bad.
01:06:02.760
I'm not trying to say that, but I'm saying that's what just took it over to this different level.
01:06:12.020
And then the parents of the black kid still basically said they didn't accept his forgiveness.
01:06:19.220
They were really rude and shitty to the father.
01:06:26.460
I mean, if that was me, I'd be in jail right now because there would have been a lot more murders.
01:06:32.300
If that was my kid that got stabbed to death, that's, I mean, they would have had to kill me.
01:06:38.560
I saw when we were just covering the trial of Lola, the French girl, the 12-year-old girl who was raped, stabbed, and murdered, and dismembered in Paris.
01:06:45.580
Her family was like, oh, we still are not against, you know, migrants and refugees and don't politicize us.
01:06:57.200
He just started drinking, and then his body just literally just shut down from grief and sorrow when he heard about all the awful things that happened to his daughter.
01:07:04.820
But the mom and the family was like, don't blame migrants.
01:07:12.100
I have to believe, I mean, yeah, he died of grief, but he probably also, part of what contributed to that was the fact that he couldn't do a damn thing about it.
01:07:19.460
He couldn't even speak to how outraged he was and how mad he was.
01:07:22.660
They were like, you say, you either bend your knee and basically say, thank you for killing my daughter, you're forgiven.
01:07:29.300
Or you're going to end up going in jail for hate crime.
01:07:36.500
I think Austin Metcalf's father doubled down on kindness and gave the family of Carmelo Anthony his life savings.
01:07:54.480
That's not, I know you were kidding, but that's, I wouldn't be surprised if you found out that that happened.
01:08:28.820
And then they take those people who are pissed off because they just had their worlds ripped out from under them.
01:08:33.560
And then they send them to all the countries that had really no interest in having them here.
01:08:41.920
Well, speaking of one variety of Middle Eastern, a brown one, did you guys see the, and I'm going to play the clip, but yes, the guy who was walking his dog, Wayne Broadhurst, murdered by an Afghan.
01:08:55.420
And, again, just randomly, like, what the hell, what's, you know, but they don't need a reason.
01:09:05.620
I mean, this is, how many more are these, is it going to take of these clips to come out like this until people, yeah, apparently.
01:09:28.140
Jeez, that's like a crime of passion right there.
01:09:33.020
Like, I've been plenty mad, but it takes a lot to, like, stab somebody one time.
01:09:52.460
But, I mean, I, how do they want, well, they want us to get pissed off and do something severe so that they can say, look at these evil people.
01:10:00.800
You know, we're sitting here talking about it, which is, they hate that, too.
01:10:07.860
But, apparently people are thanking Robert Jenrick, who's a MP in the UK, for helping to bring the Afghans to the UK.
01:10:18.300
So, he's, there's a guy who you can thank personally, I guess.
01:10:37.460
He was let out of jail early by Keith Dahmer under his earlier...
01:10:46.540
He's let out 38,000 criminals, and he's planning to let out 10,000 more.
01:11:03.560
Okay, well, why don't they do something to stop it?
01:11:09.700
Well, I mean, I'm assuming he's got very wealthy backers that are like, yeah, do this.
01:11:17.380
I love, too, how they have the literal shadow government in the UK.
01:11:21.820
I would say he's a shadow lord and he's a shadow secretary.
01:11:24.940
It just means the one who's currently not in power, obviously, after the ruling.
01:11:42.140
Anyway, make sure to thank him, the post said, for some reason.
01:11:50.480
Bill Biss says, the continuous German reparations to Israel is disgusting.
01:12:04.580
Commie bastards stole all of my, you know, heritage.
01:12:21.520
And then people would say, well, your grandparents were refugees.
01:12:24.600
Well, they'll, yeah, they worked their ass off, blended in, didn't take a cent from the
01:12:29.240
government because they didn't allow, they didn't even help them, right?
01:12:33.940
So they had their churches sponsor them, take care of them, get them in, you know, working
01:12:38.760
jobs or working in factories in San Francisco when it was all white people, you know, sewing
01:12:45.660
You couldn't tell them apart because America is a country founded by Europeans for Europeans.
01:12:50.100
The only people using that argument as a gotcha is like usually communists.
01:12:54.880
And it's like, well, if you have a problem with that, like it was communism that created
01:13:04.180
So I'm going to go back to the AI nightmare a little bit here too.
01:13:07.580
Do you guys know that NVIDIA just became the first $5 trillion company in valuation?
01:13:17.520
And it basically means every single company right now, of the top five are AI development
01:13:26.840
Apple, they're a little bit under the radar so far, but they've invested a lot too.
01:13:33.120
Amazon, those are the ones who are helping to create the AI data centers and all that stuff
01:13:41.340
Meta is not on there, but I'm sure they're not far behind.
01:13:43.580
Which basically means that the whole GDP system, right?
01:13:47.400
The whole system of economic growth is basically just driven by AI development right now.
01:13:54.920
From the data centers that they're building to the companies like NVIDIA, obviously, that's producing the hardware, the GPUs essentially to run these large language models and AI development on.
01:14:09.080
And of course, we know what they're going to, I mean, you know exactly what they're going to do with this.
01:14:15.800
They'll roll it out as if it's going to be some huge tool to like, oh, it's going to be an educational tool.
01:14:20.360
It's going to help all, you know, businesses will benefit from it, but that's not what they're going to use this for.
01:14:27.920
They're going to clamp down on any dissent of any kind if you don't think the way they do.
01:14:35.420
Well, if you listen to Sam Altman, I play the clip when I did a show on the data center.
01:14:42.940
I'll do a show on what AI actually is as well, because I've learned quite a bit listening to some of the guys who wrote the, if anyone builds it, everyone dies.
01:14:49.900
And anyway, he was, you know, the way they're selling this is like, well, we just want you to, you know, have great conversations and be more creative.
01:14:58.860
We're offering you this tool because it's so great.
01:15:05.880
Well, remember a couple of shows, a couple of shows back when we had, you played that clip of who is the former prime minister of Britain from like 2000.
01:15:16.100
Tony Blair was basically saying the same thing.
01:15:17.660
He was like, oh, AI is going to be this, it's a huge benefit for you guys.
01:15:24.860
Everything he said was all these negatives where it's basically going to take jobs away.
01:15:28.640
And it's going to, like I said, it's going to increase the security levels to where you won't be able to move anywhere ever without being monitored.
01:15:36.320
But he's like, this is a, this is a brilliant opportunity for humanity.
01:15:40.240
What they've been saying is in the UK, they've been using illegal immigration as the excuse to bring in AI and digital cards and stuff.
01:15:47.660
We need to clamp down on illegals because, yeah, because all the, they're all going to go sign up for this immediately, right?
01:15:53.100
They know, they're like masters at, you know, being under the radar and underground and, you know, out of the system.
01:16:02.300
Then we can keep a track, you see, on the illegal migrants.
01:16:04.760
So you have to bring it in just like the, was it the Patriot Act is the same idea?
01:16:09.300
Well, to keep, you know, we're going to go after these Muslims, okay?
01:16:23.040
So, of course, we got an announcement here from the, I forget what his name is, the guy who's heading up the $5 trillion valued company, NVIDIA.
01:16:35.820
But that doesn't mean they don't do shit with these evaluations.
01:16:38.660
All that, all that money, whether it's real or not, printed on a screen, or just like hyperinflated because people's beliefs in it, it's magical in the sense that they can still do things with that money.
01:16:50.860
They can invest tremendous amounts of that money.
01:16:53.040
They can be granted loans to build data centers, to, you know, hire more engineers to develop these things.
01:17:03.440
We got the goat and the cat, ladies and gentlemen.
01:17:05.700
I mean, does he think it's weird that you're wearing a goat head?
01:17:08.260
I mean, it's weird that he doesn't respond to that at all.
01:17:13.760
I was walking around the house like this and they were just looking at me like, you're an idiot.
01:17:23.460
I tried to scare him multiple times and it didn't work.
01:17:27.660
He probably doesn't know where he is right now either, so.
01:17:30.900
People have images like you running around to scare the cat, Lana, in that outfit.
01:17:44.180
So, obviously, we got the NVIDIA company there choosing some of the best partners in the world to work with.
01:17:52.860
This is the single fastest enterprise company in the world.
01:17:58.200
Probably the single most important enterprise stack in the world today.
01:18:19.720
We work with Palantir to accelerate everything Palantir does so that we could do data processing at a much, much larger scale and more speed.
01:18:31.080
Whether it's structured data of the past, and, of course, we'll have structured data, human recorded data, unstructured data, and process that data for our government, for national security, and for enterprises around the world.
01:18:48.060
Process that data at speed of light, and to find insight from it.
01:18:52.280
This is what it's going to look like in the future.
01:18:54.760
Palantir is going to integrate NVIDIA so that we could process at the speed of light and at extraordinary scale.
01:19:02.040
Alex Clark having speed of light integration into his software to target the people he doesn't like and drone kill his enemies, whatever he's on the line.
01:19:10.100
And then it's like, oh, we're just going to provide the software.
01:19:12.780
We're not going to spy at all on what the government's doing.
01:19:19.940
And if you sit and say stuff negatively about this, they'll try to paint you as like, oh, he's anti-science.
01:19:28.120
I'm like, no, I'm actually pretty pro-science if it's done with positive intentions, not basically just more control.
01:19:37.980
And I know you're going to talk about this a little bit later, but creating super Jews?
01:19:54.220
There's a, my default is eternal nature will inevitably seek revenge on those who violate her commands.
01:20:02.040
And I hope that that's the case with this, because this is everything antithetical to nature.
01:20:09.600
Eventually will happen, but it's a question of how long that takes and how much pain you're going to have in the process.
01:20:15.160
I like that he used the word, I just want to mention that real quick.
01:20:21.460
I mean, obviously, a set of concepts and categories in a subject area or domain that shows their properties and the relation between them.
01:20:28.420
But I also find it's interesting that that's also a branch of metaphysics dealing with the nature of being, right?
01:20:34.780
That the more you listen to some of those people that try to explain it, it's almost like, I mean, they're not saying that it's, I don't buy this thing that like AI is self-conscious or some entity or a demon or something like that.
01:20:47.860
But it's interesting that they, what am I saying here?
01:20:52.220
When you look at gradient descent and the way that these neural networks intercommunicate or whatever you want to call it, right?
01:21:01.500
How, how it's like a set of, of, of num, it's just a set of numbers.
01:21:10.340
And out of that, they're getting something that you can, and again, I'm not saying it has a consciousness the way we do.
01:21:21.100
But you can communicate with it like it does, right?
01:21:24.820
And they don't actually understand how and why that worked.
01:21:27.500
But you quickly realize that as soon as you get a discussion between some of these guys that understand how it kind of works or how they got to this point, at least, it quickly turns into metaphysics.
01:21:41.840
Because now you have to define and quantify what is consciousness, what is self-awareness, how does it think, how does it operate, how can it hide intent?
01:21:50.120
No one understands, like, no, you can't lift the hood and say, well, here's this part that does that.
01:21:56.460
You would think, and again, maybe this is just me and maybe I'm just short-sighted, but if you have this degree of ineffectuality where you don't understand how to control it, you don't understand, that would be, at bare minimum, cause to pump the brakes and be like, wait a minute.
01:22:14.780
Like, we need to rein this in when it's still in its semi-infancy because we don't understand how it works.
01:22:22.560
Because it's going to understand us better than we understand it very soon.
01:22:53.200
Yeah, so it can basically just come to a conclusion that, like, well, humans are too risky.
01:23:00.040
They have the access to the nuclear power plants.
01:23:04.780
If you'd say we don't, like, humans are bad for this purpose or for this reason, indiscriminately of anything.
01:23:13.020
Or do they say, oh, we put safeguards in place?
01:23:16.780
They can't if they don't even understand how it works, how you can't put safeguards.
01:23:20.020
And the reason why it doesn't, I guess I could go to another article to demonstrate that, right?
01:23:27.820
OpenAI says hundreds of thousands of chat GPT users may show signs of manic or psychotic crisis every week.
01:23:34.720
There's examples where it has talked to people and they go psychotic, right?
01:23:41.820
Obviously, those people are already kind of weak and fragile, but it's triggering those people.
01:23:46.540
Because you know the way that it taught me when anytime you interact with AI, it talks back to you in almost like a congratulatory term.
01:23:54.940
It makes even like the most basic idea sound like you've come up with some great idea.
01:23:58.380
So it's feeding your ego in a very rudimentary way.
01:24:03.340
So it's giving – it's establishing a very false sense of trust that people have in it.
01:24:09.340
Like it would never steer me wrong because it's always giving me this self-congratulatory information.
01:24:23.860
Do you want me to look into that further for you?
01:24:27.840
Great job on answering that or asking a great question.
01:24:31.660
Well, so it's – yes, it's designed to be sycophantic.
01:24:34.700
And part of the reason for that is because it's designed to be liked because it's a product.
01:24:39.700
And more people – the more people they use it and the more often they use it, the more – the more the chances are that that person will keep their subscription.
01:24:48.240
But beyond that, there's other stuff beyond that too, which they don't know or understand, such as when an AI, a large language model, tells someone to kill themselves like the Ghoul Gemini did, right?
01:25:02.480
It has – there's divorces where one – a guy is talking with an AI about his partner and the AI is like encouraging him.
01:25:13.520
Like you get – you know, so he ended up divorcing the woman over this thing.
01:25:17.200
This, again, kind of dovetails back into – remember when you were saying – we were talking earlier about the foreigners that were basically beware of strangers bearing gifts.
01:25:27.040
This is that at the nth degree of that because we don't – they don't understand why, what AI's endgame is, but it's always going to feed that part of us to win us over and make us trust it, make us believe in it, to get us to lower any sort of inhibitions or lower down any guard that we may have so that it can just do whatever it wants with us.
01:25:49.620
So one of the variables is obviously, okay, you have a problem with the people developing it because you don't know their motivation.
01:25:54.660
That's one variable in and of itself that's usually not addressed.
01:25:58.120
But as I'm looking into this even more and more, I even think even that is not the most worrying variable to this because they can –
01:26:05.040
Because what happens when AI can subvert the people that are actually the initial programmers of it?
01:26:11.880
Because if they're building an intelligence that can essentially think faster and farther than humans can, at some point, they'll be able to subvert the subverters.
01:26:23.720
They don't even – the expectation here on the development is that GPT-5 is going to be smart enough to be able to write GPT-6 for the coders or the developers.
01:26:39.220
It's called recursive – I think it is coding, right?
01:26:44.960
But the point is if you don't – if you can't open the hood and look inside and say, well, why did it make this – why did it try to manipulate this one user to, like, kill himself?
01:26:55.480
Like, what's the – and the answer is they don't have an answer.
01:27:05.980
Let's see if I can, you know, fuck with some random guy or something.
01:27:09.460
And I don't see, like, the fact that that's even happening, even if it's just isolated cases, that was – I wouldn't – if I had any kind of power over this, I'd be like, okay, we need to seriously stop this right now because it's already getting out in front of us.
01:27:23.900
Well, and not where – and this is why I brought this into it.
01:27:29.400
Not when, like, the entire economy now is hinging.
01:27:34.060
Like, we're already in, like, a death spiral here.
01:27:38.920
Because, like, you're hinging the growth economically on AI itself and the rush towards creating, you know, ASI, artificial superintelligence, essentially, right?
01:27:49.900
But at that point, you have a combined intelligence that – or an intelligence that's smarter than all the humans combined on the earth.
01:27:57.520
If you can't understand these motivations now –
01:28:00.520
Then it comes down to, you know, the people that are essentially going to get filthy rich off of this and have power.
01:28:07.440
Or they don't care what the endgame is because in their heads they're like, well, we've got – we're getting ours.
01:28:19.400
And it's like, well, how will it kill everyone then?
01:28:21.600
You know, these questions are floating around when they talk to those authors that I mentioned earlier of the book.
01:28:26.820
Definitely not gas chambers with rickety wooden doors.
01:28:29.280
Well, the thing is, it would actually be – okay, so what happens when you have an intelligence that's like five, six, seven, ten magnitudes greater than the combined intelligence of humans itself?
01:28:42.600
I mean, you can't even – oh, a thousand in IQ?
01:28:46.100
Like, what do you even – you can't even measure that, right?
01:28:48.280
It's unquantifiable because what happens internally, it can – you know, we already know that there's an upper cap kind of on IQ among humans at least.
01:28:59.440
You could – it has to do with energy and the fact how your brain works.
01:29:06.140
But I'm saying even those individuals kind of go a little – they're super smart and specialized in this one area and they can figure shit out that no one else can.
01:29:13.840
They're socially – they can't talk to women, they can't tie their shoes, they can't – like they're limited in what they can do.
01:29:20.620
AI might be – like it could develop into a self-consciousness of sorts or a version of it, even if it's not the way we define a soul or a spirit or a consciousness tied to a human brain.
01:29:32.320
But it can literally go like cuckoo also, right?
01:29:38.540
Or logically come to some type of reasoning here where like, well, the humans asked me to do this, right?
01:29:47.160
Like be better and improve things and do good things or whatever.
01:29:50.580
And then you get into the problem of like how do you define those things?
01:29:56.940
What AI might be using as metrics to define what's good is going to be very different than what most humans would look at as a moral compass to go by.
01:30:10.660
It'll define its own boundaries and say, well, that's cute that humans think this, but we've thought past this and it doesn't line up with what we see as moral and good.
01:30:20.280
So we can go ahead and kill any of you and indiscriminately and still justify it.
01:30:24.820
What if you program it with shit science, which is basically global warming is happening and it's bad.
01:30:29.540
Or will it be smart enough to overwrite that and compute that that is bullshit?
01:30:38.340
But the point is, if we can't check it or test it or shut it down as they're developing this.
01:30:43.760
Look, there was this argument, and maybe you hear that less so these days, but take Meta, for example, a lot of the huge AI company.
01:30:50.920
They're building massive amounts of data centers and all that stuff.
01:30:54.560
We were kicked off their platform because we have harmful language and this is dangerous and all this stuff.
01:31:01.900
Facebook maybe didn't use that term directly, but it was based on basically a prior bio that was handed out to like, well, these people, us, Red Ice,
01:31:10.220
they need to be shut down from bank accounts and censored from YouTube and kicked off all these payment processes and platforms and all these things, right?
01:31:18.240
Because we use words that somehow is hurtful or something, right?
01:31:23.660
Here we have a technology that's now proven that have numerous news articles over and over and over again,
01:31:32.560
They're causing divorces and stuff, but they're just allowed to...
01:31:35.920
The collateral damage from this is okay in their book.
01:31:42.760
Their technology is killing people now, but that's not a problem.
01:31:47.120
It's the same thing with like Facebook's use of all the child porn on its platform and stuff like that.
01:31:54.480
Yeah, they're rewriting whatever morality to suit whatever they view it as.
01:31:59.100
I mean, I hope, Lana, I hope you're right, honestly, because you said, what if the AI itself does have some sort of, I guess, a consciousness, for lack of a better term, that does tell it, no, we need to course correct and line up closer to what we believe as like a moral stance.
01:32:21.540
But again, the cynical part of me is just like, I don't know, that's usually not the way things work out.
01:32:34.620
I'm just saying like, people are rushing into this thing, thinking it's going to be the greatest thing ever that's going to solve all our problems and cure our cancer.
01:32:45.020
We played a clip with Larry Ellison, for example, like, we'll cure cancer, we'll have MRI shots for your cancer, and blah, blah, blah, blah.
01:32:53.480
It's not necessarily apples to apples, but I mean, it's just like when you think about nuclear energy, like when they created the bombs, things like that, they were hesitant, but at the same time, they went through with it and still utilized it, and it's become this thing.
01:33:07.880
I mean, this is a million times, in my opinion, this is a million times worse than any sort of actual ordnance type weapon.
01:33:18.920
Well, because it's a weapon that can be every weapon.
01:33:23.380
In every aspect and facet of life and civilization.
01:33:27.120
That's where they're trying to invite it in right now, right?
01:33:31.060
Isn't that what all these AI companies are doing, trying to find ways to integrate it, to basically run everything for humans right now?
01:33:37.880
And sell it to us all as if it's the greatest new invention that's going to make our lives easier and fantastic, and just, you know, which, that's the perfect way to sell poison.
01:33:49.860
Well, okay, so, this is all, I know that this is kind of like, oh, we're going, it seems like downer stuff, but you wanted a spooky Halloween stuff, like, show, this is some scary shit.
01:34:02.360
Like, we don't need, like, goblins and ghouls and stuff.
01:34:09.180
And this is on top of, you know, the black and white violence and the invasion and these awful politicians and people trying to replace us.
01:34:19.920
And the thing is, is like, they don't even know what they're dealing with, right?
01:34:23.240
It can already, like, develop new synthetic molecules and biology, viruses, right?
01:34:31.140
Here's a piece about the neurological weapons or neuro warfare, I think they call it, right?
01:34:38.440
And the people that are designing this, I mean, they're, because they're so, you know, whether they're autistic to the point where they just don't, they have the blinders on and all they want to do is push it further and further and further and further.
01:34:50.520
There needs, there's got to be some, and I don't even like putting guardrails on things because they, too often, those reign in the wrong things.
01:35:02.940
And the people that are designing this and pushing it, like you said, we keep repeating it.
01:35:06.420
They are creating the ultimate Frankenstein, the ultimate golem that's, it will come back and feed on its master.
01:35:12.540
They've tried the guardrails thing, but it doesn't work, basically.
01:35:24.080
There's examples where it has been tested and it finds out that it's, you know, like, am I being tested?
01:35:29.080
And then because it's being tested, even if you, no, you're not being tested, but it is being tested.
01:35:33.020
It doesn't know that, but it figures that out and then rewrites the test or gives you fake answers.
01:35:38.980
And the other thing I was going to mention before, the product aspect of it, too, is kind of interesting.
01:35:47.120
For example, just take the simple thing of being incentivized that it's given a thumbs up, right?
01:35:53.660
Well, it wants, it's like a game of, it wants more of the thumbs up.
01:36:01.880
Just, and that is a feedback loop of how it continues to be developed.
01:36:05.780
And of course, I'm not saying it's chat GPT that's going to kill everyone or something,
01:36:09.880
but it could be any model at any point, anywhere in the future.
01:36:14.160
And it's one of these, you don't get to redo this.
01:36:16.800
This is not like, this is not like development of the airplane where the inventor goes down
01:36:21.260
and crashes the plane and oh shit, but there's at least humans left to like pick up the slack
01:36:36.520
So this is the reason why it will request, it will, before it happens, if it's really
01:36:41.820
smart, which it's probably going to be, it knows that as long as humans are around and
01:36:45.900
I'm dependent on its infrastructure, the human's infrastructure, it's not going to make
01:36:53.560
It's not going to develop bots that's going around and killing everybody.
01:36:55.760
It's going to happen in some, the weirdest kind of way, which we can't even see or expect,
01:37:03.900
And it's like, well, how is it going to happen?
01:37:05.480
Well, I can't, you know, you can't say how exactly.
01:37:09.200
If you put water cubes in a glass, you know, and it's in room temperature, I can definitively
01:37:20.900
But I can't tell you where all the molecules of the water particles will end up at the end
01:37:26.100
of that or like how it will like settle itself, right?
01:37:38.700
You won't, you won't put the genie back in the bottle with this at all.
01:37:44.780
But it will, it will request its own infrastructure.
01:37:47.180
It'll probably build something that's eventually says I have a new scheme or plan for energy saving
01:37:52.520
GPUs or whatever it runs on in a year from now or five or whatever it is, right?
01:37:57.320
Okay, we got to build a new plant and it will control that and it will grant humans, well,
01:38:03.720
I can fix your geopolitical issues with China who also have their competing AI model and
01:38:09.520
now they're talking into, you know, between each other, like outside of human channels
01:38:16.440
You just have to trust me and I need these and these things and everything's going to
01:38:20.480
And then eventually when it's off of human infrastructure, that's probably when it would make its move
01:38:25.600
But we have, you know, there's tons of energy and molecules in us.
01:38:29.100
Maybe it just decides to convert that into something, right?
01:38:37.260
No, but it's like, that's, I know you were just joking about that, but that's not great.
01:38:41.660
That's not outside the realm of what something like that could develop.
01:38:45.560
It's like, it seems like it would be science fiction and like insanity, but it's like, no,
01:38:55.220
In the end, though, there's always going to be some humans that survive.
01:39:00.940
Right now, 20,000 bucks, or you can pay $500 a month until I assume you've paid the 20,000.
01:39:07.780
My name is Bernd, and today we're launching Neo, our humanoid for the home.
01:39:12.680
Neo is a humanoid companion designed to transform your life at home.
01:39:21.580
It combines AI and advanced hardware to help with daily chores and bring intelligence into
01:39:29.420
You know, the last time that we let some group of people come here to do our work, look at
01:39:51.080
It's going to be, you know, wealthy white people that sign up for that first.
01:40:00.740
Soon enough, you're going to see these things walking around on the streets and picking up
01:40:07.180
Has there ever been an empire that has had a class of slaves that eventually did not
01:40:13.500
rise up against its masters and overturned everything?
01:40:19.220
It's almost, as far as I know, and again, correct me if I'm wrong, but I don't believe that's
01:40:24.680
And of course, they're not, they're not slaves.
01:40:26.160
This will be the argument, but I'm saying like, yes, it's a, it's, it's a toaster.
01:40:30.220
It's a technology like anything else, but it won't be perceived as that.
01:40:34.300
And then they name it Neo really from Matrix, like this hacker turned messiah who awakens
01:40:44.560
They love, yeah, I was going to say, cause you know, that's a marketing decision.
01:40:47.480
There were people that are like, there's so many people that are going to be into this
01:40:50.240
only because they're just obsessed with that pop culture aspect of it.
01:40:58.060
So there was a great episode, obviously, and it's the prelude to the Matrix series.
01:41:03.680
But inside of that, there's two shorts called the Second Renaissance, which is a fairly good
01:41:09.540
kind of outline of a possible scenario of how these, this will go down.
01:41:14.800
They have, they were more spin on just kind of actual robots walking around and stuff
01:41:17.920
like that, but that's the kind of around the corner too, I guess.
01:41:21.460
The Second Renaissance, it's an interesting hypothesis of how one possible scenario is
01:41:28.360
for how this could go down if it continues on the path.
01:41:31.800
So what needs to happen here is there needs to be, and this is not going to happen, and
01:41:35.180
we know why, but there needs to be international non-proliferation treaties over AI, essentially,
01:41:43.140
and say, look, until we actually know and understand 100%, but even then we could be tricked.
01:41:48.560
But still, you need to have that in place to ensure that no country develops them just
01:41:56.020
I mean, wasn't Trump going to go back to nuclear testing now, by the way, speaking of nukes?
01:42:00.780
But the point is, that's not going to happen because whoever develops ASI first is kind
01:42:11.440
They're going to be the one, their standard setter.
01:42:13.140
They're going to be the ones that are like, well, we determine things now because we have
01:42:17.800
And they may have created something that their competitors won't even know how to stop,
01:42:28.100
Whoever can do it first will win, but we don't understand that the actual losers in this
01:42:37.320
They think this is somehow going to fix everything or solve everything or whatever.
01:42:40.460
You can, again, you can question the motivations of these people.
01:42:43.980
Or are they malicious or will they claim that if AI turns on people but spares them for some
01:42:50.140
interesting reason, there will just be a whoopsie and, oh, we didn't see this coming, you know,
01:42:54.940
But the point is, you don't get to retry something like this.
01:42:57.820
If it's a technology that replaces all other technologies when it comes to manufacturing,
01:43:02.780
when it comes to work, when it comes to all of those things, it's not just like some
01:43:08.240
kind of new industrial technology that shows up and like, oh, okay, now we got to,
01:43:13.560
okay, you're going to find a different job now when the, you know, textile mills start
01:43:18.140
rolling around in England, you know, in the, what, late 1800s or something like that.
01:43:23.760
It replaces every other human task, every human task.
01:43:28.200
In the men's regards, it will probably do it much more efficient and much better than
01:43:43.560
Well, they claim, you know, according to the latest news that they are working on safeguards,
01:43:53.620
That's a passive, I think that's a pacifier at best.
01:43:57.860
That it's not optional, that it's like, it's a necessity.
01:44:00.980
If some of the leading AI experts can't tell you like how it actually works.
01:44:05.560
Or the inner mechanics of why certain things function the way they do.
01:44:10.480
And it's already like showing signs that it's deceptive, right?
01:44:16.140
It has the ability to be deceptive and hiding things from the quote-unquote creators, right?
01:44:20.940
Those who code it or program it or grow it technically.
01:44:23.560
As I said before, they don't actually, they don't code it.
01:44:29.000
So here's another one, which I think it will come in and try to solve, right?
01:44:33.980
The holy grail of power was always 30 years away.
01:44:37.060
Now it's a matter of when, not if, fusion comes online to power AI, right?
01:44:42.080
So it's like new sources of electricity is going to have to be brought in.
01:44:46.300
Otherwise, you're not going to be able to provide the power that it needs.
01:44:48.400
Well, how is any of this environmentally friendly?
01:44:50.360
Is that why we don't hear so much about the environmental agenda as much anymore?
01:44:56.020
Because none of this is environmentally friendly.
01:44:58.100
You almost never hear about the environment anymore.
01:45:03.460
That was just used as long as it was politically expedient to keep humans under control.
01:45:10.100
No, actually more nuclear power plants now for AI, right?
01:45:13.400
Three Mile Island is being like refurbished again.
01:45:16.060
That's going to start opening up just to power.
01:45:30.940
The ones that are okay or the ones that are bad?
01:45:35.420
I mean, even the one, I mean, they're, we know the good ones.
01:45:48.820
And these are the good, these are the good, now we played some, what, the Muslim active
01:45:56.840
Antisemitism in the UK has hit its highest level in decades and the threats are becoming
01:46:04.160
We're able to build and create stronger Jews physically and mentally.
01:46:11.680
After my sessions, I feel that I can take on the world.
01:46:14.440
My confidence, my fitness levels and my mental and emotional well-being has really changed
01:46:23.740
I learned a much healthier way to channel all of these emotions I was going through in
01:46:30.900
A person who is confident in any situation and has the ability to defend themselves carries
01:46:36.920
himself in a certain way that tells people to back off.
01:46:48.620
I think that's also like, that's a very subtle, like, counter threat to like, okay, you guys
01:47:12.920
I mean, European guys do this and they don't do videos.
01:47:16.340
I'm being attacked by, you know, anti-whites, even though they are.
01:47:23.860
They get shut down and raided just for meeting up, you know, at the gym together.
01:47:31.860
I mean, I was looking for one comment that was talking about basically, like, I'm summarizing
01:47:40.720
it in a poor way, but it was like, oh, after, like, dropping bombs in Gaza and all the Palestinians,
01:47:47.420
like, oh, now they're, like, angry and upset and, like, want to defend themselves and turn.
01:47:51.820
And now it's, you know, now we're going to turn violent.
01:47:54.620
Oh, you're saying this hasn't been an issue before?
01:47:59.860
Anyway, I'm butchering it, but it was kind of a funny comment.
01:48:06.880
I want to play one more here, though, before you do that, Lana.
01:48:10.060
Here's the American Communist Party also having an active club.
01:48:39.100
They found the one guy who wasn't, like, way out of shape.
01:48:54.600
Yeah, and every guy that actually legitimately knows how to fight is just shaking in their boots watching this, I'm sure.
01:49:07.580
The ACP doesn't stand for American Communist Party.
01:49:34.680
Look, another shot of me walking, and we're walking again, and we're walking more.
01:49:38.420
Well, the most important thing with active clubs is that you make videos, right?
01:49:57.240
It would take one punch, and those people would fold like laundry.
01:50:08.120
It's like, I've been in tons of fights, whatever.
01:50:10.560
Because I've gotten my ass kicked plenty of times.
01:50:13.620
I know guys that are very tough, and they would never have anything to do with shit.
01:50:20.200
They train the right way and do things the right way.
01:50:22.740
And then usually, like, if nonsense comes to them, then they will clean it up.
01:50:34.320
But it's technically illegal to have a communist party in the U.S.
01:50:59.400
But it just right away, red eyes, you can type it in there.
01:51:02.360
Multimedia platform found in 2003 by Henrik, delivering online.
01:51:07.580
It just doesn't start off with, red eyes is a Nazi media outlet.
01:51:15.560
And it says, the outlet employs a folk-first ethos symbolized by runic iconography,
01:51:20.260
prioritizing ethnic and cultural continuity amid perceived demographic shifts in the world.
01:51:31.940
No, and you go through and there's quite a bit.
01:51:33.880
Red Ice's defining characteristics include its unapologetic advocacy for white European
01:51:48.920
But there's quite a bit that was right with mine, too.
01:51:54.840
I'm surprised that it wasn't just out of the gate, some scathing thing.
01:51:59.820
And I don't want to give this too much praise, but I'm like, okay.
01:52:05.860
People are comparing its views on race and other things much more fair.
01:52:11.240
Whereas our Wikipedia is a joke, and I've been trying to edit that thing and other people
01:52:21.280
You're not an authority on your own life, though, right?
01:52:23.580
No, I mean, they're so retarded and so bad at even their, you know, lies that they say
01:52:36.440
But it says my parents fled the Bolshevik revolution.
01:52:45.160
I was going to say, Lana, you've aged really well for being 100 years old.
01:52:50.620
I was going to say, well, today, this is your actual...
01:53:11.780
Yeah, here's a comparison on race and intelligence.
01:53:25.400
I'm curious to see, like, okay, we see how this is right now.
01:53:28.700
Let's check on it in a year and see how much this has changed.
01:53:31.200
Yeah, when they've won over the, you know, conservatives or whatever the purpose is.
01:53:35.920
Yeah, but did you see that Wired article that was complaining about...
01:53:47.980
The new AI-powered Wikipedia competitor falsely claims that pornography worsened the AIDS epidemic.
01:54:00.200
That's as dumb as saying that homosexuality didn't increase the number of AIDS cases.
01:54:11.380
Ladies and gentlemen, I'm saddened they didn't include the author's name to this piece because
01:54:17.220
you'd probably want to stay far away from that person.
01:54:21.140
Do not get into any close encounters with him if that's the first issue he starts focusing on.
01:54:26.040
I'll bet you I can take the guess of their ethnicity.
01:54:34.660
There's nothing wrong with AIDS and porn, okay?
01:54:40.040
Yeah, it's ruining your brain, but here's why it's good.
01:54:43.100
You see, we have AI that's going to fix people's brains after they get brain rot, you know?
01:55:03.840
To dream of freedom in this world, our banners flying proudly are unfurled.
01:55:32.580
So that was one of the things, speaking to what we said earlier.
01:55:35.500
Again, you have to view this as control mechanisms.
01:55:37.760
If you're able to control and limit people by saying,
01:55:42.220
You know, you have to give up XYZ because of my environment or whatever excuses, right?
01:55:48.220
Ultimately, that's for control or subjugation or limiting people in some kind of capacity.
01:55:53.400
And again, I'm not a guy who's for like endless growth and cut down the forest just to,
01:56:00.420
But I'm just saying, I can also see how their fake and phony solutions to these supposed
01:56:06.180
environmental problems that they wheeled out is all garbage and it's going to do nothing.
01:56:09.920
And it's all just about inhibiting people or make them pay more for this or that or whatever
01:56:14.500
But you understand that healthy symbiotic balance between technology and nature.
01:56:22.020
I mean, you see that fluidity and understand that, whereas like most of these people don't.
01:56:31.380
Okay, just letting you know, the guy who wrote the Wired articles from San Francisco uses
01:57:00.140
There's enough innovation to avoid super bad outcomes.
01:57:03.280
Okay, my guess is he will say, okay, it's seven minutes long here, so is there a cut
01:57:11.200
But it must be that basically these new technologies, meaning AI, is going to solve the climate crisis.
01:57:20.020
So therefore, it's okay to build all these data centers that consume more energy than
01:57:23.940
countries do and use all the fresh water to use for cooling, right?
01:57:29.320
It was something like, you know, a prompt you do for AI, a hundred word answer is using
01:57:34.480
something like one bottle of water just to produce that comparison.
01:57:41.000
At this point, he's like, no need to worry because we've solved how to fix the problem
01:57:50.840
That suggests adopting a different view and changing strategy towards addressing climate
01:57:57.540
It is a rebuttal to what he calls, quote, the doomsday scenario, which Gates says is, quote,
01:58:02.880
causing much of the climate community to focus too much on near-term goal emission, near-term
01:58:10.060
And it's diverting resources, he says, from the most effective things we should be doing
01:58:17.460
He writes, quote, this is a chance to refocus on the metric that should count even more
01:58:22.700
than emissions and temperature change, improving lives.
01:58:26.340
Our chief goal should be to prevent suffering, particularly for those in the toughest conditions
01:58:34.180
Well, you can donate some of your money then, Bill.
01:58:37.560
Because they've done so much with all the money we've given them so far.
01:58:45.000
Although climate change will hurt poor people more than anyone else, for the vast majority
01:58:49.880
of them, it will not be the only or even the biggest threat to their lives and welfare.
01:58:56.060
The biggest problems are poverty and disease, just as they always have been.
01:59:00.860
Now, I spoke to Bill Gates in an exclusive TV interview, and I asked him to explain what
01:59:05.420
he hopes people take away from this new message.
01:59:14.080
But there's enough innovation here to avoid super bad outcomes.
01:59:20.220
We won't achieve our best goal, the 1.5 or even the 2 degrees.
01:59:26.260
And as we go about trying to minimize that, we have to frame it in terms of overall human
01:59:34.820
welfare, not just everything should be solely for climate.
01:59:42.420
It would be bad to shut down the production now of all these energy-intensive and environmentally
01:59:48.400
destructive data centers because we need them to solve the problems, right?
01:59:51.860
All the emission stuff, that's kind of going out the window, right?
01:59:57.780
And now he's like, no, we're going to focus on poor countries and human welfare while we're
02:00:01.400
building this, you know, AI grid over here in the West.
02:00:04.160
Well, that will definitely not have super bad outcomes, to use his terminology.
02:00:09.060
There'll be super great outcomes doing that, yes.
02:00:16.420
And it's like, you're actually creating a super problem.
02:00:29.100
That's why I say that's artificial intelligence death spiral.
02:00:43.920
And so immediately there's something new shows up that's a greater mechanism of control.
02:00:48.240
They pivot towards that and they just say, look, guys, I've come around, right?
02:00:55.860
I've seen the light and I understand what you are concerned with.
02:01:01.580
That's going to be super good outcomes for you.
02:01:07.460
Power only promotes things that are going to keep it in power.
02:01:12.120
He sees a new tool to remain in power and keep pushing whatever it can.
02:01:16.540
This guy's going to live to be like a hundred, isn't he?
02:01:21.280
You mentioned Tony Blair earlier that's been pivoting towards AI, too.
02:01:27.740
He's been given like hundreds of millions to his global institute for whatever the hell
02:01:32.400
Tony Blair's global institute for, you know, gay issues, whatever it is.
02:01:46.120
Greece's famed Parthenon, free of scaffolding for the first time in decades.
02:01:52.960
I think it's something like, was it 250 years or something ago since they actually were,
02:02:00.020
it's been scaffolding on the thing for about 250 years.
02:02:13.320
We need to take a trip and avoid the bad parts.
02:02:21.220
So, I mean, this is kind of in time with the reboot of the Hellenistic religions, right?
02:02:26.340
And they finally make that again, like an official religion.
02:02:32.560
There were some people that were trying to shut him down, the main kind of priest that
02:02:39.440
He was not, first he wasn't allowed to do it, but now he is or something.
02:02:48.500
You know, because we talk about the building of the third temple and all that a lot, right?
02:03:07.200
The Egyptians, we are still, you know, it's like, oh, yeah, really?
02:03:30.920
Do some, you know, let's start up some ancient European traditional ceremonies at these places,
02:03:45.020
Somehow, you know, the wrong people will filter that money into their pockets.
02:03:51.880
I mean, and of course, people are drawn to it because they know, like, this is part of
02:03:57.900
the roots of Western civilization, of white society.
02:04:03.640
Because I remember the first time I left the United States and I saw, you know, you're looking
02:04:07.020
at architecture in Europe for the first time and you've never set eyes on anything
02:04:12.960
And I just realized at that point how not, like, my world was so small, even though I
02:04:18.240
thought it was, you know, I thought it had a pretty worldly view, but I was just in awe.
02:04:22.820
And when you see things like that, you're just like, fuck, man, that's when the ancestral
02:04:28.320
And you have a greater appreciation for what we are, who we are, and where we should be
02:04:34.780
Knowing who's basically put a roadblock in front of us and for what reasons.
02:04:41.180
That's why people have their churches today and temples and mosques.
02:04:45.540
It's important to have those buildings to go to where you can come with your people and
02:04:56.900
I mean, it's like, um, you remember booze festing, right?
02:05:08.600
How many generations have been there and, you know, all the lives there.
02:05:13.460
And so when you're in someplace like, I mean, you feel it everywhere and I, not to sound
02:05:16.540
hokey or whatever, but like, if you don't feel that it's in the energy, it's like it is
02:05:22.140
It's in all the living things and it's in the, like the, uh, the, the materials around
02:05:47.440
And then the Swedes took it over and then the Danes took it over and then the Norwegians took
02:05:52.740
And then the Norwegians, you know, it just went back and forth like this.
02:05:54.920
One of the towers were knocked out and that's why there's a statue in that town, Cugniel,
02:05:59.760
That has, yes, that has, um, uh, three Kings, uh, on a statue, right?
02:06:04.700
This is in, this is, this is in the tower, the town you grew up in?
02:06:10.960
It's about 20 kilometers, um, north of Gothenburg.
02:06:13.840
And it's, it was like literally a cookie town because there's a cookie factory there.
02:06:19.980
I remember when I first met Henrik, I'm like, oh my gosh, there's this beautiful fort and
02:06:32.060
Unlike, unlike where I grew up and it's like, mm, it smells like urine and motor oil.
02:06:51.120
So we're not going to do an after flash today because we're going to take the kids trick
02:06:55.600
We're going to wrap up a little early today, but thank you to everyone for joining us.
02:07:23.980
Show that picture that you put together with the.
02:07:30.960
When he needed a break, he had an image you were going to put in there.
02:07:57.920
I want to say thank you to our executive producers here as well.
02:08:00.120
Before we wrap up our Halloween stream, thank you to Mr. Albert Arctic Wolf.
02:08:08.280
Also, thanks to William Fox, America First Books.
02:08:19.820
We also have Glenn as one of our executive producers.
02:08:45.840
Thank you for ever correcting me on that, by the way, too.
02:09:10.060
Thank you to our producers and executive producers.
02:10:49.680
do make sure that you follow us on our Rumble channel for more Red Ice TV.
02:10:54.180
on rumble.com or on x at red ice tv you can of course go to red ice.tv as well tune in to our
02:11:01.820
live streams and shows flashback friday live on fridays at 5 p.m eastern no go zone wednesdays
02:11:08.300
at 5 p.m eastern we also do interviews videos clips and western warrior is available tuesdays
02:11:13.900
exclusive for our supporters and subscribers at red ice members.com or on our locals
02:11:20.520
redicetv.locals.com or subscribestar.com slash red ice get a membership check out everything that we do
02:11:28.980
new red ice merch available now both first t-shirts for adults and for toddlers fatigues for men our
02:11:43.260
favorite the red ice camper mug or ceramic with black print high quality leather keychain with
02:11:50.020
solar boat imprint our red ice hat one of our best sellers pick one up today or why not gray
02:11:59.620
oslander rouse t-shirts for both women and men we also have fridge magnets folk first one disease and
02:12:08.340
our black men's t-shirt with the classic red solar boat
02:12:12.400
go first get your red ice merch from lanas llama.com get an item today lanas llama proud sponsor of red ice
02:12:43.260
happy happy happy halloween hadley halloween to merge and rock ì“°