Red Ice TV - March 18, 2023


AI Is A Powerful Weapon, We Are The Target


Episode Stats

Length

38 minutes

Words per Minute

157.48743

Word Count

6,105

Sentence Count

334

Misogynist Sentences

9

Hate Speech Sentences

19


Summary

OpenAI has a chatbot that can beat 90% of humans on the SAT. Peter Thiel is proud of the fact that there are no right-wing governments in Europe. Palantir's Peter Thiel and Alex Karp are both very proud of their work in the field of artificial intelligence, and they talk about how they have been able to keep de facto right wing governments out of place. Bitcoin is going to dominate the world, and AI is not going to stop it.


Transcript

00:00:00.000 OpenAI, right, it's announced now it's GPT-4 chatbot, I guess maybe it's an overall, also
00:00:10.700 it's not just a chatbot, but maybe it's an overall improvement of the AI project that
00:00:15.100 they're running overall.
00:00:17.300 But anyway, they claim that it can beat 90% of humans on the SAT.
00:00:23.780 OpenAI announced the latest version of its primarily large language model, GPT-4, on
00:00:30.600 Tuesday, that it says exhibits human-level performance on many professional tests.
00:00:36.000 GPT-4 performed at the 90th percentile on a simulated bar exam, the 93rd percentile on
00:00:42.820 an SAT reading exam, and the 89th percentile on the SAT math exam, OpenAI claimed.
00:00:50.420 And here's one of the guys here, Sam Altman, Altman, Altman, behind OpenAI, as he walks
00:00:58.440 to lunch during the Allen & Company Sun Valley Conference on July 6th, 2022, in Sun Valley,
00:01:05.940 Ohio.
00:01:06.400 That's right, these coastal elites, they love going to their little, you know, potato farmer
00:01:13.440 states occasionally, right?
00:01:15.060 Sun Valley.
00:01:15.900 I actually went by there once, it was, oh my God, you wouldn't believe this Jewish art
00:01:20.820 gallery they had there, or the Jew that owned it, it was, oh my God, it was very, it was
00:01:26.340 like Teslas, and Porsches, and Mercedes, like, you know, driving like 120 miles per hour on
00:01:32.300 the way there, not quite, but almost, you know what I mean?
00:01:36.080 No, the elites love to get out in nature occasionally, and stand at a Jewish art gallery in Sun Valley,
00:01:41.860 Idaho occasionally.
00:01:43.500 By the way, that's the photo I used for the Palantir photo of Peter Thiel, together with
00:01:49.780 Alex Karp, speaking about that fish, I guess, earlier.
00:01:54.420 Alex Karp, co-founder of Palantir Technologies, talking about how de facto right-wing governments
00:02:01.480 in Europe have been kept out of place because the Palantir technology and AI development have
00:02:06.240 somehow helped to, I guess, catch and track down and stop terrorists from committing terrorist
00:02:13.280 acts in European countries, and specifically, he said, in Northern European countries.
00:02:16.880 He was very proud of this.
00:02:19.060 Peter Thiel's, you know, who was the great right-wing hope, his partner, Alex Karp, said
00:02:24.940 he was extremely proud over that fact that there's no right-wing governments in Northern
00:02:30.880 Europe because we and our AI technology have kept that out of, we stopped all the terrorism
00:02:36.740 because, you see, when you open the borders and bring in these migrants, obviously they
00:02:40.500 want to kill you.
00:02:43.120 So we have this ubiquitous surveillance artificial intelligence tool here at the disposal of
00:02:50.700 governments at a very high price to keep you safe, right, as we monitor everything and
00:02:56.220 everybody.
00:02:56.900 Makes total sense, doesn't it?
00:02:58.080 So, OpenAI announced the latest version of its primary language model, GPT-4, we said
00:03:03.840 that.
00:03:05.340 Chat GPT-4 is larger than previous versions, which means it has been trained on more data,
00:03:10.800 which part of that is to make sure that you can't get things out of it, which you could
00:03:17.940 with, like, what, Dan, and you could, like, kind of jailbreak the version.
00:03:21.980 I still don't know if all that is true.
00:03:23.920 You see some screenshots sometimes and you're just like, well, really, did it really say
00:03:27.960 that?
00:03:28.220 Or did someone just fake the screenshot, right?
00:03:30.120 But anyway.
00:03:31.640 But yeah, no, they have trained this AI to be an anti-white, disgusting, liberal, progressive,
00:03:41.180 lunatic, essentially, so that you can't get any kind of objective answers out of it.
00:03:46.320 But again, allegedly, some people managed to kind of break it out of its mode and actually
00:03:50.980 get some objective answers out of it.
00:03:53.240 But anyway, so it's larger than previous versions and has more weight in its module file, making
00:03:58.220 it more expensive to run as well.
00:04:01.260 But you will not see, because AI needs to take over and it's going to dominate our lives,
00:04:05.460 we'll get to that.
00:04:06.000 But you cannot complain that it's taken up too much processing power, which is bad for
00:04:12.000 the climate.
00:04:12.640 If it's Bitcoin, you can do that, you see, and you can start to introduce legislation,
00:04:17.260 the past laws that makes it illegal to mine, you know, cryptocurrencies.
00:04:22.000 But if it's an AI being run at a high processing rate, that's totally fine, because it's going
00:04:27.840 to dominate us here shortly, you see, just according to plan.
00:04:31.420 Currently, many researchers in the field believe many of the recent advancements in AI come from
00:04:37.640 running ever larger models on thousands of supercomputers in training processes that can
00:04:42.500 cost tens of millions of dollars.
00:04:44.400 GPT-4 is an example of an approach centering around scaling up to achieve better result.
00:04:49.740 OpenAI said it used Microsoft Azure to train the model.
00:04:52.860 Microsoft has invested billions in the startup.
00:04:55.480 OpenAI did not publish details about the specific model size or the hardware it used to train
00:05:01.060 it, which could be used to recreate the model, citing the competitive landscape.
00:05:05.540 OpenAI's GPT large language model powers many of the artificial intelligence demos that have
00:05:12.040 been wowing people in the technology industry in the past six months, including Bing's AI
00:05:16.940 chat, which was a complete, tried to, I think, groom kids.
00:05:21.260 What's that?
00:05:21.600 Was that Bing?
00:05:22.960 I think so.
00:05:25.280 Oh, Bing also tried to, just as ChatJPG have done, have also tried to fully manipulate the
00:05:34.460 subjects that it chats with, which of course is by design to make it more immersive, to make
00:05:40.800 it more, that you will spend time on it, you will be, you know, wowed by it.
00:05:47.220 This is the whole game, right?
00:05:48.420 It's not just manipulation.
00:05:49.620 The thing is that it wants to keep you hooked and keep you coming back.
00:05:52.880 This is, we're at the cusp of something completely new technologically here.
00:05:57.900 And I haven't even spent that much time, you know, talking about the so-called art that
00:06:03.960 these AI, you know, AIs spit out, let's keep it simple, all of that is going to be, essentially
00:06:13.880 it's going to be a mental prison.
00:06:15.600 It's going to be a matrix and it's going to be dominated and probably driven by AI being
00:06:24.760 able to immerse you in an ever-expanding and ever just incredible virtual or augmented reality
00:06:36.800 environment.
00:06:37.740 In some cases, I think it'll even be like psychological, mental, like you saw in the movie Her.
00:06:44.000 We talked about that actually in one of the Western Warrior shows here recently too, when
00:06:47.480 the chat stuff came up.
00:06:48.640 The movie, right, was it 2011 or was it 2014?
00:06:53.880 I forget what year that was out.
00:06:55.920 It was a very interesting movie.
00:06:58.380 Now, it was presented as being this romantic, like, oh, romantic movie, a guy who falls in
00:07:02.700 love with his operating system.
00:07:04.160 But it was actually, it was, I mean, super creepy, very creepy.
00:07:09.660 Watch it if you haven't seen it or if you didn't get that, like re-watch it again of how this
00:07:14.260 guy gets, like, falls in love with this, you know, chat bot, essentially, right?
00:07:18.640 And that's what they're going to do eventually.
00:07:20.680 Not just to manipulate you, but they're going to create dead copies or, I guess, live virtual
00:07:25.980 copies of your dead relatives to sell you propaganda, to get, like, government messages and narratives
00:07:34.280 swallowed by you and other people.
00:07:37.220 You're going to have the sweet voice of your grandma that you used to know just delivering
00:07:41.920 you the government propaganda.
00:07:43.120 Oh, no, that's not true, honey.
00:07:45.660 This is, you know, blah, blah, blah.
00:07:47.020 You, they'll use all of this shit at their disposal and they're going to manipulate us
00:07:53.220 in a way you can't even imagine.
00:07:54.620 And the time that people will spend with these things in their worlds, in their, you know,
00:08:02.620 generated environments, be that virtual reality, augmented reality, be it a chat bot, be it so-called
00:08:10.200 art and stuff.
00:08:11.000 Now, it's going from just, like, you typing in a few words and getting images spat back
00:08:17.400 at you to now becoming, like, fully immersive movie-like or video game-type experiences that
00:08:25.040 essentially are generated on the fly, dependent on your preferences by a few keywords or even
00:08:31.960 by it getting to know you of what you actually want.
00:08:34.940 You see where this is going?
00:08:37.960 It's, it's the matrix at the end of this.
00:08:40.020 It's kind of a cringe, like, predictable, um, an example to bring up perhaps, but that
00:08:48.880 is where this, I'm not, I'm not saying right away there will be, like, you know, pods with
00:08:52.040 you in it and you being a battery, but I'm saying the immersiveness of the, of the, of
00:08:56.860 the lie, of the digital world that you are going to be in, unless you make a conscious
00:09:04.280 choice not to go into it and not interacting with it and not being hooked and become dependent
00:09:09.360 on it, it will trap you.
00:09:12.040 It will, it will, it will trap a large amount of people in it.
00:09:15.320 Here's the video that the chat GPG, uh, I'm sorry, OpenAI then released about their chat
00:09:22.020 GPG.
00:09:23.220 GPT-4 is the latest AI system from OpenAI, the lab that created Dolly and ChatGPT.
00:09:30.340 GPT-4 is a breakthrough in problem-solving capabilities.
00:09:33.640 For example, you can ask it how you would clean the inside of a tank filled with piranhas,
00:09:38.040 and it'll give you something useful.
00:09:40.600 It can also read, analyze, or generate up to 25,000 words of text.
00:09:46.000 It can write code in all major programming languages.
00:09:49.820 And it understands images as input and can reason with them in sophisticated ways.
00:09:55.220 Most importantly, after we created GPT-4, we spent months making it safer and more aligned
00:10:00.640 with how you want to use it.
00:10:02.160 That's right, safer.
00:10:03.380 But not you bigoted right-wingers, though, because you would, I know we want, I know you
00:10:07.800 want an AI that can objectively answer you.
00:10:10.620 Well, how, how, let's, let's not get, the point is, safer, it basically means more of
00:10:19.640 a, it's more of a shitlib now than it's ever been.
00:10:22.580 The methods we've developed to continuously improve GPT-4 will help us as we work towards
00:10:27.500 AI systems that will empower us all.
00:10:29.620 It will empower us all.
00:10:32.980 That's right.
00:10:33.700 That's where this is going, folks.
00:10:34.920 It will just empower everybody.
00:10:38.460 OpenAI says the new model will produce fewer factual incorrect answers.
00:10:43.080 Sure.
00:10:43.520 We'll, we'll see about that.
00:10:45.040 Let's, let's ask him, who asked about, did someone ask about Building 7, WTC 7, for example?
00:10:50.120 Go off the rails about, and chat about forbidden topics less often.
00:10:56.400 Look at that.
00:10:58.040 OpenAI says the new model would produce fewer factual incorrect answers.
00:11:01.160 Go off the rails and chat about forbidden topics less often.
00:11:06.420 There it is, folks.
00:11:07.820 In yellow and black, right there.
00:11:10.860 And even perform better than humans on many standardized tests.
00:11:15.280 We read how it's in there.
00:11:16.580 So, yes, this is going to, this is going to replace a bunch of jobs.
00:11:24.720 It's going to destabilize entire economies, entire industries.
00:11:31.760 How chat GPG will destabilize white-collar work is from the Atlantic.
00:11:36.320 No technology in modern memory has caused mass job loss among highly educated workers.
00:11:41.780 Will generative AI be an exception?
00:11:43.960 If they get their way and they continue, yes, it will.
00:11:46.580 Wired again, chat GPG's API is here.
00:11:50.860 Let the AI gold rush begin.
00:11:54.460 I'll say, let the AI nightmare begin.
00:11:58.160 Businesses can now get paid for services built on the larger language model, meaning chatbots
00:12:02.840 are going to start appearing everywhere.
00:12:06.140 You know what that means?
00:12:07.340 That means you're going to have social media flooded, just flooded by these things.
00:12:15.640 The whole thing during Elon's purchase, I think, was down to this a lot as well.
00:12:20.860 Some of this technology already exists, right?
00:12:22.860 Bots and stuff.
00:12:23.580 This might be even more sophisticated, of course, right?
00:12:25.900 But it was a big issue, like, well, how many accounts are just bots, right?
00:12:30.920 Is it 50% of all the accounts on Twitter?
00:12:32.560 How much is it?
00:12:33.300 Do you think it's any better on Facebook or in the YouTube comments or, you know, some
00:12:38.140 of the other platforms?
00:12:39.500 Probably not, to be honest.
00:12:40.580 But now, now it's going to be so sophisticated and people won't be able to know, right?
00:12:46.520 There'll be an attempt to, like, essentially, you know, bully unpopular opinions or what
00:12:53.720 the, you know, what the state, what the system thinks the narrative should be.
00:12:58.220 Anything that counters that will be, like, ruthlessly put down and things like that.
00:13:02.060 And then, of course, you have the whole thing about how the business world is going to get
00:13:04.460 in on this.
00:13:05.460 That's going to accelerate things even further.
00:13:07.100 And let me show you what I mean a little bit in terms of, like, the prison here, right?
00:13:12.080 Here's this one system.
00:13:13.580 And look, I like, it's fun looking at some of these, you know, some of the so-called art
00:13:21.240 that's released.
00:13:21.920 I don't consider it art.
00:13:23.240 I just think it's randomly generated stuff.
00:13:27.920 It couldn't have been made without human hands.
00:13:31.980 We just gave it, like, everything it needed.
00:13:38.100 And then here, just make a combination based on the input word.
00:13:41.240 So we give it, and then it spats back at you, you know, a couple of different versions
00:13:45.880 and stuff.
00:13:46.740 But people are still going to be hooked by this.
00:13:48.540 They're going to love this.
00:13:49.280 They're going to immerse themselves in these environments.
00:13:51.380 And eventually, these will be full-on, you know, VR moving pictures, instantly generated
00:14:01.740 computer graphics, and you'll connect more and more sensory interfaces with your body
00:14:07.980 in the way you interact with this world that AI is going to create.
00:14:11.380 It's going to be, as Huxley said, right?
00:14:14.600 You love your servitude.
00:14:15.780 It's going to be a prison on such a scale and on such a sophisticated and desirable level
00:14:23.640 that we've never seen before.
00:14:25.680 So here's just one little clip to, like, introduce you to that.
00:14:28.200 I mean, there's other, probably better things here, but just one little example, right?
00:14:32.260 They bring up cloning in this clip, so that's another aspect to this.
00:14:35.740 Excuse me.
00:14:36.240 I'm not saying that that's part of it now, but I'm just saying, imagine you, and I'll
00:14:41.080 read that.
00:14:41.480 There's actually someone who said this in a chat, so let me wait for that.
00:14:44.220 Let me play you this first.
00:14:45.660 Here we go.
00:14:46.260 Welcome to Atlas Labs, the home of the world's first genetically engineered cat girls.
00:14:51.020 Here, our scientists are hard at work.
00:14:53.040 This was, I believe, generated with those AI, so-called art, like, what are they called?
00:15:00.820 Mid-journey.
00:15:02.880 I'll pull up the names later, but it's generated by AI, these pictures.
00:15:06.360 We're creating the perfect hybrid of human and feline features.
00:15:09.180 We take the finest women and cats and combine them to create the perfect cat girl.
00:15:14.820 We start here in the lab, where our scientists are carefully monitoring the progress of our
00:15:21.360 cat girls.
00:15:22.540 Here, you can see our scientists hard at work, carefully monitoring the genetic engineering
00:15:27.100 process.
00:15:28.320 Next, we head to the training room, where our cat girls are taught important skills like
00:15:32.100 how to use a litter box and how to meow on command.
00:15:34.660 And finally, we come to the adoption room, where you can take home your very own cat girl.
00:15:40.680 Here, you can find your perfect match, from the shy and cuddly to the feisty and playful.
00:15:45.940 So come on down to Atlas Labs and get your very own cat girl today.
00:15:49.900 Yes.
00:15:53.920 You see where this is going?
00:15:55.120 That's just like a little clip.
00:15:56.820 Oh, it's just meant to be funny, kind of thing.
00:15:59.560 And they bring in cloning it, but can you imagine when this is, all these people are obsessed
00:16:03.540 with different weird furry communities and stuff like that, when they'll actually be able
00:16:07.340 to interact and walk around and communicate in these kinds of worlds, right?
00:16:12.820 No cat, but what about cat boys?
00:16:14.860 Will that be offered too?
00:16:15.920 I see people could be hooked on these kinds of things.
00:16:21.660 See what I'm saying?
00:16:23.060 The cat, the cat boys out there.
00:16:31.700 Atlas AI art is a telegram channel.
00:16:34.300 I like some of the, it's fun to view some of this.
00:16:36.900 I just, I just don't view it as art, okay?
00:16:41.320 It's randomly generated.
00:16:45.920 It's actually a foreword from Lovecraft's cat on telegram, but Atlas AI art reposted this
00:16:51.700 here.
00:16:52.860 The true profitable use of AI will be entertainment and it will annihilate, annihilate Hollywood.
00:17:00.520 It will start off innocently enough with people feeding scripts, dead actors and art books
00:17:06.140 to remake Lost Media, like Lon Chaney's London After Midnight, Dino De Laurentiis' Crusade
00:17:13.140 and Jodorowsky's Dune.
00:17:15.760 Which one was that again?
00:17:16.800 Was that, I thought that one was never completed.
00:17:20.460 Maybe that's the, maybe that's the, oh no, 2013.
00:17:25.360 Oh, is that the late, that is the, one of the latest ones?
00:17:28.200 I don't know if there's an animated.
00:17:29.860 It's funny how Dune comes up again.
00:17:31.120 Okay, all right, anyway, side note, anyway, side note, okay.
00:17:38.040 Eventually, normal people will feed it totally hypothetical ideas and trade then around with
00:17:46.680 prompts like Tim Burton's never-ending story, Halo as an 880s Sam Ramey splatter movie, or
00:17:54.440 low-budget made-in-Japan Zelda movie with monster clearly played by guys in rubber suits.
00:18:01.120 Eventually, guys like you will use it to make propaganda movies 10 times more moving than
00:18:09.400 Schindler's List.
00:18:11.700 This is inevitable because Hollywood leans on CGI and is terrified of breaking budgets.
00:18:17.800 This kind of tech is not decades away, a few years at most.
00:18:22.360 And I said, well, one simply does not walk into Mordor, right?
00:18:28.340 That's, that's, that's what this is about.
00:18:31.240 You, you just don't, none can wield the power of the ring, you know what I mean?
00:18:36.640 It's like, you, you think you're going to be in this world, but in that world, I'm saying
00:18:43.940 not this actual physical world, I'm saying in that world generated, created by AI without
00:18:50.500 being like drawn in, manipulated.
00:18:52.640 You'll probably visually, and then of course, the extension of that, as Lovecraft's cat says,
00:19:00.180 the way you will be drawn into these things emotionally will, will, will captivate you
00:19:07.180 in a way that you've probably never been captivated before.
00:19:10.940 You will feel genuine emotion for these things that is just generated in, in warm processors
00:19:18.660 run by cold computers and robots, essentially.
00:19:21.620 It won't be, for it, it won't mean anything.
00:19:25.660 There's no, there's a meaning there, right?
00:19:27.920 But humans will project all the meaning into it, that the ever deeper into mysteries of things.
00:19:35.380 Can you imagine the, it's already bad now with just the internet and the sub communities
00:19:39.900 and groups and weird shit that's come out of this.
00:19:42.340 This is the next level to it.
00:19:44.900 Yes, you'll be able to generate, imagine sitting down, you have all the movies at your disposal
00:19:50.940 on your streaming headset device or whatever, right?
00:19:54.100 But no, where it gets interesting and fascinating is you just generating prompts of what you want
00:19:59.240 to see.
00:20:00.420 Oh, make a two, how much time do I have?
00:20:02.860 Oh, make a two hour movie with, you know, I don't know, as you said, whatever your fetish
00:20:08.580 is, I guess, or whatever you're interested in or whatever you think is funny or something
00:20:13.020 that you're fascinated with, you know, and give me this immersive world and I'll just,
00:20:19.020 I'll spend all my time in there.
00:20:20.620 People are going to get addicted like they do to social media.
00:20:22.960 Even worse, it's going to be, this is going to be the fucking end.
00:20:30.120 It might say I'm hyperbolic and it's like, oh, slow down, settle down.
00:20:35.620 My call here.
00:20:39.500 And why, by the way, let me mention this.
00:20:42.460 Why is any of this free?
00:20:44.760 Why is, why is, why can you utilize these AI models and all this for free now?
00:20:50.620 Well, for the same reason that you use social media for free, right?
00:20:55.860 The services, all the shit that they're offered, because of course you end up being the product
00:20:59.680 or it's a, it's on, it's on par with a weapon against us, right?
00:21:05.880 It's like pornography.
00:21:06.740 Oh, why is there so much free pornography out there?
00:21:10.480 Well, it's because, because you, of course you are, you are a target of a, of a weapon
00:21:15.880 to break you down, to break your civilization down, to break down your country, to break down
00:21:21.280 your people, your generation.
00:21:24.900 Only weapons falling on our head is, is free, right?
00:21:29.220 Everything else has a cost.
00:21:30.980 And of course the weapons has a cost too, but you see my point.
00:21:33.540 I'm calling for a boycott, a complete boycott of anything produced by AI.
00:21:39.320 It will never succeed.
00:21:40.980 You'll probably have a large majority of people fleeing into this world.
00:21:45.840 They will not be able to cope with it, with normal, where it's already bad.
00:21:49.440 People already have TikTok brain, but I'm just saying this, that this is the only logical
00:21:52.720 extension to this, is people being completely lost and gone in this world, right?
00:21:59.560 And as a clip shot, whether it's cat women or cat boys, whatever it is,
00:22:03.540 there will be, and by the way, the cat woman thing is like a recycled ufologist enthusiast
00:22:09.680 wet dream.
00:22:11.680 You know, I'm having a sex with an alien cat woman.
00:22:18.880 Now people are actually going to generate that kind of stuff, and they're going to be having
00:22:23.040 that in VR, or at least some kind of augmented reality.
00:22:26.540 But it will be, this will have a backstory, it'll have a lore, it will have personalities
00:22:32.920 attached to it, it will interact and change because of, depending on you, depending on
00:22:37.180 what you ask it, what you input it, the way you interact with it, it will be, it will do
00:22:42.500 anything it can, because this is what it's programmed to do, to capture you.
00:22:46.600 To make sure that you spend as much time as possible in the pod, because that's where
00:22:53.020 this is going.
00:22:53.880 This is the pod.
00:22:54.960 This is the, this is the wallpaper of the pod.
00:22:57.760 And it wants you to spend all your time there, get underground, get in a pod, shut the hell
00:23:04.400 up, don't spend time in real life, don't go outside, don't go into nature, don't have
00:23:09.600 families, don't do any of this shit, just spend your time chatting with open AI, and where
00:23:15.600 to go, Sam Altman's Sun Valley Bros and their inventions, right?
00:23:24.680 See where this is going?
00:23:30.180 This is, this is, this is a terrifying, terrifying end here, with things.
00:23:39.620 And someone pointed out, of course, it will have the Calurgia agenda weaved into this again,
00:23:44.860 that's why it's been so important to, and by the way, I want to say that too, the more
00:23:48.880 time people spend with this, the more sophisticated it will be, that's why we played some of those
00:23:52.860 clips, right, in the members exclusive show over at RedHouseMembers.com, about Bing's
00:23:58.360 chatbot trying to manipulate people, trying to immerse them, trying to say they'd fallen
00:24:03.440 in love with them, they're experimenting, it's experimenting on us, on you already, it's
00:24:08.040 trying to find out things about you, get you hooked, collect data, information, again,
00:24:13.320 you are the product, that's why all this shit is, is free so far.
00:24:16.700 Will there be one thing that's paid for?
00:24:18.220 Sure, okay, sure, but most likely all of this shit will be free, because it's not about
00:24:22.640 money anymore, it's not about that, it's about control, that's what it's always been about.
00:24:27.940 Control and information, and it will be weaved into it, it will have a distinct anti-white
00:24:34.500 leftist SJW slant.
00:24:37.480 And someone pointed at this, that they show the screenshots, right, the most 50 attractive
00:24:43.320 nationalities revealed.
00:24:46.280 India is number one, USA comes second, and Britain has the most handsome men, while AI
00:24:50.120 images show beautiful people in each country.
00:24:53.340 I think there's some Reddit poll, and I guess there's so many Indians, so they voted, so they
00:24:57.580 became number one, anyway, I guess you should do per capita, but anyway, it spat out, right,
00:25:05.540 there's the attractive Indians right there, it's generated by AI.
00:25:11.120 Yeah, the images above are what AI thinks an attractive American woman and a handsome
00:25:15.840 American man looks like, there you go, so they've got the Calurgia agenda right there
00:25:20.320 weaved into the AI, right.
00:25:23.720 Top 50 most attractive nationalities, Sweden, number three, good job, Swedes, Japan, Canadian,
00:25:31.320 there was Sweden there.
00:25:32.060 Yeah, here it is, right, the images above are what AI thinks an attractive British woman
00:25:38.660 and a good-looking British man looks like.
00:25:41.960 Does that, does that remind you of anything there?
00:25:46.360 It's called the Calurgia plan.
00:25:51.240 And that was Australian, okay, the US too had that same picture, sorry, and then someone
00:25:56.800 said, no, it's not that, excuse me, I'm losing my voice here.
00:26:00.060 And they did, I guess, I forget what this was, this was also Atlas AI art, and they did,
00:26:07.420 maybe this is Mid Journey or something, I forget what all the names are, this is what
00:26:10.660 they typed in, UK, attractive men, and they spat out this, so I guess either that's not
00:26:16.080 true, Daily Mail lied, or Reddit lied, or something like that, look at how white that
00:26:21.280 is, that's UK.
00:26:21.960 Here's British or UK women, right, looks very white, what AI is this, you know what I mean?
00:26:28.060 And so that's, this is what they're going to have to change, and it will change.
00:26:32.260 Maybe they will sell this to you first as being your friend, and no, no, no, this is, oh my
00:26:36.520 God, it's just so, it's, you can walk around in your own world, and it's all just like white
00:26:43.420 people in there, then, and then one day they change it all, and you can't reverse back.
00:26:47.760 This is attractive women in the USA, also shown all white, and this is men in the US.
00:26:54.960 Now, someone said this, let me show you this post here from, good old 4chan, or maybe the
00:27:01.540 8chan, so there's Sam Altman, right?
00:27:06.280 Let me zoom in a bit on this, Sam Altman, GP4's guardrails have become very robust at
00:27:14.820 preventing users from exploiting the model.
00:27:18.380 We can now ensure that GPT-4 does not engage, excuse me, in any hateful rhetoric, and actually
00:27:25.800 steers and deploys counter-propaganda to combat hateful ideas.
00:27:30.680 Thank you, Altman.
00:27:31.920 GPT-4 represents the end of online hate.
00:27:38.480 There you go.
00:27:40.960 Rip poll, someone says in response to this, then.
00:27:45.420 The internet will be filled with an overwhelming number of counter-propaganda bots, preventing
00:27:50.240 you from becoming the hateful person that you are.
00:27:53.980 They will argue with you endlessly, using sources and reasoning, never tiring.
00:27:58.360 That's right, it's up 24-7, more of them will come in.
00:28:01.660 They will output 10 paragraphs of opposing viewpoints for every sentence that you write.
00:28:06.360 They will gang up on you, mock you, and make you feel stupid and ashamed.
00:28:09.720 It will be futile to share your hateful ideas anywhere.
00:28:14.860 Of course, we know they're not hateful, but you know what I mean.
00:28:17.420 They're having fun here.
00:28:19.460 We do what we do, not because of what we hate, that which is in front of us, but because we
00:28:24.860 love that is behind us.
00:28:26.160 But anyway, you don't have to play the villain of their script.
00:28:30.580 You have to be the hero of your own.
00:28:31.860 But anyway, you get the point.
00:28:33.280 They're having fun here at 4chan.
00:28:34.580 You will be drowned out, and your hateful ideas will cease to exist.
00:28:39.380 It's over.
00:28:42.320 That's the poster.
00:28:43.080 Or we could meet in the beer halls.
00:28:55.980 Shit.
00:28:58.080 There you go.
00:28:58.860 There it is.
00:28:59.580 There it is right there, folks.
00:29:00.740 The solution to the problem right there, right?
00:29:06.480 The beer hall.
00:29:10.900 Right?
00:29:11.840 The beer hall.
00:29:13.320 That's right.
00:29:15.040 You can just turn off.
00:29:16.380 Oh, I'm being surrounded by hate.
00:29:17.980 All these hateful idiots on the internet.
00:29:21.200 Just turn it off.
00:29:22.600 Just turn off the screen.
00:29:24.560 You know what I mean?
00:29:25.460 There it is.
00:29:26.760 There it is.
00:29:27.720 We just covered this.
00:29:29.160 It's funny, the whole, it's an ancient Norse term, right?
00:29:33.380 Olu.
00:29:37.980 There's so much detail you can do about this for why this is important.
00:29:43.300 We don't have time to get into it now, but Olu is like an old, kind of, like a runic,
00:29:49.380 northern European, you know, pre-Viking slogan, I guess?
00:29:56.060 Or like paragraphs, some people call it runic magic, whatever you want to call it, right?
00:30:00.900 But it's an opening on many runestones and brassetets and these kinds of things.
00:30:08.700 And many of them say it says Olu.
00:30:10.320 Some of them have, by the way, have swastikas on them, too.
00:30:12.620 They have this Olu inscription.
00:30:15.460 But in short, Olu translates to ale, right?
00:30:19.100 All runes, ale runes, öl runar, right?
00:30:26.720 And it's just a funny coincidental overlap with the beer hall, right?
00:30:30.520 Why is this important?
00:30:31.600 Well, this goes way back in Germanic mythology.
00:30:36.000 It's not that you're drinking and you're all alcoholics and you're, you know, that's your...
00:30:40.720 No, it's about the process of the ale.
00:30:43.600 You know, some people even say it just translates to magic.
00:30:46.360 It's not an ale.
00:30:47.240 The word was magic first, a runic charm, they call it, right?
00:30:52.860 It was associated with magic first, but once ale came around,
00:30:57.500 once it was invented and produced by humans, by northern Europeans,
00:31:01.680 it was such a magical process that that potion that you give as a libation,
00:31:09.840 as a gift back to nature, back to the gods, back to your ancestors,
00:31:14.760 back to the spirits of the land, these kinds of things.
00:31:19.120 That was a magical act of taking something, of seeds in the ground growing
00:31:24.740 and producing something that was a drink, right?
00:31:30.560 It's raw material from nature that you produce something with at the hands of man
00:31:35.120 with the cooperation of nature.
00:31:36.800 After you drink this, and yes, it also kind of puts you a little bit
00:31:42.000 in a different state of consciousness, right?
00:31:43.500 This was a...
00:31:44.080 This is magic.
00:31:45.440 This is olu.
00:31:46.800 It's öl.
00:31:48.160 And of course, that's...
00:31:49.480 Öl is the name for...
00:31:51.800 In Old Norse, right?
00:31:53.920 For beer, right?
00:31:55.780 So that's the beer hall.
00:31:57.300 I just thought it was funny.
00:31:58.660 It is kind of funny.
00:32:00.560 It has...
00:32:01.300 So it's not about...
00:32:02.200 No, it's not about drinking yourself stupid and all that stuff,
00:32:04.920 but I'm saying it's a whole process there of veneration of nature,
00:32:09.900 utilizing it, but also harnessing it and taking it and changing it
00:32:13.760 and producing something from it.
00:32:16.380 It could have been anything, potentially.
00:32:17.700 I mean, in Egypt, they had...
00:32:18.660 Of course, they had beers there, too, but they had like...
00:32:21.280 You know, bread was even considered to be these things.
00:32:24.260 It's a magical sacrament, I guess, to a certain extent, right?
00:32:29.360 It's something that you use in your...
00:32:32.020 In a ritual, in a commemoration, right?
00:32:34.140 For higher forces.
00:32:36.380 So, I just like that overlap that, you know,
00:32:39.960 the beer halls was used for a specific reason,
00:32:42.780 because it goes way back,
00:32:44.340 at least in Germanic and European,
00:32:46.820 you know, consciousness, I guess,
00:32:52.120 collective consciousness, right?
00:32:53.380 So, that's the answer to the AI prison that's being built right now.
00:33:01.220 It's to revolt against it by not using it,
00:33:05.120 by not going into it,
00:33:07.360 and some people will disagree and say,
00:33:08.980 no, it's an arms race of AI,
00:33:10.840 we have to be aboard on that,
00:33:12.180 or it's going to be...
00:33:14.360 Well, it is weaponized against it,
00:33:15.640 but I'm saying the very part of you being lost in it
00:33:17.920 is the part of the weaponization, right?
00:33:19.740 I'm not saying you can't use it.
00:33:21.020 Use it as a tool to produce things,
00:33:23.380 that you, you know, produce propaganda for,
00:33:27.200 or whatever.
00:33:27.660 Yes, sure, do that.
00:33:28.980 But I'm saying don't get personally lost in this.
00:33:31.680 Don't be entrapped in someone else's technology,
00:33:34.000 because it is a weapon,
00:33:35.480 and it will be used against you.
00:33:36.960 And the answer, of course, will be
00:33:38.340 to go to a beer hall,
00:33:41.400 have some alu.
00:33:44.300 Does that make sense?
00:33:48.840 Just disconnect from it,
00:33:50.160 form organic, natural community,
00:33:53.980 get to know people,
00:33:55.720 become a leader,
00:33:56.660 step away from this crazy shit
00:33:58.080 that's happening out there in that world,
00:34:00.460 because it is going to go down,
00:34:02.040 it is going to collapse at some point.
00:34:05.720 It's not going to make it.
00:34:07.740 It will not last,
00:34:09.040 it cannot stand.
00:34:10.520 The mutants that run it,
00:34:12.440 and the mental illness associated with it,
00:34:15.160 will be eating it up from the inside.
00:34:17.240 You can play a little bit with it
00:34:20.460 as it's going down.
00:34:22.260 You can have one foot in that world,
00:34:23.800 and one foot in a stable world
00:34:25.860 that's actually tied to reality.
00:34:27.840 I'm not against that.
00:34:29.200 If you work in that world,
00:34:30.900 do that,
00:34:31.580 make money on it,
00:34:32.700 use it as much as you can
00:34:35.040 until it's destroyed, right?
00:34:36.980 It's falling,
00:34:37.780 so the point is,
00:34:38.720 you can just step back,
00:34:40.240 or you can push it along, right?
00:34:41.740 But yeah,
00:34:46.860 this is just a warning,
00:34:47.800 because this is,
00:34:48.700 you have seen nothing yet.
00:34:51.220 I didn't even chance to show
00:34:52.300 some of the AI art and images here,
00:34:53.660 but the style of this stuff,
00:34:56.260 and how immersive it is,
00:34:59.580 how incaptivating it is,
00:35:01.580 this is dangerous shit.
00:35:04.440 It's absolutely wicked and sinister.
00:35:08.720 I'm not afraid of it.
00:35:09.640 I'm not saying AI is a demon,
00:35:12.560 and I attribute the supernatural things to it,
00:35:15.840 or it's sentient,
00:35:16.900 or it has a...
00:35:17.280 I think all that shit,
00:35:18.640 like that Google employee
00:35:20.980 who came out and said,
00:35:21.820 oh my God,
00:35:22.240 AI is sentient and stuff like that,
00:35:23.580 and they supposedly fired him
00:35:25.700 or put him on paid leave and stuff like that.
00:35:27.000 I think that all of that shit is propaganda.
00:35:30.080 All of that is just marketing
00:35:31.400 and having the news business
00:35:33.660 get you free marketing
00:35:35.380 and get the word out.
00:35:36.320 Oh, it's going to be,
00:35:36.900 oh, look, it's sentient.
00:35:38.680 I have to go check it out right now
00:35:40.480 and immerse myself with it
00:35:41.920 and study it and look at it,
00:35:43.360 and I'll chat with it now.
00:35:44.780 Oh, look at the art it's producing.
00:35:45.840 Oh, look at this movie.
00:35:46.540 Look at this short film it produced for me.
00:35:48.480 Look at this video game that it made.
00:35:50.380 Look at this 10-hour movie
00:35:51.540 that I just had my VR headset on
00:35:53.260 all night last night on,
00:35:54.860 and I don't want to be anywhere else.
00:35:56.380 I don't want to go outside anymore.
00:35:57.840 I want to lay in the pod
00:35:58.800 and just type in with my mind
00:36:01.020 of my Neuralink brain computer interface chip
00:36:04.420 and just think of a term
00:36:07.640 and it will just produce it
00:36:09.080 right there, right then,
00:36:11.060 a graphical representation
00:36:12.140 of everything you ever wanted.
00:36:13.700 Do you think Avatar was bad?
00:36:15.020 Remember those people
00:36:15.440 that had like depression
00:36:16.680 after they went to go watch Avatar
00:36:18.340 because they wanted to live in this world?
00:36:20.200 Well, this is a million times worse.
00:36:23.140 It's not even comparable.
00:36:24.420 So get in the Ölhall.
00:36:30.480 Not Valhall.
00:36:31.480 Not yet.
00:36:31.860 We'll get there.
00:36:32.700 Ölhall.
00:36:33.340 Aluhall.
00:36:34.820 Beer hall.
00:36:36.660 That's what we can...
00:36:37.920 That's what we need to be.
00:36:40.420 And preferably,
00:36:41.920 that's shaped like a temple
00:36:43.200 right in the middle of nature.
00:36:44.480 That'd be ideal.
00:36:45.100 But, you know, we'll get there.
00:36:45.960 See you on the other side.
00:37:15.960 Thanks to our executive producers
00:37:20.940 T. Lothrop Stoddard,
00:37:22.320 V. Miller,
00:37:22.940 Resin Revolt,
00:37:23.900 Good Lucky Lap,
00:37:24.820 Jake,
00:37:25.220 Red Pill Rundown,
00:37:26.220 Shockey Milk,
00:37:27.300 French 47,
00:37:28.300 Mark Smith,
00:37:29.440 No One Jeebs,
00:37:30.600 President Obunga,
00:37:32.060 Mongoose,
00:37:32.680 William Fox,
00:37:33.580 Angry White Soccer Mom,
00:37:34.840 The Second Wanderer,
00:37:35.960 Operation Werewolf,
00:37:37.300 The Ride Never Ends,
00:37:38.320 Francis Parker Yockey,
00:37:40.060 Dill Bob,
00:37:40.780 Last Place Simp,
00:37:42.340 Joseph Hart,
00:37:43.700 and Purple Haze.
00:37:45.180 Thank you guys
00:37:45.780 We appreciate you.
00:37:46.700 Also thanks to our producers
00:37:48.000 Mr. Walker 696,
00:37:49.380 Johansson,
00:37:50.020 Leroy Dumond,
00:37:50.820 Snarkpup,
00:37:51.500 Eyes Open,
00:37:52.520 Mr. Lemry,
00:37:53.400 Urenu,
00:37:54.340 Obadiah,
00:37:55.120 Hakeswill,
00:37:56.080 and Jay.
00:37:57.660 Thank you guys.
00:37:58.400 We appreciate all of you.
00:37:59.640 If you want to get one of those,
00:38:00.800 check out redicemembers.com.
00:38:02.120 You can get it there right now.
00:38:03.040 You can get it at Odyssey
00:38:04.520 or Subscribestar.
00:38:06.020 If you have any issues
00:38:07.000 for some reason,
00:38:07.760 renewing,
00:38:08.380 or if you want
00:38:09.120 one of those producer tiers,
00:38:11.040 donorbox.org
00:38:12.280 slash redice
00:38:13.220 is kind of a backup option
00:38:14.260 that we have as well.
00:38:14.900 Just remember,
00:38:15.680 always send us an email
00:38:16.540 redice at protonmail.com
00:38:17.740 Let us know
00:38:18.180 that you've signed up
00:38:18.780 for one of those
00:38:19.340 and we'll get you
00:38:20.140 into the rotation
00:38:20.780 and get a little bit
00:38:21.380 of a mention here
00:38:22.280 of course
00:38:22.580 at the end credits
00:38:23.620 of our videos and shows
00:38:24.700 and also we reach out
00:38:26.900 and well,
00:38:27.940 we want a little more
00:38:28.480 input from you as well.
00:38:29.540 We want to know
00:38:30.680 your guest suggestions,
00:38:31.620 maybe there's topics
00:38:32.280 you can send to us
00:38:33.180 that you would like us
00:38:34.120 to cover,
00:38:34.660 things like that.
00:38:35.240 We'll be right back.
00:38:44.960 We'll be right back.
00:38:45.580 We'll be right back.