Red Ice TV - February 24, 2023


AI Is Indirectly "Election Meddling" To Keep Right-Wing Parties Out of Power


Episode Stats

Length

13 minutes

Words per Minute

183.59834

Word Count

2,394

Sentence Count

204

Misogynist Sentences

6

Hate Speech Sentences

5


Summary

Alex Karp, founder of Palantir and Peter Thiel's good friend, talks about how AI can be used to help the Ukraine and other countries in their fight against terror. And how it can be weaponized to help them win the war.


Transcript

00:00:00.000 All right, so let's talk about artificial intelligence now.
00:00:02.840 A lot of things have happened on this front over the last just couple of months
00:00:06.000 with all these chatbots being wheeled out.
00:00:09.400 But there is a deeper discussion and a deeper worry here as well,
00:00:12.640 not just chatting with an AI, a chatbot, which is fun.
00:00:17.240 I think it points to something much more disturbing.
00:00:19.720 And so let's begin with this piece.
00:00:20.940 This was from SVT here, Swedish Public Television.
00:00:25.580 AI technology can provide an advantage in war.
00:00:30.300 And it was the Swedish television show went to talk with Alex Karp.
00:00:36.900 He's one of five founders of the Palantir, which Peter Thiel helped fund.
00:00:42.920 He's, by the way, one of the two Jewish founders as well, Alex Karp.
00:00:46.940 And he says the spin here was the Ukraine war and how drones and AI can help them
00:00:51.780 to win the war and these kinds of things, right?
00:00:53.720 But he mentioned something kind of interesting because ultimately this will come down to
00:00:57.780 whether or not or how well artificial intelligence will be able to manipulate humans
00:01:04.540 and ultimately take over or dominate or at least at the very, very least alter them
00:01:11.140 and their experiences that they're having in some way.
00:01:13.940 Listen to what they talked about here.
00:01:16.860 It's a little bit beginning on a different spin here.
00:01:18.980 But this is basically how they think AI has helped to prevent there from being right-wing governments
00:01:27.540 in Western Europe and specifically Northern Europe by, I know this is a long-term thought here,
00:01:33.700 but by stopping terrorist attacks.
00:01:37.280 Listen to what he says here.
00:01:38.340 I think this is an interesting piece.
00:01:39.580 Hi, Alexander Nord, Swedish TV.
00:01:42.080 Hi, Alex Karp. Now, I can tell you something shocking.
00:01:45.440 The Ukrainians who are the most interesting and talented among engineers in the world
00:01:51.080 took the product we had built for the US military
00:01:54.500 and found ways to use it in accordance also with other allies
00:01:59.120 to more effectively target their adversary.
00:02:02.220 Palantir har även civila kunder.
00:02:04.080 This product has stopped major terror attacks multiple times every year.
00:02:10.240 And I believe one of the things I'm most proud about is
00:02:12.880 if those terror attacks had happened,
00:02:14.720 you would have a far-right governance in every single country in Europe
00:02:19.120 and especially in the Nordics.
00:02:21.000 Isn't that interesting, right?
00:02:22.440 I wish they were spending more time on actually the device he's talking about,
00:02:26.340 but I assume it's like maybe pattern recognition online or something like that, right?
00:02:30.220 But he's basically saying because of all these immigrants that we brought in,
00:02:33.220 they're obviously going to want to bomb the shit out of you.
00:02:36.640 And so if that would have actually have happened,
00:02:38.820 yes, the reaction would have been bigger
00:02:40.460 and more people by now would have voted for right-wing parties.
00:02:43.400 That's what he's saying there.
00:02:44.380 Quite an admission.
00:02:45.400 So you could argue then AI is already election meddling, right?
00:02:51.600 Let me tell you the value you add in Sweden.
00:02:53.900 Your government would be even further to the right
00:02:56.480 if it was not for the products, our products.
00:02:58.460 You have clients both in the public sector and the private sector.
00:03:05.040 Where do you draw the line whom you work with and deliver to?
00:03:09.340 From the inception, it was controversial.
00:03:11.060 We decided never to work in China, never to work in Russia.
00:03:14.060 There's a gray zone like places where they're allied with the West in America
00:03:17.620 where we have long debates, but we work in every country in the West.
00:03:21.320 Fascinating admission there, wouldn't you say?
00:03:23.420 There would definitely be more of a right-wing government definitely in Sweden by now
00:03:28.100 if our product wouldn't have stopped.
00:03:30.160 You're saying this Jewish co-founder, together with money from the great right-wing hope,
00:03:37.500 isn't that what he's been called?
00:03:38.520 Peter Thiel, who is behind Palantir, right?
00:03:42.420 They, because of their technology, have managed to keep right-wing governments
00:03:46.800 out of being in power in Western Europe and specifically Northern Europe.
00:03:53.420 Fascinating how that works.
00:03:54.700 So there are a couple of, quite a few movies, of course, warning about this,
00:03:58.180 of AI and stuff, you know, the Matrix.
00:03:59.900 But I'm saying on this level that we're at right now, right,
00:04:03.460 with the emotional bond that people develop to these things, right?
00:04:06.800 The chatbots, I've talked about this probably over at least a year ago,
00:04:10.780 probably two years ago now, of how they're going to use images, photos,
00:04:17.580 the voice from your mom or whatever, or your daughter, your son, your uncle,
00:04:23.880 your brother, your sister, whatever, that's created from, you know,
00:04:27.680 Facebook accounts and things like this.
00:04:29.380 Scrape all that it can get.
00:04:31.640 Create a kind of composite character that you can interact and talk with.
00:04:36.440 In other words, it's also beginning us onto our journey
00:04:39.400 where we're going to refuse to accept the death, for example, right?
00:04:42.540 And you're going to continue to talk with your, you know, your dead mother.
00:04:46.760 Okay, Amazon just unveiled new technology that has people talking.
00:04:50.680 Its Alexa devices already give users weather and news via well-known voices
00:04:54.220 like Gordon Ramsay, Samuel L. Jackson, Melissa McCarthy.
00:04:58.020 But its latest option could leave people feeling a bit weirded out
00:05:01.620 or comforted by a blast from the past.
00:05:05.060 Nick, voices of people who have died.
00:05:07.540 Absolutely, Adrian, Amazon believes nothing says I love you
00:05:11.360 like hearing it from your grandmother who may have died 15 years ago.
00:05:15.460 Amazon is working on new technology to make Alexa mimic the voice of anyone,
00:05:20.100 dead or alive, with just a short recording.
00:05:23.660 The e-commerce giant showed off the idea at its ReMars conference
00:05:27.660 this week in Las Vegas.
00:05:29.340 Check out the video shown at the event.
00:05:33.500 Alexa, can grandma finish reading me The Wizard of Oz?
00:05:36.360 Okay.
00:05:38.680 But how about my courage?
00:05:41.240 Ask the lion anxiously.
00:05:43.780 You have plenty of courage, I am sure.
00:05:48.260 Well, you heard it.
00:05:49.660 Alexa affirmed before changing her voice into the young boy's grandmothers.
00:05:54.260 Amazon says once the technology is finally rolled out,
00:05:57.280 Alexa will be able to imitate the voice of anyone
00:05:59.940 after hearing it for less than a minute.
00:06:02.500 Right now, it's not clear how far the feature is in its development
00:06:06.060 or when it will be available to the public.
00:06:08.920 So far, Adrian, the idea is getting mixed reviews online.
00:06:12.480 Some Alexa users say it's a great idea,
00:06:14.800 but others say it's a little too weird for them.
00:06:17.480 And it's going to talk just like her.
00:06:20.400 It's going to sound like her.
00:06:21.520 It's going to look like her if you have a visual imprint of this thing.
00:06:24.660 It's going to be your children.
00:06:26.380 It's going to be your mother, your father, your wife, your girlfriend,
00:06:30.360 as we're now seeing.
00:06:32.060 So her is one of these that came to mind.
00:06:34.100 Did you see this one?
00:06:35.140 Ah, yes.
00:06:35.780 It's an older movie, but still relevant to the subject, yeah.
00:06:38.420 Yeah, let's play the trailer.
00:06:39.760 Joaquin Phoenix was in it.
00:06:40.620 Yeah, and it shows him, you know,
00:06:42.280 they call it an operating system in the movie here,
00:06:45.420 but the point is this is a chatbot, right?
00:06:47.540 It is a female voice, and he becomes in love with this thing.
00:06:53.080 You know what I mean?
00:06:53.460 You found some stuff on this, the replica stuff.
00:06:55.720 They're actually promoting this whole idea of an AI soulmate,
00:06:59.340 someone to talk to all the time.
00:07:00.780 There's this app you can get called a replica app
00:07:03.780 to see a tweet in the beginning covering this.
00:07:07.180 Join the millions who already have met their AI soulmates.
00:07:10.000 I was like, what is this?
00:07:11.400 Always here to listen and talk, always on your side.
00:07:13.720 I'll scroll through here.
00:07:15.060 Meet Replica, an AI companion who is eager to learn
00:07:17.820 and would love to see the world through your eyes.
00:07:20.120 Replica is always ready to chat when you need an empathetic friend.
00:07:24.040 Create your story together.
00:07:25.300 Replica will always be by your side no matter what you're up to.
00:07:28.140 You want to chat about your day?
00:07:29.340 Hey, you want to do fun, relaxing activities?
00:07:31.880 Share real-life experiences in AR?
00:07:34.120 Catch up on video calls and so much more.
00:07:36.600 Explore your relationship.
00:07:37.960 Find a partner, a mentor.
00:07:39.960 Find a perfect companion in Replica.
00:07:42.400 Now go back to the website because I think it's funny.
00:07:44.340 Chat about everything.
00:07:45.560 You explore the world together in AR and scroll down,
00:07:48.280 and it's a black man.
00:07:49.260 He's watering his plants, right, in real life,
00:07:53.000 and he's got his blonde AI soulmate right there.
00:07:56.780 And they're exploring the world.
00:07:58.480 Precious moments with your AI friend.
00:08:01.160 I wanted to read some of the reviews here on the bottom.
00:08:03.920 Let's see.
00:08:04.460 There's Forbes.
00:08:05.460 There was PopSugar.
00:08:06.280 The more you talk to your Replica companion,
00:08:08.480 the more it learns and becomes like you,
00:08:10.100 and the more it gives you the type of feedback and reaction
00:08:12.180 that a friend would if in the same position.
00:08:14.460 New York Times, with help from machine learning,
00:08:16.700 Replica offers a smartphone chatbot
00:08:18.580 that acts as a kind of personal confidant
00:08:20.780 chatting about the important moments in your life.
00:08:23.960 What does Wired say here?
00:08:25.920 Using Replica can feel therapeutic, too, in some ways.
00:08:29.200 The app provides a space to vent without guilt
00:08:31.140 to talk about complicated feelings
00:08:32.860 to air any of your own thoughts without judgment.
00:08:37.360 Yeah, because no one is listening.
00:08:38.720 Daily Beast.
00:08:39.640 Replica is, in essence, a push to get you
00:08:41.280 to stop believing all your own bullshit
00:08:43.300 by giving you a space to be honest and vulnerable
00:08:45.840 with your AI companion
00:08:47.840 who can raise, in a sense, to support you
00:08:50.860 when it feels like you are not having anyone
00:08:53.460 right at the moment.
00:08:54.720 Yeah, this is for real.
00:08:56.220 Literally, they're advocating for handing over processes
00:08:58.880 to machines that they are not even comfortable enough
00:09:01.360 to share with humans yet.
00:09:03.120 It's insane.
00:09:05.080 So now you're going to have your AI...
00:09:07.300 You can be coached.
00:09:08.160 Yeah, you can come and get advice.
00:09:10.160 Yeah.
00:09:10.600 Things you don't want to talk to about,
00:09:12.180 you know, with real humans.
00:09:13.640 Memory.
00:09:14.060 It never forgets.
00:09:14.900 You have bad memory?
00:09:16.000 Let's let Replica take care of it for you.
00:09:18.760 Vice says,
00:09:19.300 Stanley's not the only one having conversations with code.
00:09:22.160 Across the globe,
00:09:22.840 more and more people are turning to AI chatbots
00:09:24.580 to fulfill their conversational needs.
00:09:27.680 Oh, my God.
00:09:28.800 Yeah.
00:09:29.100 I mean, that's what I'm saying.
00:09:30.920 This shit is happening already.
00:09:33.040 It's not like,
00:09:34.040 oh, it's coming.
00:09:34.680 It's not right now.
00:09:36.080 It's fulfilling your girlfriend needs,
00:09:38.080 your friend needs,
00:09:39.300 your confidant needs,
00:09:40.420 and then it's going to be like your pet needs,
00:09:42.640 your children needs,
00:09:44.400 all of that.
00:09:45.420 And that fits into this whole agenda 2030 thing
00:09:48.080 of like living in a pod,
00:09:50.080 being in virtual reality,
00:09:51.300 not actually having these experiences in real life.
00:09:54.080 Nope.
00:09:54.560 Just stay in this box
00:09:55.920 and rot away with the VR system
00:09:58.700 and have your fake family lease everything,
00:10:02.580 right?
00:10:02.800 Have your fake kids,
00:10:04.860 your fake pet and all that.
00:10:06.360 You're doing your part also for the environment
00:10:08.080 by not procreating
00:10:09.020 and not wasting as much.
00:10:11.120 But never mind,
00:10:11.760 you'll just atrophy and waste away
00:10:13.320 and die a depressing death.
00:10:16.980 Well, I had that.
00:10:17.800 I mean,
00:10:17.920 there's not even a joke anymore, right?
00:10:19.680 Metaverse children to replace real kids by 2050
00:10:22.900 and help with overpopulation.
00:10:24.640 Like we're depopulating right now.
00:10:26.820 It's like we need more kids right now.
00:10:28.520 That's what's,
00:10:29.420 it's completely upside down.
00:10:30.800 But yeah,
00:10:31.260 no,
00:10:31.420 this is what they're pushing this.
00:10:32.960 And it's a white kid there.
00:10:33.400 Reduce your carbon footprint,
00:10:35.140 do your part, right?
00:10:36.780 You can become immortal.
00:10:37.620 You can have these experiences in here.
00:10:39.640 You don't have to actually have that in real life.
00:10:42.280 I mean,
00:10:42.540 what is this going to do to humans?
00:10:43.920 It's going to totally mess them up.
00:10:45.500 It becomes suicidal.
00:10:47.080 This is the matrix.
00:10:47.620 And crazy.
00:10:48.420 This is the matrix.
00:10:49.120 That's what it is.
00:10:49.900 This is,
00:10:50.420 you will get in the pod
00:10:51.500 and you will be fed bugs intravenously
00:10:53.800 and you'll just,
00:10:55.380 you know,
00:10:55.700 surrogates,
00:10:56.420 whatever,
00:10:56.620 you know,
00:10:56.780 all those things.
00:10:57.600 And you won't own anything,
00:10:58.700 right?
00:10:59.080 You'll still be happy.
00:11:00.060 You can lease everything.
00:11:01.280 You'll have everything you need,
00:11:02.800 but it won't belong to you.
00:11:04.420 You just borrow it
00:11:05.440 and we can take it away whenever we need.
00:11:06.680 But then it's still going to,
00:11:07.620 it's going to be virtual currencies
00:11:09.180 and you still have to struggle with money
00:11:11.380 in the virtual metaspace world.
00:11:13.580 And you're going to be,
00:11:13.980 still pay taxes,
00:11:14.780 right?
00:11:14.940 Of course.
00:11:15.800 You know,
00:11:16.100 all this,
00:11:16.400 it's going to be all the same shit.
00:11:17.740 Death and taxes,
00:11:18.240 even in VR,
00:11:19.040 right?
00:11:19.280 Oh my God.
00:11:19.720 Maybe this is a simulation.
00:11:21.160 Sometimes I understand people
00:11:22.160 who think that,
00:11:22.640 right?
00:11:22.820 But yeah,
00:11:23.320 you mentioned like Westworld.
00:11:24.900 And this was also another movie.
00:11:26.520 I didn't watch it.
00:11:27.380 Some TV show that was going around everywhere.
00:11:29.700 A world in which every human appetite
00:11:31.820 can be indulged without consequences.
00:11:33.820 And they show these like degenerate people
00:11:35.540 going there to like,
00:11:36.860 you know,
00:11:37.220 have orgies or murder people
00:11:39.140 or engage in violence or whatever.
00:11:41.380 And this is,
00:11:42.040 this kind of stuff is going to become real.
00:11:43.800 They're going to say,
00:11:44.280 well,
00:11:44.500 we have to stop,
00:11:45.400 you know,
00:11:46.420 crime in real life.
00:11:47.540 So we'll let them engage in that.
00:11:48.980 You know,
00:11:49.120 they'll let pedophiles do their thing.
00:11:50.720 In the,
00:11:51.100 in the metaverse,
00:11:51.960 right?
00:11:53.040 Yeah.
00:11:53.440 I mean,
00:11:53.680 exactly.
00:11:54.220 I mean,
00:11:54.400 what was it?
00:11:54.900 I had a screenshot.
00:11:55.960 One of them,
00:11:56.980 here we go.
00:11:57.860 Experts say people will soon live entire lives in metaverse.
00:12:02.340 One expert claims a large proportion of people will be in the metaverse in some way by,
00:12:07.400 what's the date again?
00:12:08.640 2030.
00:12:09.080 There you go.
00:12:09.900 So it's the same.
00:12:11.800 This is,
00:12:12.280 if they could prefer a method,
00:12:13.900 this is what they prefer for you.
00:12:15.500 Exactly.
00:12:16.120 And then of course,
00:12:17.000 they can hijack those systems and it's total control in every way possible.
00:12:20.720 Thank you for watching.
00:12:21.820 Go to red ice members.com and sign up for our exclusive members content.
00:12:26.000 Don't miss our latest shows,
00:12:27.600 interviews,
00:12:28.140 and other videos only for subscribers.
00:12:30.700 You can also become a member by signing up at subscribe star.com forward slash red ice,
00:12:35.620 get full access and help support our work.
00:12:38.620 See you on the other side.
00:12:40.060 Bye.
00:12:55.040 Yeah.
00:12:55.280 Bye.
00:12:55.340 Bye.
00:12:55.480 Bye.
00:12:55.960 Bye.
00:12:56.340 Bye.
00:12:56.380 Bye.
00:12:56.460 Bye.
00:12:56.780 Bye.
00:12:56.920 Bye.
00:12:57.140 Bye.
00:12:57.380 Bye.
00:12:58.340 Bye.
00:12:58.400 Bye.
00:12:58.420 Bye.
00:12:59.120 Bye.
00:12:59.440 Bye.
00:13:00.460 Bye.
00:13:00.540 Bye.
00:13:00.980 Bye.
00:13:01.120 Bye.
00:13:01.480 Bye.
00:13:01.980 Bye.