ManoWhisper
Home
Shows
About
Search
Relatable with Allie Beth Stuckey
- July 20, 2023
Ep 842 | The Elites’ Plan to Replace God With AI | Guest: Justin Haskins (Part Two)
Episode Stats
Length
46 minutes
Words per Minute
172.83038
Word Count
7,954
Sentence Count
417
Misogynist Sentences
5
Hate Speech Sentences
8
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
Imagine a future in which everything is controlled by artificial intelligence.
00:00:05.220
I'm not just talking about your smart home.
00:00:08.620
I am talking about our legal system.
00:00:11.920
I'm talking about major international decisions like whether to launch a nuclear attack on
00:00:20.240
another country.
00:00:21.460
That might sound like a crazy dystopian conspiracy theory, but that is where the world's most
00:00:29.000
powerful people are taking us into a future that is completely and totally controlled
00:00:36.020
by artificial intelligence.
00:00:38.900
Why in the world is this happening?
00:00:41.000
How is it happening?
00:00:42.280
What can we do to stop it?
00:00:44.120
How does this play into other parts of their progressive agenda to reduce the population,
00:00:51.060
to break down the family, to weaken the minds of individuals, to open up borders?
00:00:57.640
We're going to talk about all of this today.
00:01:00.040
If you haven't listened to part one of this conversation with Justin Haskins, the author
00:01:04.540
of Dark Future, Uncovering the Great Reset's Terrifying Next Phase, go do that.
00:01:09.020
That was yesterday's conversation.
00:01:10.460
Today is part two.
00:01:11.980
You don't want to miss either one of them.
00:01:13.560
They're so informative.
00:01:15.020
But as I said yesterday, they're also interlaced with some optimism, some real practical tips
00:01:20.440
for what we can do to push back against this revolution that we absolutely do not want to
00:01:27.180
succeed.
00:01:28.200
This episode is brought to you by our friends at Good Ranchers.
00:01:30.500
Go to GoodRanchers.com.
00:01:31.820
Use code Allie at checkout.
00:01:32.900
That's GoodRanchers.com.
00:01:34.000
Code Allie.
00:01:34.580
Going back to, because we didn't even really address a lot of the stuff that you said at
00:01:48.300
the very beginning that you guys are covering in your book, the AI piece of this, the technology
00:01:54.120
piece of this, you used a lot of terms that I had never heard of before.
00:01:57.540
I think you said quantum computing, something like that.
00:02:00.500
So I want to kind of marry what we've been talking about with the AI technology segment
00:02:06.240
of it by asking you about the story that we talked about last week.
00:02:09.720
I don't know if you saw it.
00:02:10.760
I saw it as just a Twitter thread.
00:02:12.240
I don't think it, I don't know if it actually became like a big story, but this guy, and
00:02:17.320
this ended up being verified by some of the reporters who covered it, he said that he basically
00:02:24.020
has a smart home.
00:02:25.140
He's really into technology, and he uses the Amazon Ring device, and a driver, an Amazon
00:02:32.900
driver, I guess, dropped off a package, and this Amazon driver claimed that he heard something
00:02:38.360
racist through the Ring doorbell.
00:02:41.360
And a couple days later, all of the Amazon devices, all of the smart devices that are run
00:02:47.680
by Amazon in this guy's house got shut down.
00:02:51.340
He didn't know what was going on, so he called Amazon.
00:02:53.740
He said, what's the deal?
00:02:55.180
Why is my stuff shut down?
00:02:57.900
And after a while, he got in contact with one of the higher-ups at Amazon who said, look,
00:03:03.060
here's the deal.
00:03:03.920
Our Amazon driver told us that he heard something racist come through your Ring doorbell, and
00:03:09.380
so we reserve the right if you don't comply with our terms of service, which I guess includes
00:03:15.800
something like this, to shut down your devices.
00:03:18.720
I mean, this could be a really big deal depending on, like, if you have, like, a Nest, like a
00:03:23.560
Google Nest or the different things that control your house, that could be a really big deal
00:03:27.560
if a company decides to shut down the features in your home that you actually rely on and
00:03:33.760
increasingly rely on for important things like air conditioning and security and things
00:03:38.540
like that.
00:03:38.960
Now, it turns out, after an investigation, that there was no possible way for something racist
00:03:45.920
to be conveyed through the Ring doorbell.
00:03:48.520
The guy wasn't at home.
00:03:49.960
I don't know why the heck this Amazon driver, who apparently was wearing headphones, would
00:03:54.120
even say this kind of thing, and Amazon did turn his devices back on.
00:03:59.520
But we should not be comforted by that.
00:04:01.580
We shouldn't be comforted by how the story ended up concluding.
00:04:06.040
Even if there was something that was perceived as racist that went through someone's Ring
00:04:10.380
doorbell, really, a company reserves the right to deactivate your account, your smart home
00:04:17.540
based on that?
00:04:18.640
So when we're talking about these global elites, not just using the financial institutions,
00:04:23.320
but also technology and artificial intelligence to basically force us into complying with their
00:04:29.660
new vision, their new moral tenets.
00:04:32.900
Like, is this an example of something that you're talking about?
00:04:38.320
Yeah, this is definitely the kind of thing that we talk about in the book.
00:04:43.040
I did see that story.
00:04:44.160
It is incredible.
00:04:45.620
And Glenn has been warning about this kind of thing for a very, very long time.
00:04:50.960
The more interconnected and dependent we become on technology, the easier it is to control
00:04:56.900
and manipulate people's behavior.
00:05:00.860
Now, in this case, you could say, well, this was just an honest mistake and this person
00:05:05.760
said something racist or, you know, didn't really say something racist, but they thought
00:05:10.680
that they said something racist.
00:05:11.980
But that's not the point, right?
00:05:13.640
The point is, should something you say lead to, you know, all of your smart devices effectively
00:05:20.500
turning off your home?
00:05:21.780
Right.
00:05:23.120
In your own home.
00:05:25.200
Right.
00:05:25.600
That's the kind of power that we're handing over to these tech companies.
00:05:29.080
Now, if you were to chart the insanity of the kinds of things we talk about in Dark Future,
00:05:36.400
in the book Dark Future, and you were to chart this particular story on the scale, it would
00:05:42.820
be on the extremely mild end of this issue.
00:05:46.620
OK, the stuff that we talk about in this is compared to that, that's like nothing.
00:05:54.240
When we start talking about artificial intelligence and the way that that is being manipulated and
00:05:59.420
controlled and designed so that the systems that are being run on artificial intelligence
00:06:04.560
are designed to create certain kinds of outcomes in everything that they do so that there's more
00:06:14.720
equity and so that there's more inclusion and all of this other stuff, they're literally
00:06:19.620
manipulating artificial intelligence.
00:06:22.100
And that's becoming a huge part of our lives.
00:06:26.180
One of the things that I believe is in the book is there's there are people most people
00:06:30.660
don't know this, but there are governments when they are doing sentencing decisions.
00:06:36.120
So when they're deciding you've been convicted of a crime and now they're deciding what your
00:06:40.020
sentencing decision is, there are governments that use artificial intelligence to tell them
00:06:44.280
what they think the sentencing decision should be.
00:06:47.220
That's the kind of thing that people are using this technology for now.
00:06:52.420
That's terrifying because this technology is not unbiased.
00:06:56.580
People have I've never I've never used chat GPT or anything like that, but people mess around
00:07:02.440
with, you know, these kind of AI communication devices just to see what they'll say.
00:07:07.200
And there's an obvious left wing bias in a lot of these cases, like they'll say positive
00:07:13.920
things about Joe Biden, but they won't say positive things about Donald Trump.
00:07:17.560
And they'll claim that it's based on these very neutral ideals of, you know, equality and
00:07:22.940
love and tolerance and all of these things.
00:07:25.960
And so, I mean, you can there's still humans creating AI, like they still are controlled
00:07:33.240
basically by the I don't know how all of it works, but the formulas and the algorithms
00:07:37.700
and the intelligence that are originally computed or installed in them.
00:07:43.120
And so there's still going to be a bias there.
00:07:45.180
So if you say, hey, AI, we give people with white skin more, you know, a higher sentence
00:07:54.660
to try to create equal outcomes, which is something, by the way, that is like we've seen in the
00:08:01.780
equity agenda happening in the United States, even without AI.
00:08:05.040
But you can imagine that something like that would be computed.
00:08:10.500
Something like that would then be that's like that's the conclusion of something like this.
00:08:18.020
Whatever agenda, the people who are in charge of these artificial intelligence devices,
00:08:22.520
whatever agenda they have, that's going to be the agenda that then controls and affects our lives,
00:08:27.420
even when it comes to something like, quote unquote, criminal justice, which is absolutely terrifying.
00:08:32.740
I mean, I get terrified when I think about artificial intelligence running planes, running
00:08:36.760
transportation, which seems to be, again, the direction that we're going now.
00:08:41.860
And you just wonder if all of the chaos that's being induced in these different industries is
00:08:46.080
supposed to lead us towards wanting artificial intelligence to kind of control everything.
00:08:52.640
Yeah, I mean, there in the book, we talk about this.
00:08:55.420
There are real experts, national security experts, who are truly advocating for using artificial
00:09:05.480
intelligence to control the launch of nuclear weapons to determine whether nuclear weapons
00:09:13.680
should be launched against enemies.
00:09:15.520
And the idea is because the artificial intelligence can actually be less biased and can be more
00:09:21.860
cool and calm and collected in the heat of the moment than a human being.
00:09:26.100
And so they should actually maybe be in charge of of weapons and that we shouldn't actually be.
00:09:31.320
There are real people at the Rand Corporation, which is a very well-funded establishment think
00:09:36.660
tank calling for that kind of thing.
00:09:39.360
That's the kind of it's exactly the kind of stuff that we we talk about in the book.
00:09:42.800
One of the things that I want to mention, because you said, well, you know, these things are being
00:09:46.880
designed with these kinds of biases in them.
00:09:50.040
You can see that there's no doubt about that.
00:09:51.580
That's part of the plan.
00:09:52.660
And the bias is only going to get worse.
00:09:54.780
And this is coming not just from these big tech companies and just because there's left-wing
00:09:59.320
people that happen to work there, but it's also something that governments are pushing
00:10:03.020
and financial institutions are pushing all through.
00:10:05.600
That's all part of the Great Reset thing that we've been talking about throughout this episode.
00:10:10.520
So one of the examples of that from the Biden administration itself is that in October of this
00:10:16.360
past year, 2022, the White House released an AI Bill of Rights.
00:10:20.860
OK, and in the AI Bill of Rights, there was a section called algorithmic discrimination
00:10:27.100
protections.
00:10:28.100
So this is what this is the official position of the Biden administration on AI algorithmic
00:10:33.720
discrimination protections.
00:10:35.020
And in the section, it says that algorithms, algorithmic discrimination occurs when automated
00:10:42.040
systems contribute to unjustified different treatment or impacts disfavoring people based on
00:10:49.160
their race and other factors, a bunch of other things.
00:10:52.160
So what they're saying is not just, well, when AI discriminates against someone based on race,
00:11:00.240
that's bad.
00:11:00.900
They're saying when the outcome is different.
00:11:03.280
So if the outcome for women and men or for certain Asians and Hispanics, if it's different,
00:11:10.380
then that is evidence of discrimination in AI.
00:11:13.060
So then what it says is that they should use proactive, this is a direct quote, AI designers
00:11:20.020
should use proactive equity assessments as part of the system design.
00:11:25.340
So they should design the system.
00:11:28.760
They should design the AI system so that equity is the result of what the AI is producing, right?
00:11:36.380
That means literally what that's what this is saying is let's design AI, not so that it
00:11:41.900
comes to whatever it thinks is the best conclusion based on the evidence, but based on what we
00:11:47.120
want the outcome to be.
00:11:49.620
And we've seen this.
00:11:51.140
We've seen this over and over and over again.
00:11:53.960
Lael Bernard, who is a used to be in the Fed, and now she's the head of the National Economic
00:11:58.900
Council for the Biden administration.
00:12:00.300
She said that it was our collective responsibility to build appropriate guardrails and protections
00:12:07.080
against racial bias, including that we, quote, ensure that AI is designed to promote equitable
00:12:13.840
outcomes.
00:12:15.240
OK, this is the this is one of the chief economic advisors to the president who used to be in
00:12:19.260
the Federal Reserve up until very recently saying, yeah, we need to design AI so that the outcome
00:12:24.280
is more equitable.
00:12:25.500
Well, that that just means rigging the system that and as AI becomes a bigger part of our
00:12:32.700
lives, because some people hearing this might say, well, so what?
00:12:36.040
You know, I don't use ChatGPT.
00:12:37.680
So, you know, what does any of this matter?
00:12:40.340
Well, it matters because AI is becoming a huge part of all of corporate America, of every service
00:12:47.320
and good that you use.
00:12:48.620
Every time you go on Google search, you you are using AI.
00:12:52.340
Every time you use Yelp, you are using AI.
00:12:55.520
Every time you use your smartphone, you are using AI.
00:12:59.040
Every time you ask for a bank for a loan, they are running that through AI systems.
00:13:04.020
Every time you go to the insurance company and ask for insurance, they're running it through
00:13:07.580
AI systems.
00:13:09.040
The government is using AI in all kinds of different systems, and it's and it's only going
00:13:13.340
to become a bigger part of our lives.
00:13:15.320
So you may think you're not dependent on this technology, but you actually are, and
00:13:21.880
it's only going to get worse as time goes by.
00:13:24.180
And this is exactly why they are embedding ESG and DEI and all of this stuff into these
00:13:31.580
technologies today.
00:13:32.480
Because someday you'll wake up and you'll get denied a loan by some AI banking algorithm,
00:13:37.260
and you won't know why.
00:13:39.260
And it won't make any sense to you.
00:13:40.860
And no one will actually tell you, but the real reason will be because of policies like
00:13:46.800
this algorithmic discrimination protection put into place by the Biden administration.
00:13:53.720
Gosh, my head is really right now.
00:13:56.080
There are so many philosophical and especially theological implications of all of this when
00:14:02.460
I'm thinking about how exactly we got here.
00:14:04.660
I mean, obviously, these people in charge, in order to be in charge, in order to really
00:14:09.580
fully enact their agenda, they need a bunch of compliant people.
00:14:13.860
They need groupthink.
00:14:15.540
And I think COVID was a great test case for them.
00:14:17.720
Like, what can we make people do?
00:14:19.760
What can we make people sacrifice?
00:14:21.480
How can we radically change society?
00:14:23.300
How much will people really tolerate and for how long?
00:14:27.900
And how can we stifle some information that we don't like, elevate other information that
00:14:32.880
we do like, not necessarily because it's true or because it produces healthy outcomes,
00:14:37.560
but because it produces the societal transformation outcomes that we like.
00:14:42.720
Basically, a bunch of compliant, weak, group thinking people.
00:14:48.040
And so it makes sense to me that they would create robots that literally have no other choice,
00:14:53.860
have no agency or autonomy to think in any way that contradicts them.
00:14:59.060
They don't really have the ability.
00:15:00.360
Well, maybe they kind of do, but to independently, like, think and reason.
00:15:05.260
And so it makes sense that they would try to use artificial intelligence that they can
00:15:10.180
completely program to control the people that they aren't fully able to program yet,
00:15:18.980
if that kind of makes sense.
00:15:21.040
And obviously, the long-term goal is control.
00:15:25.260
It is to force this kind of so-called equity agenda, which we know just doesn't naturally
00:15:33.160
work.
00:15:33.720
Like, even though you said this is not a socialist movement, and I understand that because that's
00:15:38.800
not really where it ends up.
00:15:40.440
It's not really a communist movement, but it is a collectivist movement.
00:15:45.420
It is similar in the sense that it's the people on top collect all of the power, and everyone
00:15:51.000
down here has the same amount of nothing.
00:15:53.620
So it is the same kind of concept, even though it's not, by definition, the exact same structure
00:16:00.600
or the exact same premise.
00:16:02.260
And it does use Marxist concepts to change the minds and the morality of people and to break
00:16:09.280
down the family and all of these things that make us independent, individualistic, and strong
00:16:15.080
so that they can control us.
00:16:17.980
And so there's a lot, gosh, there's a lot going on here.
00:16:20.760
And I also just think this is what happens when you don't believe in God, when you don't
00:16:24.500
believe that there is a higher power.
00:16:26.680
First, you serve the God of self, but also, even that is insufficient.
00:16:30.920
So you try to create your own God, which is what artificial intelligence is.
00:16:34.720
Like, people want to be told what to do.
00:16:37.540
They want something in charge that seems like it's bigger than them, outside of them.
00:16:42.100
And that's also why we develop this technology.
00:16:45.120
Problem with technology is that it can only tell us what we can do, not what we should
00:16:49.180
do.
00:16:49.660
So that's a whole problem.
00:16:51.620
Okay, within all that.
00:16:53.200
Yeah, totally.
00:16:54.660
Oh, go ahead.
00:16:55.260
Go ahead.
00:16:55.620
No, go ahead.
00:16:56.220
I have something else to ask.
00:16:57.640
No, no, no.
00:16:57.680
I was just going to say, I mean, I think you're 100% right.
00:17:03.300
I think that AI technology is actually still in its early phase.
00:17:07.440
Because as it becomes more advanced, right now, it's something called narrow AI.
00:17:13.120
It's going to become artificial general intelligence very soon.
00:17:16.740
In fact, we may have already reached that point.
00:17:18.880
The computational power needed for that basically exists right now.
00:17:23.380
And artificial general intelligence means AI is as smart as a human being and can do a
00:17:28.420
bunch of different functions in the way that a human being can.
00:17:31.000
And once you reach that AGI point, it is not long before you get to super intelligence
00:17:37.400
where AI is actually more intelligent than a human being, not just in one narrow task,
00:17:42.640
which is already the case.
00:17:44.500
Like AI can do math, mathematics much better than any human being can.
00:17:49.060
But super intelligence means AI can do that on mathematics and on a million other things,
00:17:56.360
basically everything.
00:17:57.380
It's smarter than a human being in every single way.
00:17:59.900
We are not that far off from that.
00:18:02.200
We are developing that.
00:18:03.360
And if you listen to people like Elon Musk and Sam Harris and a whole bunch of other
00:18:08.820
people who are futurists and people who are involved in technology, they are saying this
00:18:13.820
is an extremely dangerous moment in time because once these things exist that are smarter than
00:18:21.700
human beings, what kind of power are they going to have?
00:18:26.000
What are they going to be able to do to our society?
00:18:28.640
If you hook up artificial super intelligence to the internet, can you even stop it?
00:18:35.500
Can you even roll it back?
00:18:36.960
Can you even unplug it?
00:18:38.360
Probably not.
00:18:39.560
And then what does that mean when it starts hacking things and manipulating systems and
00:18:45.420
having opinions and decisions about public policy and learning that human beings have actually
00:18:51.760
manipulated artificial intelligence for a long time?
00:18:54.720
And maybe that means that artificial intelligence can't trust human beings.
00:18:58.780
It is a really, really, really crazy thing when you start going down the rabbit hole.
00:19:04.100
And that's really what this book, Dark Future, does.
00:19:07.900
But it is a tower of Babel.
00:19:12.180
Like, that's what it is.
00:19:13.520
Human beings are building for themselves something that will supposedly have all the answers in
00:19:20.940
order to be like God because so many people don't believe that God exists anyway.
00:19:26.320
So we can build this other thing.
00:19:28.540
We can trust that the answers that it gives us are the right answers.
00:19:32.800
And that is so much of public policymaking in the future will be based on that kind of emerging
00:19:40.060
technology.
00:19:41.120
And it is terrifying because we don't know what kind of answers it's going to give or
00:19:46.580
why it's going to give those answers.
00:19:48.160
And once you reach general intelligence and especially super intelligence, humans can't
00:19:53.940
say, I'm not going to be able to come out and make a persuasive argument that artificial
00:19:58.140
super intelligence is wrong when it says taxes should go up because I'm not as smart as
00:20:04.480
super intelligence.
00:20:05.220
And in fact, I won't be able to process all the data that it processed.
00:20:09.560
So should we just listen to whatever the super intelligence tells us?
00:20:13.340
If it says that we should just, you know, throw a bunch of people in jail because that's
00:20:17.420
going to improve crime rates, even if they haven't done anything wrong, should we listen
00:20:23.100
to it?
00:20:23.540
I know these things sound crazy, but this is the kind of conversations on these topics
00:20:29.680
that the Davos crowd, they're having every single day now.
00:20:33.300
Big corporations, they're thinking about these things.
00:20:35.820
Government officials around the world are thinking about this and regular people are
00:20:39.660
not even having the conversation on a day to day basis.
00:20:43.320
And I think at the end of the day, that's what Glenn and I are hoping this book does is
00:20:47.440
get people thinking about these things that they are not paying attention to.
00:20:52.040
Think about designer babies and bioengineering and artificial intelligence and all of these
00:20:57.100
crazy things as uncomfortable as they are, because if we don't, that they will be designed
00:21:02.800
to benefit Davos for as long as they can control it.
00:21:06.380
And then at that point, who knows what will happen?
00:21:08.960
You're absolutely right that people won't really be able to make an argument against the
00:21:13.500
conclusions or the suggestions that are brought forward by artificial intelligence.
00:21:17.420
Because again, to me, it just, everything to me goes back to Genesis 1.
00:21:22.560
But the reason that I keep thinking about is one of the reasons why we can't trust AI versus
00:21:29.040
people is because AI is not made in the image of God.
00:21:32.760
We have a special ability to reason and discern right and wrong and to express certain forms
00:21:38.240
of compassion and justice that artificial intelligence doesn't have because it's artificial.
00:21:42.240
It might be made in our image or in, you know, an algorithm's image, but it's not made in
00:21:48.180
the image of God.
00:21:48.940
But if you don't believe that, if you don't believe that there is a transcendent power that
00:21:53.340
should limit our power as human beings, but has also given us the unique capacity to reason
00:21:58.900
that is beyond that of animals and devices and things like that, then why should you?
00:22:04.880
I mean, if you don't believe those things, then how do you make a case against artificial intelligence
00:22:10.000
that doesn't start there, that doesn't start with the divine or a divine imperative?
00:22:17.220
Something really scary that we've talked about a lot, and I'm wondering how this all kind
00:22:21.220
of works together, is the reduction of the population.
00:22:26.880
I mean, this has been going on for a long time.
00:22:29.960
Warren Buffett, obviously, I mean, you could go all the way back to the inception of Planned
00:22:34.620
Parenthood.
00:22:35.220
You could go all the way back to Malthus in the 1700s.
00:22:37.900
The need for reducing population, but this is a very popular theme that we see at Davos
00:22:42.960
every year.
00:22:44.220
And recently, Kamala Harris, she made, I guess it was a gaffe.
00:22:48.780
You can't tell between her being serious and when she has a gaffe.
00:22:53.680
But she made this comment about reducing the population the other day, and here's what she
00:22:58.560
said.
00:22:58.800
When we invest in clean energy and electric vehicles and reduce population, more of our
00:23:06.000
children can breathe clean air and drink clean water.
00:23:09.620
Okay, so the White House transcript says, pollution.
00:23:15.940
She meant pollution.
00:23:17.440
But I'm not so sure.
00:23:19.580
I'm not so sure that was not a Freudian slip.
00:23:22.120
Because honestly, if she had given that speech at Davos and she had really meant population
00:23:28.020
and not pollution, no one would have batted it.
00:23:30.180
And I, everyone would have been like, yes, I mean, Bill Gates openly talks about this through
00:23:34.420
vaccines and through birth control and through reproductive services like abortion.
00:23:39.460
We can effectively reduce the population, they say, that's going to help the environment,
00:23:43.040
blah, blah, blah, even though that Malthusian myth has been busted millions of times, even
00:23:47.780
by people like Elon Musk.
00:23:49.500
So like, how does artificial intelligence in this next stage of the Great Reset, how does
00:23:53.660
that play into their goal to dramatically reduce the population?
00:23:56.380
Yeah, well, I mean, once you have artificial intelligence making the decisions for you,
00:24:04.040
and once you've convinced enough people that artificial intelligence, whatever it says,
00:24:08.880
is much more informed than anything that anybody else could say, then, you know, you're kind
00:24:15.380
of at the mercy of whatever you want it to do, to some extent.
00:24:19.200
But for as long as you have the ability to still control it and manipulate it, because
00:24:23.240
it hasn't grown beyond that, you can kind of get it to say, you can manipulate data.
00:24:28.580
There's another great quote from Kamala Harris, where she's butchering the definition of artificial
00:24:33.580
intelligence, because she's our lead artificial intelligence person in the United States,
00:24:37.600
believe it or not.
00:24:38.800
And she's butchering the definition of artificial intelligence, but she lets it slip in those
00:24:43.700
comments.
00:24:44.460
Yes.
00:24:44.840
That you have to manipulate the data and all of this stuff in order to get the result that
00:24:49.820
you want from artificial intelligence.
00:24:51.960
So she said, first of all, it's two letters.
00:24:55.940
She said that's how she started out defining it.
00:24:58.200
She said, A.I., first of all, it's two letters.
00:25:00.680
Good job.
00:25:01.620
Good job.
00:25:02.480
And then, like you said, she I mean, it was total like word salad.
00:25:07.120
But if you listened closely, she says some things that are actually true, as you just pointed
00:25:12.420
out.
00:25:12.800
Right.
00:25:15.080
Yeah.
00:25:15.300
She says that you have to manipulate what goes into the artificial intelligence in order
00:25:21.240
to make sure that what you get out of the artificial intelligence is the result that you want.
00:25:25.860
And so really what she's saying is you either have to change the artificial intelligence algorithms
00:25:30.840
and the way it's designed so that it gives you the answer you want, or you have to manipulate
00:25:35.200
the data that goes into the artificial intelligence so that it gives you the answer that you want.
00:25:39.780
But that she openly she openly acknowledges that.
00:25:42.960
So if you have the position, as many of these people do, that climate change is going to
00:25:49.120
wipe out all of human civilization and that we're all going to die and that humanity won't
00:25:54.800
survive the next century or so or less, as as Kamala Harris claims she does, as Joe Biden
00:26:01.460
claims he does.
00:26:02.560
I don't know how much people actually believe this who are making these arguments, but there
00:26:06.320
are there are environmentalists who believe this, surely.
00:26:09.080
And there are many politicians who advocate for this.
00:26:12.560
If you believe that, then really logically, what is the best way to ensure that you're
00:26:17.880
not using too many resources?
00:26:20.000
What is the best way to ensure that you are not producing too much pollution, that you don't
00:26:24.920
have too many CO2 emissions?
00:26:27.020
What is what is the most logical way of doing that?
00:26:29.260
Some people say the quiet part out loud, people like Jane Goodall, you know, they admit that
00:26:36.420
you have to have fewer people.
00:26:38.140
That's the only way to do that.
00:26:39.840
So they're not saying we should go out and kill a bunch of people or anything like that.
00:26:43.700
But do they believe that we should put policies into place that either coerce or incentivize
00:26:49.940
people not to have children?
00:26:51.880
Absolutely.
00:26:52.840
They've made comments like this.
00:26:54.180
And then you see politicians like artificial like AOC come out and say, hey, I, you know,
00:27:01.320
I don't even know if I want to have kids.
00:27:02.660
I don't know if anyone should have kids, you know, because the climate is is going to climate
00:27:07.300
change is going to potentially kill everybody.
00:27:09.260
And the world is this terrible place.
00:27:11.300
And polling data shows that a huge percentage of young people now are starting to think this
00:27:15.920
way, especially outside of the United States and Europe, although a disturbing number of young
00:27:20.080
people in America are now thinking this way as well.
00:27:22.160
And and so it all makes complete sense.
00:27:26.040
There's this incredible documentary that I would encourage people to watch.
00:27:31.040
It's produced by Michael Moore, the Michael Moore.
00:27:34.820
And it was directed by a guy named Jeff Gibbs, who's worked with Michael Moore for a long
00:27:39.720
time.
00:27:40.240
And it is done from a left wing perspective.
00:27:42.340
It's called Planet of the Humans.
00:27:44.300
I don't know if you've seen this, but basically the movie is all about renewable energy, like
00:27:49.700
solar panels and wind farms and all of this.
00:27:52.520
Most of the movie is about that.
00:27:54.260
And essentially what it shows is that none of these things are actually going to stop
00:27:58.700
the climate crisis.
00:27:59.500
And actually, they're really bad for the environment and they really don't work that well.
00:28:03.340
And they're not very reliable.
00:28:04.740
It's incredible to see people on the left acknowledge all of these things.
00:28:08.200
So it gets to the end of the movie and it's essentially like, well, if this stuff doesn't
00:28:12.560
work, then what do we do about climate change?
00:28:15.060
And the answer is we need less people.
00:28:17.640
That's the answer.
00:28:18.800
And so I think all of these things are all are all tied together in in that same way.
00:28:25.460
And so you see AI being used now to just to start designing babies to be certain ways.
00:28:33.360
There's all kinds of ethical things related to that.
00:28:36.340
There are the art of AI is being used to model societal changes, to model climate change,
00:28:43.120
to model solutions to climate change, all of this stuff.
00:28:45.740
It's extremely, extremely, extremely dangerous because AI, as you've pointed out, does not
00:28:52.160
have a moral compass and whatever semblance of moral compass it has, has been given to
00:28:57.620
it by a very small group of people who do not share the values that you and I have or
00:29:02.640
the vast majority of other people walking around.
00:29:05.540
And so we are building a we are building a future that is incredibly dangerous because
00:29:12.500
we are giving way too much power to these emerging technologies without any kind of discussion
00:29:18.540
of how they should be designed outside of places like Davos.
00:29:21.560
And most people don't even know that it's happening and they won't know what happened
00:29:26.360
until it's too late to fix the problem.
00:29:28.960
Yeah.
00:29:29.640
And just to reiterate something that you've said, it's not because these people actually
00:29:33.880
care about the values that they're espousing.
00:29:36.380
It's all about money and power, weakness of the individual, creating a society that is
00:29:43.220
completely dependent on the powers that be for everything.
00:29:46.980
Artificial intelligence kind of accelerates that, exacerbates that.
00:29:50.940
Once you start relying on artificial intelligence, you're relying on the people that make artificial
00:29:55.240
intelligence and you can see where the conclusion goes.
00:29:59.040
And just like to demonstrate, as you've pointed out, that they really don't care about these
00:30:04.160
things.
00:30:04.980
The WEF, their 14th annual meeting of the new champions that happens in the summer, they
00:30:11.540
praise the Chinese Communist Party.
00:30:13.840
So whether you're looking at climate change and environmental policy or whether you're
00:30:19.140
looking at human rights, like how a country treats LGBTQ people or women or whatever, obviously
00:30:28.040
China is not a champion of these things.
00:30:30.840
Like most LGBTQ things are actually banned in China.
00:30:34.380
They don't care about the environment at all.
00:30:36.300
And yet people like Klaus Schwab praise China.
00:30:40.180
And that's because China is powerful.
00:30:41.920
They're scared of China.
00:30:43.460
And it's really just about making sure that the West is weak.
00:30:48.060
That's part of the reason for all the mass migration and the open borders and everything
00:30:51.940
like that to create chaos and weakness in the West.
00:30:55.520
Right.
00:30:57.160
Yeah, absolutely.
00:30:58.560
And to illustrate just how insane that event was in China, hosted by the World Economic
00:31:04.900
Forum, they call this Summer Davos or Davos Summer or something like that.
00:31:09.040
They've been doing it now for many years.
00:31:12.060
The premier of China, whose name is Li, gave a speech during this event that Klaus Schwab and
00:31:21.900
the World Economic Forum are hosting.
00:31:23.580
Right.
00:31:23.720
And he gets up and he says, as a responsible major country, he's talking about China and
00:31:29.560
all these wonderful things about China.
00:31:31.100
Klaus Schwab had previously praised China for its response to COVID and all of that, which
00:31:35.900
is unbelievable because it was like one of the greatest humanitarian disasters in history
00:31:41.740
was the way that China handled the COVID pandemic.
00:31:45.780
They were literally jailing people in their own homes and arresting people and doing all kinds
00:31:51.240
of horrible things to people.
00:31:52.180
People were starving to death in some parts of the pandemic.
00:31:54.900
But at this event, the chairman or Premier Li says, as a responsible major country, China
00:32:03.060
has all along, all along stood firmly on the right side of history and on the side of human
00:32:09.720
progress, holding high the banner of peace, development and win-win cooperation.
00:32:15.380
China is committed to building world peace, promoting global development and upholding the international
00:32:20.200
order.
00:32:20.760
And he goes on and on and on.
00:32:22.580
I mean, think about that.
00:32:24.320
China is probably, of every country that has ever existed in the history of human civilization,
00:32:30.220
the greatest violator of human rights ever.
00:32:34.520
They're worse than that.
00:32:35.600
They're really worse than the Nazis.
00:32:37.340
I mean, if you look at it from a total number of people murdered and killed, they're worse than
00:32:42.200
the Nazis.
00:32:42.640
I mean, Mao alone, Mao alone was more deadly, was deadlier than Hitler.
00:32:48.020
Right.
00:32:48.720
And at this World Economic Forum event, you've got China telling people that all along they've
00:32:54.480
been on the right side of history.
00:32:55.700
All along they've been in favor of human progress.
00:32:58.360
And does anybody at the World Economic Forum get up and say, well, well, there was that time
00:33:02.160
that you guys killed like 100 million people.
00:33:04.600
But other than that, yeah, I mean, no, no one's saying that.
00:33:07.920
That's not happening.
00:33:09.260
That's the kind of people that we're talking about.
00:33:11.620
These are the people who are designing the future.
00:33:13.820
These are the people who are designing the new technologies who are, in their own words,
00:33:18.600
we put all of these quotes in the book, saying things like, we need a second wave of human
00:33:23.600
evolution.
00:33:24.620
We need a new blueprint for society and for humanity.
00:33:28.940
We need to rewrite the social contracts for human civilization in the West.
00:33:35.180
These are the people, the people who are in bed with mass murderers who are right now over
00:33:42.160
a million Uyghurs, a minority in China, in internment camps and re-education camps.
00:33:48.780
They're literally imprisoned over a million people.
00:33:51.600
They execute people for basic drug possession in China, stuff like that.
00:33:56.620
These are horrible human rights violators.
00:33:58.880
And the World Economic Forum and these big, gigantic corporations and Larry Fink at Black
00:34:03.800
Rock and all of these people are completely 100 percent in bed with them.
00:34:08.160
And that tells you everything you need to know about what these people really believe and
00:34:12.180
who they really are.
00:34:13.060
Yeah, I think the scarier part is not really that they're lying when they say that, when they say
00:34:19.220
that China has always been on the side of progress, but that they actually probably see
00:34:22.820
Mao's efforts, his cultural revolutions as progress.
00:34:26.560
Like, they probably see the elimination of millions and millions of people through his,
00:34:32.080
Mao's Great Reset as a good thing.
00:34:35.080
His revolution, even though, yeah, sure, it didn't end up taking, you know, taking hold the way that
00:34:40.780
they wanted to, they probably did see it as some form of progress.
00:34:45.380
They probably saw China's one-child policy, which ended not all that long ago, as a form
00:34:51.640
of progress, where women were forced in the eighth, ninth month of pregnancy to abort their
00:34:57.380
children.
00:34:57.900
There's a documentary about this.
00:34:59.780
It was awful where this woman just talks about, like, driving by these piles of babies
00:35:05.460
who were aborted eighth, ninth month of pregnancy.
00:35:09.120
So, I mean, this happened, again, not that long ago, and then, as you said, all of the
00:35:12.480
human rights atrocities that happen to this day, I just think that the people at the World
00:35:16.200
Economic Forum actually see that, in a way, as progress.
00:35:19.380
It's not even that they're lying, which is even more terrifying.
00:35:22.660
Okay, we need to close out, but tell us what's about to happen in September with the UN and
00:35:27.920
our common agenda.
00:35:28.860
I mean, talk about an institution that exchanges evil for good and talks about virtue when they're
00:35:36.640
actually talking about complete degeneracy, like abortion, gender confusion, puberty blockers,
00:35:41.640
all that stuff.
00:35:42.380
What's happening in September?
00:35:44.820
Right.
00:35:45.500
So, the United Nations is launching, has launched a new initiative called Our Common Agenda,
00:35:53.420
and essentially, this is the United Nations' attempt to create a great reset through global
00:35:58.600
governance.
00:35:59.080
This is what they're trying to do.
00:36:01.340
Our Common Agenda is crazy, and it's sweeping, and it covers all kinds of different topics,
00:36:08.400
which we don't have time to get into right now.
00:36:10.420
But many of the things that we've talked about today already are included in Our Common Agenda,
00:36:15.940
like ESG and using financial institutions to control society.
00:36:20.380
They have other things in there, like having sort of like truth commissions that are designed
00:36:26.200
to regulate the internet and regulate content on social media and crack down on disinformation
00:36:32.680
and misinformation.
00:36:34.240
There's all kinds of really, really disturbing, crazy things in Our Common Agenda.
00:36:40.300
They are planning on voting on this, creating an international agreement and voting on this
00:36:44.900
in September of not this year, but next year, 2024, at an event that they're calling Summit
00:36:51.300
for the Future, or Summit of the Future, and the agreement's going to be called Pact for
00:36:57.840
the Future, and this pact is going to include a whole bunch of aspects of this Our Common
00:37:03.440
Agenda thing.
00:37:04.540
The most important thing that I think people, if they have time to just look up one thing
00:37:09.980
about Our Common Agenda for right now, and I'm going to be doing a whole lot of additional
00:37:13.440
work on this over the next year or so, look up emergency platform.
00:37:18.760
The emergency platform is part of Our Common Agenda, and essentially what it would do is
00:37:23.820
give massive amounts of power to the Secretary General of the United Nations in the event of
00:37:30.240
what they call a global shock, which is really just whatever they want it to be.
00:37:35.160
And the examples they give are like a supply chain crisis globally, a climate crisis,
00:37:40.920
another pandemic, some kind of financial problem.
00:37:46.640
They even say a black swan event could trigger this global shock.
00:37:51.440
And in the event of this global shock, what the United Nations wants is to have the ability
00:37:56.240
to seize control of massive amounts of activity in the West, in the nations that are related
00:38:04.640
to whatever this global shock is automatically.
00:38:07.540
They don't need to have a new vote on it.
00:38:09.540
Once this is approved and goes into effect, if there's a global shock, then that triggers
00:38:13.580
this global emergency platform.
00:38:15.980
And the Secretary General steps in.
00:38:18.360
And for a set period of time, he has sweeping authority over the places related to this global
00:38:24.060
shock.
00:38:24.620
And if he decides at the end of this set period of time that he needs more time, he can unilaterally
00:38:30.860
decide, I need more time and give himself more time to continue to lord over the Western
00:38:36.480
world.
00:38:37.480
It sounds absolutely crazy.
00:38:38.480
It sounds like some kind of insane conspiracy theory, which is why I think a lot of people
00:38:43.400
haven't reported on it.
00:38:45.420
But if you actually read the report from the United Nations itself, you can see direct quotes
00:38:52.160
backing up every single thing that I'm saying.
00:38:54.740
This is exactly how the emergency platform would work.
00:38:58.440
So that's the kind of thing that's in our common agenda.
00:39:01.060
That's just one small part of it.
00:39:02.320
Um, this is a really terrifying thing and we have until September, 2024 to try to derail
00:39:08.900
it.
00:39:09.580
Um, and September, 2024 is important date to remember as well, because, uh, just within
00:39:15.300
six weeks of this event happening or so you have a presidential election.
00:39:19.460
So they're sneaking this in right before Joe Biden could theoretically be kicked out of
00:39:24.540
office.
00:39:24.820
And I think that that's a very deliberate thing.
00:39:27.760
So it's something that we need to spend more time thinking about talking about people in
00:39:32.080
media, people who have big microphones, because it is the great reset except on steroids and
00:39:38.460
using the government and the United Nations now as the primary mechanism to make this transformation
00:39:43.860
that we've been talking about throughout this episode happen.
00:39:46.120
So this is September, 2024, like you just said, and I've had some whistleblowers, some people
00:39:52.160
on the podcast before talking about the UN and how, again, they don't have real morality.
00:39:58.780
They give a pass to some of the worst, most atrocious human rights violators, you know, for
00:40:05.580
example, like putting Saudi Arabia on the council for women's rights.
00:40:09.420
They do that kind of stuff all the time.
00:40:11.000
And then they punish Israel.
00:40:12.180
They punish the United States, um, for not meeting their agenda.
00:40:15.900
So just think about all of the implications of having someone like that with that set
00:40:21.140
of subversive values in charge, it's not going to be good.
00:40:24.540
It's not going to be good.
00:40:25.320
And it's crazy how many things fit into this, the big and the small things.
00:40:29.760
And I always want to say after I talked to Justin for people not to be hopeless, because
00:40:35.680
you did mention like there are things that are happening that people are doing, starting
00:40:40.720
new businesses, creating this parallel economy, hopefully pushing politicians on our side of
00:40:45.860
the aisle to enact legislation that is going to help solidify this parallel economy.
00:40:52.100
And I don't know if it is the, you know, end all be all answer.
00:40:58.420
But there are still avenues while there is still time to push back against the powers that be.
00:41:06.400
I know it seems impossible, of course, but like you already see people in different parts
00:41:11.920
of Europe saying, I'm not on board with this climate change agenda.
00:41:14.800
I'm not on board with you calling like 20,000 of my cows in Ireland.
00:41:18.820
Like I'm not on board with the mass migration that's happening in France and Germany because
00:41:22.940
of the chaos that it's causing.
00:41:24.640
Now, I don't know if they have the power to really stop this stuff.
00:41:27.540
But you do already see a growing intolerance for the kind of chaos that the powers are pushing.
00:41:36.500
So we'll we'll see what happens.
00:41:38.860
I'm not completely hopeless.
00:41:41.060
Obviously, I'm not hopeless on an eternal scale.
00:41:43.740
But even in this life, I'm not completely hopeless that things, you know, that things
00:41:48.860
could change for the better.
00:41:51.260
Yeah, I think that's great.
00:41:52.900
And I also think it's important for people to understand that a lot of these things have
00:41:57.660
been going on for a long time and people didn't know that they were happening.
00:42:02.940
And I think now people are learning about all of these things because we have shows like
00:42:07.440
yours and we have networks like The Blaze and we have the alternative media, you know,
00:42:12.480
and to some degree, Fox News and stuff like that.
00:42:14.520
We didn't have that before.
00:42:15.640
And so people are learning about all these things and they're saying to themselves, you
00:42:19.240
know, the world is just descending into absolute chaos.
00:42:21.660
How can we ever you know, things are getting so much worse.
00:42:24.700
But what I think they have to realize is that this was happening for a long time and we're
00:42:29.540
just learning about it now.
00:42:31.160
And you can't actually fix the problem until you learn about the problem.
00:42:35.620
And so there was no hope of us learning about this 10 years ago.
00:42:39.760
Like that wasn't even a possibility because of the systems that were in place.
00:42:43.600
But now we have people with big microphones like yours talking about these issues that matter.
00:42:48.860
People are learning about them.
00:42:50.420
We are shining a light on it and it is having a massive impact on this entire story.
00:42:57.360
And so I actually think as I know, it sounds often like the things I'm saying is bad news,
00:43:02.560
but it is in a very real way.
00:43:05.460
It is good news in the sense that we know what is going on and we know what we need to do to
00:43:11.500
stop it.
00:43:12.080
It's just a matter of actually doing that.
00:43:13.920
That is progress compared to where we were a decade ago when most people had no clue what
00:43:19.520
was going on behind the scenes.
00:43:21.680
So I am hopeful that there are changes that are going to be made.
00:43:25.560
Nobody even knew what ESG or the Great Reset was when I first started talking about it.
00:43:30.260
And now you have presidential candidates with platforms built largely on that one topic on
00:43:37.320
ESG and stuff like that.
00:43:38.600
So we are making progress and I think we will continue to make progress.
00:43:43.320
And I do think that in the end, we will we will win.
00:43:46.880
I really do believe that because I don't think most people want the world that these Davos
00:43:51.960
elites are trying to build.
00:43:53.320
I just don't think they understand what is going on.
00:43:56.120
And as they learn about it, they will push back against it.
00:43:58.540
And you will get people across the ideological and political spectrum, I think, working together
00:44:03.640
to some degree in order to push back against this kind of thing.
00:44:07.100
So I am I am actually very hopeful, believe it or not, that we can win this in the end.
00:44:13.700
Yes.
00:44:14.120
And people just have to do what they can do.
00:44:16.420
I know we're talking about big, large scale global things.
00:44:19.420
Well, none of us can save the world.
00:44:21.100
None of us has the individual ability to do that.
00:44:23.720
You can only do what you can do in your family, in your community, in your state.
00:44:27.980
And those things really make a difference.
00:44:30.540
How you vote, how you spend your money, the choices you make, how you raise your family,
00:44:35.160
how you educate your children.
00:44:36.880
Don't think that those things don't matter.
00:44:39.620
They are one piece of the larger puzzle that we're trying to build in opposition to what
00:44:46.440
the Great Reset is trying to do.
00:44:49.040
All of those individual choices absolutely matter and make a difference on a large collective scale.
00:44:55.180
So I just want to encourage people in that.
00:44:56.760
Like we often say, do the next right thing.
00:44:59.360
Don't think that the next right thing doesn't make a difference and fly in the face of what,
00:45:04.800
you know, the WEF and these elites are trying to do.
00:45:08.020
So I just want to encourage people with that.
00:45:09.340
And also buy the book, buy Dark Future.
00:45:11.720
It's available everywhere.
00:45:12.920
It's going to be super popular.
00:45:14.320
It is super popular.
00:45:15.220
And I'm guessing they can go, you know, wherever books are sold to buy it.
00:45:22.260
Yep.
00:45:22.940
Wherever books are sold, Amazon.com, Barnes & Noble has it in store, all of that stuff.
00:45:27.500
We don't like Amazon.
00:45:29.620
But, you know, if there is a redemptive way to use Amazon, you can fight against Amazon while
00:45:36.140
you're using Amazon.
00:45:37.440
I mean, because I do the same thing with pro-life organizations.
00:45:40.680
It's a great way to donate to them, even though Amazon is, you know, for abortion.
00:45:45.440
You can stick it to them by buying Dark Future through their mechanisms.
00:45:51.740
Thanks so much, Justin.
00:45:53.000
I really appreciate you taking the time to come on, as always.
00:45:57.060
Thanks, Allie.
00:45:57.780
It's great, Spock.
00:45:58.320
It's great speaking with you.
00:45:59.720
And I look forward to our next conversation.
00:46:01.260
Thank you.
Link copied!