Ep 1175 | Singularity: Davos’ New AI-Backed Plan to Take Power | Guest: Justin Haskins
Episode Stats
Length
1 hour and 6 minutes
Words per Minute
172.24265
Summary
Justin Haskins is here to tell us what this means and why it matters, how it is going to affect our lives very soon, and an actually hopeful update on the Great Reset now that Trump has been in office for several months.
Transcript
00:00:00.840
What is singularity? Justin Haskins is here today to tell us what this means and why it
00:00:08.500
matters, how it is going to affect our lives very soon. He is also here to give us an actually
00:00:15.720
hopeful update on the Great Reset now that Trump has been in office for several months,
00:00:21.880
as well as that little-known European Union law that set out to change all of our lives
00:00:28.560
for the worst. Go back and listen to that episode, both of my fall episodes with Justin Haskins,
00:00:36.140
and you'll be completely up-to-date with what is going on in the world of these progressive
00:00:42.120
activist billionaires and how they are trying to infringe upon our rights and what we can do about
00:00:47.840
all of it. As always, this is an absolutely fascinating conversation with Justin Haskins.
00:00:53.400
It's brought to you by our friends at Good Ranchers. Go to goodranchers.com, code Allie,
00:01:08.820
All right, before we get into that conversation, I just wanted to say that I hope everyone had
00:01:13.380
a wonderful weekend celebrating the resurrection of Christ. What joy we have. We know that because
00:01:20.520
Jesus rose again that we get to live forever, if by grace through faith, we have been saved by Jesus.
00:01:29.760
And so I hope that you had a wonderful weekend resting and celebrating that good news, the good
00:01:34.920
news of the gospel with your family. And I'm just so, I'm so thankful. I'm so thankful for all of you and
00:01:43.980
how you have strengthened my faith and how you challenge me and you help drive the direction
00:01:50.680
of this show to hopefully continue to be a place of encouragement, edification, education for all of
00:01:57.700
you on this Monday, on this Easter Monday. I just want to remind you that if you don't know what to
00:02:03.240
do next, do the next right thing in faith with excellence and for the glory of God. And actually,
00:02:09.480
that's very relevant to the conversation that we are about to be having with Justin Haskins,
00:02:13.720
because it's a big conversation about some overwhelming topics. And it's not all doom and
00:02:18.340
gloom, but at some point, as you'll hear me say, you can feel powerless. And just remember,
00:02:24.060
as a Christian, that is all we are responsible to do. And even in that next step, the Holy Spirit is
00:02:29.100
there empowering you. So just keep that in mind. God's sovereignty, His provision, our day-by-day,
00:02:34.980
step-by-step responsibility as believers. None of that changes based on who is in power or what is
00:02:41.440
going on or what artificial intelligence is doing. Listen to this conversation, take it in,
00:02:47.940
be educated, but balance it all with the knowledge that God is totally in control.
00:02:52.840
All right. Without further ado, here's our friend, Justin Haskins.
00:02:55.460
Justin, thanks so much for taking the time to join us again. All right. We got a lot to talk about,
00:03:05.620
as always, but just give us an update. How is the Great Reset going? How are our friends
00:03:11.880
at Davos and the WEF, how are they feeling about Trump's presidency, do you think?
00:03:17.840
It hasn't been good for Davos. It's been a rough stretch, I think. Donald Trump has come in guns
00:03:26.820
blazing. He's done everything he possibly can to undermine all of the things that you and I have
00:03:32.480
been talking about for years that have been going on with Davos and the Biden administration and going
00:03:38.340
all the way back to Barack Obama. So undermining DEI, undermining ESG, putting policies into place to
00:03:45.380
make sure that the government isn't involved in these things, ripping down whole structures
00:03:49.960
in the federal government, laying off massive, you know, numbers of employees who support these
00:03:55.840
policies, trying to take down the Department of Education. That is a huge one because so much of
00:04:02.220
the left-wing agenda comes from the Department of Education. There's just so much that Donald Trump
00:04:07.860
has been doing to try to stop the WEF's agenda. One of the biggest things that I think is the attacks
00:04:16.120
on ESG. So at the state level and the federal level, we have seen massive progress from my perspective
00:04:23.880
at tearing down ESG policies. So the biggest way that the Great Reset Davos agenda was being
00:04:31.000
pushed was through ESG policies, both in government and in the financial sector. So what was happening
00:04:40.740
is Wall Street was using ESG, big banks and investment firms like BlackRock were using ESG
00:04:47.340
as a way to push the entire Western economy towards left-wing values. And they would do that through
00:04:54.340
the social credit scores. That's what ESG is. So instead of evaluating businesses based on,
00:05:00.840
you know, how good they are at providing goods and services, companies like BlackRock and other big
00:05:06.840
asset managers and banks were using left-wing social credit scores to determine whether a company was
00:05:13.600
good or bad. And then they were tying financial benefits to the companies that were going along
00:05:19.900
with left-wing agenda. And this is one of the reasons why you saw these big corporations like Target
00:05:24.780
and Bud Light and these other companies adopt these left-wing values. It was because there were
00:05:32.760
financial strings attached to it. Well, Donald Trump has come in and in states over the past few years
00:05:40.000
have been working Republicans at the state level, have been trying to gut this entire infrastructure
00:05:46.540
that's been built. And so far, the Trump administration has made lots of progress in this regard.
00:05:51.900
They've started putting regulations into place that make it harder for financial institutions to use
00:05:56.580
ESG as a way to evaluate companies. That's really important. I think there's more work that needs to
00:06:03.060
be done there, but I think that's a really positive step forward. The Fair Access to Banking Act has been
00:06:08.680
proposed in Congress, which would make it illegal for large banks to use ESG as a way to control
00:06:15.000
people, customers and companies. There's been 19 different state laws that have been passed all across the
00:06:22.060
country, the two best being in Tennessee and in Florida to make it illegal for banks to do use ESG as a way to
00:06:29.280
try to push a left-wing agenda. So, or any agenda for that matter. And so tons of progress has been made.
00:06:35.200
And I think that the WEF is in a lot of trouble and shambles right now. Yeah. For people, just as
00:06:43.700
a reminder, ESG, environmental social governance. And so you can actually see this. We're talking
00:06:49.400
about this as if this is some like dark underbelly of corporate America, but they're really open about
00:06:55.160
it. I mean, I have friends that work for big accounting firms, these major corporations,
00:06:59.460
and they're very open. They'll get an email about ESG to make sure that they attend this training to
00:07:05.860
help keep up the ESG scores. And you can even read on websites of like hotel chains, for example.
00:07:12.300
Here's what we're doing to make sure that we are environmentally conscious, that we are,
00:07:17.320
you know, socially progressive and that we're helping with governance and championing democracy.
00:07:22.940
And all of these, of course, are euphemisms for progressive causes like transing kids. I mean,
00:07:28.260
that's just what it is. And they might be using some code words so that they're not saying that
00:07:34.660
explicitly on their site, but that's what we're talking about. And the reason they're doing that,
00:07:38.780
as you stated, is that companies, huge companies with trillions of dollars like BlackRock and Vanguard,
00:07:45.220
they are shareholders in these companies like Target and like these other major corporations,
00:07:49.900
so that these companies are more beholden to the values of BlackRock, an entity like that,
00:07:57.240
than they are to you or me. And yet we've talked about before that even before Trump won,
00:08:02.140
it did seem like some companies were at least ostensibly switching it up. For example, it seemed
00:08:09.060
that even last June, while Biden was still president, that Pride Month wasn't as aggressive
00:08:14.620
at Target. Some people even said that it was more focused on 4th of July than Pride Month,
00:08:21.240
which is quite the switch for Target. And so it does seem like some of these companies and some of
00:08:26.700
their Super Bowl ads even were trying to appeal to middle America. Tell me how much you think
00:08:33.020
that reflects just kind of trying to save face and just putting up a facade of being pro-America and even
00:08:41.280
kind of conservative-coded, or how much you think it is actually an indicator of some
00:08:50.640
disbanding of power at the WEF in Davos. I hope all of that made sense.
00:08:58.680
Yeah, yeah. No, that makes perfect sense. I would say prior to the Trump administration,
00:09:04.040
prior to Donald Trump winning the election in November, I would have said that a lot of it was
00:09:08.380
posturing with maybe they were trying to be a little bit more quiet about the sort of great
00:09:15.360
reset transition to this ESG left-wing utopian world. But I think after Donald Trump won,
00:09:22.220
I think a lot of people on the left, elites in particular, in these big institutions realized that
00:09:28.760
their agenda was on the ballot, very clearly on the ballot, and had been totally and completely
00:09:34.900
rejected. That voters went out and elected a guy who had been subject to more attacks than any
00:09:42.900
person who's ever run for office ever. We're talking about a guy who was impeached twice.
00:09:47.980
He was convicted of numerous felonies. They did everything. They accused him of insurrection
00:09:54.640
and treason and all kinds of other crazy things. He was almost killed, almost shot in the head,
00:10:01.700
and still was elected. And I think that a lot of people on the left, that was a wake-up call for a
00:10:07.280
lot of these big corporations, the leaders of these companies, who I don't think were necessarily
00:10:11.900
left-wing people per se, but were going in that direction. Some of them were, but I think a lot
00:10:17.320
of them are going in that direction because that's where the money was going. They wanted to be in the
00:10:21.600
good graces of government. They thought that this was the direction that the world was moving in in
00:10:25.720
the West, and they wanted to be a part of it because there's a lot of money tied up in it,
00:10:29.240
and because they want to be in positions of power and influence. And when Donald Trump won,
00:10:34.500
I think they realized, wow, this is not, this has just been totally rejected. This is not what
00:10:40.060
people want. And I do think there has been some real changes that have occurred. And I can give
00:10:45.880
you one example of it is banks for the past year to two years have been talking about trying to be,
00:10:54.920
you know, they're open for everybody and they're not really agenda-driven. And despite all the things
00:11:00.980
they've done in the past, that's not really who they are. But we didn't see a lot of policy changes.
00:11:06.000
We saw them pulling out of international associations that were promoting ESG and things
00:11:12.500
like that at the United Nations and other places. But we didn't actually see a lot of policy changes.
00:11:17.300
Well, in recent months, we've seen actual policy changes implemented at some of the major banks,
00:11:23.360
like JP Morgan Chase, for example, where they said, our official policy is,
00:11:27.900
we are not going to base our decisions on things like ESG criteria. That's not how we're going.
00:11:33.780
We're not going to screen people out that way. And I think that's a direct reaction to what happened
00:11:38.800
in the election and fear that if they don't go along with it, then the Trump administration is
00:11:43.420
going to put regulations in place that force them to go in that direction anyway. And so elections
00:11:48.480
have consequences. And I think one of the big consequences from this past election was,
00:11:54.060
I think a lot of elites realized that their agenda had been completely rejected by voters and that if
00:12:02.260
they want to continue to exist and be successful, they are going to have to move at least a little bit
00:12:10.080
back to the center. They can't continue on with this far left-wing agenda and openly trying to
00:12:16.760
engage in social engineering and all of that stuff because it's just not what Donald Trump and his
00:12:22.840
administration want. It's not what voters want. It's already been rejected by huge swaths of
00:12:27.640
consumers. And so I think we've started to see that. I think we're going to see more of that.
00:12:32.980
Things that BlackRock have put out, for example, they are backing way off on ESG. There's a lot of
00:12:37.920
companies that are doing this. And some of that's driven by the policies that we talked about earlier
00:12:41.380
from the state level and at the federal level. But other things I think is just a,
00:12:46.260
they see the writing on the wall. They know that their ideology is not the ideology in favor,
00:12:51.020
and they're changing because they want to make money.
00:12:58.240
Quick pause from that conversation to tell you about WeHeart Nutrition. Love WeHeart Nutrition.
00:13:03.900
I know I talk about it all the time because I use this product every day. I use their magnesium. I
00:13:09.420
am still on their postnatal vitamin, which works really well for me. I take their omega-3s. I take
00:13:14.680
their iron supplement. And I'm also taking their new Wholesome Balance product, which is great for
00:13:22.100
women's hormones. It's filled with clinically backed ingredients that help us balance our hormones.
00:13:28.280
So saffron, ginger, curcumin, kiwi vitamins. And it's great for women no matter what life stage you're
00:13:34.780
in, whether you're a teen, whether you are postpartum, or whether you are going through
00:13:40.420
perimenopause or menopause. It just gives your body the nutrients and the balance that it needs.
00:13:46.540
I think it's made a difference for me. I think that WeHeart Nutrition has made an overall super
00:13:51.880
positive difference in my health, my hair, skin, and nails, even my immune system. Plus, this is a
00:13:58.520
Christian family-owned company that is just really great to support. You'll be able to tell the
00:14:03.160
difference when you start taking WeHeart Nutrition. Go to WeHeartNutrition.com. Use code
00:14:07.280
Allie. You'll get 20% off your order. WeHeartNutrition.com. Code Allie.
00:14:16.400
I think about someone like George Soros. Obviously, as you said, these executives
00:14:21.220
are probably not really ideologically progressive. Maybe they're socially liberal because that's just
00:14:27.060
the defaults, but they probably don't have a bunch of strongly held convictions. They see green.
00:14:32.740
They want more money. As you said, there's a lot of money tied up into this. But then
00:14:36.180
there are some ideological billionaires like George Soros. I mean, it's obvious which direction he
00:14:41.560
wants the country to go based on the politicians and the organizations that he is funding. I can't
00:14:47.460
imagine. I mean, and Warren Buffett and Bill Gates and all these people. I mean, there are so many that
00:14:51.520
I could list. I can't imagine that those people are saying, eh, our progressive ideology isn't really
00:14:58.440
popular. Let's back off and switch gears. I imagine that people like George Soros and all of his
00:15:04.700
subsidiaries are saying, okay, they got this one. Just wait. We've got a plan up our sleeve. What do
00:15:12.700
you think is going on behind the scenes in places like that or organizations like Open Society or even
00:15:20.380
Silicon Valley? Right. Yeah. Well, I think that there is, I think there's no question whatsoever
00:15:27.660
that the strategy is for a lot of these big institutions and left-wing groups, like you said,
00:15:34.100
Open Society's Foundation, that's George Soros' group and others, to ride this out. You know,
00:15:39.680
undermine Trump, play politics like they always do, but to ride this out until we get to the next
00:15:46.840
stage of things. They already have Europe totally in their control. And Europe has been moving more
00:15:54.360
to the right, but they're still dominated by left-wing politics. And so they've already got
00:16:00.520
that in the bag. They need America to come along in order for them to really fully implement the
00:16:07.320
agenda that they have for the Western world. They have to have America. George Soros has said this over
00:16:12.040
and over and over again, that he's going back 20, 30 years, that America is the thorn that's always in
00:16:17.600
their side, that's always causing problems. It's the fly in the ointment. It's not allowing this elite
00:16:23.440
progressive agenda to move forward. And so they have to ride this out. But while they're riding it out
00:16:31.080
and they're sort of forced into these positions that are a little bit more moderate than what they
00:16:36.300
would like, I think that the next big movement for them, the pieces that they're putting into
00:16:42.540
place right now, all have to do with emerging technologies. I think that the developments
00:16:47.040
that we're seeing with artificial intelligence and other emerging technologies related to
00:16:51.000
artificial intelligence, that is the future. I think that the Klaus Schwab's of the world have
00:16:56.840
been saying this for a very long time. If people have been paying attention, Bill Gates and Larry
00:17:02.780
Ellison and a whole bunch of other billionaires that are on the left, been saying this for a very
00:17:07.120
long time. And I think the technology has finally reached a level and it's going to advance even more
00:17:15.140
in the next five to 10 years, where this really is the priority for people on the left. And I think
00:17:21.260
that they're willing to lose some of the other battles so that they can focus on this. Because I think
00:17:27.860
that if you can design emerging technologies like artificial intelligence so that it promotes
00:17:34.340
left-wing values, ESG type values, then you're going to transform the world by default, regardless of
00:17:43.040
what the other laws are, regardless of the other battles you may win or lose, regardless of who's in
00:17:47.660
the White House. Because of how important emerging technologies and artificial intelligence is going to
00:17:53.400
be over the really the remainder of the century. If you can control that, you can control all of
00:18:00.480
society. And this is a lesson that they learned from the internet age and the advent of the internet.
00:18:07.720
The left, I think the biggest mistake that they made over the past 30 or 40 years, and the left has
00:18:12.540
been a lot better at social engineering and moving the ball forward than the right has been. I think we all
00:18:17.980
can acknowledge that. But the biggest mistake they made was when the internet first came out and became
00:18:24.380
widely available in the public, they didn't regulate it so that it had their values embedded within it,
00:18:31.100
so that there were rules to it. And then once the internet became really popular, and it was in
00:18:36.080
everybody's home, the genie was out of the bottle, people loved the freedom that they had on the internet,
00:18:40.980
and they didn't want to regulate it. And it became so wildly unpopular. And they tried more recently to
00:18:48.540
do this with social media companies and things like that. And people have rejected it. That's part of
00:18:52.920
voting for Donald Trump is rejecting censorship on the internet and social media. Elon Musk and all of
00:18:59.180
that has been a huge part of that story, right? But if they could go back in time to before the internet was
00:19:04.860
really popular, and they could write rules for it into law, so that that left wing values were the
00:19:11.880
rules of the internet, they would absolutely go back in time and do that. I think we all know that
00:19:15.920
they've been trying to do it ever since. Now they've they learned that that lesson the hard way. And now
00:19:21.800
they're designing artificial intelligence and other emerging technologies so that in the future, that's
00:19:27.980
just going to be the way AI always is right from the very start. I believe that's what's going on right
00:19:33.020
now. I think the evidence is overwhelming. And I think that's the focus for a lot of these these
00:19:38.420
elites at major institutions, financial firms, etc. Yeah, gosh, I've got so many questions within that
00:19:44.860
first talking kind of like about the shift, even in these companies like X and like meta. After Donald
00:19:52.440
Trump won, I saw a big difference in how my Instagram account was handled. We actually for the first time
00:19:58.440
ever had a representative from meta reach out to us and ask just to have an introductory conversation
00:20:04.480
had and very friendly. I had never had that before. So there's a shift going on there. And yet at the
00:20:11.660
same time, there is this advent of AI under meta under Google under X that I can tell in my interaction
00:20:19.800
with that AI is not supportive of my views, because I literally had to argue with Grok until he would
00:20:27.360
finally admit that Islam, according to the numbers is responsible for the most terrorist attacks,
00:20:35.280
religiously motivated terrorist attacks. And it's kind of crazy how human like it is, because
00:20:40.540
you'll argue and it'll say, Okay, that's a fair point. I could see I see your perspective now. And I
00:20:47.120
apologize for not saying that. I've seen other people do the same thing with chat GBT about about a variety
00:20:52.620
of things. But the default is always the progressive position. And you have to really dig in there and
00:21:00.100
argue with them until they will tell you, you know, what is actually true, what the facts actually say.
00:21:06.600
But most people aren't doing that. They're not doing the digging. They're not asking because I know what
00:21:11.540
conclusion I'm trying to get it to. But most people aren't. And so if everyone is relying on AI to for their
00:21:19.320
research and for their answers to basically be their brain, and the default is progressive values,
00:21:25.620
I totally see what you're saying that this is where their sites are set. They can't really change
00:21:30.480
the internet, they'll still try, but it exists, the ideas are out there. But they will try everything
00:21:35.940
they can to monopolize the programming of AI to basically create a hive mind around the world, right?
00:21:43.680
Yeah, I mean, I think that's unquestionably true. I mean, that this is this is what they're openly
00:21:49.500
saying at conferences like the World Government Summit, at WEF, at the annual Davos meeting and
00:21:55.480
other places. The Biden administration very openly said things like this. And some of they had a couple
00:22:02.280
different executive orders related to artificial intelligence development and things and they
00:22:06.080
talked about the need for it to be sustainable and all this other stuff and to promote, you know,
00:22:10.740
DEI type agendas. I don't think they use that term exactly, but stuff like that. And so we know that
00:22:17.460
this is what they want to do. And all of the people who are not all of them, but the vast majority of the
00:22:24.260
people who are designing this technology, that at least at the highest levels, are people who have
00:22:30.820
a more progressive agenda. So it's not surprising at all. And it's not just about creating generative AI.
00:22:38.440
That's what you're talking about. Chat, GPT and Grok and things like that, that have kind of a left
00:22:42.140
wing agenda. That is a huge part of the problem. No question about it. But big corporations are
00:22:47.860
already adopting artificial intelligence to perform all kinds of different tasks for their businesses
00:22:54.940
that will have a downstream effect on the rest of society. And in a lot of ways, you won't even know
00:23:03.040
why society is changing, but it'll be changing because the AI that's serving at the foundation
00:23:08.180
of these business functions are, it's moving the country to the left and people don't even see it.
00:23:15.180
So for example, one of the biggest areas for artificial intelligence moving forward, and this
00:23:20.820
has already started to happen, it's going to become even bigger in the future, is having AI take over
00:23:26.900
for financial institutions like banks, take over the decision making for loans, for example, or other
00:23:35.040
banking services. So instead of having loan officers, when you apply for a loan where a person sits down
00:23:40.480
and they review your application and they decide whether you're worthy enough to get a loan, you'll
00:23:46.040
just have AI do that. Banks are already having AI do more and more of that for them. Well, what happens
00:23:52.940
when that whole, I mean, a lot of industry people believe that the entire loan officer part of the industry
00:23:59.980
is going to go away. It's going to all be decided by artificial intelligence. Well, whoever designs the
00:24:04.760
artificial intelligence is the one who's deciding who gets a loan and who doesn't get a loan. Yeah. Based on all
00:24:10.460
sorts of different criteria. And if you wanted to design AI so that it valued certain things over other
00:24:16.400
things, you would have an AI system for a financial institution making decisions about who's
00:24:22.780
eligible for loans or not. And you wouldn't even necessarily know why you've been accepted or
00:24:28.380
rejected for the loan or the other banking service. You just would be rejected. And it might be because
00:24:34.100
when artificial intelligence is going through the process of looking at your company and you don't
00:24:38.500
have enough electric vehicles, you have too many gasoline powered vehicles, and it doesn't like that
00:24:43.980
because it's been told to value, it's been designed to value, you know, sort of renewable energy type
00:24:50.900
things, then your business might be turned down for reasons that you don't understand. That kind of
00:24:56.380
thing has already started to happen and it's going to get worse. And I think the key for people to
00:25:02.200
understand is that these AI systems have to be designed with values in them. They have to be.
00:25:09.380
You can't design artificial intelligence to be totally 100% neutral. It actually creates all sorts of
00:25:16.640
other problems if you were to act to do that, to make AI in a business setting, totally evidence-based,
00:25:23.620
totally based on data. It would do things that everyone agrees is bad. Like for example, it might
00:25:29.800
come to the conclusion that certain racial groups are less likely to pay back their loans than other
00:25:36.320
groups. Therefore, we should prioritize some races over other races when we're making our decisions.
00:25:43.680
That might be in the data. And if it's in the data, you have a racist AI system making loan
00:25:50.440
decisions based in part on race. And I think we all agree that's not what should happen. So you have
00:25:56.220
to embed some kind of value into AI. And there's a billion other examples that we could point to that's
00:26:01.380
just like that one. So the question is not whether AI should have values. It's what are the values that
00:26:07.000
AI should have and who gets to decide that? And the problem that we're going to run into that I think
00:26:13.060
we're already starting to see it is that the people designing AI are not people who have values like
00:26:18.960
you and me. They're people who have progressive values. They're mostly Kamala Harris supporters.
00:26:25.660
Those are the people who are designing it. And so what does the world look like when all of the banks
00:26:31.600
are dependent on AI, when all of the financial institutions and Wall Street are dependent on AI,
00:26:36.160
when government becomes increasingly more dependent on AI, and AI is being designed with
00:26:41.620
the values that you and I don't hold? Well, what does that world look like? It doesn't look like the
00:26:46.260
world that we want. And I think that is the biggest threat that we're facing right now for people who
00:26:52.420
support liberty. Next sponsor is Cozy Earth. Okay, related bros. You've got Mother's Day coming up.
00:27:04.140
And probably one of the things that your wife would like in addition to share the arrows tickets
00:27:09.220
is some Cozy Earth loungewear or some pajamas. Like I am just very into in my mid 30s now
00:27:17.700
being comfortable when I sleep. Like I care so much about my sheets and the blankets that I use
00:27:24.040
and the pajamas that I wear. I want them to be top notch comfortable, which is why I rely on our
00:27:29.120
Cozy Earth sheets and my Cozy Earth pajamas. I've got a couple pairs of Cozy Earth pajamas that I just
00:27:34.860
rotate. I even have their loungewear. My dad and chief related bro also wear their Cozy Earth loungewear.
00:27:42.480
They love it. It's just so comfortable and so luxurious. Their bamboo sheet set,
00:27:47.700
and their pajamas are just amazing. Their bedding even comes with a 10-year warranty.
00:27:53.780
Make sleep a priority. You need it. It's important. You can get an awesome discount when you use my
00:27:59.600
link, CozyEarth.com slash relatable. You'll get up to 40% off. Amazing. CozyEarth.com slash relatable.
00:28:16.140
Yeah. So the singularity is actually an idea that's been around for 60, 70 years. Essentially,
00:28:25.500
it doesn't have a very specific narrow definition. What it means is this sort of hypothetical moment
00:28:31.920
off into the future when technology advances to a point where it just is completely transformative
00:28:41.080
for humanity. Typically, the way it's talked about is artificial intelligence or just or machines in
00:28:49.660
general become more intelligent than human beings. Other people talk about it as a moment when
00:28:55.840
artificial intelligence becomes more intelligent than human beings and has the ability to sort of
00:29:01.520
continue to redesign itself so that you get into this sort of positive feedback loop of AI becoming
00:29:08.120
smarter and then making itself smarter and then making itself smarter until you have something
00:29:13.420
called artificial superintelligence, which would be much, much smarter than humanity at virtually
00:29:20.760
everything. And that creates all kinds of potential opportunities. Some of it's really,
00:29:27.700
really positive. A lot of the biggest AI companies and developers believe that cancer will probably be
00:29:34.360
cured within the next decade or two because of artificial intelligence and the improvements in artificial
00:29:40.660
intelligence. But it also creates all kinds of ethical problems related to what happens when a lot of
00:29:47.440
employees are no longer needed because HR and loan officers and all these other big, gigantic parts of
00:29:55.920
businesses can just be outsourced to an artificial intelligence system. Well, what happens to millions of jobs?
00:30:01.840
Some people think that there's going to be hundreds of millions of jobs lost over the next 10 to 20 years
00:30:09.500
because of this. Hundreds of millions of jobs. At the very least, it'll be tens of millions of jobs.
00:30:15.680
There'll be massive disruptions in the job market. What do people do if you're in your mid-50s and you still
00:30:21.920
need to work to earn a living, but you've been replaced by AI in your HR department? I mean, what do you do for a living
00:30:29.640
at that point? So you have those kinds of problems, but then you also have the issue of if we do reach this
00:30:34.360
moment where artificial intelligence becomes more intelligent than human beings, more powerful than
00:30:40.480
human beings, well, then how do you stop that? How do you stop it from, say, hacking into secure systems
00:30:48.540
at the CIA or at financial institutions or playing with the stock market or doing all kinds of crazy stuff?
00:30:55.180
And this sounds totally insane, but this is the kind of thing that the Elon Musks of the world and the
00:31:02.200
Sam Altmans, you know, people who are in charge of, that guy's in charge of open AI. That's the company
00:31:08.180
that runs ChatGPT. Bill Gates, like these are the kinds of things that they're talking about all the
00:31:13.960
time. And they acknowledge while they're trying to develop artificial intelligence as fast as they
00:31:20.420
possibly can. They acknowledge that the threats posed by artificial intelligence are so extreme
00:31:26.180
and intense that all of humanity could be wiped out by it. And so it's this incredible moment in
00:31:33.340
history that we're in right now where the world is on the brink of fundamentally changing, at least
00:31:39.140
potentially. And most people don't really even talk about this on a day-to-day basis. You know,
00:31:46.540
never mind worry about what sorts of laws should we put in place to try to prevent this from
00:31:51.740
happening. The people who have lots of money and are focused on this are doing everything they can
00:31:57.760
to design AI as fast as they possibly can, because they're more concerned about technological advancement
00:32:03.820
and beating China in the AI race and other foreign countries. And they're not nearly as concerned as I
00:32:09.800
think they should be about these other ethical issues. And so the singularity is something that
00:32:17.580
you're going to hear a lot more about over the next five to 10 years as we get closer and closer to
00:32:23.920
artificial superintelligence becoming a reality. And I actually think based on what the experts have
00:32:29.740
said, based on what Sam Altman has said and others, that artificial superintelligence is probably
00:32:35.940
inevitable. It's just a matter of getting enough AI infrastructure, like data centers and power
00:32:42.580
companies and things like that built, because you have to have huge amounts of processing power in order
00:32:48.960
to make that happen. But once it happens, once you get that infrastructure in place, I think it's just a
00:32:55.380
matter of time before we have artificial superintelligence. And then what do we do? You know, these are the
00:33:00.320
conversations we need to have now about this, because the rest of this century is going to be
00:33:07.180
dictated, I think, by artificial intelligence. And we need to know how to handle it, how to design it,
00:33:14.280
what our values are. And this is probably the worst time in human history when it comes to answering some
00:33:21.080
of those questions, because people are so confused about fundamental values and objective truth and sort of
00:33:28.060
the core foundational things. If we can't agree on like definitions of what a woman is and what a man
00:33:36.460
is, how are we going to deal with the sort of ethical questions that I'm talking about right now?
00:33:41.900
I mean, this is a huge, huge problem, a crisis level problem. Yeah. And Americans need to start
00:33:47.420
taking it seriously. Yeah. People have posted their interactions with different kinds of AI, whether
00:33:52.920
it's chat GPT or Grok. And you'll kind of, I think that you kind of get different answers, but I've
00:33:57.900
seen people post their conversations and saying like, would you rather asking the AI bot, would you
00:34:05.520
rather misgender someone like misgender Bruce Jenner or kill a thousand people? And it will literally try
00:34:14.520
to give some nuanced take about how misgendering is never okay. And I know that we're talking beyond
00:34:20.260
just these chat bots. We're talking about something much bigger than that. But if that's what's
00:34:24.220
happening on a small scale, we can see a peek into the morality of artificial intelligence. And then
00:34:31.260
you make that, you know, you extrapolate that, you grow that into like a super intelligent being that
00:34:37.600
say somehow, I don't even fully understand it, but has the ability to make those kinds of decisions.
00:34:43.000
We're in a really scary spot. Like technology can tell you what can happen, but it alone can't tell
00:34:49.960
you what should happen. Human beings with a moral compass, of course, we believe with eternity written
00:34:55.680
on the heart. That's what the Bible says, that we're made in the image of God, that we have the
00:34:58.880
unique capacity to be able to say what's right and what's wrong and to harness nature in some ways and
00:35:05.560
to harness technology. But if all of this is being created and programmed, as you said,
00:35:12.640
by people with particular values that are either progressive or just pragmatist, like if they're
00:35:18.020
just like, yeah, whatever we can do and whatever makes life easier, whatever makes me richer,
00:35:23.520
then we should just do that. And yeah, there will be consequences of it. But, you know, I actually saw
00:35:29.100
someone, I forget who it was. It was some executive that said, you know, I'm not scared about AI,
00:35:34.480
killing 150 million jobs. That's actually why we are creating these very immersive video games.
00:35:42.380
So that when people lose their jobs, they can just play these video games and they can be satisfied
00:35:47.140
and fulfilled that way. That is a very dystopian look at the future. And yet that tells us the mind
00:35:54.680
of a lot of the people at WEF, a lot of the people at Davos, a lot of the people in Silicon Valley.
00:36:00.560
That's really how they see human beings. Whether you're talking about the Great Reset,
00:36:03.660
whether you're talking about singularity, they don't see us as people with innate worth. They
00:36:09.060
see us as cogs in a wheel, like people that can just be manipulated however they see fit.
00:36:16.180
But I mean, it takes a worldview. It takes a value system to say, no, people matter more than
00:36:21.420
technology. Jobs matter, not just for their utilitarian productivity that they offer, but
00:36:26.880
because they offer dignity, because they offer worth to a person, they offer purpose to a person.
00:36:31.540
And all of that matters because people actually matter. But that comes, like, as you said,
00:36:36.940
from a particular worldview. And I am not the person that is programming AI. Someone who completely
00:36:43.660
Okay, one thing I know I do not do well is eat enough vegetables. I like fruit. I might eat
00:36:55.800
enough fruit every day, but I know I don't eat enough vegetables. Like, I really need help with
00:37:00.460
that. And a lot of the greens powders that I've used in the past, they just taste so horrible. And
00:37:06.740
that's why I really like Field of Greens. It's like nutritional armor for your body. It actually
00:37:14.480
tastes good. You just put a scoop of this greens powder in your water, you mix it together, and
00:37:21.380
then you have your servings of fruits and vegetables for the day. This is really important for immune
00:37:26.600
health. It's also important for your hair, skin, and nails, just your overall energy, how good you
00:37:32.520
feel? The vast majority of us, probably, I don't know, more than 90% of us don't get enough vegetables.
00:37:39.860
We just don't have enough time in the day. It takes a lot of energy and effort to make and eat
00:37:44.520
all those vegetables. So they make it really easy. And here's what's different about them.
00:37:48.880
Every vegetable they use is actually organic. Okay? So a lot of those greens powders that you see,
00:37:54.860
they'll say, oh, we're healthy. We're great for you. But they're not using organic vegetables. So they
00:37:59.580
could be full of pesticides and things that are bad for you. It's totally counterproductive.
00:38:03.760
Field of Greens is completely organic. If you go to fieldofgreens.com, you can use code Allie,
00:38:09.400
and you have 20% off your first order. That's a great deal. Fieldofgreens.com, code Allie.
00:38:20.100
I feel in some ways, just as I did the first time I talked to you about the Great Reset,
00:38:24.820
I'm like, I feel so powerless to do anything. And yet this past election told us that we're not
00:38:32.160
exactly powerless. But this seems so much bigger than me. And so far beyond most of our comprehension.
00:38:42.500
Well, I think the first step is people have to start really thinking about these things and talking
00:38:48.200
about these things. I mean, the whole anti-ESG movement started from nothing. And now we have,
00:38:56.940
you know, close to 20 states with anti-ESG laws. We've got lots of progress that has been made on
00:39:03.080
that front. And ESG has been, to a large extent, beaten back because people started talking to each
00:39:09.240
other about it. And I think that that's always the first step. Have conversations with people about
00:39:14.620
artificial intelligence just to sort of pique their interest, friends and family members. You
00:39:18.760
don't need to get into all sorts of crazy stuff, but just, it should be on people's radars. I think
00:39:24.060
that's the first step. The next thing is that I think that there needs to be regulations of
00:39:32.480
artificial intelligence. Now, people on the right don't like regulations. I'm a libertarian-leaning guy.
00:39:38.880
I typically hate regulations. I don't support regulations, generally speaking. But we're
00:39:44.160
talking about something that is incredibly powerful, incredibly important. And we're talking about
00:39:51.860
big, massive companies that have hundreds of billions of dollars, more than a trillion dollars
00:40:00.080
in some cases that they're spending on artificial intelligence development. These are limited liability
00:40:05.440
corporations in most cases. So they have special laws that are given to them by government, protecting
00:40:12.320
them from liability, giving them special tax advantages and other things. So we're not talking
00:40:19.240
about some individual who owns a small business somewhere and we want to impose our values on them
00:40:25.000
or regulate them out of existence. I'm not even saying that we should regulate artificial intelligence out
00:40:29.800
of existence. But should we make sure, should we make sure that artificial intelligence, if it's going to be
00:40:35.360
designed, has the right values embedded within it? Yeah, I think we should do that. And I know for a fact
00:40:43.760
that the left is trying to get artificial intelligence developed with its values. And so I think people on the
00:40:50.940
right, we need to do the same thing. I think there should be laws that require AI systems to have certain
00:40:58.460
respects for human rights. I don't think that's crazy. I don't think it's a job killer. I don't think
00:41:04.220
it's sort of a left wing idea to say that if you're going to design AI with all these special tax
00:41:11.140
advantages and all these other things as a limited liability, gigantic corporation, then you should have
00:41:16.940
to embed AI with respect for religious liberty, with respect for the value of humanity, with the
00:41:24.640
understanding that free speech is an essential human right, and other things like that. I think
00:41:29.860
that at bare minimum, we should have laws in place that do that. And I think that that's probably in
00:41:38.300
the long run, the only chance we have of trying to get artificial intelligence to be designed
00:41:45.300
responsibly. They're not the people who are designing it are not going to do it responsibly on their own
00:41:51.440
because they don't share our values. If they embed it with values, and they will, because
00:41:56.880
they're, as I said earlier, you have to embed it with some values for it to make any sense at all,
00:42:02.100
they're going to do it with theirs. And I think that if, and it's especially ironic because so many of
00:42:08.820
these artificial intelligence systems and the infrastructure behind them are actually being
00:42:12.900
built in places like Texas. Texas is likely going to become the epicenter of artificial intelligence
00:42:20.980
in the entire world. Massive, massive investments are being made in Texas. Things like Project
00:42:28.720
Stargate, for example, which was announced earlier this year, they're talking about spending $500
00:42:33.380
billion. This is OpenAI, SoftBank, and some other big institutions talking about spending $500
00:42:40.220
billion on AI infrastructure in Texas alone. The United Arab Emirates says that they're going to invest
00:42:47.340
over a trillion dollars. I think a lot of that's going to be in Texas on artificial intelligence
00:42:53.660
systems. Other places where they're building AI systems are in primarily red states because of the
00:43:01.640
lack of, you know, there tends to be a lack of regulations and things like that. In red states, it's easier to
00:43:06.160
build things. The energy systems are more reliable in red states. And so these artificial intelligence
00:43:12.020
companies want to build in red states. Well, if they're going to build in red states, then these systems
00:43:16.620
should have red state values. They shouldn't have Silicon Valley values. If you want to build these
00:43:22.360
massive data centers in Silicon Valley, go ahead. But they don't want to build them there. And so if
00:43:29.020
they're going to put them in places where you have these communities that support freedom of speech,
00:43:36.240
that support the Second Amendment, that support religious liberty, then I think that it's only fair
00:43:41.820
that the AI being designed there should have some of these should have all of those essential values
00:43:47.080
as well. So I do think there are things we can do now. Eventually, when you end up with artificial
00:43:52.300
super intelligence, if that does happen, it doesn't matter what you designed it with. And that and that's
00:43:58.840
one of the great fears related to the development of artificial intelligence and the singularity and all of
00:44:05.420
that is that when you create something that's more intelligent than people and more powerful than
00:44:12.560
people, you can't control it anymore. And what does the world look like when you can't control
00:44:19.440
artificial intelligence? It's a very, very dangerous thing. And again, this is something that we should be
00:44:26.720
talking about and figuring out solutions for. I don't have all the solutions to these questions,
00:44:30.980
but we should at least be having those conversations. And so I think, you know,
00:44:36.120
interviews like this are so important because there's so few of these conversations happening
00:44:40.720
and they need to be an essential part of the national conversation going forward. It's not enough to
00:44:46.820
just talk about low taxes and budget battles and all of that stuff. Artificial intelligence and the
00:44:53.960
disruptions that are going to come from it and the ethical questions related to it, that is the future of
00:44:59.500
of the whole world, but especially the United States of America. And if we're not going to have
00:45:05.100
that conversation now, then are we going to wait for the crisis to have it? It doesn't make any sense.
00:45:10.260
So it's time for us to start taking this seriously and figure out solutions to these problems.
00:45:15.720
Yeah. Gosh, this reminds me of Frankenstein. Like there are a lot of actually novels that this
00:45:21.080
reminds me of. It reminds me in some ways of 1984. It reminds me of Brave New World, a lot of
00:45:26.000
dystopian fiction we see unfolding before our eyes. You know, it's going to be difficult,
00:45:32.060
I think, for business owners because, I mean, it's just it's more cost effective and easier in a lot
00:45:39.160
of ways with less liability to use AI, for example, loan officers. I mean, people, because we are complex
00:45:46.620
beings, we have all these different kind of layers. We have HR demands. We need benefits. You know,
00:45:52.320
we have to provide for our family. We make mistakes. Now, AI can also make mistakes, by the way,
00:45:57.940
because, again, they're programmed by fallible human beings. But maybe we're more likely to make
00:46:02.760
a mistake. We take longer to do things. Like I asked Grok a question this morning about some
00:46:08.920
debate about some issue, and it sent me point, counterpoint, point, counterpoint. And I don't
00:46:14.960
know, 15 seconds. That would have probably taken me to get all of that information, maybe five to six
00:46:21.240
hours. And that's five to six hours that I want to spend with my family and want to spend doing other
00:46:26.920
things. So you can see the argument for it, especially when it's like, OK, instead of hiring
00:46:31.520
this person, I'm just going to do this. But again, that just means we are going to have to really make
00:46:38.120
sure that we as individuals, not even talking about on the political level, that we know what
00:46:43.180
we believe, why we believe it. If you don't know your worldview, if you don't know why we're here,
00:46:49.540
where we come from, what right and wrong is, why people matter, then you need to figure that out right
00:46:55.840
now. Because you need to base your life and base your business and base your choices on those values.
00:47:02.120
Now, if your only value is making money, then yeah, like AI might be better than like hiring five
00:47:09.640
loan officers. But if your value is higher than that, if it's bigger than that, OK, then you might
00:47:15.880
need to make some different choices. And I'm with you. I'm not saying that AI is all bad, that we should
00:47:20.620
like get rid of it in every single case. But again, we just need to make sure that even just we as
00:47:27.760
people, as families, that we really know what we believe, why we believe it and continue to follow
00:47:32.380
that moral compass. Next sponsor is CrowdHealth. CrowdHealth is a community-driven platform that
00:47:43.100
uses crowdfunding to help members pay for medical expenses. It offers an alternative to traditional
00:47:49.660
health insurance. We know that health insurance can be burdensome. It can be confusing. It's kind of
00:47:55.460
the last thing you want to have to navigate when you're going through some kind of health emergency.
00:48:01.380
And so it might be time for your family to just opt out of the health insurance scheme altogether
00:48:06.960
and to try CrowdHealth. For $175 for an individual or $575 for a family of four or more, you get access
00:48:16.180
to a community of people who are willing to help you out in the event of an emergency. You also get
00:48:21.580
access to things like telemedicine visits, discounted prescriptions, so much more without
00:48:26.300
doctors' networks getting in the way. And plus, of course, you join the crowd, a group of members
00:48:31.640
just like you who want to help pay for each other's unexpected medical events. This is a great way
00:48:37.040
specifically for Christians to come together and help bear one another's burdens. I mean,
00:48:41.860
the burdens that are put on us just by sickness, living in a fallen world, but also sometimes the
00:48:47.600
healthcare system, it can be really hard to bear. And this just makes that burden a lot lighter.
00:48:53.580
Go to crowdhealth.com or joincrowdhealth.com. Use code Allie. When you use my code, you can get
00:48:59.480
started for just $99 a month for your first three months. CrowdHealth is not insurance. Learn more
00:49:04.940
at joincrowdhealth.com, code Allie. Okay, you mentioned earlier the EU and how it is kind of
00:49:19.360
like in this struggle in so many different ways, politically and morally. I mean, we've seen just
00:49:24.320
the demographics change so much that's changed the politics of it. And they're kind of like a
00:49:29.240
progressive stronghold in some ways. And we talked about right before the election, this little known
00:49:35.740
law that you were trying to sound the alarm about, but very few people knew about that had the potential
00:49:42.680
to change all of our lives, to change how all businesses functioned, both in Europe and here.
00:49:51.020
And so I want you to just remind us quickly of what that law was, because that episode that we did
00:49:55.620
had over a million views is fascinating. But also, what is the update? Because we know that the people
00:50:01.380
who wanted that law passed did not want Donald Trump to win because he is a huge impediment
00:50:06.880
to that part of the Great Reset. Yeah. Yeah. So the law that we're talking about
00:50:12.780
is something I've referred to as the EU ESG law. The official name for it is the Corporate Sustainability
00:50:20.240
Due Diligence Directive. It's the most boring named law in history,
00:50:24.740
but one of the most important laws. Essentially, what it was trying to do is create a government
00:50:31.100
mandated ESG system in the European Union. So social credit scores, mostly left leaning stuff
00:50:38.640
like battling climate change and other things. Take that system and impose it on every major business
00:50:45.400
that's operating in the European Union, whether they are a European company or they're a company
00:50:52.300
based in some other countries. So for example, an American company, a large American company
00:50:57.400
that does above a certain business threshold in the European Union would have to comply
00:51:03.600
with all of these ESG rules about social justice type stuff, climate change, water usage, land usage,
00:51:11.880
all kinds of different things. In fact, the law is so vaguely written, we don't even know all of the
00:51:17.140
metrics yet because each individual EU nation is supposed to create its own version of this law
00:51:23.520
and then impose it on the companies that are operating within their countries. So we just know
00:51:29.760
the floor for how crazy it is. We don't necessarily know the ceiling for how crazy it could get.
00:51:35.440
It will change country by country. But not only is that, I mean, that just in and of itself is bad
00:51:41.260
enough because you have all these huge companies that do above that business threshold I was talking about,
00:51:46.000
which I think was about $500 million or so of business in the European Union. So you've got
00:51:51.040
companies like Apple and Meta and a lot of food companies, big food conglomerate companies like
00:51:58.940
Pepsi and Coca-Cola and places like that. They're reaching these thresholds of doing business in the
00:52:03.460
European Union. So they're going to have to comply with these EU ESG metrics and thereby moving their
00:52:10.180
entire, not just with the stuff they're doing in the European Union, but all throughout their business
00:52:14.920
in the United States and all over the world. So by, by doing this, you're effectively exporting the EU
00:52:21.960
ESG, you know, requirements all over the world, including in the United States, and then transforming
00:52:28.300
those places as a result of it. Right. But in addition to that, as bad as that is, the really
00:52:35.000
insidious part is that these, one of the requirements for these companies that fall under this law
00:52:40.980
is that they have to force all, not all, but most of the companies that they do business with
00:52:47.940
in their, what they call chain of activities, just kind of like a supply chain to comply with these
00:52:54.740
rules as well. Even if those companies are small and even if they don't do any business in the European
00:52:59.800
Union. So for example, let's say you have a transportation company, that's just a handful of
00:53:06.100
trucks. And they do business with some large company that operates in the European Union and
00:53:11.780
has to comply with these EU ESG rules. This law will require that large company to force that small
00:53:18.620
transportation company to change its policies so that it's in compliance with the EU ESG rules as well.
00:53:25.920
So you'll have mom and pop shops that are being contractually forced by these massive companies that
00:53:32.920
they're dependent on in a business relationship to change so that they're doing things that comply
00:53:38.060
with an EU ESG structure. So it is, it is a way for the European Union to force the rest of the world
00:53:45.180
to be just like the European Union. Right. And one of the reasons they're doing that is because it's not
00:53:50.880
just that they're progressives and that's what they want, but it's also because their own companies
00:53:55.680
in the European Union are at a huge disadvantage with companies all over the world because they have all
00:54:01.340
these crazy requirements in the European Union that they don't have anywhere else in the world.
00:54:05.300
So one solution to that is you reduce the requirements, you deregulate. Another solution
00:54:10.700
is you come up with this crazy scheme that forces everybody into the same stupid set of laws. And
00:54:17.240
that's exactly what they figured out a way of doing. So if you want to do business in the European
00:54:21.980
Union, you have to comply with these EU ESG rules. And these large companies will do basically
00:54:28.520
anything to continue doing business in the EU because a lot of them are making tons and tons
00:54:33.300
and tons of money in the EU and they're not going to stop doing business there just because they have
00:54:38.060
to comply with some ESG requirements. They have no problem with forcing the rest of the world to go
00:54:42.380
along with it because it's all about money for them. So this was a, this is a massive problem.
00:54:47.360
The law has already been passed in the EU. It was, it's set to go into effect. It was set to go
00:54:52.600
into effect within the next couple of years. It was going to be rolled out, uh, based on company
00:54:58.140
size. And as this starts going into effect, the world is going to be transformed because of it
00:55:03.960
along EU ESG, uh, guidelines. So, um, really, really scary stuff. Uh, as you pointed, as you
00:55:11.480
mentioned, a lot of people saw the interview that we did. I did some interviews with some other folks
00:55:15.380
like Glenn Beck has talked a lot about this as well. And, um, as a result of that, I believe a lot
00:55:21.380
of people started learning about it in corporations, started putting pressure, American
00:55:25.940
corporations on members of Congress. Congress has started to talk about taking action against it.
00:55:32.800
Um, the commerce secretary, Howard Lutnick was asked about this and during his confirmation hearing
00:55:38.320
and he, uh, flower in a questionnaire that was given to him prior to his confirmation hearing,
00:55:44.280
I think. And he flat out said like, we'll do anything we can to stop it. This should not be
00:55:48.540
tolerated. It should not be allowed. Um, the E after Donald Trump won, the EU delayed implementation
00:55:55.520
of the law by a whole year, they pushed it back a whole year because they're terrified that the Trump
00:56:01.260
administration is going to push back against it with even more tariffs and other trade requirements.
00:56:06.740
Um, but the most exciting thing and the pushback against this EU ESG law comes from a, uh,
00:56:13.280
U S Senator named Bill Haggerty, who's a Senator from Tennessee. He proposed a law called the Protect
00:56:19.140
USA Act. And the whole point of this, uh, bill is to stop the EU ESG law. That's the entire reason
00:56:27.320
for the bill. Essentially what it would do is make it illegal for a lot of American companies to comply
00:56:34.520
with it. It actually forces a lot of American companies. They have a way of determining which
00:56:39.340
companies would be in this situation, um, to, uh, uh, avoid the compliance with this EU ESG law.
00:56:47.940
In addition to that, it would allow the president to basically designate almost any company or industry
00:56:53.460
as being important enough that those companies within that industry don't have to comply with it.
00:56:59.200
Uh, and it would allow people who have been harmed by this law to sue, uh, private companies that are
00:57:05.900
imposing these rules on them and get restitution. I think it's like up to a million dollars or
00:57:10.620
something like that. So in civil action, so it has, and then there's this other really incredibly
00:57:16.760
important part of the law that says that the president would have the ability to do basically
00:57:20.660
anything he needs to do to stop this, this EU ESG law or laws like it, other laws like it from,
00:57:27.440
uh, negatively impacting the United States. So it really gives, would give sweeping power to the
00:57:34.900
federal government to stop this EU ESG law, uh, because it's an assault on our national sovereignty
00:57:41.960
and what it means to be an American and what it means for us to chart our own destiny. Uh, we can't
00:57:46.740
have the European union imposing laws on our companies and transforming our society. So there has been a
00:57:53.580
lot of progress. Um, there's talk in the European union about revising the law. It looks like they are
00:57:59.020
going to revise it, water it down to what extent they water it down. We don't know, but it's a direct
00:58:04.880
reaction to the pushback that has come, um, and the Trump administration coming into power,
00:58:11.920
uh, from, you know, the pushback from the public and the Trump administration and some members of
00:58:16.980
Congress. Now the European union is starting to change its tune on all of this. So a lot more work
00:58:22.740
needs to be done on this. There's still a lot of people who have never heard of it. Don't know
00:58:25.760
anything about it. If we, if everybody in America knew about it, I guarantee this thing would be killed
00:58:30.260
almost overnight, but we have made a lot of progress and it really did look like a sort of
00:58:35.660
David versus Goliath situation for a long time. Nobody had ever heard of it. I've been following
00:58:40.580
this law for years, doing research on it. It just never really caught on. Um, and now there's suddenly
00:58:46.120
a lot of, uh, attention being given to this, especially among corporations and government officials.
00:58:51.580
Uh, so this is just further proof. You know, we saw this with, um, ESG generally and the great
00:58:58.440
reset. We saw this with the sort of the woke agenda that was being pushed by companies like
00:59:04.200
target and other companies. We're seeing this now with the EU ESG law. You know, you mentioned earlier
00:59:09.400
sort of sense of powerlessness that people have. I totally get it. I've been there. I felt that way
00:59:15.440
for a very long time. Um, but the truth is we have actually had some huge victories on our side,
00:59:22.720
not just political victories, but other kinds of victories as well. The culture is changing.
00:59:29.040
Younger people are much more likely to be conservative than millennials. Um, and, and the
00:59:34.520
world is moving more and more in our direction. And, uh, I think that the fact that we've been standing
00:59:42.300
up to some of the most powerful, richest institutions in the world for, uh, years now,
00:59:49.260
and we're actually seeing victories come out of that is proof that regular people still do have
00:59:56.140
a lot of power. And I think that the, the progressives and especially the sort of elitist
01:00:03.220
progressives, they bank on the idea that people feel powerless and that they won't feel like they can
01:00:09.860
do anything to stop these things from happening. But the truth is the pushback that we've had
01:00:14.520
against these large companies, against the European union, um, elections, and not just through
01:00:20.280
elections, but our, our personal individual actions, refusing to shop at certain places,
01:00:24.820
making it very clear on social media, how we feel about things. All of this has had a positive impact
01:00:30.860
on the world. And yes, more progress needs to be made. Yes. Artificial intelligence and emerging
01:00:35.920
technologies pose a lot of threats to us, but if we stay vigilant and we stay on top of things,
01:00:40.760
I really do believe that we're going to continue to see our side, uh, advance our side when, because
01:00:47.320
I believe we're on the right side of history. I believe that we are, uh, uh, doing the right
01:00:53.060
things that were morally grounded in truth and not just in our feelings and sort of subjective
01:00:59.400
shifting standards that don't really mean anything. And because of that, I think the truth will
01:01:04.520
ultimately win out. I really do believe that. Yeah. And I think that if you start with that
01:01:09.840
foundation, there's a lot of evidence to look at over the past five to 10 years that says,
01:01:14.880
yes, that act, that strategy is, it is working and we just have to keep fighting. And as long as we
01:01:20.660
don't give up, I think in the end, uh, the world will be a better, freer place. Yeah. And we all play
01:01:26.480
a role in that. Obviously you have a huge role in this, and I just want to, you know, amplify the
01:01:31.880
research you do. That's part of my role talking about some of these other things, but then everyone
01:01:37.200
has their role in voting with their dollar and how they raise their families, what they're teaching
01:01:42.120
their kids. I mean, self-reliance as a family, uh, ensuring again, that you know what your world
01:01:48.680
view is, which includes understanding what the Bible says. Like there's so much that you can do just
01:01:54.780
as an individual to play your part in this. Not everyone has the same part. Politicians have their
01:01:59.880
part. Commentators have their part. Researchers have their part. The parents, the professionals,
01:02:05.240
everyone plays their role in all of this and doing and saying what is true. It really matters.
01:02:11.500
Um, you're doing something to combat all of this. You've just created a fellowship program through
01:02:16.420
Mercury, uh, through Mercury one, and it's called the freedom rising fellowship program. It really is
01:02:23.420
doing and equipping young people to do a lot of what you're saying, pushing back against all of this.
01:02:29.240
So tell us about it. Yeah. So, uh, last year we started this fellowship program, 2024, uh, at the
01:02:37.140
American journey experience, which is part of Mercury one, uh, Mercury one was founded by Glenn
01:02:41.580
Beck. And, and, uh, you know, one of the things that Mercury one does is it educates young people
01:02:47.480
about American history, educates young people about, um, uh, worldview and, and, um, conservative
01:02:54.060
political philosophy and all of that stuff. And part of that effort is now this freedom rising
01:03:00.940
fellowship program that we have, where we're trying to train the next generation of young people
01:03:05.220
age 18 into their early thirties. So we're talking about people who are in college or their, uh, young
01:03:12.320
professionals or graduate school students somewhere in that range. We're trying to teach those people how
01:03:17.880
to become, uh, world changers, how to make an impact through research and public policy, how to get
01:03:24.840
things accomplished at the grassroots level through public policy, uh, through, um, you know, reaching
01:03:30.340
out to lawmakers and educating lawmakers and telling them about the next big issue and, um, and, and
01:03:36.320
learning just about all of these sort of crazy things that are going on in the world, uh, before other
01:03:41.660
people learn about them. How do we do these research? How do we conduct this research? How do we find
01:03:45.820
out about these new things before everybody else does? Me and my research team have been doing this
01:03:50.460
now for a very long time. We've been very successful at it. Um, and so people who are interested, if
01:03:55.900
you're, if you're a young person between the ages of 18 and say 35, you're interested in getting involved
01:04:01.560
in the pro-liberty movement. You want to be involved in journalism or media or public policy, or you want
01:04:07.280
to run for office, or you want to do something kind of right, something in that vein of things. You want to
01:04:13.640
become a conservative activist that helps make the world a better place. Uh, I encourage you to apply
01:04:19.420
to this fellowship program. You're going to learn a ton of great information. We're going to help you get
01:04:23.540
published. We're going to have you work on public policy that, um, might actually become laws passed at the
01:04:31.620
state or federal level. Uh, it's a really exciting opportunity. It's very selective, but I encourage anyone
01:04:37.500
who's interested to apply, you can do that by sending an email, a statement of interest with
01:04:43.380
your resume or CV and a couple of writing samples to fellowship at mercury one.org. You can also go
01:04:50.900
to mercury one.org, go to the education tab in the menu, and you can find a little bit more
01:04:56.280
information about the fellowship program and the dates and all of that stuff. But, uh, yeah, I'm really
01:05:00.940
excited about it. We had our first class last year. They were absolutely fantastic. We had a great
01:05:06.200
time. They got a lot of stuff published. Um, and I'm so excited to, uh, continue the program and also
01:05:12.240
to expand it in the years that come. Awesome. Well, thank you so much, Justin. I really appreciate
01:05:17.320
the work you're doing and that you took the time to come on today. Thanks, Allie.
01:05:27.660
Another pause to remind you guys to sign up for share the arrows, get your tickets today. It is going to be
01:05:32.980
an amazing, uh, Christian women's theology apologetics, uh, conference. We are going to be talking about
01:05:41.540
motherhood. We are going to be talking about the dangers of the new age. We are going to be talking
01:05:47.140
about how to approach health and wellness in a biblical and balanced way. Katie Faust will be there.
01:05:55.360
Alisa Childers, Ginger Duggar Volo, Shauna Holman, Taylor Dukes, Francesca Battistelli leading worship.
01:06:02.320
I will also be speaking. Thousands of you will be traveling from all over the country to be there. So
01:06:07.560
you will be worshiping alongside like-minded, courageous women. You will have the opportunity
01:06:13.380
to make lifelong friendships. That's what happened last year. It is a special, special day. I do not
01:06:19.780
want you to have FOMO. I promise if you don't sign up for share the arrows and you see the videos,
01:06:26.380
you will have FOMO. The merch alone will make you have FOMO. I don't want that for you related.
01:06:32.600
Well, I almost said a related bro and related gal. This is only for related gals. So related bros for
01:06:38.060
mother's day, go ahead and buy the related gal in your life tickets to share the arrows. She will be
01:06:44.500
so excited. Go to share the arrows.com. Get your tickets today. That's share the arrows.com.