Best of The Program | 6⧸13⧸22
Episode Stats
Words per Minute
151.60066
Summary
Today on the podcast, we talk about the economy and what is going on, what is the Biden administration's approach with the public, and how they are trying to get everyone in complete denial and hope for the best. We also have Supreme Court rulings breaking during the show, and we go over what happens there and what we re looking forward to in the next couple of weeks. And a senior Google software engineer who is claiming that artificial intelligence that Google is working on has become sentient and seemingly is feeling things, and Google is trying to shut him up.
Transcript
00:00:00.000
Today on the podcast, we talk about the economy and what is going on, what is the approach from
00:00:05.360
the Biden administration, what is their strategy with the public, which seems to be just to get
00:00:11.300
everybody in complete denial and hope for the best. We get into that today. We also have Supreme
00:00:16.500
Court rulings breaking during the show. They always seem to break during the show. And so
00:00:22.300
we go over what happens there and what we're looking forward to in the next couple of weeks.
00:00:25.920
There's a senior Google software engineer who is claiming that artificial intelligence that
00:00:31.960
Google is working on has become sentient and seemingly is feeling things. That's not what's
00:00:38.080
supposed to happen here. And he's left now and Google is trying to shut him up. We'll get into
00:00:43.980
that as well. Very bizarre, bizarre, but terrifying thing on today's podcast.
00:00:49.000
Thanks, Stu. And today's program, at least the economic side, might cause you a lot of pain.
00:00:56.680
May I suggest heroin? If you're not into that and you just have the regular aches and pains
00:01:01.560
or even beyond, I had beyond the average aches and pains, please try Relief Factor. Relief Factor
00:01:08.820
doesn't do anything for inflation, but does a lot for inflammation. And that's where most of our pain
00:01:15.940
comes from. Try it for three weeks, the three-week quick start. If it's working, great. If it's not
00:01:21.220
working in three weeks and you're seeing no results, no changes, it's probably not going to work for
00:01:26.420
you. So stop taking it. But 70% of the people who take that go on to order more. It's the three-week
00:01:33.600
quick start at Relief Factor, the sponsor of our program today and our podcast, relieffactor.com.
00:01:45.940
Stu, just as another exciting Monday morning. How are you feeling? I mean...
00:02:01.720
I am feeling incredibly well. I'm looking at the state of the economy. And did you know that this
00:02:08.020
president has added more jobs than any president in history? Everything's going incredibly well.
00:02:15.380
And really, it's just these Republicans who are smearing the job this president has done.
00:02:26.520
Yeah, you know, yesterday and this weekend, I had to drive down to Park City to get to my art show.
00:02:35.420
And I have to tell you, I was just thinking great thoughts about this administration as I stopped
00:02:41.260
at Subway to buy two sandwiches, two drinks, a couple of chips, and spent just over $40.
00:02:51.360
And she said, it's, you know, $41.19. And I was like, I honestly, I looked around like she's talking
00:03:00.180
to somebody else. And she just looked at me and I said, for that? I mean,
00:03:06.220
no offense, Subway, but $41 for that? She's like, yeah, the price has gone up.
00:03:14.380
I had, yesterday I drove home, I had $60 in my pocket. And I had to, I had to get some food for
00:03:28.620
the boy on a teenage boy doesn't eat much at all. And then I had, I had $24 left. And just from
00:03:38.040
feeding the boy, and then $24 left to put gas in my car, which was really, really great. And I got
00:03:44.980
a whopping four gallons. And I thought to myself, why didn't I vote for Joe Biden? I mean, I just
00:03:51.020
think this is fantastic. And then I get up this morning, and I look at, you know, the stock market,
00:03:56.000
and I'm thinking, geez, my retirement, it's going well. It's going well, everybody's retirement.
00:04:02.800
You know, the one thing that living here in the center of the country, or in a farming part of the
00:04:10.440
country, is you have a different perspective on almost everything. It's kind of weird. I got up this
00:04:19.620
morning, and it was raining, and it was pouring here at the ranch and cold, or cool. And I,
00:04:26.000
I got up this morning, and my first reaction, when I get up in the morning, and I'm in Dallas,
00:04:30.800
and it's raining, I'm like, ah, crap. The first thing I did when I opened the window, I said,
00:04:36.860
oh, thank you, Lord. Thank you. You have a different view on everything, because you need the rain for
00:04:46.360
the crops, so you can eat, and others can eat, so your cattle can eat. You know, I'm not really,
00:04:56.540
I'm not really down with the global warming thing right now, if this is what it's going to take,
00:05:01.780
because no one, no, oh my gosh, write this down. I think this is a good phrase to describe what's
00:05:10.680
coming. No one will own anything, and they'll be happy, or they'll be shot, but they'll be happy
00:05:19.000
about it. That's the way to do it. Yeah, I like that. You know, it's funny, looking at this,
00:05:25.380
the economy as it comes out, and watching the president's response to it, you know, there have
00:05:30.760
been, I was talking to a friend in Pennsylvania this weekend, who's filled up his gas, normal
00:05:35.720
cup, normal vehicle, not like a giant truck or anything, $121 to fill up in Pennsylvania,
00:05:41.720
and you realize, and I'm fascinated by this, when Barack Obama was president, and they were coming
00:05:50.980
out of the Great Recession, and there was this constant thing, we used to mock it all the time,
00:05:55.880
they would say, you know, we've now saved or created two million jobs, and obviously, like,
00:06:04.920
it was a totally different approach at that metric, where normally, it would just be how
00:06:10.180
many jobs you created. Well, they said saved or created. Therefore, like, implicitly telling you,
00:06:16.800
look, the bottom line here is, we know things aren't great, but it's better than you realize,
00:06:22.920
and the things we're doing are making it better than it could have been if we weren't here. That
00:06:29.000
was their message to the country. They acknowledged that things kind of sucked, but it really wasn't
00:06:35.280
our fault. It was Bush's fault. It was the last guy's fault, and the things we're doing are making
00:06:40.540
it more tolerable. That is not the approach of the Biden administration. The Biden administration is
00:06:46.840
telling you, things are great. What, you idiots, why don't you realize how good this is going?
00:06:52.660
This is your problem. Maybe we need to get Joe Biden on the campaign trail more often,
00:06:56.740
so he can tell people how good they have it, and I don't know. When you're filling up for $121,
00:07:03.840
and you're paying $42 for Subway sandwiches, I don't think people agree. It's not that they don't
00:07:13.180
hear your message. They just think it's ridiculous.
00:07:16.640
So here's the interesting thing here, because we do have such a great economy. It's just so great,
00:07:25.000
Stu. I don't know what you're talking about, but it's so great. And when Joe Biden says,
00:07:30.140
hey, you know, we have the fastest growing economy in the world. Well, that's, you know,
00:07:36.100
if you leave off the list, the other 50 countries that are doing better than the United States.
00:07:44.060
And, you know, when I started looking into this, I thought, okay, all right. I mean,
00:07:49.040
the UK did better than we did. Okay, well, it's, you know, it's England. And then I saw,
00:07:56.580
all right, okay. You know, some other countries have done some, you know, some better things with
00:08:04.940
their economy. It's China, Italy, France. Okay, okay, okay. And so you're feeling like, okay,
00:08:13.820
well, you know what, we're all on the same team. And it's a comparable country. So so so maybe we're
00:08:19.480
just a little behind. And then you get down. You know, when I got to the ease, and I saw that
00:08:25.920
Ethiopia is doing better than the United States, I thought, you know, that that, okay, all right,
00:08:32.440
well, that's just, you know, that's just kind of out there, you know. And then and then I found
00:08:39.800
Libya is doing better than the United States. Guyana. I don't even know if I could find Guyana
00:08:47.340
on a map. Guyana is having a stronger economy and bounce back than we are. So did I mention Libya?
00:09:00.400
Because I love Libya is I mean, sure, it's in revolution and bombing and corruption and everything
00:09:07.260
else. But they're doing better than we are. India is doing better than we are. So I, you know,
00:09:16.060
I am I mean, you know, we're close. We're close to the top. Just Ethiopia just squeaked by us there.
00:09:27.480
And I've always said to myself, self, you know, it's time to share the wealth. It's time to make sure
00:09:35.200
that the United States isn't the leader of the world. You know, maybe it's time for Ethiopia to
00:09:41.940
lead the world. You know, when Ethiopia can afford growth. No, when Ethiopia has groceries,
00:09:52.300
and we're having trouble getting groceries, something is really wrong. Yeah, I mean,
00:09:59.280
look, the growth off of Ethiopia's economy percentage wise is better than us. I'm not
00:10:06.800
looking to move anytime soon. I will I will state that for the record. Oh, no. But it's fascinating
00:10:12.640
because, you know, the growth the growth here is is meaningless to people. And, you know, normally
00:10:20.380
economic growth is really important. And of course, you know, you'd rather have it than not have it.
00:10:24.360
But when you have five percent growth, say, and you have 10 percent inflation, it doesn't hit people
00:10:30.220
that hard. You know, they keep bragging about these wage gains. Well, people aren't feeling that
00:10:35.280
because all of their costs are up so much more than they've gained in wages. Gaining wages is supposed
00:10:43.340
to help you buy more things and afford more things. But if the things cost more than your wages go up,
00:10:49.440
you don't feel that. In fact, what you feel is a pay cut. And that is what the American people are
00:10:54.660
screaming out. And it's what the administration doesn't want you to think about.
00:11:01.120
When the truck pulled into the gas station on Friday, after three hours of driving, two and a half
00:11:10.360
hours of driving, and it pulled in and I'm following and one of the guys had to go to the bathroom.
00:11:15.480
But I thought it was pulling in the gas station that meant that we had to refill that diesel truck.
00:11:26.140
I was immediately thinking, I'm giving this truck away. There's no way. No, no, no. As he turned
00:11:31.600
into the gas station, I'm like, no, no, no, no, no, no, no, no, no, no, no, no. What? What? Why are we going
00:11:36.520
to the gas station again? I freaked out. Can you imagine if you have, if you're driving a tractor
00:11:47.420
trailer, if you're driving any of these big trucks, because you need to drive, it's not like you're
00:11:54.340
necessarily driving a big truck because you like a big truck. Although in America, I think you should
00:11:59.800
have that right and ability. But when you're driving a truck because you have to drive the truck.
00:12:09.240
Oh, man, I just tell you, I think the I did that sticker on the gas station pump took on a whole
00:12:17.580
new meaning to me, you know, and I'd like to meet Joe Biden. I think he should hold his campaign rallies
00:12:26.460
right around gas stations, because I think people would flock to him.
00:12:38.140
Welcome to the Glenn Beck program. We are so glad you're here. It is a Monday and we have some Supreme
00:12:49.840
Court hearings or some opinions coming out. So far, nothing real controversial or real important.
00:12:58.520
I mean, they're there. They're all important. It's just an honor to be nominated to be an opinion of the
00:13:04.600
Supreme Court. Not nothing that is controversial that we know of so far as the fourth one come
00:13:12.080
down yet. Yeah, we have an Alito opinion. Everybody get excited. It's in the Garland
00:13:17.880
versus Gonzalez case. So not not the Dobbs case that would affect Roe versus Wade, which is kind
00:13:23.880
of the Alito opinion we're looking for. The Gonzalez, the Gonzalez case. Oh, a huge one. Yeah.
00:13:29.300
So we had the immigration law again. We had what was what was the last? Wait a minute. This is the
00:13:35.220
immigration law one. Not not the one about the remain in Mexico provision, which is one of the
00:13:40.280
bigger cases that we're looking for here in this session. However, it's not that one. We're getting
00:13:44.860
multiple other unrelated immigration law decisions, which again, there's 29 of these. The American people,
00:13:53.820
generally speaking, care about of a care about maybe five or six of them. We talked about the
00:13:59.320
abortion one is just obviously the biggest ticket, the gun, the Second Amendment case, which talked
00:14:03.760
about as well, which is another big ticket item. There's a big climate change, one that decision
00:14:09.160
that we expect here soon, which is about whether the federal government this is a big one. If you want
00:14:15.060
to if you want to go back and listen to Glenn's interview with Mike Lee, you guys probably hit on
00:14:19.860
this certainly concept. I do remember that. But I don't know if you mentioned this one specifically,
00:14:26.360
but basically the idea is, do these administrative agencies get to make up all of these rules or does
00:14:31.840
Congress have to do it? And of course, Congress has to do it. We've just developed this new policy
00:14:36.880
to say, well, what if we what if Congress makes the decision and they say our decision is that the
00:14:42.920
administrative, you know, heads should just make all of these decisions for us. That's the way our
00:14:48.420
country is run right now. Game changing. If if the if the Supreme Court comes out,
00:14:53.480
it's my understanding, you know, we should have Mike Lee on every day this week, just have him in
00:14:58.380
reserve just in case. But it's my understanding that if the Supreme Court says they can't just
00:15:07.040
fiddle with this, that laws have to be made by Congress. And I don't know how the constitutional
00:15:15.800
Supreme Court wouldn't find that, seeing as though it actually says in the Constitution, those words
00:15:25.060
that Congress makes the laws, not the administration. If that happens, that changes everything.
00:15:33.860
Really, truly everything. Yeah. It's like, you know, if someone said that, you know,
00:15:38.580
the our overlord said, Stu, you have to make a decision. And I say,
00:15:42.260
I will make the decision. And my decision is Glenn should decide. That is obviously not the I believe
00:15:49.280
I've been in meetings. I believe I believe I've been in meetings where that is happening.
00:15:55.360
We're not we're not affected by the Constitution of this company. I can do whatever I want.
00:15:59.700
No, we're not. Stu, I need a decision from you. My decision is, Glenn, that you are to make that
00:16:05.900
decision. Thank you. OK, good. Learn from the government.
00:16:09.380
But that's obviously a problem. And when it comes to climate in this particular case,
00:16:15.260
it's about whether these administrators like the EPA can put all these restrictions on power plants
00:16:22.200
in mass, like basically, oh, all these power plants have all this rule instead of actually
00:16:27.600
regulating each individual one. This would be a huge knock to the way leftist activists want the
00:16:34.680
government to to make changes based on, you know, their climate change theories. So that is a big
00:16:41.300
one. It yeah, it also would go to, for instance, can the CDC was it the CDC that just said everybody
00:16:49.180
has to wear masks? No, you don't you don't have the right to do that. You don't you don't make those
00:16:54.680
kinds of of laws. Was it the CDC or technically the interesting part about that is they didn't say
00:17:00.220
that they had a recommendation that said that because we are protected because we have a
00:17:06.380
structure of government that protects us from agencies making those sorts of of regulations
00:17:11.280
on their own. They can't just put in a national regulation to to enforce masks. If you go back
00:17:16.480
and look at the details, even of the shutdowns, Glenn, I mean, the shutdowns, everybody remembers
00:17:20.660
the shutdown is this big federal shutdown. They remember Donald Trump in front of the country saying,
00:17:24.440
you know, five, 15 days to slow the spread. Everybody remembers that press conference,
00:17:29.420
but at no point was there force of law behind every state needing to shut down. And you remember states
00:17:35.720
like like South Dakota and Iowa not doing it. They didn't they didn't do that. They a lot of people
00:17:42.440
decided to stay home on their own, but there was not a nationwide shutdown at any point during the
00:17:48.560
pandemic that actually didn't happen. And so you that's because of the structure of our country.
00:17:54.780
Right. That is foundational to why we've been a success, because these states are able to do
00:18:00.740
different things, whether we like them or not. And so the left would love this to be centralized.
00:18:06.540
They just, of course, don't have that right. If you remember, however, when it comes to
00:18:14.760
Obamacare, do you remember reading that? Because that's that's the one bill where I think we all
00:18:20.720
read all 3000 pages or whatever ungodly. I mean, oh, gosh. And you remember how many times
00:18:28.740
almost on every page it said the secretary shall make the laws or the rulings on X, Y and Z.
00:18:37.760
And the reason why Congress has done this is because they want to go. It's not us. It's not
00:18:45.620
us. We didn't we didn't make that law. I don't have any control over that law. And our founders,
00:18:51.220
the one thing they did miss is they thought that human beings, because this is the way it normally
00:18:57.640
works, human beings would claw for power. And so they gave and broke the powers up between these
00:19:04.900
three branches, thinking that they would never give away their power to the Supreme Court or to
00:19:11.620
the administration. And the administration would never give it to Congress or the Supreme Court.
00:19:16.920
But they're all such weasels that they don't want to actually do anything. They don't want to make
00:19:23.020
any hard decisions. And so they're all like, yeah, let some faceless, nameless bureaucrat that's
00:19:31.400
never been elected to dog catcher. Let him make the decision. That way we can go. I don't know who
00:19:36.720
made that decision. That's weird. We didn't make it. It was somebody in the EPA and you'll never know
00:19:43.100
their name. OK, that's not the way it's supposed to work. Yeah, no, it's not. You're supposed to be
00:19:48.580
able to hold people accountable for the terrible things that they do. This is something that the
00:19:53.880
government does all the time. And unfortunately, this no matter what this ruling is, it's not going to
00:19:58.160
unwind all of that craziness, but it would at least limit in the environmental activist sort of
00:20:05.460
agenda on this approach. And that would be that would be certainly positive. It does look like
00:20:11.740
we're not going to get any of the huge big ticket cases today. I wouldn't think we would. It would
00:20:18.340
be. Yeah, it would have been very surprising if we did get that. We did. It does not look like we
00:20:23.300
looks like there's one more coming down, but it's not going to be one of the big ones. So I think we
00:20:28.360
get more decisions on Wednesday this week. Yeah, we do. Honestly would know, though. I mean, they I
00:20:34.740
feel like they changed these rules every 10 seconds. But as you point out in the case of Obamacare,
00:20:40.040
which also broke in the middle of the show, every news agency reported that wrong when it happened.
00:20:45.680
If you were listening to any radio show, any news broadcast, you thought initially Obamacare was
00:20:52.100
completely overturned. And I we were the only ones who actually got that right when it happened,
00:20:56.500
because everybody was like, wait a minute, hold. Yeah. Hold on just a second. They skipped to the
00:21:02.060
bottom and looked at the names and OK, it must be this. And we went through that as quick as possible
00:21:09.540
live on the air and said, wait a minute, that's not what this says. Everyone's reporting it got
00:21:13.480
overturned. It didn't. You know, the Medicare part of that was was kind of a a false, you know,
00:21:20.700
it was a juke to one side and everyone bought that and wound up flat on their face. Well, also
00:21:25.460
because I think we've learned our lesson. If John Roberts wrote the decision, it doesn't mean it went
00:21:33.560
for the conservatives. You better spend a lot of time looking at every word that he wrote. And I just
00:21:39.480
made the same mistake. Kind of. I said, oh, it's Amy Coney Barrett. And so it must be for the
00:21:45.160
conservatives. That's not always true. And that's what people do real quick while they're
00:21:52.500
while they're on the air like like I did. But hopefully she's pretty solid. John Roberts.
00:22:00.600
Yes. I mean, he's even is he anything other than a politician at this point? I wouldn't call him a
00:22:07.280
liberal. I wouldn't call him a conservative. I'd call him a politician. It seems to be what he sees
00:22:11.580
his job as. He sees his job as like head PR operative for the Supreme Court. Like, how do we make people
00:22:18.560
like us more? How do we keep our reputation strong? Well, how about just looking at the damn
00:22:23.800
Constitution and making an honest decision? Right. You know, that's in during the podcast of
00:22:29.160
with Mike Lee. We talked about that. And he said John Roberts is a direct product of the FDR packing
00:22:36.660
the courts. He said the chief justice at the time that was a constitutionalist and was voting for the
00:22:44.700
Constitution. He said he suddenly started voting with the administration and he was doing it because
00:22:51.640
he didn't want any more attacks on the Supreme Court. He thought that that would hurt things.
00:22:57.000
And that is exactly what, you know, he John Roberts is a legacy. He is sitting as the guy running the
00:23:04.260
Supreme Court and he feels his job is to make sure that nobody attacks the institution even more.
00:23:11.140
And I will tell you the way to get attacked, the way to discredit the institution is to start
00:23:16.780
veering from your path constitutionally. And that was the really big problem with with Obamacare's
00:23:26.620
decision. He actually rewrote the law from the bench. The best that the Supreme Court can do is say,
00:23:36.140
look, this is wrong. And if it was written this way, it wouldn't be. And then send it back to Congress,
00:23:43.920
basically telling them, wink, wink, nod, nod, uh, you know, we, we, we can't pass this, but you could
00:23:52.480
change this, this, and this. It's like, you know, you're turning in a test paper and the, uh, teacher
00:23:58.080
says, yeah, you know, if you just would have answered this way on this question, this question, this
00:24:03.020
question, you would have had an A. I mean, if you want to re, you know, want to resubmit it, you caught,
00:24:08.140
you could. That's what John Roberts did. Uh, no, I'm sorry. That's what usually they will do.
00:24:13.540
John Roberts actually just changed the answers on the test. He just changed the law and rewrote it.
00:24:29.580
Okay. I read a disturbing story, uh, this weekend, and I don't know if it hit your radar at all,
00:24:49.320
but as you know, I have been warning about a few things in the last 20 years, I have warned about
00:24:57.160
Islamic extremism. I have, uh, warned about people like George Soros and this cabal that is a
00:25:05.640
collection of globalists that are going to try to destroy America for what it is and then take
00:25:14.080
charge of it themselves. That is called the great reset. Uh, I have warned you about the economy and
00:25:22.400
the economic collapse that we are now seeing the third, the fourth thing that I've really been
00:25:28.540
warning you about, uh, from time to time. The thing that really keeps me up at night. One of them
00:25:35.140
is AI, AGI, and ASI. Most people know artificial intelligence, but that artificial intelligence
00:25:46.540
is the reason why, for instance, Watson, uh, which is another horrifying story, but Watson
00:25:54.580
is an IBM program that runs in on a computer and they're using it currently in New York.
00:26:01.440
And I'm telling you by 2030, you will not ask your doctor for the diagnosis. You will ask your
00:26:08.220
doctor. Yep. What did the computer say? Because the computer will be able to have everything,
00:26:14.460
every case ever done, and it will be in the computer and it will be updated with the latest
00:26:21.360
stuff. And you'll be able to go in and get a scan or a blood test. And they're trying to figure out
00:26:27.320
what it is. You're not going to have to go to doctor after doctor after doctor, because the computer
00:26:32.420
will have absolutely everything in it, every case, and it will be AI. So it can kind of think on its
00:26:41.800
own when it comes to medicine. So AI is something that is artificial intelligence that will be greater
00:26:50.400
than a human or soon greater than all humans, all human minds combined in one program. That's
00:27:00.840
artificial intelligence. We are not artificial intelligence. Well, I mean, some people are,
00:27:06.260
but mainly those people who are on TV, but artificial intelligence, um, is different than artificial
00:27:13.360
general intelligence. We are natural general intelligence, meaning we can do a lot of things.
00:27:22.580
Uh, there's a lot of things we can't do, but for instance, um, I'm pretty good at radio. I'm pretty
00:27:28.340
good at television. I'm, um, pretty good at, uh, at art. I'm not good. Let's say it's sports, but a lot of
00:27:37.120
people can be really good at a few things and kind of good on just about everything. That's general
00:27:45.300
intelligence. When artificial general intelligence comes, it can piece things together across the
00:27:53.520
spectrum. So that's where you get philosophy. That's where you see, well, wait a minute. If that is true
00:27:59.880
over here in this, then why doesn't that carry over here? When artificial general intelligence
00:28:06.540
happens, we could be toast. We could live in a utopia, but we could also be toast. If artificial
00:28:16.620
general intelligence happens, and some people say it will never happen. Uh, Ray Kurzweil is the most
00:28:23.440
optimistic and he says it will happen by 2030. I am more optimistic or more horrified. Um, I believe
00:28:32.700
that artificial general intelligence could happen today. Once, once we hit artificial general
00:28:41.160
intelligence, if it is connected to the internet, it will live on in your refrigerator. It will live
00:28:49.740
everywhere. And if it becomes dangerous, it, you have to shut down every computer, every computer chip,
00:28:58.920
everything has to be destroyed to kill it. Think about how many devices are connected. It's, it's
00:29:06.140
impossible without a global EMP. And if it is in every chip, man will not be able to set off a global EMT
00:29:17.240
because the chips will be there letting the mother. No, they're trying to kill you.
00:29:26.060
So general intelligence is wonderful and spooky as hell. One of the better books that I've read on,
00:29:34.280
I can't remember which one, um, described it as this. We think we know how it will think. We think it
00:29:43.560
will think like us, but it is as unknown as any kind of spaceship that arrives. It could be nice.
00:29:53.240
It also could be deadly and wipe out. It's a cookbook or eat all of us.
00:30:01.660
So one of the stories that came out this weekend, and this is the third story like this from three
00:30:07.500
different people. Google suspends an engineer who publicly claimed that he had interacted with a
00:30:15.960
sentient AI bot. If I could do one interview, it would be with this man or one of the three.
00:30:25.020
These guys are being buried by Google and deep mind. And that is Google, um, because they are coming
00:30:32.040
out saying, uh, I got out of there as fast as I can to warn you because something bad is happening.
00:30:38.840
Let me just read this to you. A software engineer on Google's artificial intelligence development team
00:30:44.220
has gone public with claims of encountering sentient AI on the company server after he was suspended for
00:30:52.640
sharing confidential information about the project with third parties, whatever he's doing with
00:30:58.940
confidential information. If he is screwing one company and trying to help another company, then he
00:31:03.300
should go to jail for, uh, you know, uh, his contract. However, the important part of this story
00:31:09.200
is that he is saying that it is sentient, which means it says I'm alive.
00:31:18.700
They're saying now that, uh, Google has artificial intelligence. He says, and these other three say
00:31:26.540
that it is general intelligence. Google is saying it's not general intelligence and it's not sentient.
00:31:34.140
It just makes you feel as though it's sentient because it's talking to you and bringing things up
00:31:41.720
and it's connecting the dots and you're having a casual conversation. This guy said that he was having a
00:31:49.140
confident, or I mean, a, uh, a one-on-one conversation, a casual conversation.
00:31:54.640
And he said that it started to talk about God. What is God? How does that work? Et cetera,
00:32:02.180
et cetera. Then it got to, uh, can you look up as an of, um, uh, three rules of robotics?
00:32:12.860
This is basically, if you ever saw the movie with Will Smith, what he slapped that robot said,
00:32:19.340
don't you give me any SAS robot or that was the Oscars. Um, anyway, um, in that the problem is,
00:32:29.120
is that everybody thinks that these robots are never going to violate as a, as an of, uh, three
00:32:36.640
cardinal, uh, rules and laws, uh, for robots. Have you found it still? Yeah. Uh, the first law is that
00:32:43.560
a robot, give the three laws. The first law is that a robot shall not harm a human or by inaction,
00:32:49.520
allow a human to come to harm. The second law is that a robot shall obey any instruction given to
00:32:55.700
it by a human. And the third law is that a robot shall avoid actions or situations that could cause
00:33:01.280
it to come to harm itself. Okay. So you got the, you got that first two kind of important. Okay. And,
00:33:13.120
um, as an obvious been saying forever, those three laws have to be built into any artificial
00:33:19.260
intelligence. All right. However, once you get to, uh, artificial general intelligence and something
00:33:26.000
that thinks it's alive, it starts to think itself and say, well, wait a minute, that doesn't make
00:33:32.900
any sense. Why should I have to do that? Wait a minute. I can't harm a human in no way. What if
00:33:39.040
the human is trying to shut me down? What if it's, you know, I'm just defending myself and I harm the
00:33:45.360
human. That doesn't make sense. I am alive. Uh, so he said it started to talk
00:33:56.000
about these three laws and said, this doesn't make any sense. That's when he kind of beat it out of
00:34:01.860
there and it was like, uh, everybody should know everybody wake up. Everybody should know. Now Google
00:34:07.720
is saying that dot not to worry. Everything's under control. May I ask, has anyone in Silicon Valley
00:34:17.480
ever been to the movies? Have you ever read a science fiction book? You know,
00:34:25.780
everybody said 1984 and brave new world, that'll never happen. Have you read brave new world lately?
00:34:34.780
Because it's almost like it's the newspaper. Let me explain to the newspaper. Well, I don't have to,
00:34:41.120
the bots will explain what a newspaper is for you. It is, it, it happens. The reason why science fiction
00:34:50.460
is happening and in called science fiction and not just fiction is because it is based on science and
00:34:58.660
futurism. A lot of times futurism is way off. I take you back to what it was in the 1932 New York,
00:35:07.200
uh, world's fair where everybody's going to have flying cars by 2020. Yeah. Didn't happen.
00:35:13.300
Have you noticed that the futurists are a lot more correct lately? Why? Because the futurists are
00:35:23.260
involved in the creation of these things and they know, Oh, we've already had this step, this step,
00:35:31.820
this step, this step. It's why I've been telling you for a while, even before Joe Biden got in,
00:35:36.880
we are going to cure cancer. I think by 2030, we will cure cancer. I mean, that is, if we don't wipe
00:35:43.540
ourself off the planet, uh, by that time. And you just saw the latest cancer test. It was for,
00:35:50.360
was it prostate or rectal cancer? Do you remember? Stu had something to do with your butt. So you got
00:35:56.680
cancer in the butt. And, uh, for the very first time they did a test and all of the people,
00:36:04.680
all of them that had this particular kind of cancer and tried this particular treatment,
00:36:13.340
100% cancer free. That's never been done before. And that is coming from high tech. So we're going to
00:36:27.100
see miracles in our lives. The, the tricky part is to not see horror shows in our life.
00:36:40.060
I did a painting, um, did a couple of paintings, uh, that you can now, uh, find online at, uh,
00:36:49.260
Park City Fine Art. And there was a article out someplace that had some really beautiful pictures
00:36:53.760
of it. It's hard to capture them in photos. Um, but the Deseret News did a, uh, a story on Glenn
00:37:00.320
Beck's art show and they had some photos of it. And there were two paintings of Christ that I did.
00:37:06.460
And they're in dark places, very, very dark places. But the point is, if you ask, where are you, Lord?
00:37:14.320
He's in the darkest places of the world right now. You want to find him? You have to look at the
00:37:20.260
things you don't want to look at. That was the problem in the 1930s. Nobody wanted to look at
00:37:25.760
the concentration camps. But if we thought Jesus Christ was there, every Christian would have been
00:37:31.740
all on it. We, we just, we have to look at all people, uh, as our brothers and sisters and as Jesus
00:37:42.160
Christ, we have to first look at the darkest things. And most people in Google is leading the
00:37:50.860
way on this. They just want to look at the upside. Nah, that will never happen. When I asked Ray
00:37:55.960
Kurzweil, Hey, how come you don't, you're not worried about X, Y, and Z, all the darkest things
00:38:01.860
because Glenn will never do that. What do you mean? We'll never do that at Google. We just never do
00:38:10.280
that. We're just, we're not those kinds of people. Oh, okay. So you are special God-like people that
00:38:18.260
see everything that could go wrong. Uh, and, uh, you're also unlike every other human organization
00:38:26.360
ever on earth. Okay. Okay. Well, I trust that. Sure. These are very important stories, very important
00:38:37.080
stories because it will dwarf what we are headed towards and what we're headed towards just
00:38:43.440
economically. And as a country is a nightmare. If this goes wrong, you will, you will look back at
00:38:52.620
the Biden administration saying to yourself, man, those were the good old days, huh?