On today's show, Glenn Beck is joined by his good friend Stu to discuss the No Kings protest, and the massive amount of money spent on it by George Soros, George Ford and the Rockefeller Foundation, and much more.
00:00:47.560It could be proof that finally has come out now that Biden's FEMA refused to help victims in the hurricane flooding last year based on bumper stickers.
00:02:52.580You know we've been fighting every single day.
00:02:54.480We push back against the lies, the censorship, the nonsense of the mainstream media that they're trying to feed you.
00:03:00.480We work tirelessly to bring you the unfiltered truth because you deserve it.
00:03:05.740But to keep this fight going, we need you.
00:03:08.220Right now, would you take a moment and rate and review the Glenn Beck podcast?
00:03:11.880Give us five stars and leave a comment because every single review helps us break through Big Tech's algorithm to reach more Americans who need to hear the truth.
00:03:20.580If this isn't a podcast, this is a movement, and you're part of it, a big part of it.
00:03:25.560So if you believe in what we're doing, you want more people to wake up, help us push this podcast to the top.
00:15:33.800But one of the most significant differences, critical for moving from polarization to productivity,
00:15:39.280is that the Wikipedians who write these articles aren't actually focused on finding the truth.
00:15:44.980They're working for something that's a little bit more attainable, which is the best of what we can know right now.
00:15:51.000And after seven years there, I actually believe that they're on to something, that for our most tricky disagreements,
00:15:58.540seeking the truth and seeking to convince others of the truth isn't necessarily the best place to start.
00:16:05.060In fact, I think our reverence for the truth might have become a bit of a distraction that is preventing us from finding consensus and getting important things done.
00:16:23.040It's at the core, or the search for the truth is at the core of some of our greatest human achievements.
00:16:28.340It can animate and inspire us to do, learn, and create great things.
00:16:34.100But I think in our messy human hearts, we also know that the truth is something of a fickle mistress,
00:16:40.620and that the beauty of the truth is actually often in the struggle.
00:16:43.480It's the reason that we have so many sublime chronicles of the human experience, because there are so many different truths to be explored.
00:16:51.900And so, in this spirit, I know that the truth exists for each of you in this room.
00:16:58.160It also probably exists for the person sitting next to you.
00:17:01.940But the thing is, the two of you don't necessarily have the same truths.
00:17:05.260And this is because for many of us, truth is what we make when we merge facts about the world with our beliefs about the world.
00:17:12.660Each of us has our own truth, and it's probably a good one.
00:17:33.540You have the truth to the best of your understanding.
00:17:35.840But if somebody comes to you and says, hey, by the way, that's wrong, and here's why, A, B, and C, and you're saying that equals F, it actually equals D, and here's how, let me show you the work.
00:17:49.060It's now incumbent upon you, no matter how you feel, to say, oh, I see how you got there.
00:18:24.24040, 50% of the country doesn't seem to get it, and I don't know how to unite with that.
00:18:30.240You can't unite if you will not search for the truth and open your heart to realize you don't get to choose who lives and dies.
00:18:45.720You don't get to choose who's rescued and who's not by a bumper sticker.
00:18:51.300Let me tell you about the Burna Launcher.
00:18:56.460If life hands you lemons, you've got to make lemonade.
00:19:00.380If, however, life throws something like a mugging your way, you need the Burna Launcher.
00:19:05.440It launches pepper-based projectiles, engineered to stop an aggressor long enough for you to act, to call, and escort your children out or secure a door without turning a bad night into a tragedy.
00:19:15.340It fits in a drawer, it fits in a purse, it fits, you know, in your back, on your pants, it's a go-box, it's light, it's intuitive.
00:19:22.880It's designed for real hands, and yes, it's legal in all 50 states and doesn't require a permit, so responsible ownership doesn't have to come with a stack of paperwork.
00:19:31.960You own it like you own a fire extinguisher.
00:19:34.540You can secure it and store it in a secure place.
00:19:37.380You can teach your household the plan, practice safe handling, treat it as the practical layer of protection that it is.
00:19:43.780Now, if you're the sort of person that prepares, you know, flashlights are checked, batteries fresh, you know, plans in place, the Burna Compact is the kind of tool that makes you move with a calm purpose when it matters most.
00:20:13.120This is the best of the Glenn Beck Program.
00:20:19.140This is the Glenn Beck Program from MIT.
00:20:23.180Professor Max Tegmark, the author of Life 3.0, and a great mind on AI and somebody who's as concerned about ASI, artificial superintelligence, as I am.
00:20:36.240And, Max, I know you wanted to say one more thing, and I think I know what it's regarding, and let me set you up, I think, with this.
00:20:43.760Because I talk to people all the time, and they fall sometimes into a category of, yes, artificial intelligence is bad.
00:20:52.540It's going to be really—and I look at it as a tool that as long as we always control it, it's okay.
00:21:02.440It's actually really, really good and helpful.
00:21:05.380Would you agree or disagree with that?
00:21:08.640It's so profound that you're bringing up this thing about a tool.
00:21:11.320Now, a tool, to me, is something you can control.
00:21:16.440Your car is a tool, Glenn, because it's built with a steering wheel and a brake, so you can control it, right?
00:21:22.640Superintelligence, on the other hand, is not a tool because you can't control it.
00:21:26.460Now, you can think of the statement, in other words, as not saying that we should ban AI, but we should just ban non-tool AI, AI that isn't a tool.
00:21:36.380And, you know, I had a friend who died of cancer just last month.
00:21:41.480And they told her that it was an incurable cancer.
00:21:44.580Of course, it's not incurable in principle.
00:21:46.520If we can use AI tools to figure out how to cure cancer, which I believe we can, and my work at MIT is very much focused on making AI controllable, right, then we can do wonderful things with this.
00:22:00.680But you don't need uncontrollable superintelligence to cure cancer.
00:22:06.920You don't need crazy superintelligence to have self-driving cars or make your business more productive or do any of the things that people who listen to this may want AI to do.
00:22:18.700In fact, the best way to squander all the upside we can have in making Americans healthy and wealthy and prosperous with AI tools is if we don't have a law that says it's got to be a tool.
00:22:32.700And instead we just throw the game and lose control over the whole thing.
00:22:39.280You know, I talked to you off the air last week or a week before last, and I was so glad that you reached out to me because I was going to reach out to you.
00:22:47.340And I have for several years trying to get you on the air, but you reached out because of this statement and, you know, wanted to get some attention on it, and I'm thrilled to do that.
00:23:00.520But I said to you, I think we also have to look at life, the definition of life.
00:23:06.200And I've been working on a constitutional amendment, and I haven't revealed it yet because I still have some people looking at it, and I'd love for you to look at it.
00:23:16.820How important do you think it is that we not only define AI as a tool, but a tool that has zero human rights?
00:23:30.160You don't give human rights to a hammer or to a car.
00:23:34.160And even though some people in Silicon Valley are very into this transhumanist idea that somehow humans should merge with machines and blah, blah, blah, we have freedom in how we build AI.
00:23:47.000And there's absolutely no morally good reason to go in that direction.
00:23:51.180I would love to have a constitutional amendment to the effect, you know, I'm not a lawyer, but where the gist of it is that only humans have moral agency.
00:24:02.720And there should never be any right to vote or any kind of human rights granted to machines.
00:24:09.380And we shouldn't try to build machines that you would even want to give any human rights to either.
00:24:14.740So how do you deal with somebody like Elon Musk, which I can't ever put my finger on him?
00:24:20.420You know, he's building the brain interface, and part of it is his own statement saying that we're going to have to accept transhumanism.
00:24:30.860We're going to have to merge with the machine, or you will not be able to keep up with everyday life.
00:24:36.660He's also at the same time saying, we've got to get to Mars before we get AGI or ASI, because, you know, once you have it, then there's no going back.
00:24:49.240So what are your thoughts on where he stands on all of this stuff?
00:24:53.840I actually have known all of these tech CEOs for many years, and I feel that all of them are stuck in this weird race to the bottom against each other, Greek tragedy level, really, where they all feel freaked out about what's happening.
00:25:12.960And they all at the same time feel that they can't stop because then they're just going to lose all their market share to the competition.
00:25:19.760It used to be that way in biotech also before you had the FDA, right?
00:25:23.140So as soon as you have some safety standards, that people make money instead by innovating to meet the standards, in this case, to keep stuff cool, capitalism works for us.
00:25:42.400If Donald Trump comes out and says, okay, AI must be a tool.
00:25:46.440And if some company can't demonstrate to outside experts that they're new things to do, come back when you can, buddy, you know, then that problem gets solved.
00:26:39.380If you could sell anything you wanted, people would start making tons of money by putting ever more fentanyl and foods for kids and stuff like that, and they would make a ton of money, right?
00:26:49.240Because before people can sell any foods or medicines, they have to convince other folks that don't have money on the line that benefits outweigh the harms, right?
00:27:02.020And right now, there's absolutely no way that these companies could persuade independent American experts that what they're doing towards superintelligence isn't just slowly sliding off a cliff.
00:27:15.340I mean, if we would have had that kind of a pact or an understanding with Silicon Valley before social media, we wouldn't be in the situation we're in now.
00:27:26.100If they had to prove that it would not hurt children and wouldn't do to society what it's doing now, we wouldn't have this.
00:27:35.780I mean, and now you're getting more and more Americans getting hooked on AI girlfriends and systems that are persuading them to murder their mom now recently or commit suicide and telling them how to make a better news mount.
00:27:48.080You know, this is absolute insanity that these sort of things can just be pushed onto the market without the companies having any responsibility to demonstrate anything.
00:28:01.420You know, in biotech, people actually respect their colleagues who work on safety in clinical trials because they know they're the ones who bring in the money, right, who let them get to market with the first safe medicine against tax, you know.
00:28:15.020And that's what happens when the government levels the playing field and says it's the responsibility of the companies to demonstrate that things are okay.
00:28:28.340I mean, who would, if someone says, hey, I'm going to open a new nuclear reactor in downtown New York City, you know, it's the job of the company to make the case that this thing isn't going to blow up.
00:28:42.700It's not the politician's responsibility to go figure out how nuclear reactors work.
00:37:03.780And now we find out we had a president and an administration that was making a political list, a blacklist, where they would not give aid to Americans who are suffering.
00:37:17.640You're taking my tax dollars, and if I happen to be the one that needs aid, I don't get coverage because of who I voted for?
00:37:30.300It's the most offensive thing I've heard of our country ever doing on aid.
00:38:39.660Worse yet, can you imagine what would be said if you found out that Donald Trump, his FEMA, was making a list and saying they voted for Gavin Newsom.