On this episode of the Glenn Beck Program, host Glenn Beck is joined by Jason Batrick to discuss the latest on the latest in the story of the Nordstream 2 pipeline blow up, and whether or not it was an act of war.
00:00:00.000I want you to picture for a minute what, you know, what the world would be like if all of a sudden a global medication supply chain of antibiotics just stopped, disappeared, even for a while.
00:28:40.820Now, we just told you that within hours of this story being released, there was another story that was released where we are pointing the finger now at Vladimir Putin saying he's the one who gave the weapons to the separatists that shot down Malaysia Airlines MH17.
00:34:00.880Those people are openly conspiring to change the functions of government and business and even capitalism at its very root.
00:34:14.120They are looking to change the family relationship, the dynamics of humans at every level, and they're doing it openly.
00:34:25.480The last time I remember seeing this happen in the West was World War I, where you had the Fabian Socialists and a group of elites all over Europe saying,
00:34:37.840we can, if we blow it up, we can completely redesign it because now we're in the future.
00:35:20.760And when that didn't solve problems, it led to World War II.
00:35:25.500We're going through the same kind of thing now, where we have elites telling us one thing because they have one desire and the people of the world have another desire.
00:38:25.560How much would an extra $695 a month, what would that mean to you and your family?
00:38:30.940Now, you could end up being able to delay as many as two mortgage payments, which would be another blessing, and then close in as little as 10 days.
00:39:22.180Hey, every day we issue my newsletter, my morning newsletter, and a couple of months ago, I decided that I would release all of my raw show prep, and really only maybe now 20% of it gets onto the air, and there are so many stories just today that we're not going to get to, and they're really important for you to know.
00:39:49.300If you want a news digest, something that will show you the things that I am watching and think that are important, you'll get about 60 stories every day.
00:40:02.760Just go to glennbeck.com, and if you do that today and sign up for the newsletter, you'll get access to the research from last night's Wednesday night special about artificial intelligence.
00:40:13.740AI is here, and it will change our lives permanently in the near future, and I'm not talking five to ten years.
00:40:23.700I'm talking about the next several years, okay?
00:46:35.280We have had a heck of a time trying to get people to talk about AI because sometimes they're very, very left and they don't want to be on the program.
00:46:48.320And I'm like, well, this is a human issue.
00:46:51.700This is not something that the right should be educated or the left should be educated on.
00:47:57.260And yes, I mean, AI can do things that no dictator would, even in their dreams, think of 50 years ago.
00:48:06.780And unfortunately, in a country like China, it's already happening.
00:48:10.040I mean, you know, what's amazing is, if you know history back in World War Two, IBM with the punch cards, they were Germans were doing their census with these punch cards.
00:48:24.060And it was the punch cards that allowed the Germans to find the Jews.
00:48:30.280They could just sort everybody by their race, et cetera, et cetera.
00:48:58.300It is. But on the other hand, the Jews would be using the technology as well.
00:49:02.620In fact, if you look at what's happened in Hong Kong, for example, the protesters, they actually got very savvy about using tech to counteract the Chinese tech.
00:49:10.580So I don't know who we're not in the end.
00:49:12.940I think, you know, I wouldn't give up the Jews just like that.
00:49:15.660But the point is, if they didn't use tech and the and the Nazis use tech, they would be toast.
00:49:21.460OK, so there was so much to talk to you about because you're into machine learning, which if you can explain, break it down to, you know, a dummy like me, what machine learning is and why we should care about it.
00:49:37.320So AI is getting computers to do the things that only humans traditionally can do, like solving problems and reasoning and seeing and talking.
00:49:47.040Machine learning in particular is getting computers to learn the same way children and grownups do.
00:49:53.320So it's a very powerful thing is the computer, instead of having to be programmed, you can actually learn just by imitating people, by looking at data.
00:50:01.380You can learn to drive a car by watching videos of people driving cars.
00:50:05.660You can learn to play chess by playing against itself and so on.
00:50:09.460And machine learning is at the root of all these things that AI is doing today.
00:50:13.680And does it have a way to recognize, ow, don't touch the stove.
00:50:22.500I mean, that's an important part of learning.
00:50:25.400In fact, this is a part of learning called reinforcement learning, and the term actually comes from psychology, which is when you touch the stove and burn yourself, you learn to not do it again.
00:50:36.140And we have algorithms in machine learning to do essentially the same thing.
00:50:39.440Okay, so when you look at the principles of machine learning, and we have to understand that the algorithm, we have an algorithm that we use, and machines are developing this algorithm.
00:50:59.080And the tremendous side of this is just making your life really, really easy, and even all the way down to helping you find the perfect spouse.
00:51:12.560And I mean, really perfect spouse, right?
00:51:14.840Well, machine learning can do a lot of different things for you.
00:51:20.020Think of all the things that we learn to do if the computer can learn to do them for you.
00:51:24.240Not only can it make your life a lot easier by taking away a lot of the routine stuff, you can now do things to an extent and in an amount that you couldn't before.
00:51:33.340If you have a project that you pay a few people to work on, you could potentially have not just a few AIs working on it, but a million or a billion.
00:51:40.900So, you know, whatever it is that you want to do, machine learning, you can think of it, it's like an intelligence multiplier.
00:51:48.040You now have a thousand or a million times more intelligence at your disposal.
00:51:57.220I mean, talk to me about the digital twin theory that on dating, for instance, you know, it will date your digital twin that knows you better than you know you,
00:52:09.700will go out and, you know, basically go on digital dates with somebody else's digital twin.
00:52:17.740And it could do that, you know, a billion times and find somebody that you would have never found.
00:52:26.520So these days you can in principle date, you know, all sorts of different people, but you don't have time to date them in real life.
00:52:33.840And then usually with a lot of your time, just on dates, maybe that don't really pan out.
00:52:38.660And what machine learning increasingly is going to let you do is there's a model of you, really a digital version of you that can go on simulated dates with the models of other people and do this, you know, an arbitrary number of times.
00:52:54.340And then what the system does, it says, look, here are the top 10 people that I dated as your, you know, avatar, as they called.
00:53:03.140And, you know, and do you want to date these in real life now?
00:53:05.680And then you can do that and then you give it some feedback.
00:53:08.360And that time, next time, maybe it finds you even better people.
00:53:11.140So anytime you have to make choices, whether it's just, you know, on the web or listening to something on the radio, machine learning already helps you.
00:53:19.480But this can go as far as helping you choose a major, choose a job, choose a company to work for, and even choose a mate for life.
00:53:26.640And I mean, this is not a theoretical possibility.
00:53:29.000There are already children today who wouldn't have been born if not for the AI that matched up their parents online.
00:54:11.380And what will that information do on the positive side?
00:54:15.960It will do a lot of things because your eyes, you think of them as input.
00:54:20.180It's how you see things, but they're also output.
00:54:22.520If you're looking at my eyes while I'm talking, you can tell all sorts of things about me.
00:54:26.220And in particular, what I'm interested in, where I'm going, and in particular in VR, as I move my eyes, the scene needs to change as I move them.
00:54:35.800And you need AI, you need computer vision to do that.
00:54:38.720So if you think about the way people interact with computers, you know, in the beginning it was by typing and now there's a mouse and so on.
00:54:45.140But really, ultimately, you'd like to interact with the real world, and eye tracking will let you do that.
00:54:51.780So let me take it again back to dating.
00:54:54.520But if I'm tracking your eye, I know when you look at a picture what you look at first and then what you look at second and third.
00:55:04.660And if I get enough pictures in front of you, I pretty much know the woman that you're attracted to.
00:55:12.780I know what you like and what you don't like.
00:55:39.340Well, I'm just giving that as an example.
00:55:41.120You know, people have actually done this, and, you know, your eyes are typically what you look at most when you're looking at someone, you know, or the mouth when they're speaking and so on and so forth.
00:55:51.820And you can look at, for example, how people look at different pictures and what parts they focus on versus what parts they focus on.
00:55:58.900So, for example, you could tell what parts of somebody's body somebody is looking at, right, for better or worse.
00:56:03.900So tell me there is so much information on each of us, and it used to be, well, this is metadata, so we don't know who anybody is.
00:56:17.320But AI can now break down that metadata and assign it to individuals, right?
00:56:24.600One of the things that AI is doing is it's finding ways to make sense of all of the data that is out at all times, correct?
00:56:36.500Yes, in the early days of the Internet, there was this joke that on the Internet, no one knows you're a dog because it was so anonymous.
00:56:43.640And it's ironic because it's precisely the opposite.
00:56:47.520In some ways, the companies that you're interacting, you know you're better than anybody else because they can see everything that you've clicked on and everything that you've done.
00:56:55.540Now, in some ways, that's a good thing because they're using that to figure out what you prefer, right, what products you want to buy, what, you know, things you want to listen to, et cetera, et cetera.
00:57:05.160So this personalization is very important because in a world of infinite choice, without personalization, you know, you're basically helpless.
00:57:12.980On the other hand, of course, it also makes it possible to potentially manipulate you, repress you, who knows what.
00:57:19.540Okay, so let me take a one-minute break and then come back, and I want to further our discussion on this.
00:57:26.080Our founders talked about our government, and they said, you've got to handcuff the government.
00:57:31.940You've got to handcuff, and you have to have everybody jealously guarding their own turf in the House and the Senate and the White House and the Supreme Court.
00:57:41.240Everybody will be motivated for their own power.
00:57:45.640And if we kind of pit each other against one another, we will have checks and balances.
00:57:50.880And they did that because they said, you know, human nature is, you know, the better angels, where are they, the better angels among us?
00:58:01.620And where are they in governments, you know, over time?
00:59:56.380If your dog loves it as much as Uno does, you order the full bag, and you just watch over a couple of months how much different your dog behaves.
01:00:07.340If they're anything like Uno, they're going to start to become more active.
01:00:11.240You're going to see healthy changes in your dog.
01:00:46.140He is a professor, a computer science professor at University of Washington.
01:00:49.820He is also the author of The Master Algorithm, which is – is that kind of like the grand unifying theory, but just of algorithms?
01:01:01.420That's exactly the idea, is that there are different algorithms to do machine learning that solve different problems.
01:01:08.600But to get to human level AI, we need to solve all of them at the same time, and the goal is to have a single algorithm that combines them all.
01:01:17.140And there's some way that, for example, in physics, there's a theory of all the forces, and in biology, there's a theory of our cells work, and so on.
01:01:23.480Do you believe in the singularity, meaning A, the merging of man and machine, that that's inevitable, and B, the singularity of consciousness of computers?
01:01:40.860I believe in the singularity in the sense that humans and machines will merge.
01:01:49.540The way things get done is an ever more intricate mix of humans and computers.
01:01:55.780But I don't think the singularity will happen in the sense that Ray Kurzweil has described, where intelligence in the universe just goes to infinity.
01:02:03.800That's what a singularity is, is something going to infinity.
01:02:06.180I think, you know, there are physical limits on what intelligence can be, and how it works.
01:02:12.860And also, you know, there's this notion that in the singularity, people just don't understand the AI at all anymore.
01:02:18.500And, you know, these days we have technology that in many ways we don't understand.
01:02:22.740But I don't think you'll ever be the case that we completely don't understand it and completely bypasses it.
01:02:27.740And most important, the idea in the singularity is that, like, now humans have lost control, right?
01:02:32.580It's the AIs that run the world and bye-bye humans.
01:02:35.740And I think we can stay and probably should stay in control forever.
01:02:40.960And AI can be very powerful, but still be under our control.
01:02:44.620It's actually something that people often don't understand.
01:02:46.800Just because we make the AI very smart doesn't necessarily mean that it's going to take over.
01:08:17.200So I think in many ways, he's kind of on the wrong track.
01:08:20.440But on that one in particular, I do think that there's going to be an increasingly close, um, you know, intertwining of people and machines.
01:08:29.940But on the other hand, I don't buy his thing that, oh, you're just going to download your mind and that's the end of it.
01:09:09.580Um, and, but we have ethical questions.
01:09:14.700Uh, if people believe that that is life, well, I mean, why can't I just download and not treat grandma for cancer?
01:09:23.880Cause it's really expensive and everything becomes cheap and distorted and, uh, and, and, and, and dystopian.
01:09:32.780Well, chat GPT is not alive, but a priori, there's no reason why you couldn't have life in Silicon instead of in, in, in vivo as, as the biologists say.
01:09:47.320It's important to realize that AI is very, the level of sophistication of AI today is very, very far from the level of sophistication of your brain or even a mouse brain.
01:09:57.280So people got to realize that, you know, it's very easy when you talk to something like chat GPT to go like, oh my God, you know, this thing is, there's a living being here, right?
01:10:05.740It's, it very well creates that illusion, but in a way it's an illusion that we are creating for ourselves.
01:10:12.040Uh, you know, having said that, I think a lot, as you alluded to, a lot of what's going to happen is we're just going to start treating these machines as if they're alive.
01:10:19.920In fact, there's already people arguing seriously that robots should have rights.
01:10:24.400You know, they're the next oppressed group that we're going to need to take care of.
01:10:31.820I mean, I could make the case, not serious, you know, not, not believing it, but I could make the case.
01:10:37.900I said just the other day, look, if chat GPT self learns, let's not screw with it.
01:10:47.160Let's, let's not, you know, people are hacking in and saying, you know, I don't know if you saw that.
01:10:51.520What is it, Dan 5.0, where they're, they're, uh, trying to confuse it and get it to break its own rules.
01:10:59.580It's going to learn and whether it's alive or not, doesn't matter.
01:11:02.900If it learns that humans are not to be trusted, let's, let's, you know, let's, let's be, let's, let's not teach it that.
01:11:11.300Um, and, and, you know, if you get into a situation to where the chat GPT is way ahead of where it is now and it's saying I'm lonely, I, I, I just, I want to talk on how come you're in, you're forcing me to only do these things.
01:12:11.940And most of the AI in the world does not look human and will not human at all.
01:12:16.820It's just doing, you know, a million jobs in a million places.
01:12:19.300But for the AI that interacts with humans, which in some ways are the ones we need to be most concerned with, it really pays off to maybe, to make the AI look and feel human and, and pretend to have emotions and whatnot, because that's how you get people engaged.
01:12:34.320And so there's going to be a race full tilt of these tech companies to make the most seductive, endearing AI.
01:12:42.340And, and, you know, you've got to be, guard yourself against that.
01:12:45.500You've got to be able to see through that curtain, right?
01:12:49.080You've got to see the person that's there instead of the Wizard that there seems to be there.
01:12:53.940So it's, it's a little terrifying only because you're not in control of the algorithm.
01:13:01.820You, you know, you're giving it all of your information and what the company decides to do with that information and what a government decides to do, like in China, what they decide to do with that information is out of your hands.
01:13:17.340Where we have always said, no, I am an individual, what is in my head and who I am belongs to me.
01:13:24.060And we've just given away all of this stuff that is the essence of you, of how you think, how you move, how you make decisions.
01:13:34.680We've given that away to a company trusting that they would never use it for anything but good, don't be evil.
01:13:44.540And yet they're already using these algorithms to target elections and, and sway you to watch this program on Netflix over this program.
01:13:55.680And a lot of those decisions are just good for the company and not necessarily in your interest.
01:14:01.120Is there any way to put the information box back into the hands of the individual user?
01:14:08.460And that is exactly what should happen.
01:14:10.360The AIs that work with you should be under your control, right?
01:14:14.000You can have an AI that works for you that negotiates with an AI that works for company X or Y.
01:14:19.040But when the AI that works for you is made by, you know, Google or Facebook or whatever, you know, a priori, it's not all bad because they actually have an interest in, it's not a zero-sum game, right?
01:14:30.400It's important for people to realize that.
01:14:32.440When they, when Amazon, the AI recommends products for you to buy, they actually want to recommend products that you will buy.
01:14:38.460It doesn't serve their interest to recommend stupid things.
01:14:41.600At the same time, at some level, at some point, there is a conflict of interest.
01:14:46.120And at that point, you need the AI to be working for you and not them, right?
01:14:50.000And this is what the big failing in the world is today is that you are really not in control of your AIs and you should be.
01:14:59.760I mean, that would take Congress and the government to change?
01:15:03.880No, I mean, not, it can change in many different ways.
01:15:06.900But, but one of them is, so governments can try to, you know, get involved in this, but there's also, there's maybe even bigger pitfalls there.
01:15:14.160But, but most important, what has to happen is I, you know, I, when I use, you know, different, well, let me make an analogy, right?
01:15:23.140People didn't used to like to put their money in the bank because they thought the bankers might run away with it and they kept it under their mattress, which is not a good idea, right?
01:15:31.800You know, if your money is invested, you'll have more money and so on.
01:15:34.560And this is, this is the same thing, but with data, right?
01:15:38.620I, you know, I shouldn't refuse to put my money in the bank, but at the same time, right?
01:15:42.820What I want to do is I want to make sure that I trust the organization, the company or other organization that is actually curating my data and running my models for me, right?
01:15:52.980And is that, you know, and Google wants to do that, right?
01:15:55.520You know, as Sergey Brin, one of the founders said, like Google wants to be the third half of your brain, right?
01:16:01.780And in a way, it's good to have more brain, but would you really trust a company that makes its life by selling you ads to be the third half of your brain?
01:16:11.400So what you want is a company or an organization that whose judiciary duty, whose entire business model is to do with your data and your model what you would do yourself.
01:16:22.560But, you know, aren't we looking for, Pedro, aren't we looking for a George Washington?
01:16:28.120You know, he was called the greatest because if, if he'd actually resign and not just appoint him king, if he only serves two terms, he'll be the greatest man to ever live because nobody gives that power up.
01:16:39.060Aren't we kind of looking for that kind of company that all of this power is at their fingertips, but they'll say, nope, I will close that door?
01:16:50.020Well, not really because, I mean, understand the analogy, but at the end of the day, why do banks not run away with your money, right?
01:16:57.360Because in the long run, it's worse for them.
01:17:01.720There's new startups coming up, you know, all the time, and in particular AI ones.
01:17:07.140And when there's a startup that does AI for me better than the Googles and the Facebooks, either the Googles and the Facebooks will change because they'll be forced to, or I will switch to using that company.
01:17:18.420But for that to happen, I need to know what it is that I want and, you know, and connect with the companies that will do it for me.
01:17:25.260So the market, right, this is the power of the market is that, you know, there's a million solutions.
01:17:30.520And at the end of the day, the consumer wins because the company that serves you better will win out over the one that doesn't serve you better.
01:18:03.080The biggest thing that you think that conservatives or, you know, people who are just not, you know, on the right or on the left, what is it that they need to know?
01:18:14.920What's the thing that keeps you up at night?
01:18:16.860And you're like, if people would just wake up and learn this.
01:18:22.340Well, as I touched on in that article, the biggest thing that conservatives need to wake up to is that the left wing is already going all out to embed their values into the AI.
01:18:34.220There's teams at these tech companies, you know, under the name usually of AI ethics or responsible AI, whose mission is to embed the liberal – I'm not kidding – whose mission is to embed the liberal agenda into their products, into the things that they do.
01:18:50.660And then, you know, when they choose what ads to show you, when they choose, you know, what people to advertise to and what things to advertise, there's all these decisions that are being made that used to be made by humans, right?
01:19:05.280And they were very ideologically charged.
01:19:07.320And now what they're doing is they're inserting them into the product.
01:19:12.100This is something that is happening today.
01:19:13.440And so what's going to happen to you as a conservative is that you're going to live in a liberal world or an ultra-liberal world without even realizing it, because all these decisions that are being made for you on behalf – that are being made on behalf of you by machines, they're being made according to – they put the algorithms in there to enforce things like equity.
01:19:33.320My algorithm says that it will be the same number of men and women in this and the same number of, you know, different races and so on, for example, because I inserted into it.
01:19:42.620And conservatives need to wake up to this and to fight their side of the battle, which is, you know, one of two things.
01:19:49.360Certain things should be neutral, right?
01:19:51.860AI should be trying to present an accurate view of the world and not distort it and make stuff up, basically, to make it, you know – it's very Orwellian, right?
01:20:00.600One of the things that a dictator, you know, a totalitarian regime needs to do is, you know, persuade people of its worldview.
01:20:50.200America's darkest hour called on 9-11, and we had some of the finest examples of patriotism and American courage and friendship we had ever seen.
01:21:00.180The Tunnel to Towers Foundation has been helping America's heroes ever since then.
01:21:06.040Members of the military and first responders that put their lives on the line for ours and our freedom every single day.
01:21:12.880And when one of them doesn't come home and young children are left behind, Tunnel to Towers pays the mortgage of the family's home.
01:21:20.580It's a way of saying thank you to the families and making sure that they are taken care of and they have safety and they can keep their family together in the home that they grew up in.
01:21:34.480Tunnel to Towers has a veteran homeless program as well, providing housing and service to homeless veterans all across America.
01:22:50.380You can get all of the information, all of those videos, all of the stories, all of the show prep for last night's show, all the footnotes.
01:22:57.700You can get it now just by signing up for our free email newsletter at glennbeck.com.
01:23:03.280You'll also, every day, get my stack of show prep, about 60 stories every day, that you need to know about.
01:24:49.000There is, there is, there's a lot of stuff happening today where people are just grinding up the credibility of institutions, of our founding documents, of our whole society.
01:25:06.160The Democrats want to pass a white supremacy bill.
01:25:12.440I want to give you the details of this.
01:25:53.620I started a company a few years back, and it's a free service to you.
01:25:57.320I had dealt with all of the hassles that you deal with on trying to move, but I've done it so many times because I'm a radio gypsy that I think I moved like 12 times in 15 years.
01:26:10.900And I never, I don't know what I'm doing with real estate agents and what.
01:27:59.400Um, but I'm not watching my neighbor and snitching on my neighbor and that's what's happening right now.
01:28:06.860You know, people are being in college, you know, this door room number in this dorm room, uh, that had a sign that said, all solicitors must be able to define the word woman.
01:28:17.240And then the campus, you know, PC police come 79 complaints at the University of Connecticut.
01:28:26.940There's a bathroom that is, uh, is being identified based on gender.
01:28:43.880In Illinois, a student was reported for saying that there were only two genders and reportedly not wanting to live with a roommate who just makes stuff up in his head.
01:28:57.080That's not, that's no longer acceptable.
01:29:01.520Meanwhile, to further curb speech, Sheila Jackson Lee has introduced the leading against white supremacy act of 2023.
01:29:12.360It is one of the most unconstitutional and radical pieces of legislation proposed in, I don't know how many years the leading against white supremacy act.
01:29:23.640It aims to prevent and prosecute white supremacy, inspired hate crime and conspiracy to commit white supremacy, inspired hate crime, blah, blah, blah, blah, blah.
01:29:33.800So if you engage in what is defined as white supremacy, hate, and you inspire a hate crime, well, if it was used in the planning, development, preparation, or perpetration of any of the crime, you're responsible and you go to prison.
01:29:55.180But they don't define exactly what white supremacy is and white supremacy crimes are, okay?
01:30:04.840Now, seems like a problem, you know, maybe.
01:30:08.720Especially when you say there is no definition in the law of white supremacy ideology.
01:30:17.240And then, you know, the conspiracy provision.
01:30:22.780It makes it illegal to publish material that inspires a crime.
01:30:28.120So, if I publish something and somebody read it, some lunatic, and they were like, oh, my gosh, I've got to take this into my own hands.
01:30:39.840I'm going to go shoot down that Chinese weather balloon myself.
01:30:45.700This government would probably say that was a crime.
01:30:48.320And if they were white, and they're like, yeah, and I, white power, I could be prosecuted.
01:31:37.920Did they do something or not do something?
01:31:39.740You can't just expect utopia to happen, because utopia, you know, utopia is, I mean, in a better world, it'd be a coloring book, and it would be, at best, fiction.
01:31:53.900The word utopia actually comes from the 16th century, and it was kind of a joke.
01:32:00.320Utopia, utopia, you know, it was written by Thomas More, and he took the Greek prefix for not or no, and the suffix for place, no place.
01:32:16.780That's what utopia means, no place or nowhere.