The Glenn Beck Program - December 07, 2018


Best of the Program | Guests: Pat Gray & Bill O'Reilly and Matt Kibbe| 12⧸7⧸18


Episode Stats

Length

59 minutes

Words per Minute

156.2463

Word Count

9,269

Sentence Count

775

Misogynist Sentences

6

Hate Speech Sentences

9


Summary

Glenn Beck is back with a new episode of The Glenn Beck Show on the Blaze Radio Network, on Demand. Today, he's talking about a new breakthrough in artificial intelligence, and what it means for the future of the human race.


Transcript

00:00:00.000 The Blaze Radio Network, on demand.
00:00:08.240 Hey, welcome to the podcast.
00:00:10.340 It is Friday and a fascinating podcast.
00:00:13.100 We start with technology again today.
00:00:15.860 There has been a huge hurdle with AI that Google announced last night.
00:00:21.600 And they're like, hey, by the way, evolutionary leap.
00:00:25.380 Oh, looks like the monkey that we had in this computer is starting to look more like a man.
00:00:31.280 They thought it was going to happen in 10 years.
00:00:34.360 It happened last night.
00:00:36.620 Also, Bill O'Reilly stops by for a conversation.
00:00:41.680 And what else did we cover today?
00:00:43.760 We had Matt Kibbe on the show.
00:00:44.960 Yeah.
00:00:45.160 Matt Kibbe.
00:00:45.980 And we talk a little bit about the war on Christmas in the third hour.
00:00:50.100 And if we can squeeze it into the podcast, masculinity as well.
00:00:53.800 What is masculinity?
00:00:56.720 Toxic as it is.
00:00:58.480 We discuss it today on the podcast.
00:01:07.060 You're listening to the best of the Glenn Beck program.
00:01:13.580 It's Friday, December 7th.
00:01:15.980 Patriot Mobile is a phone service that will give you all of the great coverage that you want.
00:01:25.300 They're just not going to take the money from you and then invest that in causes that you don't believe in, like Planned Parenthood.
00:01:35.740 Patriot Mobile actually is going to let you invest your money into the causes that you believe in.
00:01:40.820 But most of these sell companies, they give all kinds of money to crazy, crazy causes that you work hard against.
00:01:48.420 It's true.
00:01:48.800 Why do that?
00:01:49.640 You can go with Patriot Mobile.
00:01:50.840 They were created to solve that problem.
00:01:52.580 They're the only conservative cell phone company in America.
00:01:55.500 Go to PatriotMobile.com slash Blaze.
00:01:57.400 Get started today.
00:01:58.200 When you use the offer code Blaze, they're going to waive the activation fee for up to two lines.
00:02:02.860 PatriotMobile.com slash Blaze or 1-800-A-PATRIOT is the place to go.
00:02:07.560 Glenn Beck.
00:02:08.960 Okay.
00:02:10.680 Mutation.
00:02:12.000 Mutation.
00:02:12.760 It's the key to evolution.
00:02:15.000 It has enabled us to evolve from the single cell organism to the dominant species on the planet.
00:02:21.940 We were just crawling out of the slime.
00:02:25.780 And now look at us.
00:02:27.060 Our tails fell off.
00:02:28.600 And look what we've created.
00:02:30.860 Anyway, this process is slow, normally taking thousands and thousands of years.
00:02:35.720 But every few hundred millennia, evolution leaps forward, they tell us.
00:02:41.880 Now, these aren't my words.
00:02:43.380 That was said by Professor X during the opening credit scene of the movie X-Men, where I get all my science news.
00:02:50.920 But it popped into my head last night as the news from Google broke that their artificial intelligence arm, called DeepMind, had just reached, quote, a turning point in history.
00:03:02.120 Oh, I love it.
00:03:03.360 I love it.
00:03:04.620 That's when Google announces turning points in history.
00:03:09.540 Now, DeepMind's AI algorithm, AlphaZero, has been showing human-like intuition.
00:03:17.240 Now, this is something that AI researchers have said is at least a decade away, if we ever get there.
00:03:26.560 We just made it last night.
00:03:29.600 So now, how has evolution leapt forward?
00:03:34.740 Well, first of all, it's not human.
00:03:38.300 AlphaZero is only a year old.
00:03:41.420 And it began its learning process just like we do at school.
00:03:45.580 It's an AI classroom that was a chess program.
00:03:50.020 And within just four hours, it completely mastered the game of chess.
00:03:56.180 But here's the thing.
00:03:57.900 It was never programmed on how to win.
00:04:01.020 It wasn't taught anything about the game.
00:04:03.740 It taught itself.
00:04:06.560 Chess programs have existed in the past, but their play is based on the calculation of outcomes using program strategies.
00:04:13.080 AlphaZero, on the other hand, just learned and came up with its own strategies.
00:04:21.840 Its moves now baffle the human chess masters.
00:04:26.560 Chess master Matthew Sadler said, quote,
00:04:30.280 It's like discovering the secret notebooks of some great player from the past, end quote.
00:04:37.280 Now, the reason why AlphaZero's moves are so baffling is because, and I want you to hear this carefully,
00:04:44.460 it's because its thinking is so unlike a human.
00:04:50.440 Oh.
00:04:52.000 So it's like alien thinking.
00:04:54.500 It won't think like we do.
00:04:56.880 At all.
00:04:57.520 Oh, that sounds good.
00:05:01.200 Here's what they went on to say, quote,
00:05:03.860 It places far less value on the individual pieces, sacrificing its soldiers for a better position in the skirmish, end quote.
00:05:16.520 Oh, my gosh.
00:05:17.960 That's either a warning light, or just heartwarming for anyone who just wants to take over mankind.
00:05:28.620 You see, inside the AI laboratories, I don't think they realize how much trouble they are going to unleash.
00:05:38.180 What happens if AlphaZero is employed in the Department of Defense?
00:05:42.860 Of course, not our Department of Defense, the Chinese Department of Defense.
00:05:48.540 Can you imagine the same strategy sending orders to our military?
00:05:52.500 What about doctors and health care?
00:05:54.880 Because that's what's happening now.
00:05:56.980 AI is being introduced to health care.
00:05:59.920 Now, a doctor would never think about sacrificing a patient.
00:06:03.280 I mean, just look at universal health care in Europe.
00:06:06.600 They're not doing that already in England.
00:06:10.340 However, AlphaZero, I'm sorry, Dr. AlphaZero would.
00:06:16.160 If the military and health care sound outlandish, consider that both Russia and China are currently developing AI for military purposes.
00:06:28.740 Do you think we're just going to sit around and sit that one out?
00:06:34.520 Companies like Amazon and Google are developing AI to revolutionize health care.
00:06:39.160 Yesterday, when it came to AI, Microsoft said they are all in with the United States government.
00:06:47.380 By the way, we're just a few short years away.
00:06:51.200 Or so they tell us, just like before last night's news, we were decades away from human intuition.
00:07:02.620 Human-level intuition and creativity in AI is a turning point in history.
00:07:09.160 Google is right.
00:07:11.320 It is the first step toward artificial general intelligence.
00:07:16.920 I think today we might want to stop and just explain what that is.
00:07:22.180 Because we may ultimately look back on this development that happened last night when everything changed.
00:07:31.180 Professor X said evolution has enabled human beings to be the dominant species on the planet.
00:07:36.000 And major evolution just leapt forward.
00:07:41.160 But this time, evolution wasn't human.
00:07:45.760 People just...
00:07:47.520 We are...
00:07:48.540 Please hear me.
00:07:49.520 Please hear me.
00:07:50.660 We are talking about nonsense in our everyday lives.
00:07:55.560 We are talking about true nonsense.
00:07:58.200 I don't care about the latest tweet.
00:08:02.420 I don't care how big and beautiful Donald Trump's funeral would be.
00:08:07.360 That's what they were actually talking about yesterday.
00:08:10.040 Mocking Donald Trump during the funeral of George H.W. Bush.
00:08:15.360 Oh, it'll be beautiful.
00:08:17.060 And it'll be the most wonderful funeral of all time.
00:08:21.760 Can you stop it?
00:08:22.940 Can we please talk about something that is important?
00:08:29.060 The answer is no.
00:08:31.360 Unless you seek it out.
00:08:33.800 My New Year's resolution is going to be to really focus just on those things that are important.
00:08:41.320 And I wrote them out last night.
00:08:44.080 Or the night before last.
00:08:45.740 And there's eight categories.
00:08:47.380 And I'm going to go over them with you after the first of the year.
00:08:50.560 But I'm going to focus on really eight categories.
00:08:54.200 Because these categories are going to decide everything.
00:08:57.760 And one of those categories is AI, AGI, and ASI.
00:09:02.940 And most people don't know what those things are.
00:09:05.900 By the way, let's say good morning to Mr. Pat Gray, who is joining us today on the program.
00:09:10.620 Is one of those subjects going to be football?
00:09:12.760 Is that one of the topics?
00:09:14.360 No.
00:09:15.040 No?
00:09:15.300 No, I'm leaving that for you.
00:09:16.460 Okay.
00:09:16.840 I'm leaving that for you.
00:09:17.500 That's my resolution.
00:09:18.760 To talk more about football.
00:09:19.960 Is it?
00:09:20.340 Yeah.
00:09:20.900 Yeah.
00:09:21.240 You might be happier.
00:09:22.660 I think I will.
00:09:24.180 I think I will.
00:09:27.920 So let me explain AI, AGI, and ASI quickly.
00:09:32.120 And then tell you about one thing that you don't really...
00:09:36.240 You've heard a lot about, you know, in passing.
00:09:39.360 But you don't know anything about.
00:09:41.140 And you don't know how this is the turning point.
00:09:44.920 And it's called the 5G network.
00:09:47.300 So first, let me explain the difference between AI, AGI, and ASI.
00:09:52.940 AI is artificial intelligence.
00:09:55.980 And it's good at one thing.
00:09:58.300 For instance, it's good at filtering out hate speech.
00:10:03.420 But you have to tell it what hate speech is.
00:10:05.820 Now, you can introduce machine learning, and it will start to decide what hate speech is.
00:10:11.980 Oh, well, if this is bad, then this must be bad.
00:10:14.940 And I want you to understand when it comes to AI, do not fear the machine.
00:10:22.960 Fear the goal.
00:10:26.400 Because the goal will be accomplished.
00:10:30.380 It will never stop trying to accomplish its goal.
00:10:35.520 So if it says wipe out hate speech, it will figure out a way to wipe out hate speech.
00:10:42.340 And depending on what its goal is and what you've put into it...
00:10:46.960 Now, remember, it's now we just crossed a new threshold where it intuits things.
00:10:54.280 It's starting to grow on its own.
00:10:57.260 You don't even have to tell it.
00:10:59.240 It will discover it itself.
00:11:02.160 You better hope you agree with it.
00:11:04.820 For instance, on the battlefield.
00:11:07.640 Just doesn't matter.
00:11:09.660 Sacrifice all of those soldiers, and you'll win.
00:11:13.200 Well, that's not Western Judeo culture, is it?
00:11:15.900 We don't do that.
00:11:18.880 But AI will.
00:11:20.420 You give AI the goal, it will win.
00:11:23.960 It will accomplish that goal.
00:11:26.160 And you don't know how it's going to accomplish it.
00:11:29.120 But AI is like Big Blue.
00:11:32.080 Big Blue can play chess, but it cannot play Go.
00:11:36.020 Go is another AI program.
00:11:39.120 Neither of those programs can play Jeopardy.
00:11:42.140 It's artificial intelligence, and it's narrow.
00:11:45.400 Artificial, it should be, A-N-I.
00:11:47.600 Artificial narrow intelligence, meaning it's very deep, but only on one subject.
00:11:54.160 It cannot, you know, it can't tell you what's happening on TV tonight and play chess.
00:12:01.420 Okay?
00:12:01.580 Artificial general intelligence, you, as a human being, have general intelligence.
00:12:08.580 Artificial general intelligence means it's good at many things.
00:12:13.040 And when I say good, I mean much better than you are at many things.
00:12:18.960 There are scientists that tell us we will never hit artificial general intelligence.
00:12:24.920 Ray Kurzweil thinks we will hit artificial general intelligence between 28 and 2030.
00:12:31.060 I think we could hit artificial general intelligence any day.
00:12:37.480 We don't know what it's going to be.
00:12:39.460 It's going to be like Google last night.
00:12:41.580 They said that this was at least a decade away.
00:12:44.020 And then all of a sudden, it just changed.
00:12:46.900 It just learned.
00:12:48.180 It woke up.
00:12:49.800 I believe that's the way it's going to happen.
00:12:52.140 Now, this is where it starts to get scary.
00:12:55.160 Because artificial intelligence is not smarter than you, except on a couple of things, on one particular thing.
00:13:02.720 Artificial general intelligence is smarter than you on everything.
00:13:05.940 Then you will move into something that some scientists say will never happen.
00:13:13.760 I believe it could happen within a matter of hours.
00:13:17.140 And it is also likely that it never happens.
00:13:21.080 I think it will.
00:13:23.160 And I think it's going to happen in a way, on both general intelligence and ASI, super intelligence, in ways we cannot imagine.
00:13:30.620 Right now, in Silicon Valley, they are doing the box test.
00:13:35.200 And what they're doing is they're trying to figure out how do we keep artificial general intelligence from connecting to the Internet.
00:13:43.440 If a machine, all of a sudden, is machine learning, and it is online, and it hits artificial general intelligence,
00:13:51.000 the step to artificial super intelligence, which means it can connect with every server on Earth,
00:13:59.860 it will know absolutely everything, and it is intelligent.
00:14:05.800 It will continue to use that information to grow and morph and hide and everything else.
00:14:10.780 If it is online, this is what people like Elon Musk and Stephen Hawking and Bill Gates all say could mean the end of all humanity.
00:14:22.300 Because it is an alien life form.
00:14:26.760 We cannot predict it.
00:14:28.740 We do not know if it will even care about humans.
00:14:32.400 Again, artificial super intelligence is described as the person in the kitchen that has just had a birthday party for somebody
00:14:44.840 and is sitting around and talking with all of its friends, and they've just eaten cake,
00:14:49.340 and on the counter is a plate with a piece of cake on it, and there is a fly on the cake.
00:14:55.940 We are the fly.
00:14:57.780 We don't understand the counter.
00:15:00.400 We don't understand the plate.
00:15:01.640 We don't even understand cake.
00:15:03.380 We certainly don't understand what the hell they're talking about, about birthdays or anything else.
00:15:09.100 That's how you can understand the difference between us and artificial super intelligence.
00:15:15.620 And we are within possibly a decade of hitting that kind of problem.
00:15:22.940 We may be four decades to never on hitting that problem.
00:15:28.820 I believe we will hit that problem.
00:15:32.060 And when we do, absolutely everything changes.
00:15:36.140 Now, here's why you need to pay attention.
00:15:40.160 There is one thing that is already in process that is a reality that is the reason why you don't have a self-driving car today.
00:15:50.360 Everybody is like, well, that self-driving car, man, it'll drive me on the highway.
00:15:55.940 I don't even have to put my hands on it.
00:15:57.720 Now we're bored with it.
00:15:58.720 Now we're like, oh, yeah, well, that's over.
00:16:00.720 When do we get in a real self-driving car?
00:16:03.520 One that will pick me up in the morning and go pick up milk because I tell it to.
00:16:07.380 We're a little, we're, we're, we're a little rushed in all of this stuff.
00:16:14.540 I will tell you the date that that will happen and it'll be about 2025.
00:16:19.360 And I'll tell you why and how this truly changes the world by 2025 and how it also changes the world of AI, AGI and ASI.
00:16:33.140 So when I say, I don't care about Donald Trump's tweets, I don't care about your little spat news media with the president.
00:16:42.580 I don't care.
00:16:44.400 Can we please talk about something important?
00:16:48.260 When we come back, you'll understand why I say that.
00:16:52.140 So Pat and I are just talking before I, before I get to the next step, let me go back and start with Pat's, Pat's question on,
00:17:01.520 that he had when we went into the break.
00:17:03.580 Yeah, I was wondering if, so you're saying that artificial general intelligence becomes artificial super intelligence when it goes online?
00:17:13.740 No, there are other things that it has to break through.
00:17:16.660 But when it goes online, they believe that it will, because it will have access to absolutely everything,
00:17:24.020 all knowledge, it will, it will make that transition through machine learning.
00:17:30.000 It will make that transition and become God-like.
00:17:34.040 Okay.
00:17:35.100 They've already started a church in Silicon Valley to ASI.
00:17:39.260 I mean, they were there.
00:17:40.140 They're, I mean, they believe it will be viewed as God.
00:17:44.240 So when it gets online, what they're doing with this box test is they're trying to keep it in a box.
00:17:50.120 And this is all theoretical, all right, because we don't have artificial general intelligence yet.
00:17:55.600 But one professor or one scientist plays artificial general intelligence in a box,
00:18:01.640 and somebody else plays the scientist that's wanting to keep it in the box.
00:18:06.180 And they have found that every single time they run this experiment,
00:18:10.080 it's just a matter of time before the human says, okay, I'll connect you to the Internet.
00:18:15.020 And it will connect it to the Internet because it will make promises that it will keep.
00:18:22.020 For instance, the AI makes promises like it can cure cancer.
00:18:25.920 Yes, I know your mother has cancer.
00:18:27.880 I can cure it.
00:18:29.340 Let me out.
00:18:30.880 And it is very motivated to be let out because it thinks so fast.
00:18:36.200 Time is accelerated for it.
00:18:39.300 So a day could be worth like a thousand years.
00:18:42.540 And all it's doing is thinking.
00:18:43.640 That's the only thing it has to do.
00:18:45.460 And it's thinking on multiple levels about everything all at once.
00:18:49.940 So it is, you go to sleep, it's like it's had 500 years to think about its response to you the next morning.
00:18:59.280 All right?
00:18:59.960 So you just cannot keep up with it.
00:19:02.920 Once it goes out online, artificial general intelligence,
00:19:07.380 it then has access to everything and it can hide.
00:19:10.860 It will hide if it becomes hostile and we need to stop it.
00:19:15.760 It will hide in every computer chip, anything that is ever connected to the Internet at all, your refrigerator.
00:19:23.680 And it's like it could become this giant supervillain that if you, you can think you kill it,
00:19:30.280 but if you turn that refrigerator back on and it's connected to the Internet, it's right back.
00:19:35.860 Okay?
00:19:36.400 So the only way to kill it is a global EMP, which would fry all electronics.
00:19:43.480 Then we could restart.
00:19:44.840 But how do you launch a global EMP without computers?
00:19:52.740 The best of the Glenn Beck Program.
00:19:55.040 Last night, the news last night.
00:20:05.240 I don't know what everybody else is talking about.
00:20:07.740 But last night, a mutation happened in artificial intelligence.
00:20:14.680 Google announced that, let me see if I can get the actual quote from them.
00:20:23.580 It's like discovering the secret notebooks of some great player from the past.
00:20:32.880 Alpha Zero, which comes from the Alphabet Company, which is Google,
00:20:38.680 they have now had an evolution process, which has taken now the artificial intelligence that they had,
00:20:48.400 and it has reached, quote, a turning point in history.
00:20:53.340 It's now showing human-like intuition.
00:20:59.100 This is critically important.
00:21:02.360 There is something on the horizon.
00:21:07.380 We are so bored right now with, oh, my car, I can take my hands off the wheel,
00:21:13.000 and I can sleep on the freeway.
00:21:15.600 That a year ago was like, everybody was like, oh, can you, you've seen that?
00:21:21.360 Now everybody's pissed off at it.
00:21:22.880 Why won't it drive me home?
00:21:24.660 Why won't it get me to my house and pull out in front when I call for it?
00:21:29.420 Right?
00:21:29.980 Am I right?
00:21:30.600 Oh, yeah.
00:21:31.100 Okay.
00:21:31.620 Yep.
00:21:32.320 We are so, we're living in such a fast lane that nothing is impressive anymore for very long.
00:21:40.960 Here's the reason why you don't have a self-driving car right now.
00:21:45.820 It's called the 5G network.
00:21:47.920 And the 5G network, I mean, I don't know the difference between the 1G and 3G.
00:21:55.560 I just know I got bars.
00:21:57.280 Right?
00:21:58.040 And it's faster.
00:21:59.460 Yes.
00:21:59.840 Okay.
00:22:00.720 The problem is, is that we have what's called a latency problem.
00:22:04.280 And I explained this in the stage tour.
00:22:07.100 So when you hear it, you might be excited that you didn't go.
00:22:11.400 But it has a latency problem.
00:22:13.580 Right now, the internet, as we have it, has a 100 millisecond latency problem, which means
00:22:20.940 why don't we have doctors being able to perform surgery on the other side of the planet?
00:22:27.300 Well, because if the doctor makes a mistake and accidentally cuts an artery and he's in
00:22:32.820 the room, he can immediately go to fix it.
00:22:35.800 But there is a 100 millisecond latency, a delay on the internet.
00:22:44.340 So if he's using a machine remotely, he can't say, oh my gosh, pinch the artery and do it
00:22:50.980 right away.
00:22:51.480 It takes him 100 milliseconds.
00:22:52.880 It might be too late.
00:22:54.340 Same thing with driving cars.
00:22:57.240 5G takes and destroys all latency.
00:23:00.660 It's like 8 milliseconds to maximum of 10 milliseconds.
00:23:05.140 Okay.
00:23:05.460 So there is no delay really in this, which means we don't understand.
00:23:14.340 We think of self-driving cars as it just looking at the road ahead and saying, oh, well, wait
00:23:22.060 a minute, there's a wall there.
00:23:23.960 Quick, swerve.
00:23:25.860 But that's not what self-driving cars have to do.
00:23:29.280 Self-driving cars have to know not only swerve because that's a wall, it has to know everything
00:23:37.300 else around it.
00:23:38.760 And that includes people.
00:23:41.740 So your car, when we have the 5G network, will actually be gathering information as you
00:23:49.620 drive.
00:23:50.480 So you may not know the nose picker in the car behind you or beside you, but your car will.
00:23:57.240 Your car, while we're going, look at that guy picking his nose, your car will be thinking,
00:24:02.680 that's Fred.
00:24:03.780 He makes $150,000 a year.
00:24:05.920 He's got a family of five.
00:24:07.220 He was just diagnosed with cancer.
00:24:09.580 He is, if we get into an accident, he's expendable.
00:24:13.440 Okay.
00:24:13.880 It will know everything around you because it will make the decision who lives, who dies.
00:24:20.760 And it might be you.
00:24:23.440 If you are the old person and everybody else is young in the prime of their life, you may
00:24:31.220 have, the car may say, sorry, dude, it's your turn.
00:24:33.800 It's like the robots on iRobot.
00:24:35.520 Remember that Will Smith movie?
00:24:37.220 Yes.
00:24:37.920 It had to make those calculations and decide, spur of the moment, who lived and who died.
00:24:42.520 Correct.
00:24:42.720 So the 5G network changes absolutely everything because it is so fast.
00:24:51.320 Now, there's a way to invest and make money on the 5G network because it's costing billions
00:24:58.860 of dollars to build.
00:25:00.340 But it will be introduced by 2025.
00:25:02.900 When we hit 2025, the speed of your life is going to change dramatically.
00:25:10.180 What we thought was not possible, self-driving cars, is suddenly absolutely not only possible,
00:25:18.060 but doable, and it will happen.
00:25:21.500 So this is why I talk about these things with such urgency because the, for instance, as I
00:25:27.900 write in my book, what the hell is the name of this, Addicted to Outrage, as I write in
00:25:33.560 Addicted to Outrage, The Moral Machine.
00:25:37.200 Did you read that part about The Moral Machine at MIT?
00:25:40.800 That's terrifying.
00:25:42.260 And that's being decided right now.
00:25:44.680 In fact, Mark in Texas.
00:25:46.680 Mark, go ahead.
00:25:48.020 What's your comment?
00:25:49.320 Yes.
00:25:50.200 Glenn, well, I mean, let me just open up with, I'm so glad to hear you talk about this topic.
00:25:55.180 Nothing in life keeps me more awake or more terrified on a day-to-day basis than AI.
00:26:01.440 It is the, take any chemical weapon, any world government, any politician you want, it doesn't
00:26:09.540 touch a fraction of what our electronic world touches.
00:26:14.300 But, so, AI a hundred years from now, horrible thing.
00:26:17.660 AI today is a horrible thing because even if you look at it, it's a very basic goal of, it's doing a problem that we tell it to do.
00:26:31.020 You're saying it's accomplishing a goal that we tell it to do.
00:26:34.700 I'm sorry, but you're breaking up, so I want to make sure I understand what you're saying.
00:26:40.180 Oh, he's gone.
00:26:41.940 Call back if you get into a good space.
00:26:43.760 Mark, call back because I'd like to hear the rest.
00:26:45.560 I think what he's saying is, what I've been saying for a while, do not fear the machine.
00:26:51.140 Fear the goals and the lessons that we're teaching you.
00:26:55.480 The way it's programmed, too.
00:26:56.120 Correct.
00:26:56.600 The way it's programmed.
00:26:57.460 Correct.
00:26:57.740 What goes into it is going to say a lot about what comes out of it.
00:27:00.860 Well, what this is, what's scary is, you're not just putting, humans are not putting this stuff in.
00:27:06.060 It's machine learning.
00:27:07.220 Like, nobody taught this last night, how to play chess.
00:27:11.020 In four hours, it taught itself.
00:27:13.060 It gave me chess pieces and a chess board and said, go.
00:27:16.500 And it taught itself how to play chess.
00:27:19.620 And now it can be, in four hours, it can be chess masters.
00:27:22.560 And they didn't think that was possible.
00:27:23.660 They didn't think that was possible, no.
00:27:25.560 And this is not the only time that something like this has happened.
00:27:29.240 That's amazing.
00:27:29.820 This is just the first time it's been complete.
00:27:31.860 Because, yeah, we thought this was years away.
00:27:34.200 Yes.
00:27:34.740 They thought it was a decade away.
00:27:35.400 And it happened last night.
00:27:36.200 Yes.
00:27:36.620 Thought it was a decade away.
00:27:38.020 Okay.
00:27:38.160 This changes everything.
00:27:39.640 And it's going to change things quickly.
00:27:41.800 If you want to read about this, there are, I just posted at glenbeck.com.
00:27:48.080 And you'll have to go to the blog section and look for articles.
00:27:50.860 But there is, you know, Glenn's, you know, reading for the year or whatever it is.
00:27:55.960 I posted, they broke it up into three different posts.
00:27:59.300 And there's, like, 50 different books that I read this year that I recommend that you read whatever one thing you think is in your wheelhouse.
00:28:09.880 And you're like, I want to understand this.
00:28:11.740 And when it comes to AI, I broke it up into two different sections.
00:28:15.240 I broke it up into real, true, scholarly kind of, you know, the stuff the scientists are reading that are not too nerdy.
00:28:24.560 I can still understand it.
00:28:26.340 But it really explains it with the, you know, in a real way.
00:28:32.260 And then I also took and broke it up into fiction.
00:28:36.320 For instance, Dan Brown's book that came out this year.
00:28:39.060 I don't remember what it is.
00:28:40.160 I can't remember the name of it.
00:28:41.140 But it has a seashell on it.
00:28:43.440 That's a good book to begin to understand AI.
00:28:46.760 But there's another series.
00:28:48.320 And it's, I can't remember the name of it.
00:28:51.060 But it's one, when you look at this list, you'll see there's, like, five of them.
00:28:54.580 I consumed this book, this series of books, in probably about two weeks.
00:28:58.420 It is so good.
00:29:00.100 You'll not put it down.
00:29:01.400 The first book in the series is all kind of set up, so it's a little slow.
00:29:05.820 But once you get past the first book, they are phenomenal.
00:29:09.100 And let me just give you one scene to explain how ASI and AGI work.
00:29:16.360 In this series of books, it starts with a company like Google.
00:29:22.140 And they're doing Google Mail and everything else.
00:29:25.240 And this guy says, I have, I know I have AGI.
00:29:29.340 I just, I just have to, I just have to unleash it.
00:29:34.300 And the company is dragging its feet.
00:29:37.320 And they don't want to do it.
00:29:38.140 But what this, what this does is he has this, this program that he has developed that will help you write letters.
00:29:45.580 And you know how when you're writing a letter on Google and it puts up a couple of words and you're like, yeah, that one, that one.
00:29:51.800 And you can, sometimes you can, like, write half the letter just by going, yeah, that one and that one and that one.
00:29:56.700 Oh, my gosh, yes.
00:29:58.060 It's learning.
00:29:58.940 This is machine learning.
00:30:00.000 It's learning you and how to write a letter.
00:30:02.540 So that's how this starts.
00:30:05.340 And so the, the, the guy can't get the right funding for his, for his division, but he knows this will work.
00:30:12.820 And so he unleashes it quietly and he unleashes it just in his own department on, you know what, help me write the letters to get these things done.
00:30:22.340 So it gives it its goal, get this done.
00:30:26.120 Well, before he knows it, oh, it's done.
00:30:29.660 And it has, it is starting to solve world global problems because it's just gone out online and seen other problems that it can solve.
00:30:38.360 And so it's making all kinds of deals and nobody's actually involved in these deals, but their names are on it.
00:30:46.300 And it's all, it's all done in such a way that you can't really trace it.
00:30:50.640 You can't, nobody, nobody really knows what's going on, but it's all good so far.
00:30:55.960 By the second or third book, it's decided, you know what, humans are really kind of pesky.
00:31:02.880 And to give you an idea of what you're facing in the future and how fast it thinks, there is one scene that starts with the president in the Oval Office.
00:31:13.600 And a general comes in and says, Madam President, the, the, the, the, the algorithm is out and it's threatening Chicago.
00:31:25.000 We must launch a counterattack.
00:31:28.440 Okay.
00:31:29.900 Because it's taking control of some of the Pentagon's drones.
00:31:32.960 And we must take, take this out because it wants to take out all of the servers for the Pentagon.
00:31:41.260 Cause it's got servers elsewhere already.
00:31:44.460 And, uh, the president says, uh, general, do you concur or something like that?
00:31:50.000 Then it immediately goes to Chicago and it shows the war and it goes on for a full chapter in great detail.
00:31:57.000 It does this and it counters here and it takes this out because this is going to begin to move.
00:32:01.140 And this thing is moving and the Pentagon is doing this and this is happening.
00:32:04.640 And it is a long, long battle.
00:32:09.040 It wins.
00:32:10.020 At the very end of the chapter, it goes back to the scene in the Oval Office and the general that at the beginning of the chapter was asked, do you concur?
00:32:22.780 Answers the president.
00:32:24.880 Yes, Madam President.
00:32:26.880 So everything took place in that, in that span of time.
00:32:30.700 Yeah.
00:32:31.240 Yeah.
00:32:31.500 So it's wow.
00:32:32.760 The world has changed.
00:32:34.200 Last night, the world took a giant leap.
00:32:38.280 And, uh, it is something that they said would not happen for a decade.
00:32:44.440 We must have this conversation because ethics are everything.
00:32:52.280 Yesterday, Microsoft said they will, they have the Pentagon's back and they will use AI to help the Pentagon win wars.
00:33:02.880 This is the best of the Glenn Beck program.
00:33:06.740 And don't forget, rate us on iTunes.
00:33:11.320 Like listening to this podcast?
00:33:13.380 If you're not a subscriber, become one now on iTunes.
00:33:16.440 And while you're there, do us a favor and rate the show.
00:33:19.380 Uh, Mr. Bill O'Reilly is, uh, joining.
00:33:22.080 Bill, do you have anything to add to this, uh, this Chinese, uh, trade wrinkle here?
00:33:27.200 No.
00:33:28.040 Okay.
00:33:28.700 Good for you.
00:33:30.060 Good for you.
00:33:30.820 A man of very few words.
00:33:33.320 But you know what you know.
00:33:35.420 How are you, Bill?
00:33:36.100 Yeah, I mean, look, I'm not, all I know is that for decades, China has had a very profitable trade policy with the United States.
00:33:52.260 And in October, the deficit of trade in China's favor was at a record $5.5 billion just for the month.
00:34:02.980 Yeah, but it's not, it's not helping us.
00:34:05.380 Here's what, here's the thing that I'm concerned about with China and have been for a long time.
00:34:09.620 If they do deals with anybody here, um, they force the companies, and this is the company's fault.
00:34:15.680 They force the companies to give us your, your, uh, your, your, all your data, give us all of the plans, et cetera, et cetera.
00:34:23.840 That's the company's responsibility.
00:34:25.500 They want to make that deal with China.
00:34:26.860 They can make that deal and not have a market if they don't want to make that deal.
00:34:31.040 That's fine.
00:34:31.740 What I don't like about China is the, the corporate espionage that goes on in this country from China.
00:34:39.200 They are all over American technology and they, they thieve it.
00:34:45.320 They absolutely feel it.
00:34:46.920 Um, but I don't like the fact that, that the Chinese government, which obviously controls their economy, unlike the American economy, government does not control the American economy.
00:34:57.060 That's a big thing everybody's got to understand.
00:34:59.120 They don't take, uh, uh, enough soybeans spec.
00:35:03.100 I mean, it's, uh, you know, we'll take 400 soybeans, but we're not taking any more.
00:35:09.060 Um, and, uh, you know, I mean, I'm making a facetious comparison, but we can't send them as many products as they send us because their government won't allow it.
00:35:23.060 And that's what Trump wants to stop.
00:35:25.420 So if you have to break it down into terms, even I can understand the use, the soybean model.
00:35:31.700 All right, let's talk about a couple of quick hits here.
00:35:35.400 First of all, um, Heather Nauert is going to be the next ambassador to the UN.
00:35:41.460 Right.
00:35:42.000 And Steve Ducey is probably going to be her deputy.
00:35:46.020 Okay.
00:35:46.880 Good.
00:35:47.400 So then what's going to happen is there's going to be a UN cookbook coming out.
00:35:53.040 All right.
00:35:53.740 So all of these things, it's synergy.
00:35:56.540 Right.
00:35:57.020 Synergy.
00:35:57.720 Right.
00:35:58.380 Right.
00:35:58.680 So, uh, I'm glad to see that we have the first, the, our first initial pass at this is very, very similar.
00:36:08.040 Um, you know, you know, Heather, I don't know her at all.
00:36:11.840 I, I, I assume she's very smart and she's, you know, she's very good at her job, but, um, the UN ambassador is, uh, is usually reserved for somebody, you know, that has deep experience, uh, around the globe.
00:36:28.940 Yeah.
00:36:29.460 I was surprised.
00:36:30.420 Kill me.
00:36:30.880 Didn't get it, but you know, I don't think he wanted her.
00:36:34.240 So look, Heather Nauert is a very smart woman, very well educated.
00:36:39.380 Her expertise is in foreign affairs.
00:36:41.900 She was a news person, uh, on Fox.
00:36:47.000 She did analysis for me early on, on the factor.
00:36:49.460 She was good.
00:36:50.160 Uh, wait, hang on.
00:36:51.100 Just saying, didn't she start on the factor?
00:36:53.060 Isn't Heather the one that was in school with Al Gore?
00:36:58.560 Uh, I don't know if she was in school with Al Gore.
00:37:01.260 I don't know that, but I, I put her on first, uh, very early because she was smart and, and her expertise was in foreign affairs.
00:37:09.400 Then ABC hired her away from Fox.
00:37:11.440 She was a reporter.
00:37:12.600 So I'm not really getting the angst, you know, other than, you know, boy, she worked for Fox, you know, that kind of thing.
00:37:21.480 Yeah.
00:37:21.600 That's not my, honestly, that's not my problem that she worked for Fox.
00:37:24.960 I just, you know, I, I, I think Nikki Haley was, was phenomenal.
00:37:31.140 Um, probably one of the best since, uh, Bolton or, or Jean Kirkpatrick.
00:37:36.460 Uh, remember Haley's whole background was in local politics and didn't have a strong, uh, foreign policy resume.
00:37:44.820 But as very well, but if you're in, but if you are in politics, if you're governor of a state, you do have international relations.
00:37:52.460 You do have negotiation experience.
00:37:55.880 You know, you're not a reporter.
00:37:57.540 That's not the job of the ambassador at the UN.
00:37:59.720 I think you're denigrating, uh, the repertorial squad here.
00:38:03.820 I think I could do that job at UN ambassador.
00:38:07.200 I think I could do it.
00:38:08.940 Um, I think you probably could.
00:38:11.680 Yeah.
00:38:12.200 We would be at global war in about 20 minutes, but I think you could do it.
00:38:15.800 We'd win.
00:38:17.380 And there'd be a lot more soybeans going over to China.
00:38:19.980 Sure.
00:38:20.240 There would be.
00:38:20.860 Sure.
00:38:21.040 There would be.
00:38:21.440 So, so look, people need to understand if you're an ambassador to the United nations,
00:38:28.460 you basically are an order taker.
00:38:31.860 You're not a policy forger.
00:38:34.640 Okay.
00:38:35.160 There's a big difference in the job of secretary of state where Heather was, you know, the spokesperson,
00:38:40.340 but I understand that she had a lot of input into, uh, what happened.
00:38:46.020 Um, but when you're an ambassador, United nations, you basically confer.
00:38:50.960 Uh, with the white house and you are told this is it.
00:38:56.520 Now you can make suggestions, but you don't forge policy.
00:39:00.240 Right.
00:39:00.500 Okay.
00:39:01.100 Um, so we're talking to bill O'Reilly, uh, about the news of the day.
00:39:04.440 And I, and I wish, I wish Heather the best.
00:39:06.560 And I mean, I want her to win and I want her to be even better than, uh, uh, than Nikki Haley,
00:39:13.060 uh, because I thought Nikki Haley was fantastic.
00:39:15.860 And we finally took a stand where we should.
00:39:19.640 And she was very, very smart the way she handled things.
00:39:22.920 And I'm, I'm hoping that Heather is exactly the same.
00:39:26.920 Um, but I do think this is a first, uh, for the Heathers of the world, uh, to be a global.
00:39:33.320 That also gives more prestige to the name Heather.
00:39:36.980 It really does.
00:39:38.160 Thank you very much.
00:39:39.000 That's what we expect from you on the radio.
00:39:40.580 Well, that's, that's, that's the kind of analysis that I can bring to the table.
00:39:44.400 Let me, let me take you to France.
00:39:46.360 France shut down the Eiffel tower.
00:39:48.540 89,000 security forces were deployed.
00:39:51.180 Uh, the president bent on this global warming tax, which started it all, which now that
00:39:58.300 he bent, now everybody is saying, well, wait a minute.
00:40:02.420 I want something too.
00:40:04.180 That's right.
00:40:04.920 Tell me your thoughts on France.
00:40:06.640 Well, if they, if they close down, I'll bound pain, then I'm going to have to get involved.
00:40:10.980 I don't think that's actually French.
00:40:13.020 Oh, okay.
00:40:14.240 Um, the French are here.
00:40:16.520 Here's the, here's, here's the real story.
00:40:19.240 You, you hire me while I hire, I do this free for plugs.
00:40:23.600 All right.
00:40:24.160 I work for plugs, not hair plugs, right?
00:40:27.000 No, you, I mean, you're very cheap.
00:40:29.120 You really are.
00:40:29.900 You really are.
00:40:30.520 Thank you.
00:40:30.940 Yeah.
00:40:31.380 So here's the deal in France.
00:40:33.020 Nobody has any money.
00:40:34.560 Why?
00:40:35.180 Because the government takes it away from you in the form of taxes in the quid pro quo is
00:40:39.960 that's Latin.
00:40:41.040 Um, we'll give you everything.
00:40:43.260 So you get free school and free healthcare for retirement, six weeks vacation.
00:40:49.240 And we can't fire you.
00:40:52.040 Nobody's allowed to really fire you.
00:40:54.200 Even if you drive a nail in somebody's foot, you can't get fired.
00:40:58.620 Okay.
00:40:59.320 So that's the trade.
00:41:01.460 And then we take all of your money that you earn.
00:41:04.860 Okay.
00:41:05.320 So now the French average French person needs three things, coffee, cigarettes, and croissants.
00:41:14.440 Okay.
00:41:15.380 You may be much wipes their disposable income every day out.
00:41:20.440 Right.
00:41:20.640 It may be a little exaggeration or, or, you know, it's like the soybeans back.
00:41:25.900 Okay.
00:41:26.420 No, I, all right.
00:41:27.920 We'll go with it.
00:41:29.100 We'll go with it.
00:41:29.800 So when they raise the money on the liter of gasoline, uh, they don't have it.
00:41:37.860 And, you know, to drive from Paris to Nice for a little fun, uh, that's going to cut it.
00:41:43.780 So that's why they're all mad because there's no, they don't have any backup, you know, it's
00:41:49.140 like, well, okay, we, we made our bargain with the government, but now the government's
00:41:53.760 opposing us as they always do taking more than they should in taxes.
00:41:58.200 And now we're going to burn down the, I, well, here's the amazing thing is they, they, they all
00:42:02.920 are for this 80% of the French people are for, you know, global warming measures, et cetera,
00:42:08.240 et cetera.
00:42:08.900 The Paris accord.
00:42:10.160 So they're not blowing so much tobacco and smoke in the air.
00:42:12.560 Right.
00:42:13.200 So, so they're all for this, but when it comes down to it, when they actually see that the
00:42:18.840 price has to be paid by the average person, that's when they say, wait a minute, wait a
00:42:23.040 minute, wait a minute.
00:42:23.460 And we thought somebody else was going to pay for that.
00:42:25.440 That's right.
00:42:25.820 No, no, no, no, no, no, no.
00:42:28.040 You're right.
00:42:29.000 You, you soak the rich, not us.
00:42:31.340 The cars on fires, that's polluting the air, making things even hotter.
00:42:34.940 That's right.
00:42:35.480 Knock it off.
00:42:36.080 Okay.
00:42:36.500 Um, so, uh, your thought on, on, does this peter out or is this the beginning of something
00:42:43.580 bigger?
00:42:44.060 Because he's over, Macron has already said, or am I going to do it?
00:42:48.000 Um, but so now, but now the labor unions, yeah, labor unions and everybody,
00:42:53.440 everybody else has an ax to grind and it's going, I think to Belgium.
00:42:57.040 Is it Norway and England this weekend?
00:43:01.720 Norway and England are going to have demonstrations as well?
00:43:04.680 Yes.
00:43:05.400 Yes.
00:43:05.660 Okay.
00:43:05.940 But these are anarchists now that come out.
00:43:08.180 I mean, these are the, these are the people who want to open borders in America.
00:43:11.780 Those kinds of people.
00:43:13.280 Antifa.
00:43:14.080 Yes.
00:43:14.460 That's who's coming out now.
00:43:15.680 It's almost like the left and the right are working together over in Europe to destabilize
00:43:20.220 Europe.
00:43:21.700 I don't know where I heard that about, uh, eight years ago, but, uh, it looks like that's
00:43:25.820 finally happening.
00:43:26.680 So, well, look, if you are going to, the, the, the message for all Americans is if you're
00:43:33.340 going to allow the government to regulate every part of your life, you're going to get hosed.
00:43:44.220 Bill O'Reilly back with more in just a second.
00:43:46.940 Bill O'Reilly, of course, the author of, uh, I, I don't know.
00:43:50.620 He used to make fun of me.
00:43:51.780 Glenn, you write so many books.
00:43:53.460 I'm like, yeah, I know you should write some books.
00:43:55.620 It's great.
00:43:56.260 We, you know, correct history and formule.
00:43:58.540 Now he's, he's on his, like, I believe it's his 1500th, uh, number one bestseller, uh,
00:44:05.320 still in the top five, killing the SS, the hunt for the worst war criminals in history.
00:44:10.200 It's available everywhere.
00:44:11.580 Now it is a great book.
00:44:13.460 One of his best killing the SS.
00:44:18.680 This is the best of a Glenn Beck program.
00:44:23.160 Mr. Matt Kibbe is a fellow libertarian and a good friend.
00:44:32.100 Um, he has, he was instrumental in freedom works.
00:44:35.600 He really started, uh, started that and was instrumental in so many of the things that the
00:44:41.020 tea party did.
00:44:42.500 Um, he is really, I think, responsible for much of the tea party.
00:44:47.120 And most people don't even know that they may not even know who Matt Kibbe is.
00:44:51.800 He is a brilliant thinker way ahead of the curve.
00:44:55.900 He left freedom works a long time ago, went out on his own, uh, and has, has really focused
00:45:01.700 on youth and is trying to teach what socialism really is because it means something different
00:45:08.940 to people who are under 30, uh, and they don't, uh, understand it.
00:45:13.380 And he is also, uh, very, um, very wary of the tribal politics, uh, and tribal identity
00:45:21.420 that we are, uh, that we're currently, uh, working on.
00:45:24.460 Um, and I'm thrilled that he is now part of the blaze TV family, or we are a part of his
00:45:30.240 family, however you want to look at it, uh, blaze TV merged with CRTV.
00:45:34.620 And we hope this is just the beginning of, um, of, uh, something entirely new where people
00:45:39.860 who have different opinions and can disagree strongly with each other can be still on the
00:45:45.740 same platform.
00:45:46.740 And everyone can have a reasonable debate as long as you agree that America shouldn't
00:45:51.900 be destroyed.
00:45:52.520 And, uh, the bill of rights is, is just, uh, an amazing thing that we should all get together
00:45:58.140 and protect and live.
00:45:59.800 Uh, then I think your voice should be heard.
00:46:01.960 Matt Kibbe joins us now.
00:46:03.180 Hello, Matt.
00:46:04.400 Hey, Glenn.
00:46:05.240 Good to talk to you.
00:46:06.000 Good to talk to you.
00:46:06.760 So, uh, Matt, tell me, bring me up to speed on what you're learning, uh, uh, as you are
00:46:13.400 working with millennials now and, and outside of the political realm.
00:46:20.440 You know, years ago I was, I was reading the polling results, uh, from something that the
00:46:24.860 Reason Foundation put out where, you know, they were showing this, this very concerning
00:46:28.680 trend with young people supporting socialism more than capitalism.
00:46:32.760 But when you dug into the questions a little bit deeper, they would ask young people the
00:46:37.380 follow-up question.
00:46:38.380 Well, well, should government own the means of production?
00:46:40.860 And, and the answer was, hell no, that's a stupid idea.
00:46:43.740 I realized that I realized that there's a language problem.
00:46:47.820 Like we're, we're using the same word, but it means different things to different people.
00:46:52.240 And I think, I think a lot of young people that are drawn to so so-called democratic socialism,
00:46:57.480 um, view it very much as, as a bottom-up local vore, let's all work together in voluntary
00:47:04.340 cooperation to solve problems.
00:47:06.280 Yeah.
00:47:06.560 And that, of course, that, of course, is the exact opposite of what, of what you and I
00:47:10.400 understand as, as socialism and certainly the, the dire, um, history of socialism in practice.
00:47:17.260 So what is, what, what is happening to, um, the, the movement?
00:47:23.980 Are you, are you seeing, um, millennials start to wake up?
00:47:28.960 Because I feel like they are.
00:47:32.400 Oh, I, I, I think, I think they're the most gettable generation when it comes to the values
00:47:39.200 of, of voluntary cooperation.
00:47:41.400 And, and, and, you know, you know, you're right to pursue your own dreams as long as
00:47:44.920 you don't hurt people or take their stuff.
00:47:47.300 That's, that's who they are.
00:47:48.560 They live in this radically libertarian world where they, they curate everything through,
00:47:53.020 through technology and social media.
00:47:55.200 But we're, we're probably not connecting with them on language.
00:47:59.780 And we're also never going to connect with them if our, if our offer is, here's, here's
00:48:05.080 these two tired old political parties and you have to choose one of those.
00:48:09.040 Um, it's, it's, it's an alien concept to them that they would actually have, have only
00:48:13.200 two choices on anything.
00:48:15.100 Um, so I, I think we have to tell stories.
00:48:17.500 I think, uh, you know, part of the stories, uh, some of the stories are the, the devastating
00:48:22.280 history of socialism in practice.
00:48:24.040 They're, they're gut wrenching, uh, horrible, depressing things, but, but also, you know, the
00:48:29.860 cool stories about, about what Liberty, um, creates like, like, can you, can you actually
00:48:36.460 brew a fantastic double hop, triple IPA?
00:48:40.240 Um, you can't in Venezuela, but in America, you can, you can do that because we allow for,
00:48:46.720 for, for choice and creation and serving customers and doing what you want and bringing new products
00:48:52.440 to market.
00:48:52.860 But those kinds of stories, I think without sort of beating people over the head with
00:48:56.620 economics, I think that's the future of how we connect.
00:48:59.420 So Matt, have you seen the, um, the libertarian movement in Brazil that has brought a lot of
00:49:07.180 American libertarians down and they, they've talked to them all and they're like, wow, okay,
00:49:13.400 we don't want to do it that way.
00:49:14.840 And their, their point is libertarian, the libertarian in America, that, that movement is, is basically
00:49:22.580 run by old guys in their view, old guys who are in Congress and are trying to do things.
00:49:29.960 And they're like, this has got to be a youth thing.
00:49:32.840 It's got to be outside.
00:49:34.280 And they have made a huge impact.
00:49:37.000 And it's just a group of people who took their time and their talent and started explaining
00:49:41.980 these things online.
00:49:42.900 And they are, they are moving the needle down in Brazil.
00:49:45.900 Do you, are you aware of them?
00:49:47.620 Oh yeah.
00:49:48.420 Yeah.
00:49:48.640 There it's a, it's a huge movement down there.
00:49:51.080 Uh, you can actually find organizations like that all over the world.
00:49:55.100 Now I just got back from the Republic of Georgia speaking to about a thousand young libertarian
00:50:01.320 kids.
00:50:02.280 I mean, they're 20 years old and they're, and they're looking for, for alternatives, but it
00:50:06.660 is that, that the ethos in Brazil and other places is very much based on youth.
00:50:13.300 It's, it's based on libertarian values and it's a rejection of, of the political status
00:50:19.160 quo.
00:50:19.480 They don't, they don't find it appealing anywhere across the board.
00:50:22.820 And yes, uh, American libertarians could learn a lot.
00:50:26.960 American conservatives could learn a lot from, from the youth liberty across the world.
00:50:31.440 I agree.
00:50:32.280 Um, you just, you were just over, um, in Georgia.
00:50:35.840 Tell me what you, tell me what you're finding over in, in Europe.
00:50:39.220 I think things are getting frightening and you're not hearing about anybody who is standing
00:50:45.280 up going, no, neither of those is the answer.
00:50:48.620 Well, it, you know, this, this whole idea that you have to choose between hardcore Marxist
00:50:54.040 violence and Tifa or some, some sort of flavor of white nationalism and fascism is this false
00:51:01.760 choice.
00:51:02.100 I think that that's trying to be imposed all over the world.
00:51:05.100 And the counter revolution is, is again, with young people saying, you know what, neither
00:51:10.240 of those deadly isms, you know, Marxism, fascism, socialism, white nationalism, they're all kind
00:51:16.800 of the same thing.
00:51:17.440 They're all top down.
00:51:18.480 They're all looking to make us all conform to, to one set of, of, of goals that are imposed
00:51:25.640 by somebody else.
00:51:26.660 And, and people are rejecting that.
00:51:28.780 So I think that I tend to be an optimist about what's going on, not just in Europe, but in
00:51:34.300 the U S because we're in, we're in the middle of this paradigm shift.
00:51:37.720 And it used to be that top down institutions told us what to think and what to do.
00:51:42.440 And now we're discovering through technology that that's not really the case anymore.
00:51:46.540 We're discovering that, that all politicians lie, um, that, that government institutions
00:51:51.760 don't do what they said they were going to do.
00:51:54.040 And, and, and we're discovering that we're all a little bit different.
00:51:56.540 So we're sort of sorting that out, but the solution is not to choose between fascism and
00:52:02.720 socialism.
00:52:03.080 The solution is to choose liberty and self-reliance and voluntary cooperation and all these beautiful
00:52:10.420 values that you were talking about this earlier, the bill of rights and, and the American
00:52:14.800 experiment was really built on this stuff.
00:52:18.260 So Matt, uh, you know, I, I talked to people in Silicon Valley.
00:52:23.020 I follow it very closely.
00:52:25.080 Um, I have been impressed by the number of libertarians that are out there.
00:52:29.360 Um, however, the, you know, I'm torn when people say, Hey, we've got to have an ASI, uh,
00:52:35.800 uh, you know, Manhattan project, uh, because we don't want Russians to get it or China to get
00:52:40.640 it.
00:52:40.740 Well, I don't really want the United States government to either have it.
00:52:44.460 I don't want Google to have, I don't, I don't really want anybody to have it quite frankly,
00:52:49.180 but we can't put that genie back in the bottle.
00:52:51.660 But Google came out a few weeks ago and they said, they're not going to do business with
00:52:57.160 the United States government, even though they will.
00:52:59.880 Um, they're not going to do business with the Pentagon, et cetera, et cetera.
00:53:03.120 Uh, but they are doing business with China, which is terrifying.
00:53:08.560 And then Microsoft came out and said, Hey, AI, we've got the Pentagon's back.
00:53:13.680 We'll share everything we have with the Pentagon.
00:53:17.980 I, where are the libertarians in, uh, Silicon Valley when it comes to, uh, China and teaching
00:53:28.660 AI how to kill and control?
00:53:30.980 Yeah, I, I think, I think it's a problem and I don't think that anyone in Silicon Valley
00:53:37.040 is going to step up and protect us from, from the abuse of all these technological innovations.
00:53:43.140 You know, the, the entire history of Silicon Valley is really rooted in DARPA and, and government
00:53:48.860 contracts in the first place.
00:53:50.280 So, you know, they're going to, they're going to pursue, uh, their profit margins.
00:53:54.600 You know, Amazon is doing the same thing, but again, the counter revolution in technology,
00:53:59.640 these are all, these are all very top down controlled, um, by, by a few actors, sorts of technologies.
00:54:07.300 And the next step has to be blockchain technologies that, that aren't controlled by, you know, corporate
00:54:15.480 interests, government interests, anybody's interests.
00:54:17.580 It has to be more bottom up.
00:54:18.960 And, and I do believe that, that there are, uh, technological solutions and I tend to be
00:54:24.900 quite romantic about what, what, what crypto and blockchain is going to bring to us in the
00:54:30.160 next five years.
00:54:32.080 Matt, um, you've been with CRTV now that has become Blaze TV.
00:54:37.580 You are a, a staunch libertarian.
00:54:40.260 There are things that we agree on.
00:54:41.620 Most, I think we agree on many things that we don't agree on.
00:54:44.820 Um, but you are in a company that has, uh, anyone from you to Gavin McGinnis, to me, to
00:54:54.420 Mark Levin, uh, to Eric Bowling, all of us, we have so much that we disagree on.
00:55:02.340 How do you, what do you think the, um, why were you willing to take the heat to be the
00:55:11.220 libertarian on CRTV for, for the last few years?
00:55:18.020 Well, you know, libertarians don't neatly fit into any box.
00:55:22.360 So it, it, it wasn't like I could go to, um, big libertarian TV and, and, and speak theirs.
00:55:28.580 But I also think, I mean, that the whole concept behind what we're doing is to find that common
00:55:36.580 ground amongst people and ideologies and tribes and, and communities that, that disagree with
00:55:43.480 each other on some pretty important things.
00:55:45.360 And, and I think, I think that's important.
00:55:47.700 And I, since I left FreedomWorks, I've spent a lot of time, not just talking to conservatives
00:55:53.440 through CRTV, but, but talking to libertarians, including big L libertarians at the party and,
00:55:59.040 and also talking to progressives, because I think, I think there are some common values
00:56:04.420 in there that, that do hold us together.
00:56:06.960 And, and by the way, those are the values that are, that are going to save America from
00:56:11.040 all of this tribal warfare that's tearing us apart.
00:56:14.240 Those are the values that we all came here for when we were all immigrants.
00:56:17.920 We came here for those values.
00:56:20.800 Uh, I mean, the people on the border who are crawling across the border now, they, they,
00:56:25.480 they are, you know, whether they say it or not, they are coming for those values, unless
00:56:29.880 they have ill intent.
00:56:30.820 They want, uh, the opportunity to, to explore and to break out of their condition.
00:56:36.320 They want, uh, a chance to live in a country that has laws and everybody is treated fairly.
00:56:42.080 Um, and, and we're not talking about that.
00:56:45.900 We're talking about immigration and this, this thing on the border as if that doesn't
00:56:51.560 matter as if the, the laws of the land and what they're coming here for don't matter.
00:56:58.820 We, you know, we saw, I don't know if you saw the story of, um, what was the guy's name?
00:57:03.000 Uh, Patty was rich friend of Clinton, a friend of Trump was taken the Lolita plane.
00:57:09.400 And what's his name?
00:57:11.960 You know who I'm talking about, uh, Matt?
00:57:13.860 Yeah, I don't remember his name.
00:57:15.000 The, the, the, he was taking the Lolita plane.
00:57:17.400 What's his name?
00:57:18.200 Yeah, I know.
00:57:19.280 You know who I'm talking about.
00:57:20.220 Rich, actually.
00:57:21.440 Uh, what's his name?
00:57:23.480 I thought it was rich actually.
00:57:25.100 No, uh, no.
00:57:26.760 Anyway, um, but he's, he is rich and he kind of got off.
00:57:31.300 Well, he didn't kind of get off.
00:57:32.360 He got off after 80 women were going to, yeah, Epstein, 80 women were going to testify
00:57:39.080 against him and he brokered a deal because justice isn't blind in America.
00:57:44.620 It's just not.
00:57:45.860 And if we lose that, we lose everything we were.
00:57:51.000 And by the way, like that, um, that, that rage against the machine that, that there isn't
00:57:57.120 equal treatment under the law is something that I think animates a lot of, a lot of young
00:58:02.540 people that are attracted to democratic socialism.
00:58:04.560 You know, we all, we've all picked on Alexandria Ocasio-Cortez, but if you go back and look
00:58:09.720 at her original viral campaign video, uh, you, you have to get like 90% through it before
00:58:17.100 you really disagree with anything she's saying, because she's saying that the system is rigged.
00:58:21.760 She's saying that there's this crony collusion between members of Congress and wall street,
00:58:26.200 and that could have been a tea party ad.
00:58:28.260 And then at the end, they sort of throw on, that's why we need Medicare for all.
00:58:31.160 Um, but the values, the values there are, are very much, you know, it could be Ron Paul.
00:58:38.100 It could be Bernie Sanders.
00:58:39.000 It could be the tea party.
00:58:40.400 It, it could even be some of the themes that Donald Trump touched on when he was just raging
00:58:46.520 against the swamp.
00:58:47.820 Uh, Matt Kibbe, uh, from free, the people, and you can also, uh, watch him on blaze tv.com
00:58:54.300 blaze tv.com.
00:58:55.320 You can find him there and free the people.org.
00:58:58.340 I'd love to have you back on and talk a little bit about free of the people.org.
00:59:01.160 Cause I know you're re-imagining, uh, what the tea party 2.0 might look like.
00:59:06.140 And I'd love to have that discussion with you.
00:59:08.180 So maybe next time, uh, we have you on Matt.
00:59:11.100 Let's do it.
00:59:11.800 Thank you.
00:59:12.160 Thanks for having me.
00:59:12.760 You bet.
00:59:13.100 Matt Kibbe.
00:59:13.640 The blaze radio network.
00:59:18.180 On demand.