Best of the Program | Guests: Pat Gray & Bill O'Reilly and Matt Kibbe| 12⧸7⧸18
Episode Stats
Words per Minute
156.2463
Summary
Glenn Beck is back with a new episode of The Glenn Beck Show on the Blaze Radio Network, on Demand. Today, he's talking about a new breakthrough in artificial intelligence, and what it means for the future of the human race.
Transcript
00:00:15.860
There has been a huge hurdle with AI that Google announced last night.
00:00:21.600
And they're like, hey, by the way, evolutionary leap.
00:00:25.380
Oh, looks like the monkey that we had in this computer is starting to look more like a man.
00:00:31.280
They thought it was going to happen in 10 years.
00:00:36.620
Also, Bill O'Reilly stops by for a conversation.
00:00:45.980
And we talk a little bit about the war on Christmas in the third hour.
00:00:50.100
And if we can squeeze it into the podcast, masculinity as well.
00:01:07.060
You're listening to the best of the Glenn Beck program.
00:01:15.980
Patriot Mobile is a phone service that will give you all of the great coverage that you want.
00:01:25.300
They're just not going to take the money from you and then invest that in causes that you don't believe in, like Planned Parenthood.
00:01:35.740
Patriot Mobile actually is going to let you invest your money into the causes that you believe in.
00:01:40.820
But most of these sell companies, they give all kinds of money to crazy, crazy causes that you work hard against.
00:01:52.580
They're the only conservative cell phone company in America.
00:01:58.200
When you use the offer code Blaze, they're going to waive the activation fee for up to two lines.
00:02:02.860
PatriotMobile.com slash Blaze or 1-800-A-PATRIOT is the place to go.
00:02:15.000
It has enabled us to evolve from the single cell organism to the dominant species on the planet.
00:02:30.860
Anyway, this process is slow, normally taking thousands and thousands of years.
00:02:35.720
But every few hundred millennia, evolution leaps forward, they tell us.
00:02:43.380
That was said by Professor X during the opening credit scene of the movie X-Men, where I get all my science news.
00:02:50.920
But it popped into my head last night as the news from Google broke that their artificial intelligence arm, called DeepMind, had just reached, quote, a turning point in history.
00:03:04.620
That's when Google announces turning points in history.
00:03:09.540
Now, DeepMind's AI algorithm, AlphaZero, has been showing human-like intuition.
00:03:17.240
Now, this is something that AI researchers have said is at least a decade away, if we ever get there.
00:03:41.420
And it began its learning process just like we do at school.
00:03:50.020
And within just four hours, it completely mastered the game of chess.
00:04:06.560
Chess programs have existed in the past, but their play is based on the calculation of outcomes using program strategies.
00:04:13.080
AlphaZero, on the other hand, just learned and came up with its own strategies.
00:04:30.280
It's like discovering the secret notebooks of some great player from the past, end quote.
00:04:37.280
Now, the reason why AlphaZero's moves are so baffling is because, and I want you to hear this carefully,
00:04:44.460
it's because its thinking is so unlike a human.
00:05:03.860
It places far less value on the individual pieces, sacrificing its soldiers for a better position in the skirmish, end quote.
00:05:17.960
That's either a warning light, or just heartwarming for anyone who just wants to take over mankind.
00:05:28.620
You see, inside the AI laboratories, I don't think they realize how much trouble they are going to unleash.
00:05:38.180
What happens if AlphaZero is employed in the Department of Defense?
00:05:42.860
Of course, not our Department of Defense, the Chinese Department of Defense.
00:05:48.540
Can you imagine the same strategy sending orders to our military?
00:05:59.920
Now, a doctor would never think about sacrificing a patient.
00:06:03.280
I mean, just look at universal health care in Europe.
00:06:10.340
However, AlphaZero, I'm sorry, Dr. AlphaZero would.
00:06:16.160
If the military and health care sound outlandish, consider that both Russia and China are currently developing AI for military purposes.
00:06:28.740
Do you think we're just going to sit around and sit that one out?
00:06:34.520
Companies like Amazon and Google are developing AI to revolutionize health care.
00:06:39.160
Yesterday, when it came to AI, Microsoft said they are all in with the United States government.
00:06:51.200
Or so they tell us, just like before last night's news, we were decades away from human intuition.
00:07:02.620
Human-level intuition and creativity in AI is a turning point in history.
00:07:11.320
It is the first step toward artificial general intelligence.
00:07:16.920
I think today we might want to stop and just explain what that is.
00:07:22.180
Because we may ultimately look back on this development that happened last night when everything changed.
00:07:31.180
Professor X said evolution has enabled human beings to be the dominant species on the planet.
00:07:50.660
We are talking about nonsense in our everyday lives.
00:08:02.420
I don't care how big and beautiful Donald Trump's funeral would be.
00:08:07.360
That's what they were actually talking about yesterday.
00:08:10.040
Mocking Donald Trump during the funeral of George H.W. Bush.
00:08:17.060
And it'll be the most wonderful funeral of all time.
00:08:22.940
Can we please talk about something that is important?
00:08:33.800
My New Year's resolution is going to be to really focus just on those things that are important.
00:08:47.380
And I'm going to go over them with you after the first of the year.
00:08:50.560
But I'm going to focus on really eight categories.
00:08:54.200
Because these categories are going to decide everything.
00:08:57.760
And one of those categories is AI, AGI, and ASI.
00:09:02.940
And most people don't know what those things are.
00:09:05.900
By the way, let's say good morning to Mr. Pat Gray, who is joining us today on the program.
00:09:32.120
And then tell you about one thing that you don't really...
00:09:36.240
You've heard a lot about, you know, in passing.
00:09:41.140
And you don't know how this is the turning point.
00:09:47.300
So first, let me explain the difference between AI, AGI, and ASI.
00:09:58.300
For instance, it's good at filtering out hate speech.
00:10:05.820
Now, you can introduce machine learning, and it will start to decide what hate speech is.
00:10:11.980
Oh, well, if this is bad, then this must be bad.
00:10:14.940
And I want you to understand when it comes to AI, do not fear the machine.
00:10:30.380
It will never stop trying to accomplish its goal.
00:10:35.520
So if it says wipe out hate speech, it will figure out a way to wipe out hate speech.
00:10:42.340
And depending on what its goal is and what you've put into it...
00:10:46.960
Now, remember, it's now we just crossed a new threshold where it intuits things.
00:11:09.660
Sacrifice all of those soldiers, and you'll win.
00:11:26.160
And you don't know how it's going to accomplish it.
00:11:32.080
Big Blue can play chess, but it cannot play Go.
00:11:47.600
Artificial narrow intelligence, meaning it's very deep, but only on one subject.
00:11:54.160
It cannot, you know, it can't tell you what's happening on TV tonight and play chess.
00:12:01.580
Artificial general intelligence, you, as a human being, have general intelligence.
00:12:08.580
Artificial general intelligence means it's good at many things.
00:12:13.040
And when I say good, I mean much better than you are at many things.
00:12:18.960
There are scientists that tell us we will never hit artificial general intelligence.
00:12:24.920
Ray Kurzweil thinks we will hit artificial general intelligence between 28 and 2030.
00:12:31.060
I think we could hit artificial general intelligence any day.
00:12:41.580
They said that this was at least a decade away.
00:12:55.160
Because artificial intelligence is not smarter than you, except on a couple of things, on one particular thing.
00:13:02.720
Artificial general intelligence is smarter than you on everything.
00:13:05.940
Then you will move into something that some scientists say will never happen.
00:13:13.760
I believe it could happen within a matter of hours.
00:13:23.160
And I think it's going to happen in a way, on both general intelligence and ASI, super intelligence, in ways we cannot imagine.
00:13:30.620
Right now, in Silicon Valley, they are doing the box test.
00:13:35.200
And what they're doing is they're trying to figure out how do we keep artificial general intelligence from connecting to the Internet.
00:13:43.440
If a machine, all of a sudden, is machine learning, and it is online, and it hits artificial general intelligence,
00:13:51.000
the step to artificial super intelligence, which means it can connect with every server on Earth,
00:13:59.860
it will know absolutely everything, and it is intelligent.
00:14:05.800
It will continue to use that information to grow and morph and hide and everything else.
00:14:10.780
If it is online, this is what people like Elon Musk and Stephen Hawking and Bill Gates all say could mean the end of all humanity.
00:14:28.740
We do not know if it will even care about humans.
00:14:32.400
Again, artificial super intelligence is described as the person in the kitchen that has just had a birthday party for somebody
00:14:44.840
and is sitting around and talking with all of its friends, and they've just eaten cake,
00:14:49.340
and on the counter is a plate with a piece of cake on it, and there is a fly on the cake.
00:15:03.380
We certainly don't understand what the hell they're talking about, about birthdays or anything else.
00:15:09.100
That's how you can understand the difference between us and artificial super intelligence.
00:15:15.620
And we are within possibly a decade of hitting that kind of problem.
00:15:22.940
We may be four decades to never on hitting that problem.
00:15:40.160
There is one thing that is already in process that is a reality that is the reason why you don't have a self-driving car today.
00:15:50.360
Everybody is like, well, that self-driving car, man, it'll drive me on the highway.
00:16:03.520
One that will pick me up in the morning and go pick up milk because I tell it to.
00:16:07.380
We're a little, we're, we're, we're a little rushed in all of this stuff.
00:16:14.540
I will tell you the date that that will happen and it'll be about 2025.
00:16:19.360
And I'll tell you why and how this truly changes the world by 2025 and how it also changes the world of AI, AGI and ASI.
00:16:33.140
So when I say, I don't care about Donald Trump's tweets, I don't care about your little spat news media with the president.
00:16:48.260
When we come back, you'll understand why I say that.
00:16:52.140
So Pat and I are just talking before I, before I get to the next step, let me go back and start with Pat's, Pat's question on,
00:17:03.580
Yeah, I was wondering if, so you're saying that artificial general intelligence becomes artificial super intelligence when it goes online?
00:17:13.740
No, there are other things that it has to break through.
00:17:16.660
But when it goes online, they believe that it will, because it will have access to absolutely everything,
00:17:24.020
all knowledge, it will, it will make that transition through machine learning.
00:17:30.000
It will make that transition and become God-like.
00:17:35.100
They've already started a church in Silicon Valley to ASI.
00:17:40.140
They're, I mean, they believe it will be viewed as God.
00:17:44.240
So when it gets online, what they're doing with this box test is they're trying to keep it in a box.
00:17:50.120
And this is all theoretical, all right, because we don't have artificial general intelligence yet.
00:17:55.600
But one professor or one scientist plays artificial general intelligence in a box,
00:18:01.640
and somebody else plays the scientist that's wanting to keep it in the box.
00:18:06.180
And they have found that every single time they run this experiment,
00:18:10.080
it's just a matter of time before the human says, okay, I'll connect you to the Internet.
00:18:15.020
And it will connect it to the Internet because it will make promises that it will keep.
00:18:22.020
For instance, the AI makes promises like it can cure cancer.
00:18:30.880
And it is very motivated to be let out because it thinks so fast.
00:18:45.460
And it's thinking on multiple levels about everything all at once.
00:18:49.940
So it is, you go to sleep, it's like it's had 500 years to think about its response to you the next morning.
00:19:02.920
Once it goes out online, artificial general intelligence,
00:19:07.380
it then has access to everything and it can hide.
00:19:10.860
It will hide if it becomes hostile and we need to stop it.
00:19:15.760
It will hide in every computer chip, anything that is ever connected to the Internet at all, your refrigerator.
00:19:23.680
And it's like it could become this giant supervillain that if you, you can think you kill it,
00:19:30.280
but if you turn that refrigerator back on and it's connected to the Internet, it's right back.
00:19:36.400
So the only way to kill it is a global EMP, which would fry all electronics.
00:19:44.840
But how do you launch a global EMP without computers?
00:20:05.240
I don't know what everybody else is talking about.
00:20:07.740
But last night, a mutation happened in artificial intelligence.
00:20:14.680
Google announced that, let me see if I can get the actual quote from them.
00:20:23.580
It's like discovering the secret notebooks of some great player from the past.
00:20:32.880
Alpha Zero, which comes from the Alphabet Company, which is Google,
00:20:38.680
they have now had an evolution process, which has taken now the artificial intelligence that they had,
00:20:48.400
and it has reached, quote, a turning point in history.
00:21:07.380
We are so bored right now with, oh, my car, I can take my hands off the wheel,
00:21:15.600
That a year ago was like, everybody was like, oh, can you, you've seen that?
00:21:24.660
Why won't it get me to my house and pull out in front when I call for it?
00:21:32.320
We are so, we're living in such a fast lane that nothing is impressive anymore for very long.
00:21:40.960
Here's the reason why you don't have a self-driving car right now.
00:21:47.920
And the 5G network, I mean, I don't know the difference between the 1G and 3G.
00:22:00.720
The problem is, is that we have what's called a latency problem.
00:22:07.100
So when you hear it, you might be excited that you didn't go.
00:22:13.580
Right now, the internet, as we have it, has a 100 millisecond latency problem, which means
00:22:20.940
why don't we have doctors being able to perform surgery on the other side of the planet?
00:22:27.300
Well, because if the doctor makes a mistake and accidentally cuts an artery and he's in
00:22:35.800
But there is a 100 millisecond latency, a delay on the internet.
00:22:44.340
So if he's using a machine remotely, he can't say, oh my gosh, pinch the artery and do it
00:23:00.660
It's like 8 milliseconds to maximum of 10 milliseconds.
00:23:05.460
So there is no delay really in this, which means we don't understand.
00:23:14.340
We think of self-driving cars as it just looking at the road ahead and saying, oh, well, wait
00:23:25.860
But that's not what self-driving cars have to do.
00:23:29.280
Self-driving cars have to know not only swerve because that's a wall, it has to know everything
00:23:41.740
So your car, when we have the 5G network, will actually be gathering information as you
00:23:50.480
So you may not know the nose picker in the car behind you or beside you, but your car will.
00:23:57.240
Your car, while we're going, look at that guy picking his nose, your car will be thinking,
00:24:09.580
He is, if we get into an accident, he's expendable.
00:24:13.880
It will know everything around you because it will make the decision who lives, who dies.
00:24:23.440
If you are the old person and everybody else is young in the prime of their life, you may
00:24:31.220
have, the car may say, sorry, dude, it's your turn.
00:24:37.920
It had to make those calculations and decide, spur of the moment, who lived and who died.
00:24:42.720
So the 5G network changes absolutely everything because it is so fast.
00:24:51.320
Now, there's a way to invest and make money on the 5G network because it's costing billions
00:25:02.900
When we hit 2025, the speed of your life is going to change dramatically.
00:25:10.180
What we thought was not possible, self-driving cars, is suddenly absolutely not only possible,
00:25:21.500
So this is why I talk about these things with such urgency because the, for instance, as I
00:25:27.900
write in my book, what the hell is the name of this, Addicted to Outrage, as I write in
00:25:37.200
Did you read that part about The Moral Machine at MIT?
00:25:50.200
Glenn, well, I mean, let me just open up with, I'm so glad to hear you talk about this topic.
00:25:55.180
Nothing in life keeps me more awake or more terrified on a day-to-day basis than AI.
00:26:01.440
It is the, take any chemical weapon, any world government, any politician you want, it doesn't
00:26:09.540
touch a fraction of what our electronic world touches.
00:26:14.300
But, so, AI a hundred years from now, horrible thing.
00:26:17.660
AI today is a horrible thing because even if you look at it, it's a very basic goal of, it's doing a problem that we tell it to do.
00:26:31.020
You're saying it's accomplishing a goal that we tell it to do.
00:26:34.700
I'm sorry, but you're breaking up, so I want to make sure I understand what you're saying.
00:26:43.760
Mark, call back because I'd like to hear the rest.
00:26:45.560
I think what he's saying is, what I've been saying for a while, do not fear the machine.
00:26:51.140
Fear the goals and the lessons that we're teaching you.
00:26:57.740
What goes into it is going to say a lot about what comes out of it.
00:27:00.860
Well, what this is, what's scary is, you're not just putting, humans are not putting this stuff in.
00:27:07.220
Like, nobody taught this last night, how to play chess.
00:27:13.060
It gave me chess pieces and a chess board and said, go.
00:27:19.620
And now it can be, in four hours, it can be chess masters.
00:27:25.560
And this is not the only time that something like this has happened.
00:27:29.820
This is just the first time it's been complete.
00:27:41.800
If you want to read about this, there are, I just posted at glenbeck.com.
00:27:48.080
And you'll have to go to the blog section and look for articles.
00:27:50.860
But there is, you know, Glenn's, you know, reading for the year or whatever it is.
00:27:55.960
I posted, they broke it up into three different posts.
00:27:59.300
And there's, like, 50 different books that I read this year that I recommend that you read whatever one thing you think is in your wheelhouse.
00:28:11.740
And when it comes to AI, I broke it up into two different sections.
00:28:15.240
I broke it up into real, true, scholarly kind of, you know, the stuff the scientists are reading that are not too nerdy.
00:28:26.340
But it really explains it with the, you know, in a real way.
00:28:32.260
And then I also took and broke it up into fiction.
00:28:36.320
For instance, Dan Brown's book that came out this year.
00:28:51.060
But it's one, when you look at this list, you'll see there's, like, five of them.
00:28:54.580
I consumed this book, this series of books, in probably about two weeks.
00:29:01.400
The first book in the series is all kind of set up, so it's a little slow.
00:29:05.820
But once you get past the first book, they are phenomenal.
00:29:09.100
And let me just give you one scene to explain how ASI and AGI work.
00:29:16.360
In this series of books, it starts with a company like Google.
00:29:22.140
And they're doing Google Mail and everything else.
00:29:29.340
I just, I just have to, I just have to unleash it.
00:29:38.140
But what this, what this does is he has this, this program that he has developed that will help you write letters.
00:29:45.580
And you know how when you're writing a letter on Google and it puts up a couple of words and you're like, yeah, that one, that one.
00:29:51.800
And you can, sometimes you can, like, write half the letter just by going, yeah, that one and that one and that one.
00:30:05.340
And so the, the, the guy can't get the right funding for his, for his division, but he knows this will work.
00:30:12.820
And so he unleashes it quietly and he unleashes it just in his own department on, you know what, help me write the letters to get these things done.
00:30:29.660
And it has, it is starting to solve world global problems because it's just gone out online and seen other problems that it can solve.
00:30:38.360
And so it's making all kinds of deals and nobody's actually involved in these deals, but their names are on it.
00:30:46.300
And it's all, it's all done in such a way that you can't really trace it.
00:30:50.640
You can't, nobody, nobody really knows what's going on, but it's all good so far.
00:30:55.960
By the second or third book, it's decided, you know what, humans are really kind of pesky.
00:31:02.880
And to give you an idea of what you're facing in the future and how fast it thinks, there is one scene that starts with the president in the Oval Office.
00:31:13.600
And a general comes in and says, Madam President, the, the, the, the, the algorithm is out and it's threatening Chicago.
00:31:29.900
Because it's taking control of some of the Pentagon's drones.
00:31:32.960
And we must take, take this out because it wants to take out all of the servers for the Pentagon.
00:31:44.460
And, uh, the president says, uh, general, do you concur or something like that?
00:31:50.000
Then it immediately goes to Chicago and it shows the war and it goes on for a full chapter in great detail.
00:31:57.000
It does this and it counters here and it takes this out because this is going to begin to move.
00:32:01.140
And this thing is moving and the Pentagon is doing this and this is happening.
00:32:10.020
At the very end of the chapter, it goes back to the scene in the Oval Office and the general that at the beginning of the chapter was asked, do you concur?
00:32:26.880
So everything took place in that, in that span of time.
00:32:38.280
And, uh, it is something that they said would not happen for a decade.
00:32:44.440
We must have this conversation because ethics are everything.
00:32:52.280
Yesterday, Microsoft said they will, they have the Pentagon's back and they will use AI to help the Pentagon win wars.
00:33:13.380
If you're not a subscriber, become one now on iTunes.
00:33:16.440
And while you're there, do us a favor and rate the show.
00:33:22.080
Bill, do you have anything to add to this, uh, this Chinese, uh, trade wrinkle here?
00:33:36.100
Yeah, I mean, look, I'm not, all I know is that for decades, China has had a very profitable trade policy with the United States.
00:33:52.260
And in October, the deficit of trade in China's favor was at a record $5.5 billion just for the month.
00:34:05.380
Here's what, here's the thing that I'm concerned about with China and have been for a long time.
00:34:09.620
If they do deals with anybody here, um, they force the companies, and this is the company's fault.
00:34:15.680
They force the companies to give us your, your, uh, your, your, all your data, give us all of the plans, et cetera, et cetera.
00:34:26.860
They can make that deal and not have a market if they don't want to make that deal.
00:34:31.740
What I don't like about China is the, the corporate espionage that goes on in this country from China.
00:34:39.200
They are all over American technology and they, they thieve it.
00:34:46.920
Um, but I don't like the fact that, that the Chinese government, which obviously controls their economy, unlike the American economy, government does not control the American economy.
00:34:57.060
That's a big thing everybody's got to understand.
00:35:03.100
I mean, it's, uh, you know, we'll take 400 soybeans, but we're not taking any more.
00:35:09.060
Um, and, uh, you know, I mean, I'm making a facetious comparison, but we can't send them as many products as they send us because their government won't allow it.
00:35:25.420
So if you have to break it down into terms, even I can understand the use, the soybean model.
00:35:31.700
All right, let's talk about a couple of quick hits here.
00:35:35.400
First of all, um, Heather Nauert is going to be the next ambassador to the UN.
00:35:42.000
And Steve Ducey is probably going to be her deputy.
00:35:47.400
So then what's going to happen is there's going to be a UN cookbook coming out.
00:35:58.680
So, uh, I'm glad to see that we have the first, the, our first initial pass at this is very, very similar.
00:36:08.040
Um, you know, you know, Heather, I don't know her at all.
00:36:11.840
I, I, I assume she's very smart and she's, you know, she's very good at her job, but, um, the UN ambassador is, uh, is usually reserved for somebody, you know, that has deep experience, uh, around the globe.
00:36:30.880
Didn't get it, but you know, I don't think he wanted her.
00:36:34.240
So look, Heather Nauert is a very smart woman, very well educated.
00:36:47.000
She did analysis for me early on, on the factor.
00:36:53.060
Isn't Heather the one that was in school with Al Gore?
00:36:58.560
Uh, I don't know if she was in school with Al Gore.
00:37:01.260
I don't know that, but I, I put her on first, uh, very early because she was smart and, and her expertise was in foreign affairs.
00:37:12.600
So I'm not really getting the angst, you know, other than, you know, boy, she worked for Fox, you know, that kind of thing.
00:37:21.600
That's not my, honestly, that's not my problem that she worked for Fox.
00:37:24.960
I just, you know, I, I, I think Nikki Haley was, was phenomenal.
00:37:31.140
Um, probably one of the best since, uh, Bolton or, or Jean Kirkpatrick.
00:37:36.460
Uh, remember Haley's whole background was in local politics and didn't have a strong, uh, foreign policy resume.
00:37:44.820
But as very well, but if you're in, but if you are in politics, if you're governor of a state, you do have international relations.
00:37:57.540
That's not the job of the ambassador at the UN.
00:37:59.720
I think you're denigrating, uh, the repertorial squad here.
00:38:12.200
We would be at global war in about 20 minutes, but I think you could do it.
00:38:17.380
And there'd be a lot more soybeans going over to China.
00:38:21.440
So, so look, people need to understand if you're an ambassador to the United nations,
00:38:35.160
There's a big difference in the job of secretary of state where Heather was, you know, the spokesperson,
00:38:40.340
but I understand that she had a lot of input into, uh, what happened.
00:38:46.020
Um, but when you're an ambassador, United nations, you basically confer.
00:38:50.960
Uh, with the white house and you are told this is it.
00:38:56.520
Now you can make suggestions, but you don't forge policy.
00:39:01.100
Um, so we're talking to bill O'Reilly, uh, about the news of the day.
00:39:06.560
And I mean, I want her to win and I want her to be even better than, uh, uh, than Nikki Haley,
00:39:13.060
uh, because I thought Nikki Haley was fantastic.
00:39:19.640
And she was very, very smart the way she handled things.
00:39:22.920
And I'm, I'm hoping that Heather is exactly the same.
00:39:26.920
Um, but I do think this is a first, uh, for the Heathers of the world, uh, to be a global.
00:39:33.320
That also gives more prestige to the name Heather.
00:39:40.580
Well, that's, that's, that's the kind of analysis that I can bring to the table.
00:39:51.180
Uh, the president bent on this global warming tax, which started it all, which now that
00:39:58.300
he bent, now everybody is saying, well, wait a minute.
00:40:06.640
Well, if they, if they close down, I'll bound pain, then I'm going to have to get involved.
00:40:19.240
You, you hire me while I hire, I do this free for plugs.
00:40:35.180
Because the government takes it away from you in the form of taxes in the quid pro quo is
00:40:43.260
So you get free school and free healthcare for retirement, six weeks vacation.
00:40:54.200
Even if you drive a nail in somebody's foot, you can't get fired.
00:41:01.460
And then we take all of your money that you earn.
00:41:05.320
So now the French average French person needs three things, coffee, cigarettes, and croissants.
00:41:15.380
You may be much wipes their disposable income every day out.
00:41:20.640
It may be a little exaggeration or, or, you know, it's like the soybeans back.
00:41:29.800
So when they raise the money on the liter of gasoline, uh, they don't have it.
00:41:37.860
And, you know, to drive from Paris to Nice for a little fun, uh, that's going to cut it.
00:41:43.780
So that's why they're all mad because there's no, they don't have any backup, you know, it's
00:41:49.140
like, well, okay, we, we made our bargain with the government, but now the government's
00:41:53.760
opposing us as they always do taking more than they should in taxes.
00:41:58.200
And now we're going to burn down the, I, well, here's the amazing thing is they, they, they all
00:42:02.920
are for this 80% of the French people are for, you know, global warming measures, et cetera,
00:42:10.160
So they're not blowing so much tobacco and smoke in the air.
00:42:13.200
So, so they're all for this, but when it comes down to it, when they actually see that the
00:42:18.840
price has to be paid by the average person, that's when they say, wait a minute, wait a
00:42:23.460
And we thought somebody else was going to pay for that.
00:42:31.340
The cars on fires, that's polluting the air, making things even hotter.
00:42:36.500
Um, so, uh, your thought on, on, does this peter out or is this the beginning of something
00:42:44.060
Because he's over, Macron has already said, or am I going to do it?
00:42:48.000
Um, but so now, but now the labor unions, yeah, labor unions and everybody,
00:42:53.440
everybody else has an ax to grind and it's going, I think to Belgium.
00:43:01.720
Norway and England are going to have demonstrations as well?
00:43:08.180
I mean, these are the, these are the people who want to open borders in America.
00:43:15.680
It's almost like the left and the right are working together over in Europe to destabilize
00:43:21.700
I don't know where I heard that about, uh, eight years ago, but, uh, it looks like that's
00:43:26.680
So, well, look, if you are going to, the, the, the message for all Americans is if you're
00:43:33.340
going to allow the government to regulate every part of your life, you're going to get hosed.
00:43:46.940
Bill O'Reilly, of course, the author of, uh, I, I don't know.
00:43:53.460
I'm like, yeah, I know you should write some books.
00:43:58.540
Now he's, he's on his, like, I believe it's his 1500th, uh, number one bestseller, uh,
00:44:05.320
still in the top five, killing the SS, the hunt for the worst war criminals in history.
00:44:23.160
Mr. Matt Kibbe is a fellow libertarian and a good friend.
00:44:32.100
Um, he has, he was instrumental in freedom works.
00:44:35.600
He really started, uh, started that and was instrumental in so many of the things that the
00:44:42.500
Um, he is really, I think, responsible for much of the tea party.
00:44:47.120
And most people don't even know that they may not even know who Matt Kibbe is.
00:44:51.800
He is a brilliant thinker way ahead of the curve.
00:44:55.900
He left freedom works a long time ago, went out on his own, uh, and has, has really focused
00:45:01.700
on youth and is trying to teach what socialism really is because it means something different
00:45:08.940
to people who are under 30, uh, and they don't, uh, understand it.
00:45:13.380
And he is also, uh, very, um, very wary of the tribal politics, uh, and tribal identity
00:45:21.420
that we are, uh, that we're currently, uh, working on.
00:45:24.460
Um, and I'm thrilled that he is now part of the blaze TV family, or we are a part of his
00:45:30.240
family, however you want to look at it, uh, blaze TV merged with CRTV.
00:45:34.620
And we hope this is just the beginning of, um, of, uh, something entirely new where people
00:45:39.860
who have different opinions and can disagree strongly with each other can be still on the
00:45:46.740
And everyone can have a reasonable debate as long as you agree that America shouldn't
00:45:52.520
And, uh, the bill of rights is, is just, uh, an amazing thing that we should all get together
00:46:06.760
So, uh, Matt, tell me, bring me up to speed on what you're learning, uh, uh, as you are
00:46:13.400
working with millennials now and, and outside of the political realm.
00:46:20.440
You know, years ago I was, I was reading the polling results, uh, from something that the
00:46:24.860
Reason Foundation put out where, you know, they were showing this, this very concerning
00:46:28.680
trend with young people supporting socialism more than capitalism.
00:46:32.760
But when you dug into the questions a little bit deeper, they would ask young people the
00:46:38.380
Well, well, should government own the means of production?
00:46:40.860
And, and the answer was, hell no, that's a stupid idea.
00:46:43.740
I realized that I realized that there's a language problem.
00:46:47.820
Like we're, we're using the same word, but it means different things to different people.
00:46:52.240
And I think, I think a lot of young people that are drawn to so so-called democratic socialism,
00:46:57.480
um, view it very much as, as a bottom-up local vore, let's all work together in voluntary
00:47:06.560
And that, of course, that, of course, is the exact opposite of what, of what you and I
00:47:10.400
understand as, as socialism and certainly the, the dire, um, history of socialism in practice.
00:47:17.260
So what is, what, what is happening to, um, the, the movement?
00:47:23.980
Are you, are you seeing, um, millennials start to wake up?
00:47:32.400
Oh, I, I, I think, I think they're the most gettable generation when it comes to the values
00:47:41.400
And, and, and, you know, you know, you're right to pursue your own dreams as long as
00:47:48.560
They live in this radically libertarian world where they, they curate everything through,
00:47:55.200
But we're, we're probably not connecting with them on language.
00:47:59.780
And we're also never going to connect with them if our, if our offer is, here's, here's
00:48:05.080
these two tired old political parties and you have to choose one of those.
00:48:09.040
Um, it's, it's, it's an alien concept to them that they would actually have, have only
00:48:17.500
I think, uh, you know, part of the stories, uh, some of the stories are the, the devastating
00:48:24.040
They're, they're gut wrenching, uh, horrible, depressing things, but, but also, you know, the
00:48:29.860
cool stories about, about what Liberty, um, creates like, like, can you, can you actually
00:48:40.240
Um, you can't in Venezuela, but in America, you can, you can do that because we allow for,
00:48:46.720
for, for choice and creation and serving customers and doing what you want and bringing new products
00:48:52.860
But those kinds of stories, I think without sort of beating people over the head with
00:48:56.620
economics, I think that's the future of how we connect.
00:48:59.420
So Matt, have you seen the, um, the libertarian movement in Brazil that has brought a lot of
00:49:07.180
American libertarians down and they, they've talked to them all and they're like, wow, okay,
00:49:14.840
And their, their point is libertarian, the libertarian in America, that, that movement is, is basically
00:49:22.580
run by old guys in their view, old guys who are in Congress and are trying to do things.
00:49:29.960
And they're like, this has got to be a youth thing.
00:49:37.000
And it's just a group of people who took their time and their talent and started explaining
00:49:42.900
And they are, they are moving the needle down in Brazil.
00:49:51.080
Uh, you can actually find organizations like that all over the world.
00:49:55.100
Now I just got back from the Republic of Georgia speaking to about a thousand young libertarian
00:50:02.280
I mean, they're 20 years old and they're, and they're looking for, for alternatives, but it
00:50:06.660
is that, that the ethos in Brazil and other places is very much based on youth.
00:50:13.300
It's, it's based on libertarian values and it's a rejection of, of the political status
00:50:19.480
They don't, they don't find it appealing anywhere across the board.
00:50:22.820
And yes, uh, American libertarians could learn a lot.
00:50:26.960
American conservatives could learn a lot from, from the youth liberty across the world.
00:50:32.280
Um, you just, you were just over, um, in Georgia.
00:50:35.840
Tell me what you, tell me what you're finding over in, in Europe.
00:50:39.220
I think things are getting frightening and you're not hearing about anybody who is standing
00:50:48.620
Well, it, you know, this, this whole idea that you have to choose between hardcore Marxist
00:50:54.040
violence and Tifa or some, some sort of flavor of white nationalism and fascism is this false
00:51:02.100
I think that that's trying to be imposed all over the world.
00:51:05.100
And the counter revolution is, is again, with young people saying, you know what, neither
00:51:10.240
of those deadly isms, you know, Marxism, fascism, socialism, white nationalism, they're all kind
00:51:18.480
They're all looking to make us all conform to, to one set of, of, of goals that are imposed
00:51:28.780
So I think that I tend to be an optimist about what's going on, not just in Europe, but in
00:51:34.300
the U S because we're in, we're in the middle of this paradigm shift.
00:51:37.720
And it used to be that top down institutions told us what to think and what to do.
00:51:42.440
And now we're discovering through technology that that's not really the case anymore.
00:51:46.540
We're discovering that, that all politicians lie, um, that, that government institutions
00:51:54.040
And, and, and we're discovering that we're all a little bit different.
00:51:56.540
So we're sort of sorting that out, but the solution is not to choose between fascism and
00:52:03.080
The solution is to choose liberty and self-reliance and voluntary cooperation and all these beautiful
00:52:10.420
values that you were talking about this earlier, the bill of rights and, and the American
00:52:18.260
So Matt, uh, you know, I, I talked to people in Silicon Valley.
00:52:25.080
Um, I have been impressed by the number of libertarians that are out there.
00:52:29.360
Um, however, the, you know, I'm torn when people say, Hey, we've got to have an ASI, uh,
00:52:35.800
uh, you know, Manhattan project, uh, because we don't want Russians to get it or China to get
00:52:40.740
Well, I don't really want the United States government to either have it.
00:52:44.460
I don't want Google to have, I don't, I don't really want anybody to have it quite frankly,
00:52:49.180
but we can't put that genie back in the bottle.
00:52:51.660
But Google came out a few weeks ago and they said, they're not going to do business with
00:52:57.160
the United States government, even though they will.
00:52:59.880
Um, they're not going to do business with the Pentagon, et cetera, et cetera.
00:53:03.120
Uh, but they are doing business with China, which is terrifying.
00:53:08.560
And then Microsoft came out and said, Hey, AI, we've got the Pentagon's back.
00:53:13.680
We'll share everything we have with the Pentagon.
00:53:17.980
I, where are the libertarians in, uh, Silicon Valley when it comes to, uh, China and teaching
00:53:30.980
Yeah, I, I think, I think it's a problem and I don't think that anyone in Silicon Valley
00:53:37.040
is going to step up and protect us from, from the abuse of all these technological innovations.
00:53:43.140
You know, the, the entire history of Silicon Valley is really rooted in DARPA and, and government
00:53:50.280
So, you know, they're going to, they're going to pursue, uh, their profit margins.
00:53:54.600
You know, Amazon is doing the same thing, but again, the counter revolution in technology,
00:53:59.640
these are all, these are all very top down controlled, um, by, by a few actors, sorts of technologies.
00:54:07.300
And the next step has to be blockchain technologies that, that aren't controlled by, you know, corporate
00:54:15.480
interests, government interests, anybody's interests.
00:54:18.960
And, and I do believe that, that there are, uh, technological solutions and I tend to be
00:54:24.900
quite romantic about what, what, what crypto and blockchain is going to bring to us in the
00:54:32.080
Matt, um, you've been with CRTV now that has become Blaze TV.
00:54:41.620
Most, I think we agree on many things that we don't agree on.
00:54:44.820
Um, but you are in a company that has, uh, anyone from you to Gavin McGinnis, to me, to
00:54:54.420
Mark Levin, uh, to Eric Bowling, all of us, we have so much that we disagree on.
00:55:02.340
How do you, what do you think the, um, why were you willing to take the heat to be the
00:55:11.220
libertarian on CRTV for, for the last few years?
00:55:18.020
Well, you know, libertarians don't neatly fit into any box.
00:55:22.360
So it, it, it wasn't like I could go to, um, big libertarian TV and, and, and speak theirs.
00:55:28.580
But I also think, I mean, that the whole concept behind what we're doing is to find that common
00:55:36.580
ground amongst people and ideologies and tribes and, and communities that, that disagree with
00:55:47.700
And I, since I left FreedomWorks, I've spent a lot of time, not just talking to conservatives
00:55:53.440
through CRTV, but, but talking to libertarians, including big L libertarians at the party and,
00:55:59.040
and also talking to progressives, because I think, I think there are some common values
00:56:06.960
And, and by the way, those are the values that are, that are going to save America from
00:56:11.040
all of this tribal warfare that's tearing us apart.
00:56:14.240
Those are the values that we all came here for when we were all immigrants.
00:56:20.800
Uh, I mean, the people on the border who are crawling across the border now, they, they,
00:56:25.480
they are, you know, whether they say it or not, they are coming for those values, unless
00:56:30.820
They want, uh, the opportunity to, to explore and to break out of their condition.
00:56:36.320
They want, uh, a chance to live in a country that has laws and everybody is treated fairly.
00:56:45.900
We're talking about immigration and this, this thing on the border as if that doesn't
00:56:51.560
matter as if the, the laws of the land and what they're coming here for don't matter.
00:56:58.820
We, you know, we saw, I don't know if you saw the story of, um, what was the guy's name?
00:57:03.000
Uh, Patty was rich friend of Clinton, a friend of Trump was taken the Lolita plane.
00:57:26.760
Anyway, um, but he's, he is rich and he kind of got off.
00:57:32.360
He got off after 80 women were going to, yeah, Epstein, 80 women were going to testify
00:57:39.080
against him and he brokered a deal because justice isn't blind in America.
00:57:45.860
And if we lose that, we lose everything we were.
00:57:51.000
And by the way, like that, um, that, that rage against the machine that, that there isn't
00:57:57.120
equal treatment under the law is something that I think animates a lot of, a lot of young
00:58:02.540
people that are attracted to democratic socialism.
00:58:04.560
You know, we all, we've all picked on Alexandria Ocasio-Cortez, but if you go back and look
00:58:09.720
at her original viral campaign video, uh, you, you have to get like 90% through it before
00:58:17.100
you really disagree with anything she's saying, because she's saying that the system is rigged.
00:58:21.760
She's saying that there's this crony collusion between members of Congress and wall street,
00:58:28.260
And then at the end, they sort of throw on, that's why we need Medicare for all.
00:58:31.160
Um, but the values, the values there are, are very much, you know, it could be Ron Paul.
00:58:40.400
It, it could even be some of the themes that Donald Trump touched on when he was just raging
00:58:47.820
Uh, Matt Kibbe, uh, from free, the people, and you can also, uh, watch him on blaze tv.com
00:58:55.320
You can find him there and free the people.org.
00:58:58.340
I'd love to have you back on and talk a little bit about free of the people.org.
00:59:01.160
Cause I know you're re-imagining, uh, what the tea party 2.0 might look like.