Friendly Fireļ¼ Rising Prices, Rising AI, and Rise of the Merlin World Premiere Trailer
Episode Stats
Length
1 hour and 10 minutes
Words per Minute
220.75916
Summary
In this episode, Ben and Matt discuss the dangers of artificial intelligence (AI) and whether or not it's good or bad for all of us. Plus, the trailer for the new movie "Pendragon: The Rise of the Merlin" is out now.
Transcript
00:00:06.020
We're fluent in data digitization and expansion into foreign markets.
00:00:09.940
And we can talk all day about streamlining manufacturing processes.
00:00:14.240
Because at Desjardins Business, we speak the same language you do.
00:00:18.720
So join the more than 400,000 Canadian entrepreneurs who already count on us.
00:00:30.000
This is why you guys need me here as a community college dropout with all you Ivy League nerds.
00:00:35.180
You were just making fun of me because I brought that up.
00:00:44.280
Okay, now I really want to move on because Matt's offering a moderate opinion and Ben is agreeing with him.
00:00:58.300
All Daily Wire Plus subscriptions are 50% off right now.
00:01:08.140
Also stick around because we have the world premiere of the trailer of Pendragon.
00:01:12.560
The Pendragon Cycle, The Rise of the Merlin that is coming up at the end of the show.
00:01:16.080
But before we get to any of that, speaking of wizardry, I want to talk about AI and whether AI is really good like everyone seems to think it is,
00:01:25.380
like all the financial speculators have thought, which is why it boosted the MAG7 stocks until recently before our impending stock market collapse,
00:01:32.760
or whether AI is probably mostly bad for all of us.
00:01:36.840
To kick it off, the most optimistic person on the panel, Mr. Walsh.
00:01:41.200
Yeah, I'm very, I become more anti-AI with each passing day.
00:01:48.660
If I could, I said before, if I could commit some sort of anti-AI genocide, I would totally do it.
00:01:55.160
I think that, and here's what blows my mind about it, is that we can all, most of us anyway, can see,
00:02:01.400
even people who are behind AI, like Elon Musk, can see coming this, like, potential civilizational level catastrophe.
00:02:10.340
And basically nothing is being done about it at all, because what is absolutely going to happen,
00:02:16.480
as far as I can tell, is AI, at a minimum, is going to wipe out many millions of jobs over the next 5 to 10 years.
00:02:23.740
How many millions, there's no way to say for sure.
00:02:26.680
I did ask, by the way, ChatGPT before we went on, I asked ChatGPT to estimate how many jobs it will take,
00:02:36.020
And I think the answer I got was 15 million or something like that.
00:02:42.020
It's millions of jobs are going out the window.
00:02:44.920
And they're not going to be replaced by anything.
00:02:50.940
We're going to be, we're already, we're almost there now, but we will soon be in a situation online
00:02:55.300
where you just simply cannot tell reality from fiction at all,
00:02:59.100
where the AI videos are going to be so good that if anybody wants to smear any of us here,
00:03:05.020
I can't imagine anyone would want to smear any of us because we're so, we're all so beloved.
00:03:09.140
But if anyone wanted to do that, I could just make a video of any of us doing or saying something horrible,
00:03:12.980
and there'd be no way for us to prove it didn't happen.
00:03:16.340
I don't know if you guys saw the cat playing the didgeridoo and everything.
00:03:20.480
If I didn't, if I didn't know that most cats don't play didgeridoo, I would have thought that was a 100% real video.
00:03:26.220
Well, that's, but Michael, that's the other, that's, that's the other thing that's going to happen with AI
00:03:29.520
is that people are just sitting there looking at this slop made by an algorithm all day, every day,
00:03:36.640
And then on top of all those other things, it's going to completely destroy every creative industry
00:03:46.780
Are we just going to sit back and let it happen?
00:03:48.720
Because that seems to be the kind of defeatist attitude that most people have is like,
00:03:53.680
So let's just, I guess, you know, we had a good run, human beings.
00:03:59.820
And I, I, I, Matt, I do want, Matt, I want to ask you seriously,
00:04:02.400
do you think that AI is going to kill all of us?
00:04:04.240
Or is this kind of your list of, because I know that's the sort of the most catastrophist take on this.
00:04:09.340
Is that AI is going to turn around and do gigantic murder to all of us?
00:04:14.020
But you know, like this is your list of complaints.
00:04:16.040
I just want to make sure that that's the list of complaints that I can argue with them.
00:04:18.920
No, the, the Terminator thing, I don't, that's like, I'd prefer that.
00:04:22.420
I mean, at least that's, that, you know what, if, if, if AI becomes Terminator,
00:04:26.780
then that at least gives us jobs that we could do because we're fighting the AI.
00:04:32.280
I'm not looking at any science, you know, sci-fi scenario.
00:04:35.840
The main thing is people will not have much to do because AI is going to do everything and it's going to take all of our jobs.
00:04:42.320
And I don't think that we have the capacity to sustain that.
00:04:45.460
I don't think we have any plan for what we do when 20 million people all of a sudden have no job.
00:04:51.160
I'm going to argue with everything you just said.
00:04:52.860
So I'm, I'm not a person who, who believes that AI is the cure for all problems.
00:04:59.120
I also do not think that what we are in right now is sustainable economically.
00:05:03.980
I've actually been saying it for, for well over a year is that I think we are in a bubble.
00:05:11.420
It just means that the overinvestment in infrastructure at some point is going to have to pay off in actual earnings or the entire pyramid is going to crumble, at least for, for most of these companies.
00:05:19.600
As far as I'm hearing kind of three arguments there.
00:05:22.480
One is the AI is going to take all of our jobs.
00:05:25.660
Two is that if the AI takes all of our jobs, what are we going to do with our lives?
00:05:28.860
And three is the quality of AI is, is demeaning to sort of the, the, the human being that what's going to happen to human art, what's going to happen to quality.
00:05:37.820
It's all going to kind of descend into AI slot mediocrity.
00:05:41.240
So one at a time, I will say that AI is going to cause job dislocation, but it's not going to take out nearly all of the jobs.
00:05:48.780
And in the end, what you will see is a job shift actually predominantly away from the white collar industries and more toward the blue collar industries.
00:05:55.680
So what you'll see is all the people who are telling welders to code 15 years ago, all those people are now going to have to go learn to weld.
00:06:03.240
There can be a lot of people who are going to have, have to be in sort of more physical industries.
00:06:06.680
They're going to have to do more nursing, for example.
00:06:08.660
Like there's certain things human beings want from other human beings that AI isn't going to provide.
00:06:12.120
It's going to be more of an aid than anything else.
00:06:14.560
And it's going to take slower to, to work its way into the market than everybody thinks.
00:06:17.780
Everybody always thinks it's going to be transitional boom, like tomorrow, all jobs replaced by AI.
00:06:23.120
The people who it's first going to replace are the coders.
00:06:24.860
I've already started to see some of this happen at Google.
00:06:27.440
And I know people, friends and family to whom this has happened, but it's going to take a while for it to filter into all business.
00:06:33.320
And there will be transitional job loss, and then it will move into other areas.
00:06:38.820
This is what happens with every, you know, kind of great industrial age invention is that there's a tremendous job dislocation at the beginning, and then the job market moves.
00:06:46.500
And I don't think AI is going to destroy wholesale all of these jobs.
00:06:50.340
But let's, let's move to part number two, which is sort of the idea that it will destroy all the jobs.
00:06:57.480
I was actually at a conference with a bunch of people who are like the creators of these systems.
00:07:01.060
And they were arguing kind of what you're arguing, Matt, that, that eventually AI will be better at everything.
00:07:16.440
I'm going to actually spend more time getting in touch with God.
00:07:18.820
Like, I think that actually religious people and community-oriented people will be fine because we actually have a thing to do with our day.
00:07:25.080
I think that secular humanism is going to have a real problem determining what to do with its day in a way that many religious people will not.
00:07:31.720
And then just as far as the quality of it, I'm not sure that AI is ever going to be creative enough.
00:07:39.280
But in terms of the actual creativity of truly great writing, I don't think AI is ever going to be a great writer.
00:07:46.380
I think that AI, because it's predictive text mechanism, and you will end up with mid-range slop for the most part.
00:07:52.860
But the way that I've used AI in my own work is to save time asking a sophisticated question that would take me a while to research, for example.
00:08:00.260
Or if I'm doing a creative writing project, and I don't want to take a lot of time looking up the details of Soviet Russia in 1938 or something, then I can ask a multi-part question and it'll spit out an answer.
00:08:10.040
If I asked it to write dialogue, the dialogue would just not be as good.
00:08:13.000
And so I agree that there will be a lot of slop, but I think that the people who are best at their craft will actually end up benefiting from AI.
00:08:18.940
And usually when the best get better, that's actually good for everybody else because it tends to drag everybody else along in terms of quality.
00:08:24.880
So you point on the religious people who, you know, they'll know what to do with their time, or the educated people, or the cultural elites.
00:08:36.000
But to me, this is what's really worrisome about Matt's point, that it's going to displace 15 million jobs, and most people are not going to know what to do.
00:08:45.780
No, no, no, but in the white-collar jobs, Michael.
00:08:47.780
If you're a blue-collar person, those are the jobs, but those are the people who you're talking about, like largely blue-collar people who are, like you're saying, you know, all the people who are like the intellectual elite, those are the people who are now most likely to lose their jobs.
00:09:03.720
There are plenty of people in white-collar jobs who are complete Philistines, who are secular humanists, who I don't know that they are going to figure out what to do.
00:09:11.020
Because really what it gets down to is a perennial question, which is what we do for leisure time, you know.
00:09:16.040
That's what the liberal arts were supposed to teach us how to do.
00:09:18.840
Now we think of them more as trade school, but it was supposed to teach us what to do with our freedom, how aristocrats are supposed to live.
00:09:26.160
So my fear is that the promise of AI is really just an extension of the promise of the Internet.
00:09:34.260
We were going to have all of human knowledge at our fingertips.
00:09:40.200
And the reality is, for some people, the Internet did make them smarter and more productive and more thoughtful and have fuller lives.
00:09:48.220
And for more people than that, really for most people I think, it made them dumber and it made them more vicious.
00:09:54.400
And I think it made them more likely to look at porn and it made them more likely to ignore the great works.
00:10:00.440
It's Plato's dialogue where Socrates is saying that written language, books essentially, are going to make people dumber because they're going to have the simulacrum of wisdom.
00:10:09.660
But they're not actually going to memorize anything.
00:10:14.040
I think for people who have their lives in order and are religious and have a cohesive view of the purpose of life, I think it could improve their lives.
00:10:21.100
And I think for most people, it probably won't.
00:10:23.500
Drew, well, if I could take you and Ben and mash you together just for my own personal pleasure, that would be great.
00:10:30.040
But also I think that what you're saying, you're hitting that.
00:10:36.820
I mean, people talk about, are we going to have to regulate an industry?
00:10:42.480
You have to regulate human beings because they're sinful and broken and will kill each other and rob each other and do all these things.
00:10:47.420
Because already we see with AI, I mean, recently, last week I think it was, they brought out an AI where you can record somebody and then after he's dead, you can continue to talk.
00:10:57.200
It will give you an AI version of your dead relative so you can talk to mom even after she's passed.
00:11:01.840
I mean, that is idolatry of the worst possible kind.
00:11:05.340
There have been AI dolls that have been put in children's rooms that talk them out of believing in God and tell them how to get drugs and things like this.
00:11:21.660
I mean, I can already see that it will do anything you want it to do.
00:11:25.140
It's going to rob people of their desire to read.
00:11:30.520
I mean, it's already people are like condensing books.
00:11:36.640
But that's a complete destruction of what it means.
00:11:38.700
And so people who don't have the meaning of life or don't know where it lies, which is in the internal life, are going to be lost.
00:11:47.940
You and I, Knowles, had a conversation with a very powerful leader in AI just the other week or so.
00:11:55.100
And I went up to him and I said to him, don't you understand that when AI speaks, it's not speaking.
00:12:02.560
And I said, it's like it's like I quoted the great Louis Armstrong saying, I see friends shaking hands saying, how do you do?
00:12:08.740
They're really saying, I love you, meaning that when we speak, we deliver our inner selves to one another, even if our words are not precisely that meaning.
00:12:20.360
They are convinced that because it can imitate an inner life, they think the Turing test, which is the stupidest idea anybody ever had, is indicative of an inner life.
00:12:29.640
If it can confuse us about its inner life, it has one.
00:12:32.480
So what I'm worried about it is it is in some ways the ultimate idol.
00:12:38.100
You know, we know that when all Moses has to do is leave town for five minutes, they start worshiping the golden calf.
00:12:51.260
You know, to your point, Drew, it's a really important point because part of that conversation, and I've had this conversation with other people too, is can AI write a poem?
00:12:58.560
And people get really, really, I don't know, vitriolic about this.
00:13:01.880
They're very, because it's really the heart of the AI debate.
00:13:04.540
And my argument was they can't write a poem because to write a poem, you have to have sensual experience.
00:13:11.120
You have to be able to describe a grape in a way that gives someone the sensory experience of that.
00:13:16.900
And you have to be able to take language, which is just full of dead metaphors.
00:13:22.560
And you have to create a new metaphor, you know, something that's evocative.
00:13:26.540
And AI in particular cannot do that because it doesn't have any senses yet.
00:13:31.780
It's worth pointing out that with robotics, it actually might have sensory experience.
00:13:37.260
So, in my view, it can't make a poem, but I don't know, maybe it can.
00:13:42.180
And all of this is a little bit beside the question of, all right, if it's going to have these negative effects, what do we do about it?
00:13:53.700
This is why you guys need me here as a community college dropout with all you Ivy League nerds who immediately, this becomes a, this becomes a, like, can AI make a poem?
00:14:05.080
And what will we, what will we think about in our leisure time about AI?
00:14:15.260
How are you going to make money to buy a house?
00:14:20.240
Because, and if the answer is, well, we'll live in some sort of AI socialist dystopia where AI will provide all that stuff for you.
00:14:29.620
Well, I'm very skeptical that it will work out that way.
00:14:32.080
I think what's actually going to happen is you're going to end up with, you know, a handful of trillionaires off this AI stuff and a lot of other people who are totally destitute.
00:14:39.140
But even if it did work out that way, okay, well, then that's our life that now we're living as people that are totally dependent on this non-human algorithm to provide for us.
00:14:47.500
I think that's a pretty horrifying vision of the future.
00:14:49.480
Why should that happen now and it's never happened before?
00:14:51.340
It's also, this is, it's not just white-collar jobs.
00:14:56.280
Okay, delivery drivers, truck drivers, Uber drivers, that's all going away.
00:15:04.320
This is not creating new jobs because this is different from any other technology that has ever existed on the planet.
00:15:09.860
It is not analogous to anything else because the whole point of it, the whole point is to take the human element out of it completely.
00:15:19.420
It's not like going from a carriage driver to now you're driving an automobile.
00:15:27.740
And so these jobs are leaving and they're not being replaced.
00:15:30.160
For all the drivers who are not going to have a job anymore, there's not some new thing.
00:15:40.240
They say this every time a new technology comes.
00:15:43.720
I want to get, I want to get, no, Drew, it's fine.
00:15:46.200
But before we get to it, we need to, we need to eat.
00:15:49.380
The only way we're going to eat is if I read this ad right here, this one.
00:15:57.900
Hey guys, did you know that up until the 1990s, cryptography was classified as a strategic weapon by the United States government.
00:16:03.620
And during the Cold War, it was the, it was added to the same U.S. munitions list that restricts export of rifles and rockets.
00:16:09.620
In 1954, encryption hardware and algorithms were added to the list to prevent the Soviets from acquiring tools that protected American military secrets.
00:16:16.540
Well, just the way that we are allowed to possess firearms to protect life and liberty because we have an amazing Second Amendment.
00:16:21.960
We also can create, share, and wield strong cryptographic arms to safeguard their communications data and digital lives from any adversary, foreign or domestic.
00:16:31.940
It's an app that encrypts and reroutes your internet connection through secure servers that makes your online activity private.
00:16:37.620
No one can monitor, record, manipulate, or profit from it without your consent.
00:16:41.040
ExpressVPN works on every device, phone, laptop, tablet, you name it.
00:16:44.120
And you can protect up to 14 devices with one subscription.
00:16:46.860
Get four extra months of ExpressVPN just by using our special link.
00:16:52.740
That's E-X-P-R-E-S-S-V-P-N dot com slash friendlyfire to get four extra months.
00:17:00.820
I don't want anybody else looking over my shoulder at the data that I'm using or the stuff that I'm searching.
00:17:08.820
Head on over to expressvpn.com slash friendlyfire.
00:17:11.740
That's E-X-P-R-E-S-S-V-P-N dot com slash friendlyfire.
00:17:14.680
Get four extra months and start protecting yourself today.
00:17:19.640
I also have to jump in, I'm told, with another momentum-killing advertisement.
00:17:26.440
Anyway, right when it's getting interesting, let's jump in with the answer.
00:17:31.920
Because I do want to tell you about Helix Sleep.
00:17:36.220
Actually, we have Helix mattresses in our house.
00:17:40.620
All of our kids, all of our 90 kids all have Helix mattresses.
00:17:46.380
I'm not getting a lot of sleep right now because after we fall back with Daylight Savings,
00:17:51.540
everyone talks about how, oh, we save an hour of sleep.
00:17:54.280
Well, the problem is when you have young kids, they don't realize, they don't care about the clock.
00:18:00.760
So now I've got twin toddlers waking up at 4.30 in the morning who are rousing me out of sleep,
00:18:08.780
So Helix will help you sleep like a baby at night, unless you have babies in the house,
00:18:18.780
You can go to helixsleep.com slash friendlyfire for 27% off-site-wide.
00:18:23.900
That's helixsleep.com slash friendlyfire for 27% off-site-wide.
00:18:27.740
You go to their website, you take a sleep quiz, and you get matched with the perfect mattress for you
00:18:33.080
because everyone is different, and they take care of that there.
00:18:37.720
Make sure you enter our show name into the post-purchase survey so they know we've sent you.
00:18:46.560
Canada can be a global leader in reducing the harm caused by smoking, but it requires actionable steps.
00:18:53.120
Now is the time to modernize Canadian laws so that adult smokers have information and access to better alternatives.
00:19:15.620
It's that it comes up every single time there's a new technology.
00:19:19.200
Every time, and it's why government is so bad at managing economies.
00:19:23.880
It's why you don't want a top-down economy because when the cart and horse goes out of style,
00:19:28.680
the government says, we must save the jobs of buggy whip makers, you know?
00:19:38.060
I think this has happened a million times before.
00:19:39.900
You can't imagine what the new job is going to be.
00:19:42.020
But there'll be jobs to do because people are endlessly creative.
00:19:44.920
It's like the people who worry about running out of oil, you know, you don't run out of energy
00:19:56.400
And if we run out of oil, we'll turn something else.
00:20:02.940
The human mind and imagination and creativity is bottomless.
00:20:06.660
I don't fear this about AI at all, although I do think Ben is right that there could be
00:20:10.720
difficult transitions and knowing how people are will handle that in the worst way.
00:20:16.100
But I do think I do think when you have a powerful new tool, you have to start to think about
00:20:21.740
You have to start to think about the things we're going to use it for that are destructive.
00:20:27.360
I mean, my worry about AI is the endless pornography, the endless narcissism, the things
00:20:33.760
that social media has done to human beings by exacerbating our worst qualities and that
00:20:40.300
But as far as sort of the economic point here, I'm significantly less worried about that for
00:20:45.200
One, because I'm just less worried about it based on the history of technological innovation.
00:20:49.780
If you go back to the early 20th century, well over 80 percent of jobs in the United
00:20:53.200
States were agriculturally based or early industry based.
00:20:58.880
Now, if you go to the middle of the 20th century, America was a manufacturing based economy.
00:21:03.780
Jobs tend to move around and human beings are quite adaptable.
00:21:06.940
If the question is, you know, will I be endlessly poor while a few people are trillionaires?
00:21:10.580
That wouldn't work because they wouldn't be trillionaires if everybody is endlessly poor.
00:21:14.500
That's not the way that actually wealth distribution happens.
00:21:16.760
They don't take their wealth from a bunch of super duper poor people.
00:21:19.800
If there's no wealth for them to take, then they don't generate the product.
00:21:22.480
So the actual thing that would happen, the kind of worst case scenario that people are
00:21:25.640
talking about actually would be a sort of Star Trek replicator machine.
00:21:28.800
So in Star Trek, I know not a lot of Trekkies online here, but if you are Trekkie, my understanding
00:21:32.920
is that there is a replicator machine whereby you can literally generate any product from
00:21:40.560
Well, if you don't have to worry about anything, I thought that that was mostly the goal of
00:21:43.920
human beings because work, I mean, we all understand that work is important, but there
00:21:49.480
Like, for example, spending time with your family, it's a different type of fulfillment.
00:21:54.900
What we would call in Hebrew, avoda, which is the same Hebrew, the word for work and service
00:22:00.500
The same type of thing, I think, is true in our lives, right?
00:22:03.900
When I think of like the things that I do that are important, my work actually comes maybe
00:22:06.520
third or fourth on the list after family and religion and the stuff that I'm doing in my
00:22:12.000
So, you know, I'm less worried about the kind of how do I get my stuff?
00:22:14.960
If things work out great, we're all going to be way richer and have a lot more leisure
00:22:17.980
If you're worried about the leisure time, that's a human nature problem.
00:22:21.960
And then there is the other problem, which is what's the alternative?
00:22:25.300
People keep talking about, okay, we could regulate it out of existence, right?
00:22:27.900
We're just going to regulate it, stop it from taking trucker jobs.
00:22:31.960
Let's say that we were able to ban all the self-driving cars.
00:22:34.040
Does anybody think that any other place on Earth is going to ban the self-driving cars?
00:22:38.480
So the actual thing that will happen is that China will gain complete economic dominance
00:22:42.460
over planet Earth, unless you are going to essentially make America autarkic and poor.
00:22:48.160
China will gain the advantage of every efficiency on planet Earth while we hamper ourselves.
00:22:52.900
And we will live in relative poverty compared to what we are now, while China gains significantly
00:22:57.960
more power globally, and then uses that power in order to cram down its terrible vision
00:23:03.600
Is your view then like pure laissez-faire, no regulation whatsoever, let the market lead
00:23:09.420
in it, and that way we'll beat China and we'll maintain our dominance?
00:23:11.840
Yes, except for morality and national security, yes.
00:23:15.140
So I don't think we should be selling NVIDIA chips to China, because I think China is our
00:23:19.700
And I also think that we should be heavily regulating pornography, period, and that applies
00:23:25.240
But if we're talking about like, should we stop AI from generating healthcare solutions
00:23:29.820
because people in the healthcare industry are going to lose their jobs?
00:23:34.400
Much like it's easy for us living in a first world country with an average life expectancy
00:23:38.960
above 80 to talk about, you know, the evils of AI.
00:23:41.980
But if AI, for example, in medical industry extends lifespans by another 20 years, which could
00:23:47.360
easily happen, you know, that seems like a pretty good thing to happen.
00:23:52.360
And I think that one of the biggest mistakes I see people happen, there's a mistake that
00:23:57.920
And that is, I think it happens on the Marxist left, and I think it sometimes happens on the
00:24:02.100
And that is, they take a spiritual problem, people's emptiness and inability to function
00:24:08.480
And then they say, there's a material solution for that.
00:24:11.240
And it is very rare to me that there's actually a material solution to a spiritual problem.
00:24:15.320
That's a very good point, Ben, because it is true.
00:24:17.940
Sometimes people think like with the birth rate problem, you can just fix it with a lot
00:24:21.580
of material solutions, and there's not a lot of evidence.
00:24:24.080
However, there's a distinction between a material solution and a government solution, because
00:24:28.780
the government influences culture, it promotes certain ideas, suppresses others, it promotes
00:24:32.660
religion traditionally, and I think inevitably.
00:24:35.740
And so, you know, to use the birth rate example, the only thing that seems to reliably increase
00:24:42.200
But the government can do things there, either explicitly promote religion, or at least stop
00:24:46.960
the suppression of religion, like, you know, we saw under Joe Biden, and we see under a
00:24:51.280
So is there any role, just before we get to the other guys, is there any role for the
00:24:55.680
government here in maybe not providing a material solution to the consequences of AI, but some
00:25:04.960
And this, by the way, no one, the problem with AI is a bunch of unknown unknowns, right?
00:25:08.600
It's not known unknowns, it's just we literally don't know what's going to happen next.
00:25:12.720
Which is why the Calci markets, right, Calci is one of our sponsors, right now in the Calci
00:25:16.180
markets, like 5% shot that there's any serious regulation of AI, because no one even would
00:25:23.280
I mean, this is a question, honestly, Matt, this is a question for you, because you want
00:25:28.080
You want to do something to stop sort of the forward march of AI.
00:25:30.900
So on a practical level, what does that look like?
00:25:33.420
Well, I think that, and I don't have all the answers, I'll fully admit that.
00:25:36.080
That's why it's so frustrating to me that we're not at a serious level even having this
00:25:42.200
conversation, I mean, we're having this conversation right now, but including like our lawmakers
00:25:46.580
having this debate about what can we do, what should we do, and that conversation just isn't
00:25:53.860
And if I had all the answers myself, then I guess I wouldn't be frustrated by that, because
00:25:57.300
I could just say, well, here's the answers, guys.
00:25:58.780
I don't have them, but what I know, the answer can't be, well, whatever, we'll see how it
00:26:05.300
That can't be the answer when you're facing something that is going to fundamentally alter
00:26:09.640
our civilization in the way that this is going to.
00:26:13.200
I mean, people have suggested, and when it comes to, and this is kind of on a lower level,
00:26:16.520
but things like intellectual property, this is another huge problem with AI, and I think
00:26:21.840
some of you guys have already kind of touched on it, that AI cannot create anything.
00:26:30.540
You were just making fun of me because I brought that up, and now you're bringing that up.
00:26:36.400
So the problem with the reason why I can't do that is because it's stealing from what
00:26:40.060
other people have done, and right now AI lives in this kind of like bubble where the rules
00:26:46.620
So there are things that you could do there legislatively.
00:26:50.920
Again, it's not easy to do, but I do think you have to do something there to protect people
00:26:55.520
from having their creative property, but I would flip it back the other way because
00:27:02.060
what I'm going to ask is, okay, the drivers are all going to lose their jobs, most likely.
00:27:08.800
Customer service, the customer service industry, a lot of that is just going away because when
00:27:14.660
AI is adopted, and I don't think this is not some kind of like sci-fi speculation.
00:27:21.820
It's like pretty clear that if we keep applying this stuff, there's not going to be anything
00:27:27.240
So I think a lot of these customer service jobs are going to go away.
00:27:30.160
And then, yes, there's also the white collar, but I care about those people too.
00:27:34.800
Anyone who sits in a cubicle all day and enters data into computers, which is millions of people,
00:27:41.880
probably a lot of their jobs are going away, and I think that that matters too.
00:27:45.280
So my question is, if that were to happen, let's just say, and maybe AI all breaks down
00:27:50.960
and it doesn't happen, I think it probably will.
00:27:52.840
If that happens over the next five to 10 years and you've got tens of millions of people
00:27:57.000
who, not just their job, but really their entire industry just went away, what are we
00:28:09.140
It'll take me, I promise, like four sentences, okay?
00:28:12.680
If I had asked you that same question in 1998, the advent of the internet is going to kill
00:28:16.380
a bunch of jobs, and it will kill a bunch of jobs, you know, based on all the supply
00:28:20.060
chains being changed, everything getting a lot shorter, you won't have to go to the local
00:28:23.260
mom and pop shop, you can order off the internet.
00:28:25.120
And I said to you, don't worry, in 20 years, there will be literally millions of people
00:28:29.720
who are working on AI coding and database building, data center building.
00:28:33.660
You would say, what the hell are you even talking about?
00:28:38.220
If I said to you, there would be legitimately thousands of jobs that were people who were
00:28:44.340
You say, what the hell, what's a social media and how do it work, right?
00:28:47.660
Like, this is the whole point of the market, is that jobs that we don't even know exist
00:28:51.580
will come about because that's what the market does.
00:28:54.060
The market generates innovation because human desire is endless, and the human desire for
00:29:05.940
Yes, yes, we can't, you know, we can't imagine these things.
00:29:09.620
I think there are going to be jobs that we have no idea could possibly exist.
00:29:13.020
But the question that Knowles asked and actually Ben referred to is the really important
00:29:19.060
When, back in the day, when you wanted to get a pornographic magazine, you had to walk
00:29:23.720
into a store, shame yourself, you had to make sure no, but none of the neighbors
00:29:26.720
saw you, you know, you went home with this piece of paper that you could look at and all
00:29:31.280
Not that, Drew, not that you have any experience in describing it.
00:29:37.960
So, but nobody, when people said, oh, we've got to ban this, and they did ban it, and you
00:29:42.040
know, they censored things, and then they said, oh yeah, we got to censor Ulysses too.
00:29:48.320
Now you've got this sewer of porn wiping people's lives away with no regulation whatsoever.
00:29:54.760
And so now conservatives, when I come out and say things, for instance, like, you should
00:29:58.760
not be able to censor opinions on YouTube, conservatives go, oh my, regulation, regulation.
00:30:06.500
It needs new regulations to make sure the freedom of speech lives, because if you censor things
00:30:11.000
on YouTube, you have virtually taken them out of the public square.
00:30:15.760
I mean, I, who would have said, you know, so what, pornography, 30 years ago, now think,
00:30:22.720
holy, this is a toxin being pumped into the human psyche like never before.
00:30:31.480
And what it was going to do to destroy young people in 2005.
00:30:38.100
No, I've written many words of pornography over the years.
00:30:40.600
These are the questions that we're not addressing now, where we know the danger, we can see
00:30:47.040
These are the issues I think we should be addressing, not whether jobs are going to disappear, because
00:30:52.820
We don't even know what that's going to look like.
00:30:55.940
Yeah, on the regulation side of it, I mean, obviously, the most, the most, you know, the
00:31:00.200
sort of the most heavy-handed and obvious thing, if we're talking about regulation, is,
00:31:03.940
you know, the government saying that, hey, okay, you want to wipe out all the driver
00:31:08.380
jobs, you want to wipe out, you want to, you know, you want to get rid of all your customer
00:31:12.320
service jobs if you're McDonald's, and it's, it's a law saying, well, you can't do that.
00:31:18.800
We're not going to let you do that, because we're not going to let you put millions of
00:31:22.040
people out of work all at the same time, because we just can't, we can't sustain that
00:31:29.100
That's the kind of thing that I normally would not support, and there is this tension between
00:31:34.440
like free markets and then this other huge civilization level concern.
00:31:41.240
And I do think, and I just go back to that this is a different kind of thing.
00:31:48.080
Well, the internet is a different kind of thing.
00:31:49.920
The internet is a, you know, a very high-tech, sophisticated form of communication.
00:31:55.820
It's just a way of, for people to communicate and connect with each other.
00:31:59.120
And, and so that in and of itself is not going to take away jobs.
00:32:02.560
It might change what the jobs are, but you still need the, you still have humans who are
00:32:06.780
on the internet, commuting with, communicating with each other.
00:32:09.260
And that's the case with all of these technological innovations, that it's just a different tool
00:32:14.600
And so, yeah, maybe the job where you use the, the more primitive tool goes away, but
00:32:18.340
now you use the more sophisticated tool and that's the job.
00:32:20.940
And I think with AI, it's just different because, as I said, it's artificial intelligence,
00:32:25.440
which means the entire point of it is that we don't need a person to do this at all.
00:32:32.200
And because we're facing this totally new kind of thing, which I really believe is unprecedented
00:32:37.140
in human history, I think we might need to embrace solutions that otherwise would make
00:32:44.440
In fairness, we don't know if that's even Matt really talking right now.
00:32:51.220
Now, I want to get to, it was something we touched on though.
00:32:53.420
It's related, but it's a totally separate topic, is affordability.
00:32:57.500
It's the word, it's the meme that everyone's taught, it's the new six, seven, everyone's
00:33:04.500
But first, I want to restore a little balance to this conversation.
00:33:07.380
Yes, I want to say there's, here is something that AI cannot do.
00:33:15.760
It's a very, very complicated thing, these vegetables.
00:33:18.500
And if you want to get enough of them, you need to use balance of nature because I love
00:33:22.320
vegetables, but if I ate enough, the kinds of things that, you know, nutrition experts
00:33:26.940
recommend, it would be all over my beard, my face.
00:33:30.680
So instead, I have balance of nature, fruits and veggies.
00:33:34.980
And you may say, well, if you use them all the time, which I do, why aren't they open?
00:33:38.920
It's because I have so many of these dead things that I don't even have to open them.
00:33:45.180
Balance of nature, what they do is they freeze dry fruits and veggies, then powder them
00:33:48.680
and blend them into the most convenient nutritional value.
00:33:52.180
You can take the fruits and veggies supplements with water, chew them or open them up and mix
00:33:56.180
the powder into your food or drinks, which just sounds silly to me, but it's still, it's
00:34:03.220
You wonder how an animated corpse like myself can look like a 30-year-old man.
00:34:08.800
So go to balanceofnature.com and get a free fiber and spice supplement.
00:34:14.040
You didn't even have time to talk about the fiber and spices.
00:34:16.200
Plus, you get 35% off your first set as a new preferred customer by using discount code
00:34:23.280
Go to balanceofnature.com and use the discount code FRIENDLYFIRE.
00:34:28.580
Well, I was saying with the rest of your money, you need to go to dailywire.com slash subscribe
00:34:33.040
because we have the biggest deal of the year right now.
00:34:42.460
You're going to get the world premiere of that trailer coming out at the end of the show.
00:34:56.700
I love building culture, but I also like doing it on a good deal.
00:35:03.980
And so when you can do it for 50% off, it's a great time to do it.
00:35:10.580
Absolutely fitting, apt way to talk about affordability, which is a very serious problem.
00:35:16.000
You know, usually sweet little Elisa does the shopping in the house.
00:35:19.220
Occasionally, I had to go out the other day to get lemons for a cocktail that I was making.
00:35:23.160
Not even for food, just for a cocktail I was making.
00:35:29.220
Anyway, I go to the grocery store, and the prices are insane.
00:35:32.500
I see why Elisa had been keeping me from them, largely.
00:35:35.040
I mean, you know, the affordability problem is very real.
00:35:38.740
It's not that it's not being pounced on by political actors,
00:35:41.340
and it's obviously become a big political talking point.
00:35:46.840
A lot of the fundamentals of the economy are a little shaky right now,
00:35:50.100
even though those MAG-7 stocks that we were just talking about, AI, is pumping up the market.
00:35:57.740
One, can the government do something to fix this?
00:36:02.000
Or is the government only going to make things worse?
00:36:05.640
How is this going to affect the midterms in the 2028 election?
00:36:13.780
because there was a short clip of you going around saying,
00:36:17.000
yeah, listen, you know, if you can't afford stuff, move out of your town.
00:36:20.900
Even if it's your hometown, even if your family's been there for a long time,
00:36:23.460
just get, you've got to get out, you've got to be mobile.
00:36:25.840
And you were variously exalted and pilloried for this comment.
00:36:35.440
That was a piece of personal advice to people that I think every single young person that I know
00:36:40.080
has at some point taken, which is if you're living in a place that you can't afford,
00:36:43.640
and the policies aren't going to change, and you want to make your life better,
00:36:46.640
you do have to make a significant calculation as to whether you think your life is going to get better
00:36:49.960
where you are or whether you're going to have to go pursue a dream someplace else.
00:36:54.360
You've seen tremendous population movement in this country right now,
00:36:59.020
You've seen tremendous population movement from the blue areas to the red areas of the country,
00:37:02.460
specifically because people are seeking economic opportunity.
00:37:04.780
So what I thought I was saying was something that's fairly obvious,
00:37:07.400
which is that if you are on a personal level in a place where you're stuck and you can't afford to live there,
00:37:12.540
you have to make the best decision for yourself and your family,
00:37:15.120
and that does include the possibility of actually moving as opposed to shouting at the wind
00:37:20.400
That's a separate question from what sort of policies could be pursued in order to make things more affordable.
00:37:26.480
If you're talking about Manhattan, Manhattan will never be as affordable as Des Moines.
00:37:31.160
And anybody who says that it is going to is totally lying to you.
00:37:35.620
The reality is there are only two ways to make things more affordable.
00:37:38.980
One is to drop the demand for a product and retain the same supply.
00:37:42.280
The other is to radically increase the supply of a product and to retain the same demand.
00:37:46.520
Those are the only way that things become more affordable.
00:37:49.460
The only way things become more affordable is if the supply greatly outstrips the demand,
00:37:53.520
and the only ways to do that are to increase supply or reduce demand.
00:37:57.040
So if you're talking about how to make things more affordable,
00:37:59.560
one of the things you can do to increase supply is remove regulations.
00:38:02.480
You can get rid of tax structures that disincentivize investment.
00:38:05.980
You can get rid of a lot of the difficulty in building, for example, in New York.
00:38:10.080
But are you ever going to build enough units so that suddenly the real estate prices there reflect
00:38:13.740
what it would be across the river in sort of rural parts of New Jersey?
00:38:19.340
And when people talk about affordability, the thing that makes me totally crazy about this
00:38:24.520
I'm sick to people in politics doing this routine where they say the problem over and
00:38:32.700
I don't really see a solution to the thing you're talking about.
00:38:34.500
They pillory you for noting the obvious like, OK, if you're not providing Zoram Amdani is
00:38:39.760
Him saying affordability didn't make affordability magically appear like Beetlejuice if he said
00:38:45.640
And also politicians are in the business of lying to you.
00:38:48.680
OK, when when the president of the United States, who I generally agree with, he made
00:38:52.540
a mistake when he came into office and said, I'm going to make things affordable again.
00:38:57.780
And the reason you're probably not is because all of the inflation that Joe Biden embedded in
00:39:01.800
the economy already made things so wildly unaffordable that the best you're probably
00:39:07.340
What the Federal Reserve seeks to do is keep the inflation rate at like two percent, which
00:39:11.480
is an increase in the prices just by the very nature of it.
00:39:14.460
And what people actually want is for there to be deflation.
00:39:17.400
They want the prices to be back at 2019 levels.
00:39:19.800
And they're not talking about going back to 2024 levels.
00:39:23.520
The only way to get back to 2019 levels is probably an economic recession.
00:39:28.920
And so, again, saying unpopular things, the best that the inflation rate could look like
00:39:33.880
for President Trump is like this under Joe Biden and then like this under Trump.
00:39:39.820
OK, so here's OK, this would be Biden, this gigantic spike.
00:39:45.260
The problem is people are looking at the prices here and they're saying, well, they don't look
00:39:50.560
Well, yeah, what's Trump supposed to do about that?
00:39:53.200
Absent a radical increase in the interest rates that would sink the that would sink the
00:39:57.740
So one thing that has happened, everyone was predicting that Trump's tariffs were going
00:40:03.240
And the Treasury Secretary, Scott Bessent, was doing a little victory lap because when
00:40:07.520
he was being confirmed for his position, he said, no, I actually think tariffs are going
00:40:12.340
And the San Francisco Fed just came out and said the tariffs are deflationary.
00:40:19.980
What that study says is that when you look at tariffs over time, there's a spike at the
00:40:27.420
beginning because things get more expensive because you're reducing the supply and the
00:40:31.180
So the price goes up temporarily and then people start to lose their jobs.
00:40:34.380
And when people start to lose their jobs, the demand goes down.
00:40:36.780
And when the demand goes down, the prices come down.
00:40:40.340
There was a big caveat, even in the popular reporting, which is the caveat is it hurts
00:40:48.920
There's one further point on it, just to why I think your video went viral, Ben, is because
00:40:54.480
one thing people are hearing is they're missing the context of you're giving personal advice
00:40:59.760
to someone who's asking, you know, but at a macro level, at a political level, what people
00:41:04.000
are hearing is, hold on, you're telling me my family's been in this town forever.
00:41:08.600
I got dozens of family members buried in the local cemetery in my hometown.
00:41:14.180
And even before that, the Knowleses initially were from New Hampshire and they arrived here,
00:41:22.400
The Knowles family home stood from 1660 until 1994 when the home burned down.
00:41:28.500
There are still Knowleses all over that area in New Hampshire and Maine.
00:41:32.780
And what I think a lot of people are looking around at is part of the reason that housing
00:41:37.260
in particular is unaffordable right now is because of government decisions, government decisions
00:41:42.840
to flood the country with a bunch of like Venezuelan criminals or Somalis or something and increase
00:41:47.740
the cost of housing or government decisions that are going to compromise certain industries
00:41:52.940
or certain jobs because of trade deals or whatever, going all the way back to NAFTA or
00:41:56.840
We don't need to litigate those in particular, but you're saying, no, there's part of this
00:42:01.400
political order that has led to this crisis, at the very least with migration.
00:42:06.360
And so why is it that I'm just supposed to say, oh, shucks, I got to lose my hometown
00:42:10.600
because, well, you know, Republicans and Democrats together flooded the country with aliens.
00:42:16.480
Isn't there a good to having, you know, long family histories in a single place?
00:42:23.300
And there's a single and there's a good to having your family live near you.
00:42:29.740
I spent my entire life living in L.A. until I was 35, one mile from my parents.
00:42:33.220
And then I moved to Florida and I still live one mile from my parents because I took them
00:42:37.160
One of the things I talk about on the show all the time is having family structures nearby
00:42:40.360
because you need those supportive family structures.
00:42:42.300
That's not the case that I'm making is that you should abandon this sort of stuff or that
00:42:46.200
mass migration should replace you in your hometown.
00:42:48.260
I think everyone here is very much against mass migration, is very much in favor of what
00:42:52.240
President Trump has been doing on the immigration program.
00:42:58.100
But if there's a mentality that sets in that says, I bear no responsibility in changing my
00:43:01.880
own life if I can't change the outside circumstances.
00:43:04.400
And now I'm just going to sit here and bitch about it.
00:43:05.880
Like that doesn't seem like a specific recipe for individual success.
00:43:09.480
But Matt, I want to know what you take because I think you and I are
00:43:12.080
as usual, we are on opposite ends of the spectrum in some ways.
00:43:17.180
And I agree also with maybe I'm somewhere in between because I agree with your point.
00:43:21.620
I also agree with some of the criticism, the more the more rational.
00:43:29.220
And I've said the same thing many times that especially as a young man, I also think there's
00:43:33.220
a gender element to this that is a sort of a different topic.
00:43:36.040
But as a parent, I want my sons when they become adults to move out of the house.
00:43:41.480
I don't want them to move 10 hours away, hopefully.
00:43:44.800
I do want them to move out and experience living on their own a little bit before they
00:43:51.520
My daughters, I would love for them to just stay home with me until they get married many,
00:44:00.320
I think if I totally agree that if you're in a spot, particularly if you're a young
00:44:05.920
man and you can't afford anything, you can't get a job, can't afford to live anywhere while
00:44:11.820
you're single, you have no kids, you have no dependents, you can go anywhere and do anything
00:44:21.400
I mean, worst case scenario, you go somewhere, you end up sleeping in your car or something
00:44:25.460
I mean, that's not good, but it's like, well, it's just you.
00:44:28.000
You can handle that, especially as a young man.
00:44:29.840
And so you could take risks, you can go out and pursue opportunities.
00:44:35.120
However, at the same time, it's also true that you shouldn't have to do that.
00:44:40.300
Like something is wrong that so many people have to do that.
00:44:43.960
You should be able to, to Michael's point, if you're a young man and you're looking at,
00:44:48.540
okay, well, my parents were born here, they lived here.
00:44:54.700
So generations of a family lived in the same place.
00:44:57.140
And now all of a sudden, and I'm, I have the same kind of skills that they do.
00:45:01.020
I might even be more, more educated than they were.
00:45:03.480
So I'm in many ways more qualified for a job than even any of them were.
00:45:07.000
And yet all of a sudden everything's broken down.
00:45:09.120
It doesn't work for me to live in this town anymore.
00:45:15.480
So, but on the practical level, well, it is this way now and we want you to still succeed.
00:45:20.620
So you might have to go somewhere else, hopefully with the intent of eventually coming back to live
00:45:24.900
around your family, because I totally believe, I mean, we, we emphasize the nuclear family so much,
00:45:29.200
which is important, but also the, the quote unquote extended family is also important.
00:45:33.360
So getting back to them and that's what, you know, what a lot of us do, what I kind of did move
00:45:37.180
around, move around, end up back with your family.
00:45:42.360
You shouldn't have to, it shouldn't be that way.
00:45:46.560
And so we need policies in place that make it possible for people to live with their family and then move next door
00:45:53.020
and stay with generations of families their entire life.
00:45:56.200
You should be able to do that in a functioning and thriving society.
00:45:59.180
One of the ways to make that happen is the thing we all agree with, uh, get all the illegals out.
00:46:04.600
There's a lot, we've been, they've been saying 20 million illegals in this country.
00:46:07.540
They've been telling me that since like 20 years ago, they were saying it was 20 million.
00:46:12.140
We don't know how many get them all out, shut down immigration.
00:46:15.620
And, uh, that's one of the policy changes that can be made and we need to do that.
00:46:19.020
But until that happens, yeah, you got to figure out what you're going to do in your own life.
00:46:25.340
And I want to hear from my great, great grandfather, Andrew, play this.
00:46:30.520
I was just going to say, I agree with Matt actually.
00:46:32.040
So Matt and I are actually in total agreement on this.
00:46:34.180
Now I really want to move on because Matt's offering a moderate opinion.
00:46:38.520
I want to tell you at the other end of the age spectrum about pre-born.
00:46:43.040
I want you to go to pre-born.com slash fire right now, because pre-born is, is one of my
00:46:53.560
I encourage you to personally support it, to give what you can.
00:46:55.840
They've saved over 380,000 babies, uh, through their rescue program.
00:47:06.460
And, uh, when a woman sees an ultrasound, it doubles the baby's chance of life.
00:47:10.480
When a woman is considering abortion, it's, they provide amazing care and work.
00:47:15.060
Not only do they introduce the babies to the mothers, they also take care of those mothers
00:47:18.960
afterward, radically increase the chances that that baby is going to live and that they
00:47:24.780
This giving season, do not let another life be lost.
00:47:29.860
Be the hope for worried mothers and at-risk babies to donate securely.
00:47:34.540
If you like your phone, if you're a little more of a Luddite than some of us, you're not
00:47:37.560
down with, on the AI train, you dial pound two 50, you say keyword baby, pound two 50 keyword
00:47:42.800
baby, or you go to pre-born.com slash fire, pre-born.com slash fire.
00:47:48.820
So it's another way of not having to pay all those bureaucrats in Washington.
00:47:52.400
It's a, your money can be put to good use and not be put to bad use.
00:48:03.060
So I disagree with Ben in a couple of ways here.
00:48:05.640
I mean, first of all, Zoramandani is, is one of the scummiest politicians I've ever seen
00:48:15.100
And when you raise the issue, people, people perk up.
00:48:19.460
He raised the issue and then offered socialist solutions that we know will be utterly, utterly
00:48:24.660
It's not plain candy man to say the word that people are thinking about.
00:48:28.760
The worst thing a politician can do, and the thing will destroy any administration is to
00:48:33.440
show people a chart that shows them they're not suffering when they can't afford Christmas
00:48:40.500
You know, and people know exactly how they're doing and it makes them incredibly frustrated.
00:48:45.040
What they're frustrated with Trump now is he's doing something I think is urgently
00:48:49.440
I think we're going to be very grateful to Trump for what he did five, six, seven years
00:48:53.200
down the line when China finally invades Taiwan.
00:48:56.020
I think he's totally rearranged America's priorities in absolute great ways, but he didn't
00:49:01.540
pay attention to the thing that's right there on the table and he has to pay attention to
00:49:05.540
The other thing I disagree with is normally it is true that you have to put people out of
00:49:13.200
He didn't lose that, the houses, but he lost the midterms because of it.
00:49:18.780
And then the economy turned around for the next 25 years because of what Reagan did.
00:49:23.040
But the other thing that there is a third way of dealing with inflation, which is raising
00:49:32.820
If you can steady, you know, if you can cut inflation off and make the prices level out
00:49:37.460
and then wages start to rise, then you can actually, that is the same thing as bringing
00:49:42.000
down inflation because now people can afford the things they couldn't afford before.
00:49:45.780
So Matt is totally right that we got to get rid of all the illegals.
00:49:49.880
And as far as I'm concerned, I don't care who it is.
00:49:52.160
I've lost all sympathy with the illegal immigrations.
00:49:55.120
I know some of these people are great people who snuck in.
00:49:59.160
And we got to give the country back to the people who are here and who were born here.
00:50:03.740
I cannot have compassion for 20 million people.
00:50:06.500
I can only have compassion for one person at a time.
00:50:08.380
If one guy sneaks in, I can have compassion for him.
00:50:11.300
I can't have a compassion for an invading army, which is what the Biden administration
00:50:15.900
But the other thing is we have to have capitalist solutions.
00:50:20.520
For instance, I think a lot of companies are now offering people stock.
00:50:24.780
A lot more companies are offering people stock and investment as payment, as part of the
00:50:31.600
I was a reader for Columbia Pictures and Coca-Cola owned them and they gave me Coke stock.
00:50:39.900
And now I had an investment in the company and in the economy.
00:50:44.420
Trump is talking about personal savings accounts that I think is also a really good idea.
00:50:48.920
Some of his ideas, like the 50-year mortgage, I'm not too happy about because it's going
00:50:59.040
But I think that there are ways for capitalists to increase people's participation in the economy
00:51:06.280
so that when things work for the bosses, they work for the people too.
00:51:11.040
I think it's a wonderful thing that this country, when it is working on all cylinders and when
00:51:15.960
the capitalism is in place, it makes so much money that the big guys can afford to share
00:51:21.360
a little bit with the little guys, not by having the government redistribute it, but by saying,
00:51:33.240
And so I think that there are ways of dealing with this.
00:51:35.220
But I think that dealing with it is something government has to do.
00:51:50.360
And I think one thing, you're right that we don't want deflation because it means the
00:51:53.500
economy is tanking, but you can get wages growing in a lot of different ways, one of
00:51:57.800
them by reducing the workforce, by getting rid of the people who shouldn't be here, would
00:52:02.760
I don't disagree with some of those policy prescriptions, but I think that the thing
00:52:06.700
that I am kind of stuck in, and it's driving me a little crazy, and I think it's the reason
00:52:10.980
why the country is penduluming side to side incredibly wildly.
00:52:14.360
You'll see, like, right now, you know, Kalshi is one of our sponsors, so I'll mention them
00:52:21.180
But if you look at the polls, like the Kalshi markets right now, Democrats, according to
00:52:25.080
that market, and I kind of agree with this, are actually the favorites in 2028.
00:52:28.540
And I think the reason for that, and I think the reason that the country just keeps swinging
00:52:31.680
wildly poll to poll, is because when you have politicians who are actually saying the
00:52:36.900
same thing, but none of them are saying what is true, this is what you end up with.
00:52:40.200
So if everybody says affordability is- I agree, affordability is a problem.
00:52:45.600
Labeling problems is the easiest thing in the world.
00:52:49.340
And I can agree with my wife on every single problem that exists in our life.
00:52:52.340
It's when you get to the solutions that things get a little bit complicated.
00:52:55.400
And when you have politicians who always say the same thing, but from different sides
00:52:58.660
of the aisle, which is, you're right, it's government's job to solve it.
00:53:02.760
If the thing that you're saying is not going to solve it, and you're asking for additional
00:53:06.760
centralized power in order to solve the thing, what you are going to end up with
00:53:10.220
is failure, and then the other guy is going to say, give it to me.
00:53:13.300
And so they're just passing the ball side to side.
00:53:14.880
The only thing that is going to create affordability is a dynamic and innovative economy, which
00:53:20.920
One, a consistent level of regulation or less regulation, right?
00:53:24.440
Like actual certainty, what's going to happen tomorrow in the economy?
00:53:27.060
Two, you're actually going to need innovators to innovate, and you need to leave them alone
00:53:30.960
and allow them to innovate and actually capture the profits that they're creating through
00:53:35.860
And then you're going to need to get the hell out of the way.
00:53:37.400
I mean, the magic of the Reagan economy, I know Reagan has now become anathema for some
00:53:42.620
I can't imagine why the right has decided that Reagan was suddenly bad, other than because
00:53:47.100
we need to cast up a false villain in order to elevate, you know, whatever the new-
00:53:51.560
The mass amnesty irritated some people in retrospect.
00:53:53.680
I'm not saying everything about Reagan was wonderful, but I don't think everything about
00:53:58.340
I do think that the Reagan economy generated more job growth and pulled us out of a greater
00:54:03.240
economic morass than any president in history, probably.
00:54:09.840
And so if you look at, you know, Reagan's pitch, his pitch was, I can't solve all your
00:54:14.720
problems for you, but I can get the government out of your way so you can solve your own
00:54:18.180
And I just want one politician who will say that, like just one, as opposed to this kind
00:54:22.160
of centralized government bullshit where everybody says, no, no, don't worry.
00:54:25.480
You sit there and I'll solve all your problems for you.
00:54:27.080
No one is going to solve the vast majority of problems in your life.
00:54:31.320
The best they can do is get rid of the obstacles that are in your way, the systemic obstacles
00:54:35.620
And then most of the decisions in a free country ought to be up to you.
00:54:38.780
And that is scary because it means that actually your success or failure is largely on your
00:54:47.620
No, but in defense of those who are critiquing, Ray, obviously I still love St. Gipper and politicians
00:54:58.760
Now people are looking more toward, I don't know, they like Teddy Roosevelt.
00:55:01.760
So this happens as we rethink history and as we move on to new circumstances.
00:55:06.100
Part of the reason that there's a little more of a critical lens, you know, as opposed
00:55:10.380
to just exalting St. Reagan as being perfect in all ways is because, you know, in the 80s,
00:55:17.040
mass amnesty for illegal aliens, for example, wasn't really all that big a deal.
00:55:24.720
In the 80s, you know, obviously Reagan was massively successful in his economic policy,
00:55:30.380
as was Thatcher, as was that whole kind of movement.
00:55:35.340
And so it's not to say we throw out all of their solutions.
00:55:37.380
It's not to say that we throw out all of their solutions, but it's to recognize that
00:55:40.120
there are more difficult economic problems that we have to deal with.
00:55:42.860
And so one of, you know, Drew actually offered some real solutions here, which is he, you
00:55:47.780
pointed out, Drew, that having people really bought into the economy, you know, Coca-Cola
00:55:52.180
giving you some stock back in the day is helpful.
00:55:54.540
Back when we were rethinking some of the problems with industrial capitalism 100 years ago, you
00:55:58.980
had writers, especially Catholic writers like Chesterton and Belloc saying we need some
00:56:02.900
option, not socialism and communism, not pure unbridled capitalism, but some other option.
00:56:08.160
They propose something called distributism, which is too complicated to get into here and probably
00:56:13.060
But part of it, a lot of what it comes down to is give people some ownership, give people
00:56:19.720
And so here's another criticism maybe of what came out of the Reagan era is that we
00:56:26.480
And GDP is a fine economic indicator, but it's not the be all and end all of everything.
00:56:30.600
And I think what a lot of people are looking around at today is saying, look, you can show
00:56:34.360
a lot of economic activity in all sorts of ways by the pornography industry, to use the topic
00:56:43.400
You know, there are all sorts of very destructive industries.
00:56:46.220
We brag now about how women's employment is the highest ever.
00:56:51.320
You know, I mean, who's taking care of the kids?
00:56:55.500
And so I just, I wonder one slightly practical solution might be to say, all right, look, maybe
00:57:01.060
GDP isn't the be all and end all of everything.
00:57:03.140
And maybe there are certain areas of the economy that are legitimately immoral and destructive.
00:57:07.340
And we used to heavily regulate them like pornography, for instance, but all sorts of other kind
00:57:17.780
Maybe that maybe it takes up GDP a little bit, but it doesn't.
00:57:20.180
I don't think that's really great for the true health of an economy.
00:57:22.140
Maybe we need to rethink what economic health really looks like, because the changes that
00:57:27.300
came about in the late part of the 20th century did have some negative side effects as well
00:57:33.280
Rinse takes your laundry and hand delivers it to your door, expertly cleaned and folded.
00:57:38.060
So you could take the time once spent folding and sorting and waiting to finally pursue
00:58:03.360
Can I address the Reagan thing for a minute, though?
00:58:05.300
Because a lot of this, I think, started with that Caldwell book, The Age of Entitlement,
00:58:09.220
in which he blamed Reagan for things that Reagan actually didn't.
00:58:11.740
Reagan said he failed to cut down the government.
00:58:13.460
That was the big failure of his administration.
00:58:20.200
He freed, like, a huge, huge section of the world, of the globe.
00:58:29.980
You can't imagine how unheard of that was, how unexpected it was, how nobody thought it
00:58:35.340
would ever happen, how we were dealing with the Soviet Union for the rest of our lives.
00:58:38.440
Not just people who thought that communism was going to work, but people who thought it's
00:58:48.940
We now are living in an absolutely new economy.
00:58:59.040
The basis of deregulation and freedom and free markets are absolutely the same.
00:59:06.040
You know, but the problems that arise because no system solves human problems because human
00:59:14.080
The problems that arise in the places where the peaks of problems are change.
00:59:21.180
One of them, one of the key ones is the role of women in our society, which I think is screwed
00:59:28.440
We've actually stopped reproducing, which to me is always a bad sign.
00:59:32.160
You know, that economic indicator, another indicator.
00:59:37.860
I should let Drew finish his sentences because when he finishes them, I'm more likely to agree
00:59:41.520
But at the same time, you know, Knowles, I'll pick on you a little bit.
00:59:47.720
When we say, you know, terrible, we shouldn't look at GDP.
00:59:51.800
Neither we shouldn't, but it's not the be all and end all.
00:59:54.420
Okay, so there's no such thing as an economic be all and end all.
00:59:57.060
Okay, but I think that we are mixing up a few terminologies here.
01:00:00.280
And I think that we ought to tease out the strain for one second.
01:00:03.000
There's a difference between economic health and societal health.
01:00:06.560
You can have a very economically healthy society that is breaking down in a lot of social ways
01:00:14.800
And so, yes, it turns out that we are materially significantly better off than we were in the
01:00:19.660
In fact, we are materially significantly better off than we were in the mid-2000s.
01:00:23.100
When people talk about the unaffordability of homes, that's because an average home in 1950
01:00:26.920
was a 980-foot, you know, square-foot brick house with no insulation and no heating or
01:00:35.060
Like, this kind of idea that we're living worse than your parents or grandparents is
01:00:40.340
Maybe you're living worse than your grandparents are right now, but you're not living worse
01:00:44.300
than your grandparents were at the same age, right?
01:00:46.460
If you're a 20-year-old living in 2025, you're not worse off than your grandparents were
01:00:53.480
You have an iPhone, but you don't have a house.
01:00:55.600
I mean, I agree that houses are nicer now, but you don't have one.
01:00:58.000
Dude, your apartment is nicer than their house was.
01:01:00.420
Okay, that is a reality if you're living anywhere except for New York City.
01:01:03.680
And by the way, the idea that you couldn't move somewhere and get a house, now you're
01:01:08.340
getting back to my original point, which is on a personal level, if you want to live
01:01:11.160
a life like your grandparents, you might have to do the thing that your grandparents
01:01:14.140
Okay, your grandparents went to a war and then they came back and moved to a town that they
01:01:18.800
And then they got a house that was like off the lot from some big corporation that built
01:01:24.540
a bunch of standard box-looking houses that now you drive past those on the freeway and
01:01:27.780
you say, I can't believe somebody ever lived in those.
01:01:29.520
So it's kind of, you know, rose-colored glasses about the past.
01:01:33.720
And again, I think that if we want to look at the real problems in our society, we shouldn't
01:01:37.020
create a mythical past and we shouldn't create a mythically terrible present.
01:01:40.200
We should actually look at the problems in our society.
01:01:42.440
And one of those would be people not having kids.
01:01:44.540
One of those would be deep depression and unhappiness.
01:01:46.640
People killing themselves with opioids, you know, people being, yes, people having their
01:01:50.540
jobs taken by illegal immigrants in certain industries.
01:01:52.720
Like those are actual real solvable problems, but I don't have a DeLorean.
01:01:56.000
All I have right now is the way that people are living right now.
01:01:59.620
And so now we have to look at the problems in front of us and how do we solve those?
01:02:02.000
Yeah, but that's the one part where I, so at the buzzer, I get to disagree with you,
01:02:07.380
Ben, on, I remember there was one thing you said in that, in that clip that I did disagree
01:02:11.560
that I couldn't remember, but then you just said it again.
01:02:12.860
And so the, the, the one part about, well, this is, you know, America is how America has
01:02:18.120
always been that you, you, you leave and you go somewhere else away from your family.
01:02:22.400
And I think that like back in the pioneer days, I mean, that there is something about that
01:02:26.820
that's in the American spirit of like literally going out into a wilderness and building your
01:02:32.300
own life, maybe a thousand miles away from anyone that you know.
01:02:35.380
And so there's, that's American in a certain sense, but that was back in the pioneer days.
01:02:39.980
I think for most of it, for most of, for most of American history, it's like anywhere else
01:02:48.060
They, they stayed where their support systems were.
01:02:51.760
And by the stats, we're less mobile now than we have ever been anytime in American history.
01:02:56.020
If you are currently living in the town where you grew up.
01:03:00.020
You're saying, but you're saying we're less mobile now.
01:03:01.840
I'd be, and I'm saying that we are a unique breed in that we actually like, we're a little
01:03:08.260
Like we, but the, the people who tend to be more successful and again, as a piece of
01:03:11.960
advice are the people who tend to actually move in pursuit of opportunity.
01:03:15.560
And if you look, historically speaking, it is not true that in 1920, everybody is living
01:03:20.300
In fact, in 1920, there were more people who are moving across the country at great expense
01:03:24.440
and difficulty than there are today in, in 2025.
01:03:31.840
They go into the wilderness, they build new towns, but most people are not exceptional.
01:03:38.360
So, and you want us, and you want a country filled with communities and filled with, you
01:03:42.540
know, people with traditions and things like that.
01:03:46.120
I do believe that exceptional people should and will move, but I, but I think that, that Matt
01:03:50.340
is right, that it shouldn't be like that for everybody.
01:03:52.460
Sorry, go back to Matt so Matt can finish this degree with me.
01:03:55.260
Uh, no, I, I think, I think, uh, I think that's, I don't know the, the claim that, um, people
01:04:04.320
There's also, there's a technological side of this too, that, that for a lot of American
01:04:07.940
history, you know, moving away from your family, uh, and going to another state over
01:04:14.560
And, you know, people are going to die along the way.
01:04:16.740
So, so that is one of the reasons why we know that, that, that for a lot of, you know, American
01:04:21.820
history and human history, people didn't tend to do that.
01:04:24.580
I mean, sometimes they did, but that was, again, that's like, you're a pioneer.
01:04:28.340
Um, I think that the, at the very least, and I, and I don't think we're disagreeing
01:04:32.060
on this point that the desire to stay in your community, where you were born, where your
01:04:38.680
family is stay with your support system, with your families and your, and your family
01:04:47.260
And, uh, and, uh, and a healthy country is one where people, if they want to do that,
01:04:53.400
So, but I think that's the part, I think we all agree on that, right?
01:04:57.960
You know, this gets back though, to this point of, uh, uh, the neat and pat distinction between
01:05:03.900
I'm not sure that we can, obviously they're distinct concepts, but I'm not sure that we can
01:05:08.000
totally separate them, you know, especially as increasingly in the modern age, we think of
01:05:11.840
ourselves as homo economicus, you know, we're like primarily, uh, economic creatures.
01:05:16.400
And I, I don't, I think we're just an integral creatures that, and we, we have all of these
01:05:21.460
And so, you know, especially at this kind of moment, you look now, compare it to 1980 or
01:05:28.740
One of the major problems that we have is that social solidarity has really frayed, that
01:05:33.920
religiosity has declined precipitously, though there are some signs that that's turning around
01:05:39.080
and you can't divorce that from the birth rate problem.
01:05:42.820
You know, you can't divorce that, divorce that from the fact that people aren't having
01:05:45.760
These are great predictors, you know, stability, tradition, and religion are, are predictors
01:05:50.480
And you can't divorce that from the economic problems because if we don't import the entire
01:05:54.680
third world, we're told that our, our economy is going to collapse, the GDP is going to collapse.
01:05:59.100
So that's the whole argument for mass migration.
01:06:00.920
And so these problems are all so deeply intertwined that it seems to me that there has to be some
01:06:06.420
firmer political solution to, rather than just say, look, we're going to let the free
01:06:12.740
hand of the market, you know, work its way and we'll let the chips fall where they may.
01:06:16.660
A lot of people are looking around and saying, I don't like where the chips are falling.
01:06:19.120
Well, I mean, this is a great place to, for us to conclude, because I'm going to disagree
01:06:22.960
for one second with Knowles and just say that there are many, many more impoverished countries
01:06:27.880
than the United States that have less severe pathologies than the United States.
01:06:32.200
And in the past, we were a less wealthy nation with less severe pathologies.
01:06:36.560
And so this is why I say that trying to tie the economic situation to the pathologies,
01:06:40.300
I think in some cases, I think in most cases, actually, it can be a fool's errand, but we'll
01:06:44.880
have to save that for next time because here's the deal before we leave folks, our biggest
01:06:48.840
and best sale of the year is happening right this very instant, like at this moment, while
01:06:53.080
you're listening to us, all Daily Wire Plus annual memberships are 50% off.
01:06:56.960
You get everything, you get access to the DW Library of Movies, Documentaries, Matt's
01:07:01.220
Documentaries mostly is what we're talking about there, because those are the best ones
01:07:03.720
that have ever been made, and series that stand for the ideals that keep America free.
01:07:07.260
And that, of course, includes the Pendragon Cycle Rise of the Merlin.
01:07:12.200
All Access members get early access to episodes one and two one month early on Christmas Day,
01:07:18.220
You empower DW Plus to build culture, defend values, launch stories that ensure your voice and
01:07:22.920
your values shape the future of the United States.
01:07:25.080
Whether you want to join or give the gift of a DW membership to someone, now's the time
01:07:32.640
You can head on over to dailywire.com slash subscribe.
01:07:35.600
We will all be very happy to see you over there.
01:07:38.280
Well, in just a moment, we are going to bring you the magical, mystical trailer for, finally,
01:07:49.140
We will see you here, hopefully never for the rest of us, but actually we will see you here
01:07:52.920
in a couple of weeks and we'll get together and disagree in friendly fashion on Friendly
01:08:29.760
They say he was a king in Dovid, the son of a princess of lost Atlantis.
01:08:35.980
They say the future and the past are known to him.
01:08:41.800
That the fire and the wind tell him their secrets.
01:08:45.360
That the magic of the hillfolk and druids come forth at his easy command.
01:08:57.060
That the world burned and trembled at his wrath.
01:09:00.100
The Merlin died long before you and I were born.
01:09:09.360
Merlin Emrys has returned to the land of the living.
01:09:21.660
Saxon Hengist has assembled the greatest war host ever seen in the Island of the Mighty.
01:09:25.800
And before the summer is through, he means to take the throne.
01:09:32.280
If we are too busy squabbling amongst ourselves to take up arms against him, here is your hope.
01:09:38.740
A king will arise to hold all Britain in his hand.
01:09:53.580
There'll be no peace in these lands till we are all dust.
01:10:11.900
These brothers are our only hope to stand against it.
01:10:20.820
They say Merlin slew 17 men with his own hands.