Friendly Fireļ¼ Rising Prices, Rising AI, and Rise of the Merlin World Premiere Trailer
Episode Stats
Length
1 hour and 11 minutes
Words per Minute
219.1741
Summary
On today's show, we have a special guest, Ben Walsh, a community college dropout, who joins us to talk about whether or not AI is a good or bad thing for all of us. We also have the World Premiere of the trailer for Pendragon: The Pendragon Cycle, The Rise of the Merlin, coming up at the end of the show.
Transcript
00:00:00.000
Canada can be a global leader in reducing the harm caused by smoking,
00:00:11.620
so that adult smokers have information and access to better alternatives.
00:00:33.020
With a towing capacity of 3,500 kilograms and a weighting depth of 900 millimeters,
00:00:50.360
Break through the busiest time of year with the brand new Peloton Cross Training Tread Plus.
00:00:56.580
With real-time guidance and endless ways to move,
00:00:59.060
you can personalize your workouts and train with confidence,
00:01:08.580
Explore the new Peloton Cross Training Tread Plus at onepeloton.ca.
00:01:12.540
This is why you guys need me here as a community college dropout with all you Ivy League nerds.
00:01:18.600
You were just making fun of me because I brought that up.
00:01:27.720
Okay, now I really want to move on because Matt's offering a moderate opinion
00:01:41.740
All Daily Wire Plus subscriptions are 50% off right now.
00:01:51.380
Also, stick around because we have the world premiere of the trailer of Pendragon,
00:01:56.020
the Pendragon cycle, the rise of the Merlin that is coming up at the end of the show.
00:01:59.680
But before we get to any of that, speaking of wizardry,
00:02:02.200
I want to talk about AI and whether AI is really good like everyone seems to think it is,
00:02:08.820
like all the financial speculators have thought,
00:02:10.960
which is why it boosted the MAG7 stocks until recently before our impending stock market collapse,
00:02:16.200
or whether AI is probably mostly bad for all of us.
00:02:20.180
To kick it off, the most optimistic person on the panel, Mr. Walsh.
00:02:26.020
Yeah, I'm very, I become more anti-AI with each passing day.
00:02:32.100
If I could, I said before, if I could commit some sort of anti-AI genocide,
00:02:38.620
I think that, and here's what blows my mind about it,
00:02:41.720
is that we can all, most of us anyway, can see,
00:02:48.060
can see coming this, like, potential civilizational level catastrophe,
00:02:54.120
and basically nothing is being done about it at all.
00:02:57.480
Because what is absolutely going to happen, as far as I can tell,
00:03:03.940
many millions of jobs over the next five to ten years.
00:03:07.200
How many millions, there's no way to say for sure.
00:03:10.020
I did ask, by the way, ChatGPT before we went on,
00:03:13.180
I asked ChatGPT to estimate how many jobs it will take
00:03:16.760
and AI will take from us in the next ten years.
00:03:19.680
And I think the answer I got was 15 million or something like that.
00:03:25.460
It's millions of jobs are going out the window.
00:03:28.360
And they're not going to be replaced by anything.
00:03:34.440
We're going to be, we're already, we're almost there now,
00:03:38.760
where you just simply cannot tell reality from fiction at all,
00:03:48.460
I can't imagine anyone would want to smear any of us
00:03:53.800
I could just make a video of any of us doing or saying something horrible
00:03:56.240
and there'd be no way for us to prove it didn't happen.
00:03:59.700
I don't know if you guys saw the cat playing the didgeridoo and everything.
00:04:04.120
If I didn't, if I didn't know that most cats don't play didgeridoo,
00:04:07.020
I would have thought that was a 100% real video.
00:04:11.420
that's the other thing that's going to happen with AI
00:04:12.960
is that people are just sitting there looking at this slop
00:04:21.740
it's going to completely destroy every creative industry
00:04:30.240
Are we just going to sit back and let it happen?
00:04:32.220
Because that seems to be the kind of defeatist attitude
00:04:34.520
that most people have is like, well, we can't do anything.
00:04:37.120
So let's just, I guess, you know, we had a good run, human beings.
00:04:43.100
And I, Matt, I do want, Matt, I want to ask you seriously,
00:04:45.840
do you think that AI is going to kill all of us?
00:04:49.220
because I know that's the sort of the most catastrophist take on this.
00:04:52.720
Is the AI is going to turn around and do gigantic murder to all of us.
00:04:57.460
But you know, like this is your list of complaints.
00:04:59.640
I just want to make sure that that's the list of complaints
00:05:02.300
No, the Terminator thing, I don't, that's like, I'd prefer that.
00:05:10.220
then that at least gives us jobs that we could do
00:05:15.720
I'm not looking at any science, you know, sci-fi scenario.
00:05:19.280
The main thing is people will not have much to do
00:05:25.980
And I don't think that we have the capacity to sustain that.
00:05:30.580
when 20 million people all of a sudden have no job.
00:05:34.600
I'm going to argue with everything you just said.
00:05:36.320
So I'm not a person who believes that AI is the cure for all problems.
00:05:42.560
I also do not think that what we are in right now
00:05:47.420
I've actually been saying it for well over a year
00:05:54.860
It just means that the overinvestment in infrastructure
00:05:56.940
at some point is going to have to pay off in actual earnings
00:06:03.120
As far as, I'm hearing kind of three arguments there.
00:06:05.940
One is the AI is going to take all of our jobs.
00:06:12.300
And three is the quality of AI is demeaning to sort of the human being.
00:06:21.260
It's all going to kind of descend into AI slot mediocrity.
00:06:24.840
So one at a time, I will say that AI is going to cause job dislocation,
00:06:29.940
but it's not going to take out nearly all of the jobs.
00:06:32.200
And in the end, what you will see is a job shift
00:06:34.360
actually predominantly away from the white collar industries
00:06:39.120
So what you'll see is all the people who are telling welders to code 15 years ago,
00:06:43.520
all those people are now going to have to go learn to weld.
00:06:46.700
There are going to be a lot of people who are going to have to be
00:06:50.200
They're going to have to do more nursing, for example.
00:06:52.100
Like there are certain things human beings want from other human beings
00:06:55.560
It's going to be more of an aid than anything else.
00:06:58.020
And it's going to take slower to work its way into the market
00:07:01.220
Everybody always thinks it's going to be transitional boom,
00:07:06.560
The people who it's first going to replace are the coders.
00:07:08.300
You've already started to see some of this happen at Google.
00:07:10.900
And I know people, friends and family to whom this has happened.
00:07:14.140
But it's going to take a while for it to filter into all business.
00:07:22.260
This is what happens with every kind of great industrial age invention
00:07:26.020
is there's a tremendous job dislocation at the beginning.
00:07:29.960
And I don't think AI is going to destroy wholesale all of these jobs.
00:07:33.780
But let's move to part number two, which is sort of the idea
00:07:40.920
I was actually at a conference with a bunch of people
00:07:44.520
And they were arguing kind of what you're arguing, Matt,
00:07:46.540
that eventually AI will be better at everything
00:07:59.900
I'm going to actually spend more time getting in touch with God.
00:08:06.480
because we actually have a thing to do with our day.
00:08:08.840
I think that secular humanism is going to have a real problem
00:08:17.300
I'm not sure that AI is ever going to be creative enough.
00:08:22.540
But in terms of the actual creativity of truly great writing,
00:08:26.960
I don't think AI is ever going to be a great writer.
00:08:29.660
I think that AI, because it's a predictive text mechanism,
00:08:33.120
and you will end up with mid-range slop for the most part.
00:08:38.980
is to save time asking a sophisticated question
00:08:41.880
that would take me a while to research, for example.
00:08:46.660
looking up the details of Soviet Russia in 1938 or something,
00:08:56.460
And so I agree that there will be a lot of slop,
00:08:58.380
but I think that the people who are best at their craft
00:09:06.060
because it tends to drag everybody else along in terms of quality.
00:09:12.600
who, you know, they'll know what to do with their time
00:09:19.440
But to me, this is what's really worrisome about Matt's point
00:09:24.520
and most people are not going to know what to do
00:09:29.420
No, no, no, but in the white-collar jobs, Michael,
00:09:30.600
in the white-collar jobs, you're a blue-collar person.
00:09:32.280
But those are the people who you're talking about,
00:09:36.280
like largely blue-collar people who are, like you're saying,
00:09:40.160
you know, all the people who are like the intellectual elite,
00:09:41.760
those are the people who are now most likely to lose their jobs.
00:09:44.580
No, no, no, but I'm drawing a distinction here.
00:09:47.160
There are plenty of people in white-collar jobs
00:09:48.600
who are complete Philistines, who are secular humanists,
00:09:51.220
who I don't know that they are going to figure out what to do
00:09:54.460
because really what it gets down to is a perennial question,
00:09:57.340
which is what we do for leisure time, you know.
00:09:59.820
That's what the liberal arts were supposed to teach us how to do.
00:10:03.840
but it was supposed to teach us what to do with our freedom,
00:10:12.500
is really just an extension of the promise of the Internet.
00:10:17.700
We were going to have all of human knowledge at our fingertips.
00:10:26.860
the Internet did make them smarter and more productive
00:10:35.080
it made them dumber and it made them more vicious.
00:10:37.820
And I think it made them more likely to look at porn
00:10:39.760
and it made them more likely to ignore the great works.
00:10:42.300
And this goes all the way back to the Federis, you know,
00:10:50.460
because they're going to have the simulacrum of wisdom,
00:10:52.900
but they're not actually going to memorize anything.
00:10:57.360
I think for people who have their lives in order
00:11:07.620
Well, if I could take you and Ben and mash you together
00:11:35.560
they brought out an AI where you can record somebody
00:11:40.660
It will give you an AI version of your dead relative
00:11:42.780
so you can talk to mom even after she's passed.
00:11:45.280
I mean, that is idolatry of the worst possible kind.
00:11:53.220
and tell them how to get drugs and things like this.
00:11:58.140
It is, it's what people are going to do with it.
00:12:13.940
I mean, it's already people are like condensing books.
00:12:20.080
but that's a complete destruction of what it means.
00:12:22.140
And so people who don't have the meaning of life
00:12:47.120
it's like I quoted the great Louis Armstrong saying,
00:12:49.640
I see friends shaking hands saying, how do you do?
00:12:58.160
even if our words are not precisely that meaning.
00:13:03.800
They are convinced that because it can imitate an inner life,
00:13:13.120
If it can confuse us about its inner life, it has one.
00:13:15.700
So what I'm worried about it is it is in some ways the ultimate idol.
00:13:21.760
You know, we know that when all Moses has to do is leave town for five minutes,
00:13:35.140
it's a really important point because part of that conversation,
00:13:38.220
and I've had this conversation with other people too,
00:13:41.980
And people get really, really, I don't know, vitriolic about this.
00:13:45.200
Because it's really the heart of the AI debate.
00:13:50.620
because to write a poem, you have to have sensual experience.
00:13:54.560
You have to be able to describe a grape in a way
00:13:57.940
that gives someone the sensory experience of that.
00:14:25.620
And all of this is a little bit beside the question of,
00:14:28.140
all right, if it's going to have these negative effects,
00:14:48.540
And what will we think about in our leisure time about AI?
00:14:58.700
How are you going to make money to buy a house?
00:15:05.700
well, we'll live in some sort of AI socialist dystopia
00:15:13.060
Well, I'm very skeptical that it will work out that way.
00:15:20.440
and a lot of other people who are totally destitute.
00:15:27.360
that are totally dependent on this non-human algorithm
00:15:30.940
I think that's a pretty horrifying vision of the future.
00:15:34.780
It's also, this is, it's not just white-collar jobs.
00:15:39.720
Okay, delivery drivers, truck drivers, Uber drivers,
00:15:49.080
because this is different from any other technology
00:16:23.660
They say this every time a new technology comes.
00:16:43.780
cryptography was classified as a strategic weapon
00:17:01.960
to possess firearms to protect life and liberty
00:17:17.220
your internet connection through secure servers.
00:17:44.260
I don't want anybody else looking over my shoulder
00:18:15.380
Because I do want to tell you about Helix Sleep.
00:18:19.700
Actually, we have Helix mattresses in our house.
00:18:31.720
because after we fall back with Daylight Savings,
00:18:39.540
they don't realize, they don't care about the clock.
00:18:52.220
So Helix will help you sleep like a baby at night
00:19:13.820
and you get matched with the perfect mattress for you
00:19:59.940
because when the cart and horse goes out of style,
00:20:03.760
we must save the jobs of buggy whip makers, you know?
00:20:12.520
I think this has happened a million times before.
00:20:14.360
You can't imagine what the new job is going to be.
00:20:19.620
It's like the people who worry about running out of oil.
00:20:50.160
But I do think when you have a powerful new tool,
00:20:57.280
we're going to use it for that are destructive.
00:21:01.820
I mean, my worry about AI is the endless pornography,
00:21:07.240
the things that social media has done to human beings
00:21:22.020
based on the history of technological innovation.
00:21:28.080
were agriculturally based or early industry based.
00:21:31.140
And obviously, very few people do agriculture now.
00:22:01.120
would be a sort of Star Trek replicator machine.
00:22:14.920
Well, if you don't have to worry about anything,
00:23:00.740
okay, we could regulate it out of existence, right?