The Matt Walsh Show - November 20, 2025


Friendly Fire: Rising Prices, Rising AI, and Rise of the Merlin World Premiere Trailer


Episode Stats

Length

1 hour and 11 minutes

Words per Minute

219.1741

Word Count

15,590

Sentence Count

773

Misogynist Sentences

14

Hate Speech Sentences

33


Summary

On today's show, we have a special guest, Ben Walsh, a community college dropout, who joins us to talk about whether or not AI is a good or bad thing for all of us. We also have the World Premiere of the trailer for Pendragon: The Pendragon Cycle, The Rise of the Merlin, coming up at the end of the show.


Transcript

00:00:00.000 Canada can be a global leader in reducing the harm caused by smoking,
00:00:06.400 but it requires actionable steps.
00:00:09.160 Now is the time to modernize Canadian laws
00:00:11.620 so that adult smokers have information and access to better alternatives.
00:00:16.180 By doing so, we can create lasting change.
00:00:19.340 If you don't smoke, don't start.
00:00:21.780 If you smoke, quit.
00:00:23.420 If you don't quit, change.
00:00:25.860 Visit unsmoked.ca.
00:00:30.000 This episode is brought to you by Defender.
00:00:33.020 With a towing capacity of 3,500 kilograms and a weighting depth of 900 millimeters,
00:00:38.520 the Defender 110 pushes what's possible.
00:00:41.660 Learn more at LandRover.ca.
00:00:47.780 This episode is brought to you by Peloton.
00:00:50.360 Break through the busiest time of year with the brand new Peloton Cross Training Tread Plus.
00:00:55.020 Powered by Peloton IQ.
00:00:56.580 With real-time guidance and endless ways to move,
00:00:59.060 you can personalize your workouts and train with confidence,
00:01:02.580 helping you reach your goals in less time.
00:01:04.620 Let yourself run, lift, sculpt, push, and go.
00:01:08.580 Explore the new Peloton Cross Training Tread Plus at onepeloton.ca.
00:01:12.540 This is why you guys need me here as a community college dropout with all you Ivy League nerds.
00:01:18.600 You were just making fun of me because I brought that up.
00:01:20.480 And now you're bringing that up.
00:01:21.460 Well, I'm bringing it back to the real world.
00:01:24.200 No, no, no.
00:01:24.920 You're reading the study totally wrong.
00:01:26.220 That's not what the study says.
00:01:27.720 Okay, now I really want to move on because Matt's offering a moderate opinion
00:01:30.980 and Ben is agreeing with him.
00:01:32.120 Friends are these goodies enemies.
00:01:35.500 Friends are these goodies enemies.
00:01:39.440 Everybody, welcome to Friendly Fire.
00:01:41.740 All Daily Wire Plus subscriptions are 50% off right now.
00:01:46.940 Get them right now.
00:01:48.080 Dailywire.com slash subscribe.
00:01:51.380 Also, stick around because we have the world premiere of the trailer of Pendragon,
00:01:56.020 the Pendragon cycle, the rise of the Merlin that is coming up at the end of the show.
00:01:59.680 But before we get to any of that, speaking of wizardry,
00:02:02.200 I want to talk about AI and whether AI is really good like everyone seems to think it is,
00:02:08.820 like all the financial speculators have thought,
00:02:10.960 which is why it boosted the MAG7 stocks until recently before our impending stock market collapse,
00:02:16.200 or whether AI is probably mostly bad for all of us.
00:02:20.180 To kick it off, the most optimistic person on the panel, Mr. Walsh.
00:02:26.020 Yeah, I'm very, I become more anti-AI with each passing day.
00:02:30.620 I hate AI.
00:02:32.100 If I could, I said before, if I could commit some sort of anti-AI genocide,
00:02:36.540 I would totally do it.
00:02:38.620 I think that, and here's what blows my mind about it,
00:02:41.720 is that we can all, most of us anyway, can see,
00:02:44.860 even people who are behind AI, like Elon Musk,
00:02:48.060 can see coming this, like, potential civilizational level catastrophe,
00:02:54.120 and basically nothing is being done about it at all.
00:02:57.480 Because what is absolutely going to happen, as far as I can tell,
00:03:01.600 is AI, at a minimum, is going to wipe out
00:03:03.940 many millions of jobs over the next five to ten years.
00:03:07.200 How many millions, there's no way to say for sure.
00:03:10.020 I did ask, by the way, ChatGPT before we went on,
00:03:13.180 I asked ChatGPT to estimate how many jobs it will take
00:03:16.760 and AI will take from us in the next ten years.
00:03:19.680 And I think the answer I got was 15 million or something like that.
00:03:23.220 15 to 25 million.
00:03:24.560 So, who knows?
00:03:25.460 It's millions of jobs are going out the window.
00:03:27.140 We know that because of AI.
00:03:28.360 And they're not going to be replaced by anything.
00:03:29.780 They're just, they're going away.
00:03:30.880 They're not coming back.
00:03:32.760 That's going to happen.
00:03:34.440 We're going to be, we're already, we're almost there now,
00:03:37.080 but we will soon be in a situation online
00:03:38.760 where you just simply cannot tell reality from fiction at all,
00:03:42.540 where the AI videos are going to be so good
00:03:44.900 that if anybody wants to smear any of us here,
00:03:48.460 I can't imagine anyone would want to smear any of us
00:03:50.460 because we're so, we're all so beloved.
00:03:52.580 But if anyone wanted to do that,
00:03:53.800 I could just make a video of any of us doing or saying something horrible
00:03:56.240 and there'd be no way for us to prove it didn't happen.
00:03:58.040 That cat video radicalized me.
00:03:59.700 I don't know if you guys saw the cat playing the didgeridoo and everything.
00:04:02.500 It was very good.
00:04:04.120 If I didn't, if I didn't know that most cats don't play didgeridoo,
00:04:07.020 I would have thought that was a 100% real video.
00:04:09.800 Well, that's, but Michael, that's the other,
00:04:11.420 that's the other thing that's going to happen with AI
00:04:12.960 is that people are just sitting there looking at this slop
00:04:16.180 made by an algorithm all day, every day,
00:04:18.540 while their minds are melted.
00:04:20.080 And then on top of all those other things,
00:04:21.740 it's going to completely destroy every creative industry
00:04:24.780 is all going out, out the window.
00:04:27.180 And so what are we doing about this?
00:04:30.240 Are we just going to sit back and let it happen?
00:04:32.220 Because that seems to be the kind of defeatist attitude
00:04:34.520 that most people have is like, well, we can't do anything.
00:04:37.120 So let's just, I guess, you know, we had a good run, human beings.
00:04:41.260 Let's, let's pack it in.
00:04:43.100 And I, Matt, I do want, Matt, I want to ask you seriously,
00:04:45.840 do you think that AI is going to kill all of us?
00:04:47.680 Or is this kind of your list of,
00:04:49.220 because I know that's the sort of the most catastrophist take on this.
00:04:52.720 Is the AI is going to turn around and do gigantic murder to all of us.
00:04:56.340 No.
00:04:56.880 Terminator 2.
00:04:57.460 But you know, like this is your list of complaints.
00:04:59.640 I just want to make sure that that's the list of complaints
00:05:01.120 so that I can argue with them.
00:05:02.300 No, the Terminator thing, I don't, that's like, I'd prefer that.
00:05:05.860 I mean, at least that's, you know what?
00:05:08.700 If AI becomes Terminator,
00:05:10.220 then that at least gives us jobs that we could do
00:05:11.980 because we're fighting the AI.
00:05:13.820 So it's not that at all.
00:05:15.720 I'm not looking at any science, you know, sci-fi scenario.
00:05:18.380 It really is.
00:05:19.280 The main thing is people will not have much to do
00:05:22.620 because AI is going to do everything
00:05:24.480 and it's going to take all of our jobs.
00:05:25.980 And I don't think that we have the capacity to sustain that.
00:05:28.900 I don't think we have any plan for what we do
00:05:30.580 when 20 million people all of a sudden have no job.
00:05:33.160 That's the main thing.
00:05:34.460 Okay.
00:05:34.600 I'm going to argue with everything you just said.
00:05:36.120 Okay.
00:05:36.320 So I'm not a person who believes that AI is the cure for all problems.
00:05:42.560 I also do not think that what we are in right now
00:05:45.220 is sustainable economically.
00:05:46.340 I've been saying this for a while.
00:05:47.420 I've actually been saying it for well over a year
00:05:49.280 is that I think we are in a bubble.
00:05:50.780 I think pretty clearly we're in an AI bubble.
00:05:52.540 That doesn't mean AI is not important.
00:05:54.860 It just means that the overinvestment in infrastructure
00:05:56.940 at some point is going to have to pay off in actual earnings
00:05:59.400 or the entire pyramid is going to crumble
00:06:01.260 at least for most of these companies.
00:06:03.120 As far as, I'm hearing kind of three arguments there.
00:06:05.940 One is the AI is going to take all of our jobs.
00:06:09.120 Two is that if the AI takes all of our jobs,
00:06:11.500 what are we going to do with our lives?
00:06:12.300 And three is the quality of AI is demeaning to sort of the human being.
00:06:19.220 What's going to happen to human art?
00:06:20.360 What's going to happen to quality?
00:06:21.260 It's all going to kind of descend into AI slot mediocrity.
00:06:24.840 So one at a time, I will say that AI is going to cause job dislocation,
00:06:29.940 but it's not going to take out nearly all of the jobs.
00:06:32.200 And in the end, what you will see is a job shift
00:06:34.360 actually predominantly away from the white collar industries
00:06:37.840 and more toward the blue collar industries.
00:06:39.120 So what you'll see is all the people who are telling welders to code 15 years ago,
00:06:43.520 all those people are now going to have to go learn to weld.
00:06:45.620 That's actually what's going to happen.
00:06:46.700 There are going to be a lot of people who are going to have to be
00:06:48.460 in sort of more physical industries.
00:06:50.200 They're going to have to do more nursing, for example.
00:06:52.100 Like there are certain things human beings want from other human beings
00:06:54.220 that AI isn't going to provide.
00:06:55.560 It's going to be more of an aid than anything else.
00:06:58.020 And it's going to take slower to work its way into the market
00:07:00.420 than everybody thinks.
00:07:01.220 Everybody always thinks it's going to be transitional boom,
00:07:03.180 like tomorrow all jobs replaced by AI.
00:07:05.760 And it's not true.
00:07:06.560 The people who it's first going to replace are the coders.
00:07:08.300 You've already started to see some of this happen at Google.
00:07:10.900 And I know people, friends and family to whom this has happened.
00:07:14.140 But it's going to take a while for it to filter into all business.
00:07:16.960 And there will be transitional job loss.
00:07:19.400 And then it will move into other areas.
00:07:20.900 This is what happened with the internet.
00:07:22.260 This is what happens with every kind of great industrial age invention
00:07:26.020 is there's a tremendous job dislocation at the beginning.
00:07:28.300 And then the job market moves.
00:07:29.960 And I don't think AI is going to destroy wholesale all of these jobs.
00:07:33.780 But let's move to part number two, which is sort of the idea
00:07:36.620 that it will destroy all the jobs.
00:07:38.060 Let's take that as an assumption.
00:07:39.620 So here's my thing.
00:07:40.920 I was actually at a conference with a bunch of people
00:07:42.700 who are like the creators of these systems.
00:07:44.520 And they were arguing kind of what you're arguing, Matt,
00:07:46.540 that eventually AI will be better at everything
00:07:49.360 and none of us will have jobs anymore.
00:07:51.420 And what are we going to do with our day?
00:07:52.660 And I raised my hand.
00:07:53.460 I said, you know what?
00:07:53.940 I know what I'm going to do with my day.
00:07:55.280 I'm going to take care of my family.
00:07:56.280 I'm going to go to synagogue more often.
00:07:57.720 I'm going to learn the holy books.
00:07:59.900 I'm going to actually spend more time getting in touch with God.
00:08:02.320 Like I think that actually religious people
00:08:03.880 and community-oriented people will be fine
00:08:06.480 because we actually have a thing to do with our day.
00:08:08.840 I think that secular humanism is going to have a real problem
00:08:11.760 determining what to do with its day
00:08:13.060 in a way that many religious people will not.
00:08:15.220 And then just as far as the quality of it,
00:08:17.300 I'm not sure that AI is ever going to be creative enough.
00:08:20.940 Visually, it will be.
00:08:21.600 It'll be able to fool you visually.
00:08:22.540 But in terms of the actual creativity of truly great writing,
00:08:26.960 I don't think AI is ever going to be a great writer.
00:08:28.840 It's all derivative.
00:08:29.660 I think that AI, because it's a predictive text mechanism,
00:08:33.120 and you will end up with mid-range slop for the most part.
00:08:36.300 But the way that I've used AI in my own work
00:08:38.980 is to save time asking a sophisticated question
00:08:41.880 that would take me a while to research, for example.
00:08:43.900 Or if I'm doing a creative writing project
00:08:45.540 and I don't want to take a lot of time
00:08:46.660 looking up the details of Soviet Russia in 1938 or something,
00:08:49.960 then I can ask a multi-part question.
00:08:52.160 It'll spit out an answer.
00:08:53.480 If I asked it to write dialogue,
00:08:54.780 the dialogue would just not be as good.
00:08:56.460 And so I agree that there will be a lot of slop,
00:08:58.380 but I think that the people who are best at their craft
00:09:00.500 will actually end up benefiting from AI.
00:09:02.400 And usually, when the best get better,
00:09:04.780 that's actually good for everybody else
00:09:06.060 because it tends to drag everybody else along in terms of quality.
00:09:08.320 So, Ben, you point on the religious people
00:09:12.600 who, you know, they'll know what to do with their time
00:09:15.300 or the educated people or the cultural elites.
00:09:18.000 I totally agree with that.
00:09:19.440 But to me, this is what's really worrisome about Matt's point
00:09:22.240 that it's going to displace 15 million jobs
00:09:24.520 and most people are not going to know what to do
00:09:26.160 because I agree with you.
00:09:27.460 You will figure out what to do with your time.
00:09:29.420 No, no, no, but in the white-collar jobs, Michael,
00:09:30.600 in the white-collar jobs, you're a blue-collar person.
00:09:32.280 But those are the people who you're talking about,
00:09:36.280 like largely blue-collar people who are, like you're saying,
00:09:40.160 you know, all the people who are like the intellectual elite,
00:09:41.760 those are the people who are now most likely to lose their jobs.
00:09:44.580 No, no, no, but I'm drawing a distinction here.
00:09:47.160 There are plenty of people in white-collar jobs
00:09:48.600 who are complete Philistines, who are secular humanists,
00:09:51.220 who I don't know that they are going to figure out what to do
00:09:54.460 because really what it gets down to is a perennial question,
00:09:57.340 which is what we do for leisure time, you know.
00:09:59.820 That's what the liberal arts were supposed to teach us how to do.
00:10:02.280 Now we think of them more as trade school,
00:10:03.840 but it was supposed to teach us what to do with our freedom,
00:10:06.300 how aristocrats are supposed to live.
00:10:08.080 We obviously don't really have that.
00:10:09.600 So my fear is that the promise of AI
00:10:12.500 is really just an extension of the promise of the Internet.
00:10:15.720 The Internet was going to make us all smarter.
00:10:17.700 We were going to have all of human knowledge at our fingertips.
00:10:19.740 We were going to learn a new language.
00:10:22.200 It's all the same stuff we're hearing with AI.
00:10:24.100 And the reality is, for some people,
00:10:26.860 the Internet did make them smarter and more productive
00:10:29.320 and more thoughtful and have fuller lives.
00:10:31.660 And for more people than that,
00:10:33.600 really for most people, I think,
00:10:35.080 it made them dumber and it made them more vicious.
00:10:37.820 And I think it made them more likely to look at porn
00:10:39.760 and it made them more likely to ignore the great works.
00:10:42.300 And this goes all the way back to the Federis, you know,
00:10:44.440 Plato's dialogue where Socrates is saying
00:10:46.760 that written language, books essentially,
00:10:49.220 are going to make people dumber
00:10:50.460 because they're going to have the simulacrum of wisdom,
00:10:52.900 but they're not actually going to memorize anything.
00:10:54.740 They're not going to know anything.
00:10:55.840 And so I fear, I think you're right.
00:10:57.360 I think for people who have their lives in order
00:10:59.140 and are religious and have a cohesive view
00:11:01.180 of the purpose of life,
00:11:02.140 I think it could improve their lives.
00:11:04.280 And I think for most people, I probably won't.
00:11:06.820 Drew?
00:11:07.620 Well, if I could take you and Ben and mash you together
00:11:10.960 just for my own personal pleasure,
00:11:12.740 that would be great.
00:11:13.460 But also I think that what you're saying,
00:11:15.680 you're hitting that.
00:11:16.520 The problem is not AI.
00:11:17.660 The problem is human beings.
00:11:19.020 And it's always the problem.
00:11:20.260 I mean, people talk about,
00:11:21.300 are we going to have to regulate an industry?
00:11:23.300 You don't regulate industries.
00:11:24.500 You regulate human beings.
00:11:25.960 You have to regulate human beings
00:11:27.200 because they're sinful and broken
00:11:28.660 and will kill each other and rob each other
00:11:30.040 and do all these things.
00:11:31.460 Already we see with AI,
00:11:33.100 I mean, recently, last week, I think it was,
00:11:35.560 they brought out an AI where you can record somebody
00:11:38.260 and then after he's dead,
00:11:39.580 you can continue to talk.
00:11:40.660 It will give you an AI version of your dead relative
00:11:42.780 so you can talk to mom even after she's passed.
00:11:45.280 I mean, that is idolatry of the worst possible kind.
00:11:48.800 There have been AI dolls
00:11:49.980 that have been put in children's rooms
00:11:51.380 that talk them out of believing in God
00:11:53.220 and tell them how to get drugs and things like this.
00:11:55.420 So the problem is not the AI per se.
00:11:58.140 It is, it's what people are going to do with it.
00:12:00.200 It is going to make porn spectacular.
00:12:03.180 I mean, the porn is going to come out of AI.
00:12:05.100 I mean, I can already see
00:12:06.200 that it will do anything you want it to do.
00:12:08.580 It's going to, it's going to rob people
00:12:10.880 of their desire to read.
00:12:13.940 I mean, it's already people are like condensing books.
00:12:16.680 Well, now I've got, you know, war and peace.
00:12:18.460 It's just give me two paragraphs,
00:12:20.080 but that's a complete destruction of what it means.
00:12:22.140 And so people who don't have the meaning of life
00:12:24.720 or don't know where it lies,
00:12:27.000 which is in the internal life,
00:12:28.440 are going to be lost.
00:12:31.380 You and I, Knowles, had a conversation
00:12:34.040 with a very powerful leader in AI
00:12:36.800 just the other week or so.
00:12:38.660 And I went up to him and I said to him,
00:12:40.560 don't you understand that when AI speaks,
00:12:43.440 it's not speaking, it's not conscious.
00:12:45.840 And I said, it's like,
00:12:47.120 it's like I quoted the great Louis Armstrong saying,
00:12:49.640 I see friends shaking hands saying, how do you do?
00:12:52.320 They're really saying, I love you.
00:12:53.920 Meaning that when we speak,
00:12:55.560 we deliver our inner selves to one another,
00:12:58.160 even if our words are not precisely that meaning.
00:13:01.040 AI has no inner life.
00:13:02.340 And these guys don't know that.
00:13:03.800 They are convinced that because it can imitate an inner life,
00:13:06.800 they think the Turing test,
00:13:08.100 which is the stupidest idea anybody ever had,
00:13:10.500 is indicative of an inner life.
00:13:13.120 If it can confuse us about its inner life, it has one.
00:13:15.700 So what I'm worried about it is it is in some ways the ultimate idol.
00:13:19.860 And we know what people do with idols.
00:13:21.760 You know, we know that when all Moses has to do is leave town for five minutes,
00:13:25.660 they start worshiping the golden calf.
00:13:27.780 That's where I think the danger lies.
00:13:29.380 I think jobs will be created.
00:13:31.340 I think creativity will exist.
00:13:33.580 You know, to your point, Drew,
00:13:35.140 it's a really important point because part of that conversation,
00:13:38.220 and I've had this conversation with other people too,
00:13:39.680 is can AI write a poem?
00:13:41.980 And people get really, really, I don't know, vitriolic about this.
00:13:45.200 Because it's really the heart of the AI debate.
00:13:48.340 And my argument was they can't write a poem
00:13:50.620 because to write a poem, you have to have sensual experience.
00:13:54.560 You have to be able to describe a grape in a way
00:13:57.940 that gives someone the sensory experience of that.
00:14:00.640 And you have to be able to take language,
00:14:02.560 which is just full of dead metaphors.
00:14:04.200 It's like the graveyard of dead metaphors.
00:14:06.000 And you have to create a new metaphor,
00:14:07.940 something that's evocative.
00:14:10.000 And AI in particular cannot do that
00:14:13.100 because it doesn't have any senses yet.
00:14:15.220 It's worth pointing out that with robotics,
00:14:16.680 it actually might have sensory experience.
00:14:18.300 And two, it's just learning on dead language.
00:14:20.900 So in my view, it can't make a poem.
00:14:23.640 But I don't know, maybe it can.
00:14:25.620 And all of this is a little bit beside the question of,
00:14:28.140 all right, if it's going to have these negative effects,
00:14:30.040 what do we do about it?
00:14:30.920 Do we regulate it?
00:14:31.920 Or do we let the market run its course?
00:14:34.740 What are we going to do?
00:14:35.680 You guys, hang on a second.
00:14:38.160 This is why you guys need me here
00:14:41.060 as a community college dropout
00:14:42.460 with all you Ivy League nerds
00:14:44.260 who immediately, this becomes a,
00:14:45.920 this becomes a, like, can AI make a poem?
00:14:48.540 And what will we think about in our leisure time about AI?
00:14:53.020 My question is, how are people going to eat?
00:14:55.200 Okay, I'm not talking about leisure time.
00:14:57.100 How are you going to feed yourself?
00:14:58.700 How are you going to make money to buy a house?
00:15:00.700 Like, that's the first question here.
00:15:04.000 Because, and if the answer is,
00:15:05.700 well, we'll live in some sort of AI socialist dystopia
00:15:09.500 where AI will provide all that stuff for you.
00:15:13.060 Well, I'm very skeptical that it will work out that way.
00:15:15.500 I think what's actually going to happen
00:15:16.480 is you're going to end up with, you know,
00:15:18.000 a handful of trillionaires off this AI stuff
00:15:20.440 and a lot of other people who are totally destitute.
00:15:22.560 But even if it did work out that way,
00:15:24.540 okay, well, then that's our life
00:15:25.800 that now we're living as people
00:15:27.360 that are totally dependent on this non-human algorithm
00:15:30.020 to provide for us.
00:15:30.940 I think that's a pretty horrifying vision of the future.
00:15:33.140 But why should that happen now?
00:15:34.020 It's never happened before.
00:15:34.780 It's also, this is, it's not just white-collar jobs.
00:15:38.200 It's also blue-collar jobs.
00:15:39.720 Okay, delivery drivers, truck drivers, Uber drivers,
00:15:42.780 that's all going away.
00:15:43.800 That's gone.
00:15:44.500 That's finished.
00:15:45.180 And that's just the beginning of it.
00:15:46.480 And they're not being replaced.
00:15:47.780 This is not creating new jobs
00:15:49.080 because this is different from any other technology
00:15:51.160 that has ever existed on the planet.
00:15:53.300 It's not analogous to anything else
00:15:54.940 because the whole point of it,
00:15:57.020 the whole point is to take the human element
00:15:59.260 out of it completely.
00:16:00.860 It's not a new tool for humans to use.
00:16:02.860 It's not like going from a carriage driver
00:16:04.660 to now you're driving an automobile.
00:16:06.460 This is, the human is gone.
00:16:08.080 We don't need you anymore.
00:16:09.360 It's artificial intelligence.
00:16:10.820 And so these jobs are leaving
00:16:12.640 and they're not being replaced.
00:16:13.840 For all the drivers
00:16:14.680 who are not going to have a job anymore,
00:16:16.440 there's not some new thing,
00:16:17.740 oh, well, you'll go over here and do this.
00:16:19.240 It's, there's nothing for you.
00:16:20.940 You're out now.
00:16:21.880 And that's a million-
00:16:22.580 Why do you think that's true?
00:16:23.660 They say this every time a new technology comes.
00:16:26.020 Okay, I'm not.
00:16:26.460 No, wait a minute.
00:16:27.100 I want to get, I want to get,
00:16:27.960 no, Drew, it's fine.
00:16:28.900 I want to get to it.
00:16:29.620 But before we get to it,
00:16:30.960 we need to, we need to eat.
00:16:32.600 Okay, the only way we're going to eat
00:16:34.000 is if I read this ad right here, this one.
00:16:36.720 So guys, just cut it for a second.
00:16:38.140 AI can do this.
00:16:39.880 Yeah, I hope not.
00:16:41.340 Okay, guys, did you know
00:16:42.500 that up until the 1990s,
00:16:43.780 cryptography was classified as a strategic weapon
00:16:45.880 by the United States government.
00:16:47.380 And during the Cold War,
00:16:48.620 it was the,
00:16:49.000 it was added to the same U.S. munitions list
00:16:50.940 that restricts export of rifles and rockets.
00:16:53.020 In 1954, encryption hardware and algorithms
00:16:55.220 were added to the list
00:16:56.260 to prevent the Soviets from acquiring tools
00:16:57.760 that protected American military secrets.
00:16:59.980 Well, just the way that we are allowed
00:17:01.960 to possess firearms to protect life and liberty
00:17:03.820 because we have an amazing Second Amendment,
00:17:05.100 we also can create, share, and wield
00:17:07.040 strong cryptographic arms
00:17:08.640 to safeguard their communications data
00:17:10.600 and digital lives
00:17:11.340 from any adversary, foreign or domestic.
00:17:13.060 That's what ExpressVPN does for you.
00:17:14.700 That's what it does for me.
00:17:15.460 It's an app that encrypts and reroutes
00:17:17.220 your internet connection through secure servers.
00:17:19.280 That makes your online activity private.
00:17:21.020 No one can monitor, record, manipulate,
00:17:22.380 or profit from it without your consent.
00:17:24.540 ExpressVPN works on every device,
00:17:25.860 phone, laptop, tablet, you name it.
00:17:27.580 And you can protect up to 14 devices
00:17:29.240 with one subscription.
00:17:30.300 Get four extra months of ExpressVPN
00:17:31.780 just by using our special link.
00:17:33.520 Go to expressvpn.com
00:17:35.000 slash friendlyfire.
00:17:36.160 That's E-X-P-R-E-S-S-V-P-N dot com
00:17:38.780 slash friendlyfire
00:17:39.960 to get four extra months.
00:17:41.020 Start protecting yourself.
00:17:42.020 Today, I know, when I'm traveling,
00:17:43.400 I'm using public Wi-Fi,
00:17:44.260 I don't want anybody else looking over my shoulder
00:17:46.200 at the data that I'm using
00:17:47.720 or the stuff that I'm searching.
00:17:49.040 So that's why I use ExpressVPN.
00:17:50.580 I'm using it all the time.
00:17:51.500 You should do the same.
00:17:52.260 Head on over to expressvpn.com
00:17:53.860 slash friendlyfire.
00:17:55.180 That's E-X-P-R-E-S-S-V-P-N dot com
00:17:57.020 slash friendlyfire.
00:17:58.140 Get four extra months
00:17:58.920 and start protecting yourself today.
00:18:00.840 Okay, now, Drew, you want to say something?
00:18:02.380 Yes, Drew.
00:18:02.780 Hang on.
00:18:03.100 I also have to jump in, I'm told,
00:18:05.940 with another momentum-killing advertisement.
00:18:09.760 Anyway, right when it's getting interesting,
00:18:13.260 let's jump in with the ad.
00:18:14.320 It's fine, though.
00:18:14.940 It's good.
00:18:15.380 Because I do want to tell you about Helix Sleep.
00:18:17.520 And I do love Helix Sleep.
00:18:19.700 Actually, we have Helix mattresses in our house.
00:18:23.140 We all sleep on Helix.
00:18:24.060 All of our kids, all of our 90 kids
00:18:25.980 all have Helix mattresses.
00:18:27.680 And it's great.
00:18:29.820 I'm not getting a lot of sleep right now
00:18:31.720 because after we fall back with Daylight Savings,
00:18:34.980 everyone talks about,
00:18:36.320 oh, we save an hour of sleep.
00:18:37.720 Well, the problem is when you have young kids,
00:18:39.540 they don't realize, they don't care about the clock.
00:18:44.300 So now I've got twin toddlers waking up
00:18:47.040 at 4.30 in the morning
00:18:47.880 who are rousing me out of sleep
00:18:50.480 out of my very comfortable Helix mattress.
00:18:52.220 So Helix will help you sleep like a baby at night
00:18:54.720 unless you have babies in the house
00:18:56.260 and they will wake you up.
00:18:57.220 There's nothing we can do about that.
00:18:58.900 But Helix is great.
00:19:00.980 I can't recommend it enough.
00:19:02.140 You can go to helixsleep.com
00:19:03.660 slash friendlyfire for 27% off site-wide.
00:19:07.320 That's helixsleep.com slash friendlyfire
00:19:09.400 for 27% off site-wide.
00:19:11.320 You go to their website,
00:19:12.480 you take a sleep quiz
00:19:13.820 and you get matched with the perfect mattress for you
00:19:16.540 because everyone is different
00:19:18.560 and they take care of that there.
00:19:21.160 Make sure you enter our show name
00:19:22.300 into the post-purchase survey
00:19:23.880 so they know we sent you.
00:19:25.220 helixsleep.com slash friendlyfire.
00:19:28.540 Okay, only 10 more presents to wrap.
00:19:30.960 You're almost at the finish line.
00:19:32.980 But first...
00:19:38.840 There, the last one.
00:19:42.860 Enjoy a Coca-Cola for a pause that refreshes.
00:19:47.500 So here's my problem with the no-job scenario
00:19:49.840 is that it comes up every single time
00:19:52.260 there's a new technology.
00:19:54.180 Every time.
00:19:54.920 And it's why government is so bad
00:19:56.920 at managing economies.
00:19:58.320 It's why you don't want a top-down economy
00:19:59.940 because when the cart and horse goes out of style,
00:20:03.060 the government says,
00:20:03.760 we must save the jobs of buggy whip makers, you know?
00:20:07.660 And the thing is, there'll be new jobs.
00:20:09.340 There will be new jobs.
00:20:10.300 And the thing is,
00:20:11.200 maybe we can't even imagine.
00:20:12.520 I think this has happened a million times before.
00:20:14.360 You can't imagine what the new job is going to be.
00:20:16.460 But there'll be jobs to do
00:20:17.540 because people are endlessly creative.
00:20:19.620 It's like the people who worry about running out of oil.
00:20:22.220 You know, you don't run out of energy
00:20:25.420 because energy is a product of the human mind.
00:20:27.960 The human mind turns things into energy.
00:20:30.860 And if we run out of oil,
00:20:31.940 we'll turn something else.
00:20:32.940 You know, we'll mash up Knowles.
00:20:34.000 We'll use him for energy.
00:20:34.880 I mean, you can always make energy.
00:20:37.400 The human mind and imagination and creativity
00:20:39.600 is bottomless.
00:20:40.440 It's endless.
00:20:41.100 I don't fear this about AI at all,
00:20:43.300 although I do think Ben is right
00:20:44.520 that there could be difficult transitions
00:20:46.220 and knowing how people are
00:20:48.120 will handle that in the worst way possible.
00:20:50.160 But I do think when you have a powerful new tool,
00:20:53.800 you have to start to think about human sin.
00:20:56.200 You have to start to think about the things
00:20:57.280 we're going to use it for that are destructive.
00:20:59.520 And that's where I see all the danger.
00:21:00.720 Yeah, I totally agree with this, Drew.
00:21:01.820 I mean, my worry about AI is the endless pornography,
00:21:05.240 the endless, you know, narcissism,
00:21:07.240 the things that social media has done to human beings
00:21:09.840 by exacerbating our worst qualities
00:21:11.160 and that getting even worse, obviously.
00:21:13.620 That's the thing I worry about.
00:21:14.760 But as far as sort of the economic point here,
00:21:16.720 I'm significantly less worried about that
00:21:18.500 for a couple of reasons.
00:21:19.520 One, because I'm just less worried about it
00:21:22.020 based on the history of technological innovation.
00:21:24.380 If you go back to the early 20th century,
00:21:26.080 well over 80% of jobs in the United States
00:21:28.080 were agriculturally based or early industry based.
00:21:31.140 And obviously, very few people do agriculture now.
00:21:33.460 If you go to the middle of the 20th century,
00:21:35.040 America was a manufacturing-based economy.
00:21:36.720 Now we're a service-based economy.
00:21:38.220 Jobs tend to move around
00:21:39.260 and human beings are quite adaptable.
00:21:41.380 If the question is, you know,
00:21:42.620 will I be endlessly poor
00:21:43.780 while a few people are trillionaires,
00:21:45.160 that wouldn't work
00:21:46.160 because they wouldn't be trillionaires
00:21:47.340 if everybody is endlessly poor.
00:21:48.680 That's not the way
00:21:49.580 that actually wealth distribution happens.
00:21:51.180 They don't take their wealth
00:21:51.900 from a bunch of super-duper poor people.
00:21:54.040 If there's no wealth for them to take,
00:21:55.680 then they don't generate the product.
00:21:56.920 So the actual thing that would happen,
00:21:58.660 the kind of worst-case scenario
00:21:59.700 that people are talking about actually
00:22:01.120 would be a sort of Star Trek replicator machine.
00:22:03.240 So in Star Trek,
00:22:04.100 I know not a lot of Trekkies online here,
00:22:05.460 but if you are Trekkie,
00:22:06.820 my understanding is
00:22:07.740 that there is a replicator machine
00:22:09.180 whereby you can literally generate
00:22:10.580 any product from nothing
00:22:11.560 with no resource use, essentially.
00:22:13.420 And so you don't have to worry about anything.
00:22:14.920 Well, if you don't have to worry about anything,
00:22:16.620 I thought that that was mostly
00:22:17.820 the goal of human beings
00:22:18.880 because work,
00:22:19.860 I mean, we all understand
00:22:20.980 that work is important,
00:22:21.800 but there are other types of work, right?
00:22:23.940 Like, for example,
00:22:25.020 spending time with your family,
00:22:26.240 it's a different type of fulfillment.
00:22:27.460 It's not really work,
00:22:28.280 but it's service,
00:22:29.360 what we would call in Hebrew avoda,
00:22:30.520 which is the same Hebrew,
00:22:31.780 the word for work and service is the same.
00:22:33.820 It's avoda.
00:22:34.980 The same type of thing, I think,
00:22:36.740 is true in our lives, right?
00:22:38.360 When I think of like the things
00:22:39.200 that I do that are important,
00:22:40.020 my work actually comes
00:22:40.780 maybe third or fourth on the list
00:22:41.940 after family and religion
00:22:43.220 and the stuff that I'm doing
00:22:44.580 in my community and for the country.
00:22:46.460 So, you know,
00:22:47.180 I'm less worried about the kind of
00:22:48.460 how do I get my stuff?
00:22:49.420 If things work out great,
00:22:50.300 we're all going to be way richer
00:22:51.160 and have a lot more leisure time.
00:22:52.440 If you're worried about the leisure time,
00:22:53.560 that's a human nature problem.
00:22:54.600 That's what Drew is talking about.
00:22:56.420 And then there is the other problem,
00:22:58.260 which is what's the alternative?
00:22:59.760 People keep talking about,
00:23:00.740 okay, we could regulate it out of existence, right?
00:23:02.360 We're just going to regulate it,
00:23:03.320 stop it from taking trucker jobs.
00:23:04.980 Okay, let's say that we were able to do that.
00:23:06.440 Let's say that we were able
00:23:07.080 to ban all the self-driving cars.
00:23:08.600 Does anybody think
00:23:09.420 that any other place on earth
00:23:11.060 is going to ban the self-driving cars?
00:23:12.940 So the actual thing that will happen
00:23:14.460 is that China will gain
00:23:15.680 complete economic dominance
00:23:16.900 over planet earth,
00:23:17.840 unless you are going to
00:23:18.880 essentially make America autarkic and poor.
00:23:20.860 That is the way that trade actually works.
00:23:22.660 China will gain the advantage
00:23:24.080 of every efficiency on planet earth
00:23:26.060 while we hamper ourselves.
00:23:27.400 And we will live in relative poverty
00:23:29.600 compared to what we are now,
00:23:30.700 while China gains significantly
00:23:32.420 more power globally
00:23:33.600 and then uses that power
00:23:34.800 in order to cram down
00:23:35.560 its terrible vision of the world,
00:23:36.900 which will eventually affect us.
00:23:38.340 Is your view then,
00:23:39.540 like pure laissez-faire,
00:23:41.620 no regulation whatsoever,
00:23:42.900 let the market lead in it
00:23:44.200 and that way we'll beat China
00:23:45.220 and we'll maintain our dominance?
00:23:46.140 Yes, except for morality
00:23:47.380 and national security, yes.
00:23:49.580 So I don't think
00:23:50.060 we should be selling
00:23:50.740 NVIDIA chips to China
00:23:52.580 because I think China's our enemy.
00:23:54.180 And I also think
00:23:55.220 that we should be heavily
00:23:56.040 regulating pornography, period,
00:23:57.740 and that applies also to AI.
00:23:59.980 But if we're talking about like,
00:24:01.480 should we stop AI
00:24:02.540 from generating healthcare solutions
00:24:04.260 because people in the healthcare industry
00:24:05.420 are going to lose their jobs?
00:24:06.400 I mean, let's be real about this.
00:24:09.220 It's easy for us
00:24:10.420 living in a first world country
00:24:12.140 with an average life expectancy
00:24:13.420 above 80
00:24:13.940 to talk about the evils of AI.
00:24:16.680 But if AI, for example,
00:24:19.000 in the medical industry
00:24:19.600 extends lifespans
00:24:20.480 by another 20 years,
00:24:21.420 which could easily happen,
00:24:23.320 that seems like
00:24:24.240 a pretty good thing to happen.
00:24:27.280 And I think that
00:24:27.760 one of the big mistakes
00:24:29.000 I see people happen,
00:24:30.280 there's a mistake
00:24:31.280 that I just generally object to.
00:24:32.360 And that is,
00:24:32.980 I think it happens
00:24:33.660 on the Marxist left
00:24:34.420 and I think it sometimes happens
00:24:35.440 on the populist right.
00:24:36.580 And that is,
00:24:37.280 they take a spiritual problem,
00:24:38.620 people's emptiness
00:24:39.220 and inability to function
00:24:40.820 in the absence
00:24:41.480 of particular guardrails.
00:24:42.840 And then they say,
00:24:43.600 there's a material solution for that.
00:24:45.680 And it is very rare to me
00:24:47.100 that there's actually
00:24:47.640 a material solution
00:24:48.400 to a spiritual problem.
00:24:49.780 That's a very good point, Ben,
00:24:51.220 because it is true.
00:24:52.400 Sometimes people think
00:24:53.280 like with the birth rate problem,
00:24:54.860 you can just fix it
00:24:55.680 with a lot of material solutions
00:24:57.040 and there's not a lot of evidence.
00:24:58.540 However, there's a distinction
00:24:59.640 between a material solution
00:25:01.440 and a government solution
00:25:02.780 because the government
00:25:03.680 influences culture,
00:25:04.680 it promotes certain ideas,
00:25:05.780 suppresses others,
00:25:06.560 it promotes religion traditionally
00:25:08.520 and I think inevitably.
00:25:10.200 And so, you know,
00:25:11.000 to use the birth rate example,
00:25:13.200 the only thing that seems
00:25:14.140 to reliably increase birth rate
00:25:15.380 is the promotion of religion.
00:25:16.660 But the government can do things there,
00:25:18.940 either explicitly permit religion
00:25:20.380 or at least stop
00:25:21.420 the suppression of religion
00:25:22.520 like, you know,
00:25:23.280 we saw under Joe Biden
00:25:24.220 and we see under a lot of liberals.
00:25:25.760 So is there any role,
00:25:27.220 just before we get to the other guys,
00:25:28.840 is there any role
00:25:29.680 for the government here
00:25:30.820 in maybe not providing
00:25:32.520 a material solution
00:25:33.300 to the consequences of AI,
00:25:34.880 but some role for the government?
00:25:36.780 I mean, I want to know the specifics.
00:25:38.160 It always comes down
00:25:38.800 to the specifics.
00:25:39.420 And this, by the way,
00:25:40.240 no one, the problem with AI
00:25:41.420 is a bunch of unknown unknowns, right?
00:25:43.060 It's not known unknowns,
00:25:43.940 it's just we literally don't know
00:25:45.280 what's going to happen next.
00:25:46.180 How do you regulate for that?
00:25:47.180 Which is why the Calci markets,
00:25:48.520 right, Calci is one of our sponsors,
00:25:49.860 right now in the Calci markets,
00:25:51.360 like 5% shot
00:25:52.480 that there's any serious
00:25:53.380 regulation of AI
00:25:54.300 because no one even would know
00:25:55.740 what that looks like.
00:25:56.980 What does that even look like?
00:25:57.720 I mean, this is a question,
00:25:58.780 honestly, Matt,
00:25:59.400 this is a question for you
00:26:00.100 because you want to regulate AI,
00:26:02.120 I assume.
00:26:02.540 You want to do something
00:26:03.320 to stop sort of
00:26:04.020 the forward march of AI.
00:26:05.360 So on a practical level,
00:26:06.280 what does that look like?
00:26:07.860 Well, I think that,
00:26:08.660 and I don't have all the answers,
00:26:09.840 I'll fully admit that.
00:26:10.580 That's why it's so frustrating to me
00:26:13.600 that we're not at a serious level
00:26:15.720 even having this conversation.
00:26:17.060 I mean, we're having
00:26:17.380 this conversation right now,
00:26:18.740 but including like our lawmakers
00:26:21.040 having this debate
00:26:21.980 about what can we do,
00:26:24.180 what should we do,
00:26:25.140 and that conversation
00:26:25.980 just isn't happening at all.
00:26:28.340 And if I had all the answers myself,
00:26:29.980 then I guess I wouldn't
00:26:30.560 be frustrated by that
00:26:31.580 because I could just say,
00:26:32.400 well, here's the answers, guys.
00:26:33.460 I don't have them,
00:26:34.300 but what I know,
00:26:35.400 the answer can't be,
00:26:36.520 well, whatever.
00:26:38.420 We'll see how it plays out.
00:26:39.860 That can't be the answer
00:26:40.520 when you're facing something
00:26:41.520 that is going to fundamentally
00:26:43.680 alter our civilization
00:26:44.880 in the way that this is going to.
00:26:46.140 Now, there are some things
00:26:47.060 that can be done.
00:26:47.660 I mean, people have suggested
00:26:48.500 when it comes to,
00:26:49.600 and this is kind of
00:26:50.200 on a lower level,
00:26:50.980 but things like intellectual property.
00:26:53.640 This is another huge problem with AI,
00:26:55.840 and I think some of you guys
00:26:56.760 have already kind of touched on it,
00:26:58.380 that AI cannot create anything.
00:27:00.400 It can't make a poem.
00:27:03.040 Like, it can't write a poem.
00:27:03.760 It can't do a screenplay.
00:27:05.000 You were just making fun of me
00:27:06.040 because I brought that up,
00:27:06.880 and now you're bringing that up.
00:27:08.140 Well, I'm bringing it back
00:27:09.960 to the real world.
00:27:10.900 So the problem with the reason
00:27:12.580 why I can't do that
00:27:13.240 is because it's stealing
00:27:14.020 from what other people have done,
00:27:15.760 and right now AI lives
00:27:17.220 in this kind of, like, bubble
00:27:18.340 where the rules of plagiarism
00:27:19.760 don't apply to it.
00:27:21.060 So there are things
00:27:22.260 that you could do there legislatively.
00:27:24.380 There's, again,
00:27:25.900 it's not easy to do,
00:27:27.000 but I do think you have to do
00:27:28.200 something there
00:27:29.060 to protect people
00:27:29.980 from having their...
00:27:30.720 I heard that.
00:27:31.340 ...from having their creative property.
00:27:32.980 But I would kind of...
00:27:34.860 I would flip it back
00:27:35.700 the other way
00:27:36.240 because what I'm going to ask is,
00:27:38.680 okay, the drivers
00:27:40.840 are all going to lose their jobs,
00:27:42.140 most likely.
00:27:43.280 Customer service,
00:27:44.480 the customer service industry,
00:27:46.180 a lot of that
00:27:46.940 is just going away
00:27:48.160 because when AI is adopted,
00:27:50.440 and I don't think...
00:27:51.620 This is not some kind
00:27:52.360 of, like, sci-fi speculation.
00:27:54.040 It's just extending out
00:27:55.800 a little bit.
00:27:56.260 It's, like, pretty clear
00:27:57.220 that if we keep applying this stuff,
00:27:59.520 there's not going to be anything
00:28:00.420 for people to do
00:28:01.020 in a lot of these jobs.
00:28:01.680 So I think a lot of these
00:28:02.280 customer service jobs
00:28:03.200 are going to go away.
00:28:04.680 And then, yes,
00:28:05.740 there's also the white collar,
00:28:07.160 but I care about those people, too.
00:28:09.280 Anyone who sits
00:28:10.140 in a cubicle all day
00:28:11.840 and enters data
00:28:13.700 into computers,
00:28:14.540 which is millions of people,
00:28:16.340 probably a lot of their jobs
00:28:17.980 are going away,
00:28:18.700 and I think that
00:28:19.140 that matters, too.
00:28:19.800 So my question is,
00:28:21.180 if that were to happen,
00:28:22.080 let's just say,
00:28:22.740 and maybe AI all breaks down
00:28:25.420 and it doesn't happen,
00:28:26.360 I think it probably will.
00:28:27.300 If that happens
00:28:28.100 over the next five to ten years,
00:28:29.160 and you've got tens of millions
00:28:30.780 of people who,
00:28:32.500 not just their job,
00:28:33.560 but really their entire industry
00:28:34.940 just went away,
00:28:36.300 what are we doing with them?
00:28:37.820 What are we doing for them?
00:28:38.720 I want to go back.
00:28:39.400 Wait, wait, wait.
00:28:40.560 I want to go back to the question.
00:28:42.220 Hold on.
00:28:42.720 Let me just answer that.
00:28:43.580 It'll take me,
00:28:43.940 I promise,
00:28:44.500 like four sentences, okay?
00:28:45.860 So here's the answer to that.
00:28:47.320 If I had asked you
00:28:48.000 that same question in 1998,
00:28:49.540 the advent of the internet
00:28:50.360 is going to kill
00:28:50.840 a bunch of jobs,
00:28:51.500 and it will kill
00:28:52.000 a bunch of jobs,
00:28:52.800 you know,
00:28:53.060 based on all the supply chains
00:28:54.980 being changed,
00:28:55.780 everything getting a lot shorter,
00:28:56.900 you won't have to go
00:28:57.380 to the local mom and pop shop,
00:28:58.460 you can order it off the internet,
00:28:59.580 and I said to you,
00:29:00.620 don't worry,
00:29:01.320 in 20 years,
00:29:02.340 there will be literally
00:29:03.260 millions of people
00:29:04.180 who are working on AI coding
00:29:05.780 and database building,
00:29:07.260 data center building.
00:29:08.120 You would say,
00:29:08.580 what the hell
00:29:09.140 are you even talking about?
00:29:10.260 What do those words mean?
00:29:11.480 I don't know
00:29:11.960 what those words mean.
00:29:12.840 If I said to you,
00:29:13.520 there would be legitimately
00:29:14.540 thousands of jobs
00:29:15.820 that were people
00:29:16.520 who were social media editors
00:29:18.020 and marketers.
00:29:18.800 You say,
00:29:19.000 what the hell,
00:29:19.440 what's a social media
00:29:20.800 and how do it work,
00:29:21.920 right?
00:29:22.120 Like this is the whole point
00:29:23.120 of the market
00:29:23.560 is that jobs
00:29:24.440 that we don't even know exist
00:29:26.040 will come about
00:29:26.960 because that's what
00:29:27.960 the market does.
00:29:28.520 The market generates innovation
00:29:29.660 because human desire
00:29:31.640 is endless
00:29:32.180 and the human desire
00:29:34.080 for new and innovative things
00:29:36.040 is also endless.
00:29:37.080 I have pure faith.
00:29:38.000 This is different.
00:29:38.500 This is all done.
00:29:39.180 I want to hear from Drew.
00:29:40.360 Yes, yes.
00:29:40.980 We can't imagine these things.
00:29:42.880 I totally agree with Ben.
00:29:44.060 I think there are going to be jobs
00:29:44.940 that we have no idea
00:29:46.180 could possibly exist.
00:29:47.280 But the question
00:29:48.200 that Knowles asked
00:29:49.020 and actually Ben referred to
00:29:50.740 is the really important question.
00:29:54.060 Back in the day
00:29:54.900 when you wanted to get
00:29:55.860 a pornographic magazine,
00:29:57.580 you had to walk into a store,
00:29:58.920 shame yourself.
00:29:59.700 You had to make sure
00:30:00.180 none of the neighbors saw you.
00:30:01.920 You went home
00:30:02.500 with this piece of paper
00:30:04.240 that you could look at
00:30:05.200 and all this stuff.
00:30:05.880 Not that you have any experience
00:30:07.620 in describing it.
00:30:07.720 No, I'm talking about that.
00:30:08.920 The one right on.
00:30:09.960 Theoretically.
00:30:10.620 A friend of mine,
00:30:11.440 theoretically, right.
00:30:12.720 But nobody,
00:30:14.020 when people said,
00:30:14.700 oh, we've got to ban this
00:30:15.560 and they did ban it
00:30:16.320 and they censored things
00:30:17.940 and then they said,
00:30:18.660 oh yeah,
00:30:18.840 we've got to censor Ulysses too.
00:30:20.300 It was silly.
00:30:21.680 You had to get rid of it.
00:30:22.780 Now you've got this sewer of porn
00:30:25.180 wiping people's lives away
00:30:27.180 with no regulation whatsoever.
00:30:29.260 And so now conservatives,
00:30:30.320 when I come out
00:30:30.800 and say things,
00:30:31.680 for instance,
00:30:32.240 like you should not be able
00:30:33.740 to censor opinions on YouTube,
00:30:35.960 conservatives go,
00:30:36.800 oh my, regulation, regulation.
00:30:38.980 Well, no, it's a new thing.
00:30:40.960 It needs new regulations
00:30:42.160 to make sure
00:30:42.680 the freedom of speech lives
00:30:44.340 because if you censor things
00:30:45.460 on YouTube,
00:30:46.060 you have virtually taken them
00:30:47.440 out of the public square.
00:30:48.520 So what do you do
00:30:49.540 with pornography?
00:30:50.600 I mean,
00:30:51.040 I would have said,
00:30:53.300 you know,
00:30:53.620 so what pornography
00:30:54.560 30 years ago
00:30:56.140 now think,
00:30:57.180 holy,
00:30:57.900 this is a toxin
00:30:59.940 being pumped
00:31:01.000 into the human psyche
00:31:02.640 like never before.
00:31:03.680 Dude,
00:31:03.800 I wrote a literal book
00:31:04.780 on pornography
00:31:05.460 and what it was going to do
00:31:06.880 to destroy young people
00:31:08.040 in 2005.
00:31:09.500 Absolutely.
00:31:09.680 And I was mocked for it.
00:31:10.480 I was 21 years old
00:31:11.460 when I wrote that book.
00:31:12.560 No,
00:31:12.780 I've written many words
00:31:13.640 of pornography over the years.
00:31:15.060 These are the questions
00:31:15.840 that we're not addressing now
00:31:17.280 where we know the danger,
00:31:19.180 we can see the danger,
00:31:20.020 it's only going to get worse.
00:31:21.500 These are the issues
00:31:22.320 I think we should be addressing,
00:31:23.420 not whether jobs
00:31:24.600 are going to disappear
00:31:25.200 because everything will change.
00:31:27.260 We don't even know
00:31:28.220 what that's going to look like.
00:31:29.160 All right,
00:31:29.420 Matt,
00:31:29.620 last word.
00:31:30.420 Yeah,
00:31:30.600 on the regulation side of it,
00:31:32.140 I mean,
00:31:32.400 obviously the most,
00:31:33.620 the most,
00:31:34.380 you know,
00:31:34.580 the sort of the most
00:31:35.080 heavy-handed
00:31:35.580 and obvious thing
00:31:36.580 if we're talking about regulation
00:31:37.640 is,
00:31:38.380 you know,
00:31:39.340 the government saying that,
00:31:40.960 hey,
00:31:41.100 okay,
00:31:41.340 you want to wipe out
00:31:42.180 all the driver jobs,
00:31:43.220 you want to wipe out,
00:31:44.300 you want to,
00:31:44.960 you know,
00:31:45.360 you want to get rid of
00:31:46.240 all your customer service jobs
00:31:47.380 if you're McDonald's
00:31:48.220 and it's a law saying,
00:31:50.220 well,
00:31:50.280 you can't do that.
00:31:51.280 We're just,
00:31:51.880 you're not,
00:31:52.420 you can't do that.
00:31:53.260 We're not going to let you do that
00:31:54.420 because we're not going to let you
00:31:56.000 put millions of people
00:31:56.620 out of work
00:31:57.000 all at the same time
00:31:57.660 because we just can't,
00:31:58.760 we can't sustain that
00:31:59.680 as a society.
00:32:00.540 We can't,
00:32:00.900 it can't happen.
00:32:02.020 And now that's very complicated.
00:32:03.660 That's the kind of thing
00:32:05.000 that I normally would not support.
00:32:07.180 And there is this tension
00:32:08.520 between like free markets
00:32:09.800 and then this other huge
00:32:10.900 civilization level concern.
00:32:13.020 So that's just,
00:32:13.860 that's,
00:32:14.160 that's the thing.
00:32:14.660 That's what we're dealing with.
00:32:15.660 And I do think,
00:32:17.040 and I just go back to that
00:32:18.080 this is a different kind of thing.
00:32:19.640 I think any analogy breaks down.
00:32:21.600 Ben,
00:32:21.740 you brought up the internet.
00:32:22.540 Well,
00:32:22.620 the internet is a different kind of thing.
00:32:24.380 The internet is a,
00:32:25.440 you know,
00:32:26.020 a very high tech,
00:32:27.100 sophisticated form of communication.
00:32:30.280 It's just a way of,
00:32:31.140 for people to communicate
00:32:31.860 and connect with each other.
00:32:33.580 And,
00:32:33.680 and so that in and of itself
00:32:35.340 is not going to take away jobs.
00:32:37.020 It might change what the jobs are,
00:32:38.620 but you still need the,
00:32:39.900 you still have humans
00:32:40.940 who are on the internet
00:32:41.780 commuting with,
00:32:42.500 communicating with each other.
00:32:43.700 And that's the case
00:32:44.680 with all of these
00:32:45.320 technological innovations
00:32:46.320 that it's just a different tool
00:32:47.480 for people to use.
00:32:48.820 And so,
00:32:49.340 yeah,
00:32:49.480 maybe the job where you use
00:32:50.700 the,
00:32:51.020 the more primitive tool goes away,
00:32:52.620 but now you use
00:32:53.360 the more sophisticated tool
00:32:54.260 and that's the job.
00:32:55.360 And I think with AI,
00:32:56.680 it's just different
00:32:57.180 because as I said,
00:32:58.620 it's artificial intelligence,
00:33:00.000 which means the entire point of it
00:33:01.660 is that we don't need a person
00:33:03.220 to do this at all.
00:33:04.360 It's not a new thing for you to do.
00:33:05.760 You're not needed.
00:33:06.920 And because we're facing
00:33:08.200 this totally new kind of thing,
00:33:10.060 which I really believe
00:33:10.740 is unprecedented in human history.
00:33:12.860 I think we might need
00:33:13.900 to embrace solutions
00:33:15.820 we otherwise would,
00:33:17.040 that otherwise would make us uncomfortable.
00:33:18.900 In fairness,
00:33:19.840 we don't know
00:33:20.660 if that's even Matt
00:33:21.580 really talking right now.
00:33:22.720 That could be
00:33:23.500 Rock or Gemini or something.
00:33:25.680 Now I want to get to,
00:33:26.660 it was something we touched on though.
00:33:27.860 It's related,
00:33:28.660 but it's a totally separate topic
00:33:29.720 is affordability.
00:33:31.800 It's the word,
00:33:32.800 it's the meme
00:33:33.320 that everyone's taught.
00:33:33.920 It's the new six, seven.
00:33:35.240 Everyone's just saying
00:33:35.900 affordability all the time.
00:33:37.100 I want to get into
00:33:37.820 what that actually means.
00:33:38.960 But first,
00:33:39.440 I want to restore a little balance
00:33:40.960 to this conversation.
00:33:41.820 Yes, I want to say this.
00:33:43.120 And true, I want you to tell us.
00:33:43.740 Here is something that AI cannot do.
00:33:45.560 It cannot eat your vegetables.
00:33:47.300 It can't even eat my vegetables.
00:33:48.700 In fact, I can't eat your vegetables.
00:33:50.220 It's a very, very complicated thing,
00:33:51.740 these vegetables.
00:33:52.920 And if you want to get enough of them,
00:33:54.420 you need to use balance of nature
00:33:55.900 because I love vegetables,
00:33:57.280 but if I ate enough,
00:33:58.780 the kinds of things
00:33:59.820 that nutrition experts recommend,
00:34:02.020 it would be all over my beard,
00:34:03.520 my face.
00:34:03.900 It would be just disgusting.
00:34:05.140 So instead,
00:34:06.080 I have balance of nature,
00:34:08.300 fruits and veggies.
00:34:09.460 And you may say,
00:34:09.980 well, if you use them all the time,
00:34:11.480 which I do,
00:34:12.300 why aren't they open?
00:34:13.380 It's because I have so many
00:34:14.740 of these damn things
00:34:15.760 that I don't even have to open them.
00:34:17.360 I got more downstairs
00:34:18.120 that are open.
00:34:19.600 Balance of nature,
00:34:20.200 what they do is
00:34:20.700 they freeze dry fruits and veggies,
00:34:22.420 then powder them
00:34:23.140 and blend them
00:34:24.320 into the most convenient
00:34:25.440 nutritional value.
00:34:26.620 You can take the fruits
00:34:27.340 and veggies supplements
00:34:28.160 with water,
00:34:29.180 chew them or open them up
00:34:30.320 and mix the powder
00:34:30.980 into your food or drinks,
00:34:31.900 which just sounds silly to me,
00:34:33.400 but it's still,
00:34:34.040 it's made from 100%
00:34:35.820 whole food ingredients.
00:34:37.560 You wonder how an animated corpse
00:34:39.360 like myself
00:34:39.900 can look like a 30-year-old man.
00:34:41.700 It's because I use balance of nature.
00:34:43.420 So go to balanceofnature.com
00:34:45.780 and get a free fiber and spice supplement.
00:34:48.500 You didn't even have time
00:34:49.380 to talk about the fiber and spices.
00:34:51.020 Plus,
00:34:51.260 you get 35% off your first set
00:34:53.580 as a new preferred customer
00:34:55.000 by using discount code
00:34:56.320 FRIENDLYFIRE.
00:34:57.740 Go to balanceofnature.com
00:34:59.120 and use the discount code
00:35:00.740 FRIENDLYFIRE.
00:35:01.740 Now,
00:35:01.920 Knowles,
00:35:02.160 what were you saying?
00:35:03.040 Well,
00:35:03.540 I was saying
00:35:04.080 with the rest of your money,
00:35:05.540 you need to go to
00:35:06.100 dailywire.com
00:35:06.860 slash subscribe
00:35:07.520 because we have the biggest deal
00:35:08.660 of the year right now.
00:35:09.840 This is the Black Friday deal,
00:35:11.440 50% off.
00:35:12.360 It's really,
00:35:13.160 really big.
00:35:14.380 You're going to get everything.
00:35:15.520 Obviously,
00:35:15.840 we have Pendragon coming out.
00:35:17.140 You're going to get
00:35:17.580 the world premiere
00:35:18.740 of that trailer
00:35:19.460 coming out
00:35:20.020 at the end of the show.
00:35:21.060 Really big stuff though.
00:35:22.080 New docs,
00:35:22.600 new hosts,
00:35:23.140 new everything.
00:35:23.860 It's very exciting.
00:35:25.540 You guys are
00:35:26.360 who empower DW
00:35:27.840 to build culture
00:35:28.740 and so right now
00:35:30.180 you can save 50%.
00:35:31.220 I love building culture
00:35:33.000 but I also like doing it
00:35:34.840 on a good deal.
00:35:35.820 You know,
00:35:36.160 I want to build culture
00:35:37.500 frugally
00:35:38.120 and so when you can
00:35:39.600 do it for 50% off,
00:35:40.860 it's a great time to do it.
00:35:42.020 Go to dailywire.com
00:35:43.340 slash subscribe.
00:35:45.000 Absolutely fitting,
00:35:46.320 apt way to talk
00:35:46.960 about affordability
00:35:47.680 which is a very serious problem.
00:35:50.460 You know,
00:35:50.760 usually sweet little Elisa
00:35:52.120 does the shopping
00:35:52.740 in the house.
00:35:53.620 Occasionally,
00:35:54.060 I had to go out
00:35:54.500 the other day
00:35:54.880 to get lemons
00:35:55.460 for a cocktail
00:35:57.120 that I was making,
00:35:57.620 not even for food,
00:35:58.320 just for a cocktail
00:35:58.900 I was making
00:35:59.360 and so I kind of
00:36:01.080 just a great cocktail.
00:36:02.240 That's a story
00:36:02.900 for another time.
00:36:03.700 Anyway,
00:36:04.020 I go to the grocery store
00:36:04.900 and the prices are insane.
00:36:06.980 I see why Elisa
00:36:07.680 had been keeping me
00:36:08.560 from them largely.
00:36:09.960 I mean,
00:36:10.280 you know,
00:36:11.140 the affordability problem
00:36:12.200 is very real.
00:36:13.200 It's not that it's
00:36:13.800 not being pounced on
00:36:14.900 by political actors
00:36:15.720 and it's obviously
00:36:16.280 become a big
00:36:16.760 political talking point
00:36:17.540 but it's very,
00:36:18.540 very real.
00:36:19.140 A ton of Americans
00:36:20.000 are hurting.
00:36:21.380 A lot of the fundamentals
00:36:22.520 of the economy
00:36:23.180 are a little shaky
00:36:24.020 right now
00:36:24.500 even though
00:36:24.980 those Mag 7 stocks
00:36:26.300 that we were just talking about
00:36:27.160 AI is pumping up
00:36:28.040 the market.
00:36:28.760 It's really,
00:36:29.700 really tough
00:36:30.020 and so there are
00:36:30.940 a bunch of related questions.
00:36:32.200 One,
00:36:33.000 can the government
00:36:33.880 do something
00:36:34.520 to fix this
00:36:35.560 or is the government
00:36:38.220 only going to make things worse?
00:36:40.080 How is this going to affect
00:36:41.080 the midterms
00:36:41.580 in the 2028 election?
00:36:43.320 Are we headed
00:36:44.480 for an economic disaster?
00:36:46.320 Ben,
00:36:46.620 you got in a huge amount
00:36:47.560 of trouble
00:36:47.980 because there was
00:36:48.660 a short clip of you
00:36:49.760 going around
00:36:50.380 saying,
00:36:51.480 yeah,
00:36:51.660 listen,
00:36:52.120 you know,
00:36:52.300 if you can't afford stuff
00:36:53.300 move out of your town
00:36:55.280 even if it's your hometown,
00:36:56.300 even if your family's been there
00:36:57.120 for a long time,
00:36:57.920 just get,
00:36:58.460 you got to get out,
00:36:59.200 you got to be mobile
00:36:59.940 and you were variously
00:37:02.140 exalted and pilloried
00:37:03.900 for this comment.
00:37:04.860 So,
00:37:05.200 what's it mean?
00:37:06.380 Yeah,
00:37:06.600 so let me start
00:37:07.680 with what that comment meant.
00:37:10.000 That was a piece
00:37:10.440 of personal advice
00:37:11.240 to people
00:37:11.680 that I think
00:37:12.480 every single young person
00:37:13.740 that I know
00:37:14.520 has at some point taken
00:37:15.580 which is,
00:37:16.180 if you're living in a place
00:37:16.980 that you can't afford
00:37:17.860 and the policies
00:37:18.840 aren't going to change
00:37:19.680 and you want to make
00:37:20.500 your life better,
00:37:21.100 you do have to make
00:37:21.660 a significant calculation
00:37:22.780 as to whether you think
00:37:23.660 your life is going to get
00:37:24.260 better where you are
00:37:24.900 or whether you're going
00:37:25.360 to have to go pursue
00:37:26.040 a dream someplace else
00:37:27.700 and you've seen this.
00:37:28.800 You've seen tremendous
00:37:29.240 population movement
00:37:29.940 in this country right now
00:37:30.880 out of New York
00:37:31.560 to places like Austin, Texas.
00:37:33.500 You've seen tremendous
00:37:34.060 population movement
00:37:34.820 from the blue areas
00:37:35.560 to the red areas
00:37:36.220 of the country
00:37:36.600 specifically because people
00:37:37.880 are seeking economic opportunity.
00:37:39.240 So what I thought
00:37:39.680 I was saying
00:37:40.160 was something
00:37:40.740 that's fairly obvious
00:37:41.860 which is that
00:37:42.660 if you are on a personal level
00:37:44.180 in a place
00:37:44.960 where you're stuck
00:37:45.540 and you can't afford
00:37:46.240 to live there,
00:37:47.140 you have to make
00:37:47.660 the best decision
00:37:48.240 for yourself and your family
00:37:49.380 and that does include
00:37:50.480 the possibility
00:37:51.100 of actually moving
00:37:52.200 as opposed to
00:37:52.900 shouting at the wind
00:37:53.760 if the policy
00:37:54.260 isn't going to change.
00:37:54.900 That's a separate question
00:37:55.860 from what sort of policies
00:37:57.000 could be pursued
00:37:57.740 in order to make things
00:37:58.760 more affordable.
00:37:59.680 I mean, I'll start with this.
00:38:00.940 If you're talking
00:38:01.400 about Manhattan,
00:38:02.380 Manhattan will never be
00:38:03.100 as affordable as Des Moines.
00:38:04.560 It just is not going to
00:38:05.400 and anybody who says
00:38:06.380 that it is going to
00:38:07.020 is totally lying to you.
00:38:08.080 It's just a flat out lie.
00:38:10.080 The reality is
00:38:11.080 there are only two ways
00:38:12.060 to make things
00:38:12.520 more affordable.
00:38:13.520 One is to drop
00:38:14.360 the demand for a product
00:38:15.280 and retain the same supply.
00:38:16.920 The other is to radically
00:38:17.920 increase the supply
00:38:18.740 of a product
00:38:19.200 and to retain the same demand.
00:38:20.500 That's it.
00:38:20.980 Those are the only way
00:38:21.660 that things become
00:38:22.180 more affordable.
00:38:22.740 There is no magical third way.
00:38:23.920 The only way things
00:38:25.020 become more affordable
00:38:25.700 is if the supply
00:38:26.720 greatly outstrips the demand
00:38:27.860 and the only ways
00:38:28.720 to do that
00:38:29.060 are to increase supply
00:38:29.800 or reduce demand.
00:38:30.940 That's it.
00:38:31.500 So if you're talking
00:38:32.560 about how to make
00:38:33.080 things more affordable,
00:38:34.020 one of the things
00:38:34.600 you can do to increase supply
00:38:35.580 is remove regulations.
00:38:36.880 You can get rid
00:38:37.780 of tax structures
00:38:38.580 that disincentivize investment.
00:38:40.420 You can get rid
00:38:41.360 of a lot of the difficulty
00:38:42.840 in building,
00:38:43.580 for example,
00:38:44.080 in New York.
00:38:44.520 But are you ever going
00:38:45.000 to build enough units
00:38:45.720 so that suddenly
00:38:46.320 the real estate prices
00:38:47.640 there reflect
00:38:48.200 what it would be
00:38:48.900 across the river
00:38:49.560 in sort of rural
00:38:50.800 parts of New Jersey?
00:38:51.540 The answer,
00:38:52.040 of course,
00:38:52.640 is no.
00:38:53.800 And when people
00:38:54.380 talk about affordability,
00:38:55.360 the thing that makes me
00:38:55.900 totally crazy about this
00:38:56.960 is I'm totally sick
00:38:58.400 in politics.
00:38:58.980 I'm sick to shit
00:38:59.920 of people in politics
00:39:00.900 doing this routine
00:39:01.640 where they say
00:39:02.540 the problem
00:39:03.240 over and over and over.
00:39:04.200 We're providing
00:39:04.580 zero solution.
00:39:06.000 And then when you say,
00:39:06.820 you know what,
00:39:07.140 I don't really see
00:39:07.720 a solution to the thing
00:39:08.400 you're talking about,
00:39:09.000 they pillory you
00:39:09.740 for noting the obvious.
00:39:11.240 Like, okay,
00:39:11.640 if you're not providing,
00:39:12.360 Zoram Amdani
00:39:12.900 is not providing a solution.
00:39:14.200 Him saying affordability
00:39:15.080 didn't make affordability
00:39:16.060 magically appear
00:39:16.900 like Beetlejuice
00:39:17.620 if you said affordability
00:39:18.600 three times.
00:39:20.000 And also,
00:39:20.760 politicians are in the business
00:39:21.840 of lying to you.
00:39:23.100 Okay,
00:39:23.280 when the president
00:39:24.040 of the United States,
00:39:25.200 who I generally agree with,
00:39:26.740 he made a mistake
00:39:27.300 when he came into office
00:39:27.940 and said,
00:39:28.340 I'm going to make things
00:39:29.220 affordable again.
00:39:30.360 The answer is,
00:39:31.260 no, you're probably not.
00:39:32.240 And the reason you're probably not
00:39:33.320 is because
00:39:33.840 all of the inflation
00:39:35.080 that Joe Biden
00:39:35.680 embedded in the economy
00:39:36.700 already made things
00:39:37.500 so wildly unaffordable
00:39:38.680 that the best you're probably
00:39:39.860 going to do
00:39:40.160 is keep prices stable.
00:39:41.680 Right?
00:39:41.800 What the Federal Reserve
00:39:42.420 seeks to do
00:39:43.120 is keep the inflation rate
00:39:44.840 at like 2%,
00:39:45.580 which is an increase
00:39:46.780 in the prices
00:39:47.480 just by the very nature of it.
00:39:48.900 And what people actually want
00:39:50.020 is for there to be deflation.
00:39:51.880 They want the prices
00:39:52.540 to be back at 2019 levels.
00:39:54.300 And they're not talking
00:39:55.020 about going back
00:39:55.520 to 2024 levels.
00:39:56.960 They're talking about 2019 levels.
00:39:58.180 The only way to get back
00:39:58.940 to 2019 levels
00:39:59.740 is probably
00:40:00.620 an economic recession.
00:40:02.340 That's just the reality.
00:40:03.740 And so,
00:40:04.200 again,
00:40:04.920 saying unpopular things,
00:40:06.440 the best that the inflation rate
00:40:07.920 could look like
00:40:08.340 for President Trump
00:40:09.020 is like this
00:40:09.880 under Joe Biden
00:40:10.920 and then
00:40:11.960 like this
00:40:13.020 under Trump.
00:40:14.340 Okay?
00:40:14.760 So here's,
00:40:15.480 okay,
00:40:16.140 this would be Biden,
00:40:17.200 this gigantic spike,
00:40:18.360 and then Trump stays steady.
00:40:19.700 The problem is
00:40:20.440 people are looking
00:40:21.000 at the prices here
00:40:21.740 and they're saying,
00:40:22.900 well,
00:40:23.100 they don't look like
00:40:23.640 the prices here.
00:40:25.080 Well,
00:40:25.560 yeah,
00:40:25.760 what's Trump supposed
00:40:26.340 to do about that
00:40:27.340 absent a radical increase
00:40:28.980 in the interest rates
00:40:30.420 that would sink
00:40:31.060 the economy?
00:40:32.140 So one thing
00:40:32.900 that has happened,
00:40:33.740 everyone was predicting
00:40:34.840 that Trump's tariffs
00:40:35.960 were going to be inflationary
00:40:37.580 and the Treasury Secretary,
00:40:38.560 Scott Besant,
00:40:39.180 was doing a little victory lap
00:40:41.100 because when he was
00:40:42.340 being confirmed
00:40:43.260 for his position,
00:40:44.180 he said,
00:40:44.620 no,
00:40:44.820 I actually think
00:40:45.280 tariffs are going
00:40:45.760 to be deflationary
00:40:46.660 and the San Francisco Fed
00:40:47.840 just came out
00:40:48.320 and said the tariffs
00:40:48.840 are deflationary.
00:40:49.640 No,
00:40:49.880 no,
00:40:50.080 you're reading
00:40:50.600 the study totally wrong.
00:40:51.600 That's not what
00:40:51.960 the study says.
00:40:52.620 I read the entire study.
00:40:53.540 It's 150 pages.
00:40:54.840 What that study says
00:40:55.860 is that
00:40:56.720 when you look
00:40:57.540 at tariffs,
00:40:59.980 over time,
00:41:01.020 there's a spike
00:41:01.640 at the beginning
00:41:02.100 because things
00:41:02.580 get more expensive
00:41:03.140 because you're
00:41:03.460 reducing the supply
00:41:04.260 and the demand
00:41:04.760 retains the same,
00:41:05.520 right?
00:41:05.640 So the price
00:41:05.940 goes up temporarily
00:41:06.560 and then people
00:41:07.320 start to lose
00:41:07.900 their jobs
00:41:08.540 and when people
00:41:09.160 start to lose
00:41:09.740 their jobs,
00:41:10.400 the demand goes down
00:41:11.160 and when the demand
00:41:11.680 goes down,
00:41:12.140 the prices come down.
00:41:13.260 No,
00:41:13.400 no,
00:41:13.440 no.
00:41:13.560 So you can say
00:41:14.160 it's deflationary.
00:41:14.800 There was a big caveat
00:41:15.560 even in the popular
00:41:16.380 reporting which is
00:41:17.120 the caveat is
00:41:18.220 it hurts employment
00:41:19.360 and it hurts
00:41:20.080 the economic growth.
00:41:21.240 Yeah,
00:41:21.360 yeah,
00:41:21.400 so there's obviously
00:41:22.000 a big caveat to it.
00:41:22.660 That's been the traditional...
00:41:23.400 There's one further point
00:41:25.000 on it just to why
00:41:26.480 I think your video
00:41:27.220 went viral,
00:41:27.860 Ben,
00:41:28.300 is because one thing
00:41:29.500 people are hearing
00:41:30.160 is they're not,
00:41:31.820 they're missing
00:41:32.440 the context of
00:41:33.220 you're giving
00:41:33.520 personal advice
00:41:34.220 to someone who's asking,
00:41:35.080 you know,
00:41:35.320 but at a macro level,
00:41:36.920 at a political level,
00:41:38.040 what people are hearing
00:41:38.820 is,
00:41:39.140 hold on,
00:41:39.700 you're telling me
00:41:40.220 my family's been
00:41:40.960 in this town forever.
00:41:42.080 I'll use my own example.
00:41:43.180 I got,
00:41:43.500 I have dozens
00:41:44.000 of family members
00:41:44.940 buried in the local
00:41:46.540 cemetery in my hometown
00:41:48.640 and even before that,
00:41:50.280 the Knowles'
00:41:50.900 initially were from
00:41:51.940 New Hampshire
00:41:52.360 and they,
00:41:53.420 you know,
00:41:53.700 they arrived here,
00:41:54.500 the Knowles side,
00:41:55.500 in 1660.
00:41:56.840 The Knowles family home
00:41:58.240 stood from 1660
00:42:00.020 until 1994
00:42:01.220 when the home
00:42:02.300 burned down.
00:42:02.960 There's still Knowles'
00:42:04.100 all over that area
00:42:05.400 in New Hampshire
00:42:06.400 and Maine
00:42:06.800 and what I think
00:42:08.140 a lot of people
00:42:08.600 are looking around at
00:42:09.320 is part of the reason
00:42:10.960 that housing in particular
00:42:12.200 is unaffordable right now
00:42:13.860 is because of
00:42:15.280 government decisions.
00:42:16.560 Government decisions
00:42:17.260 to flood the country
00:42:18.460 with a bunch of
00:42:19.180 like Venezuelan criminals
00:42:20.320 or Somalis or something
00:42:21.460 and increase the cost
00:42:22.580 of housing
00:42:23.040 or government decisions
00:42:24.440 that are going to
00:42:26.260 compromise certain industries
00:42:27.400 or certain jobs
00:42:28.080 because of trade deals
00:42:28.940 or whatever,
00:42:29.460 going all the way back
00:42:30.060 to NAFTA
00:42:30.540 or even further.
00:42:31.580 We don't need to
00:42:32.480 litigate those
00:42:33.380 in particular,
00:42:34.560 but you're saying,
00:42:35.060 no,
00:42:35.200 there's part of this
00:42:35.860 political order
00:42:36.640 that has led to this crisis.
00:42:39.320 At the very least
00:42:39.840 with migration.
00:42:40.820 And so,
00:42:41.460 why is it
00:42:42.240 that I'm just supposed
00:42:43.040 to say,
00:42:43.440 oh, shucks,
00:42:43.900 I've got to lose
00:42:44.380 my hometown
00:42:45.060 because,
00:42:46.020 well,
00:42:46.560 you know,
00:42:46.880 Republicans and Democrats
00:42:47.920 together flooded
00:42:48.540 the country
00:42:48.980 with aliens.
00:42:51.400 Isn't there a good
00:42:52.960 to having,
00:42:53.960 you know,
00:42:54.460 long family histories
00:42:55.700 in a single place?
00:42:56.960 Of course,
00:42:57.300 sure.
00:42:57.800 And there's a good
00:42:59.120 to having your family
00:42:59.900 live near you.
00:43:00.420 I have tons of family
00:43:01.160 that lives near me.
00:43:02.040 I'm a person
00:43:02.520 who grew up in LA.
00:43:04.180 I spent my entire life
00:43:05.200 living in LA
00:43:05.840 until I was 35,
00:43:06.760 one mile from my parents.
00:43:07.680 And then I moved to Florida
00:43:08.380 and I still live one mile
00:43:09.260 from my parents
00:43:09.720 because I took them with me.
00:43:10.480 So I'm very much in favor.
00:43:11.600 One of the things
00:43:11.980 I talk about on the show
00:43:12.660 all the time
00:43:13.180 is having family structures nearby
00:43:14.800 because you need
00:43:15.560 those supportive family structures.
00:43:17.060 That's not the case
00:43:17.680 that I'm making
00:43:18.280 is that you should abandon
00:43:19.360 this sort of stuff
00:43:20.260 or that mass migration
00:43:21.520 should replace you
00:43:22.140 in your hometown.
00:43:22.700 I think everyone here
00:43:23.620 is very much against
00:43:24.740 mass migration
00:43:25.460 and is very much in favor
00:43:26.440 of what President Trump
00:43:27.180 has been doing
00:43:27.620 on the immigration program.
00:43:29.020 The problem that I see
00:43:29.960 is not any of that.
00:43:31.220 I agree with all this
00:43:31.800 on policy,
00:43:32.560 but if there's a mentality
00:43:33.280 that sets in
00:43:33.920 that says,
00:43:34.740 I bear no responsibility
00:43:35.560 in changing my own life
00:43:36.680 if I can't change
00:43:37.340 the outside circumstances
00:43:38.440 and now I'm just going to sit here
00:43:39.640 and bitch about it,
00:43:40.500 like that doesn't seem
00:43:41.140 like a specific recipe
00:43:42.480 for individual success.
00:43:43.780 But Matt,
00:43:44.480 I want to know what you'd say
00:43:45.360 because I think you and I
00:43:46.400 are, as usual,
00:43:47.700 we are on opposite ends
00:43:48.420 of the spectrum in some ways.
00:43:49.700 I agree with your practical point
00:43:51.500 and I agree also with,
00:43:53.320 maybe I'm somewhere in between
00:43:54.620 because I agree with your point.
00:43:56.080 I also agree with
00:43:56.780 some of the criticism,
00:43:58.120 the more rational criticism.
00:43:59.400 You have a moderate position, Matt?
00:44:00.880 Well, no,
00:44:01.400 because here's the way I put it.
00:44:02.980 Ben's correct
00:44:03.600 and I've said the same thing
00:44:04.660 many times that,
00:44:05.820 especially as a young man,
00:44:07.140 I also think there's
00:44:07.680 a gender element to this
00:44:08.780 that is a sort of
00:44:09.880 a different topic.
00:44:10.660 But like as a parent,
00:44:12.880 I want my sons
00:44:14.340 when they become adults
00:44:15.120 to move out of the house.
00:44:15.940 I don't want them
00:44:16.320 to move 10 hours away,
00:44:17.640 hopefully,
00:44:18.020 but if they have to,
00:44:18.560 they have to.
00:44:19.380 I do want them to like move out
00:44:20.820 and, you know,
00:44:21.400 experience living on their own
00:44:22.500 a little bit
00:44:22.960 before they become,
00:44:24.340 before they become husbands
00:44:25.140 and fathers.
00:44:25.980 My daughters,
00:44:26.820 I would love for them
00:44:27.800 to just stay home with me
00:44:29.280 until they get married
00:44:30.720 many, many, many years
00:44:31.900 in the future.
00:44:32.440 So I do think
00:44:32.800 there's like a gender element to it,
00:44:33.700 but that's a separate thing.
00:44:35.180 I think if,
00:44:36.760 I totally agree
00:44:37.860 that if you're in a spot,
00:44:39.560 particularly if you're a young man
00:44:40.660 and you can't afford anything,
00:44:42.740 you can't get a job,
00:44:44.480 can't afford to live anywhere,
00:44:46.060 while you're single,
00:44:47.400 you have no kids,
00:44:48.460 you have no dependents,
00:44:49.660 you can go anywhere
00:44:50.580 and do anything
00:44:51.480 and you can take risks
00:44:52.800 and, you know,
00:44:54.040 the stakes are pretty low.
00:44:55.860 I mean,
00:44:56.280 worst case scenario,
00:44:57.140 you go somewhere,
00:44:57.920 you end up sleeping in your car
00:44:58.980 or something for a while.
00:45:00.000 I mean,
00:45:00.060 that's not good,
00:45:00.980 but it's like,
00:45:01.620 well, it's just you.
00:45:02.360 You can handle that,
00:45:03.480 especially as a young man.
00:45:04.480 So you could take risks,
00:45:05.840 you can go out
00:45:06.240 and pursue opportunities.
00:45:09.580 However,
00:45:10.080 at the same time,
00:45:11.300 it's also true
00:45:12.180 that you shouldn't have to do that.
00:45:14.760 Like something is wrong
00:45:16.000 that so many people
00:45:17.460 have to do that.
00:45:18.420 You should be able to,
00:45:19.640 to Michael's point,
00:45:21.560 if you're a young man
00:45:22.480 and you're looking at,
00:45:23.000 okay,
00:45:23.140 well,
00:45:23.280 my parents were born here,
00:45:24.460 they lived here.
00:45:25.340 My grandparents lived here.
00:45:26.980 Maybe my great-grandparents
00:45:28.680 lived here.
00:45:28.960 So generations of a family
00:45:30.500 lived in the same place
00:45:31.600 and now all of a sudden,
00:45:32.780 and I have the same kind of skills
00:45:34.900 that they do.
00:45:35.480 I might even be more educated
00:45:37.060 than they were.
00:45:37.920 So I'm in many ways
00:45:39.040 more qualified for a job
00:45:40.280 than even any of them were.
00:45:41.460 And yet all of a sudden,
00:45:42.600 everything's broken down
00:45:43.520 and it doesn't work for me
00:45:44.480 to live in this town anymore.
00:45:45.840 Something is wrong.
00:45:46.820 Something is broken.
00:45:47.640 It should not be this way.
00:45:48.900 We need to fix it.
00:45:50.280 So,
00:45:50.580 but on the practical level,
00:45:51.620 well,
00:45:51.820 it is this way now
00:45:52.980 and we want you to still succeed.
00:45:55.060 So you might have to go somewhere else,
00:45:56.920 hopefully with the intent
00:45:57.820 of eventually coming back
00:45:58.860 to live around your family
00:46:00.060 because I totally believe,
00:46:01.420 I mean,
00:46:01.600 we emphasize the nuclear family so much,
00:46:03.760 which is important,
00:46:04.620 but also the quote-unquote
00:46:06.080 extended family is also important.
00:46:08.080 So getting back to them,
00:46:09.420 and that's what a lot of us did,
00:46:10.860 what I kind of did,
00:46:11.480 move around,
00:46:11.960 move around,
00:46:12.520 end up back with your family.
00:46:14.100 So you might have to do that practically.
00:46:16.800 You shouldn't have to.
00:46:17.940 It shouldn't be that way.
00:46:19.120 That's the policy end of it.
00:46:21.020 And so we need policies in place
00:46:22.440 that make it possible
00:46:23.580 for people to live with their family
00:46:26.180 and then move next door
00:46:27.480 and stay with generations of families
00:46:29.940 their entire life.
00:46:30.660 You should be able to do that
00:46:31.660 in a functioning and thriving society.
00:46:33.800 One of the ways to make that happen
00:46:35.080 is the thing we all agree with,
00:46:36.920 get all the illegals out.
00:46:39.120 There's a lot,
00:46:39.720 they've been saying
00:46:40.480 20 million illegals in this country.
00:46:42.240 They've been telling me that
00:46:42.900 since like 20 years ago,
00:46:44.420 they were saying it was 20 million.
00:46:45.620 It's way more than that.
00:46:46.580 We don't know how many.
00:46:47.680 Get them all out,
00:46:48.900 shut down immigration.
00:46:50.140 And that's one of the policy changes
00:46:51.720 that can be made
00:46:52.320 and we need to do that.
00:46:53.480 But until that happens,
00:46:54.520 yeah, you got to figure out
00:46:55.520 what you're going to do
00:46:55.960 in your own life.
00:46:56.540 It's actually a rare moment
00:46:57.680 of total agreement.
00:46:58.620 I want to hear Ben's point
00:46:59.660 and I want to hear
00:47:00.420 from my great-great-grandfather,
00:47:02.120 Andrew, play this.
00:47:02.660 No, no, no.
00:47:03.120 I wasn't even going to make a point.
00:47:04.500 I was just going to say
00:47:05.520 I agree with Matt, actually.
00:47:06.500 So Matt and I are actually
00:47:07.380 in total agreement on this.
00:47:08.340 Okay, now I really want to move on
00:47:09.920 because Matt's offering
00:47:10.860 a moderate opinion
00:47:11.580 and Ben is agreeing with him.
00:47:12.960 I want to tell you
00:47:14.120 at the other end
00:47:14.980 of the age spectrum
00:47:15.840 about pre-born.
00:47:17.500 I want you to go to
00:47:18.220 pre-born.com
00:47:20.040 slash fire right now
00:47:22.240 because pre-born is one
00:47:23.820 of my absolute favorite charities.
00:47:26.820 I personally support it.
00:47:28.020 I encourage you
00:47:28.500 to personally support it,
00:47:29.360 to give what you can.
00:47:30.320 They've saved over 380,000 babies
00:47:32.920 through their rescue program.
00:47:35.620 What they do is pretty simple.
00:47:37.560 They introduce babies
00:47:38.800 to their mothers.
00:47:41.060 And when a woman
00:47:42.120 sees an ultrasound,
00:47:43.040 it doubles the baby's chance
00:47:44.200 of life.
00:47:44.940 When a woman
00:47:45.340 is considering abortion,
00:47:46.900 they provide amazing care
00:47:48.940 and work.
00:47:49.680 Not only do they
00:47:50.560 introduce the babies
00:47:51.420 to the mothers,
00:47:52.000 they also take care
00:47:52.860 of those mothers afterward,
00:47:54.340 radically increase
00:47:55.280 the chances
00:47:55.780 that that baby
00:47:56.880 is going to live
00:47:57.600 and that they will
00:47:58.220 have a successful life.
00:47:59.680 This giving season,
00:48:02.140 do not let another life
00:48:03.320 be lost.
00:48:04.300 Be the hope for worried mothers
00:48:05.420 and at-risk babies
00:48:06.620 to donate securely.
00:48:07.880 Two ways to do it.
00:48:09.000 If you like your phone,
00:48:09.900 if you're a little more
00:48:10.500 of a Luddite than some of us,
00:48:11.700 you're not down
00:48:12.280 on the AI train,
00:48:13.480 you dial pound 250,
00:48:15.240 you say keyword baby,
00:48:16.420 pound 250, keyword baby,
00:48:17.740 or you go to
00:48:18.060 preborn.com slash fire,
00:48:19.960 preborn.com slash fire.
00:48:21.460 Every gift is tax deductible,
00:48:23.300 so it's another way
00:48:24.500 of not having to pay
00:48:25.660 all those bureaucrats
00:48:26.360 in Washington.
00:48:27.100 Your money can be put
00:48:28.320 to good use
00:48:28.820 and not be put
00:48:29.460 to bad use.
00:48:30.460 Okay, Ben agrees
00:48:31.800 with Matt.
00:48:32.620 Matt has a moderate opinion.
00:48:34.080 I am totally scandalized
00:48:35.540 and I want to hear
00:48:36.620 from Drew.
00:48:37.580 So I disagree with Ben
00:48:38.760 in a couple of ways here.
00:48:40.000 I mean, first of all,
00:48:41.680 Zoramandani is one
00:48:43.380 of the scummiest politicians
00:48:44.500 I've ever seen
00:48:45.260 in my entire life,
00:48:46.380 but he did do half the job.
00:48:48.060 He did raise the issue
00:48:49.340 and when you raise the issue,
00:48:51.120 people perk up.
00:48:52.640 No, it's a terrible thing.
00:48:53.980 He raised the issue
00:48:54.600 and then offered
00:48:55.060 socialist solutions
00:48:56.500 that we know will be
00:48:57.480 utterly, utterly destructive.
00:48:59.120 It's not plain candy man
00:49:00.800 to say the word
00:49:01.720 that people are thinking about.
00:49:03.220 The worst thing
00:49:04.080 a politician can do
00:49:05.020 and the thing
00:49:05.380 will destroy
00:49:06.320 any administration
00:49:07.160 is to show people
00:49:08.360 a chart that shows them
00:49:09.800 they're not suffering
00:49:10.640 when they can't afford
00:49:11.580 Christmas presents
00:49:12.260 for their kids.
00:49:13.040 Like, here's this chart,
00:49:13.960 you're doing great,
00:49:14.940 you know,
00:49:15.140 people know exactly
00:49:16.300 how they're doing
00:49:16.920 and it makes them
00:49:17.980 incredibly frustrated.
00:49:19.480 What they're frustrated
00:49:20.240 with Trump now
00:49:21.200 is he's doing something
00:49:22.320 I think is urgently important.
00:49:23.880 I think we're going
00:49:24.340 to be very grateful
00:49:25.080 to Trump
00:49:25.520 for what he did
00:49:26.340 five, six, seven years
00:49:27.660 down the line
00:49:28.200 when China finally
00:49:29.000 invades Taiwan.
00:49:30.480 I think he's totally
00:49:31.380 rearranged America's
00:49:33.000 priorities
00:49:33.780 in absolute great ways,
00:49:35.520 but he didn't pay attention
00:49:36.500 to the thing
00:49:37.240 that's right there
00:49:37.780 on the table
00:49:38.260 and he has to pay
00:49:38.940 attention to it now.
00:49:40.000 The other thing
00:49:40.480 I disagree with
00:49:41.200 is normally it is true
00:49:42.780 that you have to put
00:49:43.420 people out of work
00:49:44.060 to bring down inflation.
00:49:45.100 That's what Reagan did
00:49:45.900 and he lost the midterms.
00:49:47.660 He didn't lose
00:49:48.180 the houses
00:49:49.400 but he lost the midterms
00:49:50.480 because of it
00:49:51.320 and everybody said,
00:49:52.060 oh, this is a disaster
00:49:52.880 and then the economy
00:49:53.840 turned around
00:49:54.240 for the next 25 years
00:49:55.620 because of what Reagan did.
00:49:57.500 But the other thing
00:49:58.360 that there is a third way
00:50:00.020 of dealing with inflation
00:50:02.700 which is raising
00:50:03.580 the investments
00:50:05.460 and the salaries
00:50:06.520 of people.
00:50:07.320 If you can steady,
00:50:08.180 you know,
00:50:08.320 if you can cut
00:50:08.920 inflation off
00:50:10.220 and make the prices
00:50:11.360 level out
00:50:11.920 and then wages
00:50:12.700 start to rise
00:50:13.560 then you can actually,
00:50:15.120 that is the same thing
00:50:16.160 as bringing down inflation
00:50:16.940 because now people
00:50:17.460 can afford
00:50:18.260 the things
00:50:18.900 they couldn't afford before.
00:50:20.240 So Matt is totally right
00:50:22.600 that we got to get rid
00:50:23.380 of all the illegals
00:50:24.260 and as far as I'm concerned
00:50:25.260 I don't care who it is.
00:50:26.640 I've lost all sympathy
00:50:27.620 with the illegal immigrations.
00:50:29.640 I know some of these people
00:50:30.400 are great people
00:50:31.040 who snuck in.
00:50:32.000 They got to go.
00:50:32.620 Everybody's got to go
00:50:33.480 and we got to give
00:50:34.200 the country back
00:50:34.760 to the people
00:50:35.200 who are here
00:50:35.680 and who were born here.
00:50:36.420 No question about that
00:50:37.480 in my mind.
00:50:38.220 I cannot have compassion
00:50:39.560 for 20 million people.
00:50:40.880 I can only have compassion
00:50:41.780 for one person at a time.
00:50:43.140 If one guy sneaks in
00:50:44.200 I can have compassion
00:50:45.220 for him.
00:50:45.760 I can't have compassion
00:50:46.620 for an invading army
00:50:47.800 which is what the Biden
00:50:48.960 administration gave us.
00:50:50.360 But the other thing
00:50:51.120 is we have to have
00:50:52.360 capitalist solutions
00:50:53.220 and I think there are
00:50:53.980 capitalist solutions.
00:50:55.000 For instance,
00:50:55.860 I think a lot of companies
00:50:56.980 are now offering people
00:50:58.260 stock.
00:50:59.240 A lot more companies
00:51:00.080 are offering people
00:51:00.840 stock and investment
00:51:01.760 as payment
00:51:02.660 as part of the payment.
00:51:03.540 I got that
00:51:03.960 when I worked for Coca-Cola.
00:51:06.060 I was a reader
00:51:06.780 for Columbia Pictures
00:51:07.680 and Coca-Cola owned them
00:51:09.120 and they gave me Coke stock.
00:51:10.000 It was transformative.
00:51:11.960 I mean,
00:51:12.220 all I had to do
00:51:13.380 was hold on to it
00:51:14.100 and now I had an investment
00:51:15.320 in the company
00:51:16.060 and in the economy
00:51:17.340 and I think that's
00:51:17.960 really important.
00:51:18.900 Trump is talking about
00:51:19.680 personal savings accounts
00:51:20.800 that I think is also
00:51:22.140 a really good idea.
00:51:23.380 Some of his ideas
00:51:24.020 like the 50-year mortgage
00:51:25.020 I'm not too happy about
00:51:25.960 because it's going to
00:51:26.600 double the price of homes
00:51:28.340 but still,
00:51:29.320 it might bring down
00:51:30.160 Lifetime debt slavery.
00:51:31.120 That's right.
00:51:31.780 Lifetime debt slavery.
00:51:32.800 Exactly, yeah.
00:51:34.300 But I think that there are ways
00:51:35.920 for capitalists
00:51:37.360 to increase people's participation
00:51:39.280 in the economy
00:51:40.720 so that when things work
00:51:42.820 for the bosses,
00:51:44.260 they work for the people too.
00:51:45.740 I think it's a wonderful thing
00:51:47.280 that this country,
00:51:48.020 when it is working
00:51:49.000 on all cylinders
00:51:49.800 and when the capitalism
00:51:50.980 is in place,
00:51:51.880 it makes so much money
00:51:53.740 that the big guys
00:51:55.020 can afford to share
00:51:55.800 a little bit
00:51:56.640 with the little guys,
00:51:58.040 not by having the government
00:51:59.260 redistribute it,
00:52:00.180 but by saying,
00:52:00.840 here's a piece
00:52:01.280 of what you're working for.
00:52:02.500 Starbucks did it.
00:52:03.300 It worked really well
00:52:04.060 for a long time
00:52:05.040 and I think a lot of companies
00:52:06.820 should do it.
00:52:07.700 And so I think that there are
00:52:08.680 ways of dealing with this
00:52:09.620 but I think that dealing with it
00:52:11.300 is something government
00:52:12.180 has to do.
00:52:13.080 It is a policy problem.
00:52:14.200 Government creates inflation.
00:52:16.180 People do not,
00:52:17.000 it's not the greedy banks,
00:52:18.340 it's not the greedy
00:52:19.040 drugstores or whatever.
00:52:20.620 It's the government
00:52:21.460 that creates inflation.
00:52:22.860 They can actually do things
00:52:24.200 to bring it down
00:52:24.740 and I think one thing,
00:52:25.780 you're right,
00:52:26.220 we don't want deflation
00:52:27.320 because it means
00:52:27.800 the economy is tanking
00:52:29.260 but you can get wages
00:52:30.400 growing in a lot
00:52:31.160 of different ways,
00:52:32.000 one of them by reducing
00:52:32.840 the workforce by getting
00:52:33.780 rid of the people
00:52:34.380 who shouldn't be here
00:52:35.580 would be a great first step.
00:52:37.220 I don't disagree
00:52:38.160 with some of those
00:52:39.120 policy prescriptions
00:52:40.040 but I think that
00:52:40.660 the thing that I am
00:52:42.180 kind of stuck in
00:52:42.940 and it's driving me
00:52:43.500 a little crazy
00:52:43.920 and I think it's the reason
00:52:45.440 why the country
00:52:45.960 is penduluming side to side
00:52:47.860 incredibly wildly.
00:52:49.480 You'll see,
00:52:50.000 you'll see like right now
00:52:51.760 that, you know,
00:52:52.580 Calci is one of our sponsors
00:52:53.440 so I'll mention them again here
00:52:54.360 because I did on my show earlier
00:52:55.520 but if you look at the polls
00:52:56.620 like the Calci markets
00:52:57.900 right now,
00:52:58.760 Democrats according to that market
00:52:59.940 and I kind of agree with this
00:53:00.980 are actually the favorites
00:53:01.700 in 2028
00:53:02.400 and I think the reason
00:53:04.000 for that
00:53:04.400 and I think the reason
00:53:05.100 that the country
00:53:05.540 just keeps swinging wildly
00:53:06.740 poll to poll
00:53:07.380 is because
00:53:08.220 when you have politicians
00:53:10.000 who are actually
00:53:10.780 saying the same thing
00:53:11.740 but none of them
00:53:12.300 are saying what is true
00:53:13.360 this is what you end up with.
00:53:14.700 So if everybody says
00:53:15.360 affordability is a problem,
00:53:16.080 I agree affordability
00:53:17.100 is a problem.
00:53:17.880 This is why I'm kind of
00:53:18.360 waving that away.
00:53:20.040 Labeling problems
00:53:20.720 is the easiest thing
00:53:21.460 in the world.
00:53:21.780 You can do it in your life
00:53:22.540 all day long
00:53:23.520 and I can agree with my wife
00:53:24.760 on every single problem
00:53:25.840 that exists in our life.
00:53:26.740 It's when you get to the solutions
00:53:27.660 that things get
00:53:28.320 a little bit complicated
00:53:29.480 and when you have politicians
00:53:30.740 who always say the same thing
00:53:32.340 but from different sides
00:53:33.120 of the aisle
00:53:33.420 which is you're right
00:53:34.140 it's government's job
00:53:34.900 to solve it.
00:53:35.680 Okay, there's only one problem.
00:53:37.220 If the thing that you're saying
00:53:38.240 is not going to solve it
00:53:39.200 and you're asking
00:53:40.680 for additional centralized power
00:53:41.960 in order to solve the thing
00:53:43.200 what you are going to end up with
00:53:44.660 is failure
00:53:45.160 and then the other guy
00:53:46.180 is going to say
00:53:46.780 give it to me
00:53:47.560 and so they're just passing
00:53:48.420 the ball side to side.
00:53:49.400 The only thing
00:53:50.120 that is going to create affordability
00:53:51.760 is a dynamic
00:53:52.700 and innovative economy
00:53:53.780 which means
00:53:54.620 a few things.
00:53:55.300 One, a consistent level of regulation
00:53:57.220 or less regulation, right?
00:53:58.900 Like actual certainty
00:53:59.660 and what's going to happen
00:54:00.280 tomorrow in the economy.
00:54:01.540 Two, you're actually going to need
00:54:03.120 innovators to innovate
00:54:04.400 and you need to leave them alone
00:54:05.420 and allow them to innovate
00:54:06.540 and actually capture the profits
00:54:08.040 that they're creating
00:54:08.760 through innovation
00:54:09.860 and then you're going to need
00:54:10.960 to get the hell out of the way.
00:54:11.940 I mean the magic
00:54:12.720 of the Reagan economy
00:54:13.540 I know Reagan has now become
00:54:14.520 anathema for some reason
00:54:15.460 that I cannot even imagine.
00:54:17.060 I can't imagine
00:54:17.780 why the right has decided
00:54:18.840 that Reagan was suddenly bad
00:54:20.480 other than because
00:54:21.560 we need to cast up
00:54:22.720 a false villain
00:54:23.440 in order to elevate
00:54:24.820 whatever the new
00:54:25.700 This amnesty
00:54:26.380 irritated some people
00:54:27.480 in retrospect.
00:54:28.160 I'm not saying everything
00:54:28.980 about Reagan was wonderful
00:54:30.900 but I don't think
00:54:31.380 everything about Trump
00:54:31.940 is wonderful either.
00:54:32.820 I do think
00:54:33.540 that the Reagan economy
00:54:34.360 generated more job growth
00:54:36.440 and pulled us out
00:54:37.160 of a greater economic morass
00:54:38.820 than any president
00:54:39.800 in history probably.
00:54:42.140 And so I think
00:54:42.680 that is worth something.
00:54:44.300 And so if you look
00:54:44.980 at Reagan's pitch
00:54:47.140 his pitch was
00:54:47.800 I can't solve
00:54:48.920 all your problems for you
00:54:49.700 but I can get the government
00:54:50.500 out of your way
00:54:51.160 so you can solve
00:54:51.840 your own problems.
00:54:52.480 And I just want
00:54:53.240 one politician
00:54:53.980 who will say that.
00:54:55.020 Like just one.
00:54:55.900 As opposed to this kind
00:54:56.600 of centralized government
00:54:57.560 bulls**t
00:54:58.300 where everybody says
00:54:59.100 no no don't worry
00:54:59.880 you sit there
00:55:00.340 and I'll solve
00:55:00.760 all your problems for you.
00:55:01.560 No one is going to solve
00:55:02.920 the vast majority
00:55:03.600 of problems in your life.
00:55:04.580 No politician will do it.
00:55:05.760 The best they can do
00:55:06.420 is get rid of the obstacles
00:55:07.460 that are in your way.
00:55:08.200 The systemic obstacles
00:55:09.060 that are in your way
00:55:09.760 and then most of the decisions
00:55:11.380 in a free country
00:55:11.920 ought to be up to you
00:55:12.940 and that is scary
00:55:13.960 because it means that
00:55:14.760 actually your success
00:55:15.580 or failure
00:55:15.980 is largely on your own shoulders.
00:55:17.460 I agree with you 100%
00:55:18.720 on this Ben.
00:55:19.420 This is different.
00:55:20.520 I agree with Ben 100%
00:55:21.560 on all of them.
00:55:22.080 No but in defense
00:55:23.540 of those who are critiquing
00:55:24.860 Ray obviously
00:55:25.320 I still love St. Gipper
00:55:26.320 and politicians come and go
00:55:28.000 you know Nixon was
00:55:28.700 in the crater for a while
00:55:30.260 now Nixon's making a comeback.
00:55:31.920 Coolidge was the man
00:55:32.680 for a while
00:55:33.220 now people are looking
00:55:34.020 more toward
00:55:34.420 I don't know
00:55:34.680 they like Teddy Roosevelt
00:55:35.440 they used to hate him.
00:55:36.220 So this happens
00:55:36.860 as we rethink history
00:55:38.600 and as we move on
00:55:39.460 to new circumstances.
00:55:40.560 Part of the reason
00:55:41.360 that there's a little more
00:55:42.720 of a critical lens
00:55:43.660 you know as opposed
00:55:44.840 to just exalting
00:55:45.920 St. Reagan
00:55:46.920 of being perfect
00:55:47.880 in all ways
00:55:48.440 is because
00:55:49.380 you know
00:55:50.100 in the 80s
00:55:51.300 mass amnesty
00:55:52.260 for illegal aliens
00:55:53.060 for example
00:55:53.640 wasn't really
00:55:54.740 all that big a deal
00:55:55.560 but it did set the stage
00:55:56.720 for a major problem
00:55:57.580 and so we're rethinking that.
00:55:59.120 In the 80s
00:56:00.040 you know
00:56:00.980 obviously Reagan
00:56:01.980 was massively successful
00:56:03.220 in his economic policy
00:56:04.560 as was Thatcher
00:56:05.700 as was that
00:56:06.300 whole kind of movement.
00:56:07.640 We do live
00:56:08.420 in a different world today
00:56:09.660 and so
00:56:10.060 it's not to say
00:56:10.700 we throw out
00:56:11.180 all of their solutions
00:56:11.840 it's not to say
00:56:12.240 that we throw out
00:56:12.640 all of their solutions
00:56:13.320 but it's to recognize
00:56:14.340 that there are
00:56:14.900 difficult economic problems
00:56:16.580 that we have to deal with
00:56:17.320 and so
00:56:17.620 Drew actually offered
00:56:19.140 some real solutions here
00:56:20.080 which is
00:56:20.900 you pointed out Drew
00:56:22.940 that having people
00:56:24.360 really bought into
00:56:25.340 the economy
00:56:25.920 you know Coca-Cola
00:56:26.640 giving you some stock
00:56:27.500 back in the day
00:56:28.100 is helpful.
00:56:29.240 Back when we were
00:56:29.940 rethinking some of the problems
00:56:31.080 with industrial capitalism
00:56:32.060 a hundred years ago
00:56:33.020 you had writers
00:56:33.960 especially Catholic writers
00:56:34.840 like Chesterton
00:56:35.740 and Belloc
00:56:36.260 saying we need some option
00:56:37.740 not socialism and communism
00:56:39.120 not pure unbridled capitalism
00:56:41.020 but some other option
00:56:42.540 they proposed something
00:56:43.100 called distributism
00:56:43.880 which is too complicated
00:56:44.900 to get into here
00:56:45.560 and probably isn't
00:56:46.580 all that practical
00:56:47.200 but a lot of what
00:56:48.580 it comes down to
00:56:49.120 is give people
00:56:49.940 some ownership
00:56:50.620 give people some stake
00:56:51.840 and I think that's
00:56:53.060 really really important
00:56:53.880 and so here's another
00:56:54.980 criticism maybe
00:56:55.740 of what came out
00:56:56.340 of the Reagan era
00:56:56.940 is that we judge
00:56:57.840 the health of an economy
00:56:58.780 purely by GDP
00:57:00.140 and GDP is a fine
00:57:01.980 economic indicator
00:57:02.780 but it's not the be all
00:57:03.900 and end all of everything
00:57:04.760 and I think what a lot
00:57:05.700 of people are looking
00:57:06.240 around at today
00:57:07.000 is saying look
00:57:07.860 you can show a lot
00:57:09.060 of economic activity
00:57:10.160 in all sorts of ways
00:57:11.640 by the pornography industry
00:57:12.980 to use the topic
00:57:13.700 we keep coming back to
00:57:14.580 the pornography industry
00:57:15.860 is booming
00:57:16.200 look at that
00:57:16.740 GDP is going up
00:57:17.660 there are all sorts
00:57:18.520 of very destructive industries
00:57:20.320 we brag now
00:57:21.540 about how women's employment
00:57:23.040 is the highest ever
00:57:24.040 I'm not sure that's
00:57:25.100 a great thing
00:57:25.700 who's taking care
00:57:26.740 of the kids
00:57:27.180 who's watching the home
00:57:28.040 isn't there some cost
00:57:29.080 to that as well
00:57:29.720 and so I just
00:57:30.500 I wonder
00:57:31.360 one slightly practical
00:57:32.780 solution might be
00:57:33.740 to say
00:57:34.440 alright look
00:57:35.240 maybe GDP
00:57:35.960 isn't the be all
00:57:36.720 and end all
00:57:37.120 of everything
00:57:37.580 and maybe there are
00:57:38.440 certain areas
00:57:38.980 of the economy
00:57:39.520 that are legitimately
00:57:40.600 immoral and destructive
00:57:41.780 and we used to
00:57:42.600 heavily regulate them
00:57:43.740 like pornography
00:57:44.820 for instance
00:57:45.460 but all sorts of other
00:57:46.540 kind of vicious
00:57:47.540 and degrading avenues
00:57:49.300 we've liberalized gambling
00:57:50.680 I don't know that
00:57:51.400 that's really great
00:57:52.140 maybe it ticks up
00:57:53.200 GDP a little bit
00:57:53.860 but it doesn't
00:57:54.520 I don't think
00:57:54.920 that's really great
00:57:55.420 for the true health
00:57:56.080 of an economy
00:57:56.580 maybe we need to rethink
00:57:58.020 what economic health
00:57:59.120 really looks like
00:57:59.880 because the changes
00:58:01.560 that came about
00:58:02.080 in the late part
00:58:02.660 of the 20th century
00:58:03.420 did have some negative
00:58:05.280 side effects
00:58:05.900 as well as
00:58:06.560 positive outcomes
00:58:07.480 at Desjardins
00:58:09.820 we speak business
00:58:11.160 we speak startup funding
00:58:12.820 and comprehensive game plans
00:58:14.320 we've mastered
00:58:15.380 made to measure growth
00:58:16.340 and expansion advice
00:58:17.460 and we can talk your ear off
00:58:19.200 about transferring
00:58:19.980 your business
00:58:20.660 when the time comes
00:58:21.660 because at Desjardins Business
00:58:23.180 we speak the same language
00:58:24.880 you do
00:58:25.360 business
00:58:26.000 so join the more than
00:58:27.540 400,000 Canadian entrepreneurs
00:58:29.480 who already count on us
00:58:31.040 and contact Desjardins today
00:58:32.800 we'd love to talk
00:58:34.380 business
00:58:35.280 can I address
00:58:38.480 the Reagan thing
00:58:39.060 for a minute though
00:58:39.640 because a lot of this
00:58:40.680 I think started
00:58:41.280 with that Caldwell book
00:58:42.280 The Age of Entitlement
00:58:43.160 in which he blamed Reagan
00:58:44.840 for things that Reagan
00:58:45.540 actually didn't
00:58:46.140 Reagan said he failed
00:58:46.980 to cut down the government
00:58:47.800 that was the big failure
00:58:48.640 of his administration
00:58:49.380 but we've edited
00:58:50.560 the Cold War
00:58:51.280 out of history
00:58:51.880 and you know
00:58:52.800 Reagan like won
00:58:53.980 the Cold War
00:58:54.520 he freed
00:58:55.520 like a huge
00:58:57.160 huge section
00:58:58.020 of the world
00:58:58.800 of the globe
00:58:59.480 he set people free
00:59:00.540 and what they did
00:59:01.600 with that is up to them
00:59:02.460 but he actually did that
00:59:03.720 you can't imagine
00:59:05.440 how unheard of that was
00:59:07.540 how unexpected it was
00:59:08.600 how nobody thought
00:59:09.680 it would ever happen
00:59:10.360 how we were dealing
00:59:11.080 with the Soviet Union
00:59:11.840 for the rest of our lives
00:59:12.780 not just people
00:59:13.540 who thought that
00:59:14.100 communism was going to work
00:59:15.460 but people who thought
00:59:16.420 it's just never going to go away
00:59:17.920 he made it go away
00:59:19.560 and I think for that
00:59:20.260 he's a hero
00:59:21.220 and yeah
00:59:21.880 what Knowles is saying
00:59:23.020 is true
00:59:23.320 we now are living
00:59:24.240 in an absolutely new economy
00:59:26.040 and while the basics
00:59:27.240 of deregulation
00:59:28.180 I totally disagree
00:59:29.460 there's no such thing
00:59:30.160 as a new economy
00:59:30.760 the basis of deregulation
00:59:35.280 and freedom
00:59:36.240 and free markets
00:59:38.000 are absolutely the same
00:59:39.580 they don't change at all
00:59:40.480 but the problems
00:59:42.460 that arise
00:59:43.020 because no system
00:59:44.520 solves human problems
00:59:46.160 because human beings
00:59:46.940 can't be solved
00:59:47.620 the problems that arise
00:59:49.800 and the places
00:59:50.640 where the peaks
00:59:51.180 of problems are
00:59:51.860 change
00:59:52.400 and then we have
00:59:52.860 to address those
00:59:53.480 and one of them
00:59:54.280 you're absolutely right
00:59:55.400 one of the key ones
00:59:57.500 is the role of women
00:59:58.260 in our society
00:59:59.060 which I think
00:59:59.760 has screwed up
01:00:00.460 so badly
01:00:01.300 that it's destroying everything
01:00:02.660 we've actually
01:00:03.360 stopped reproducing
01:00:04.740 which to me
01:00:05.220 is always a bad sign
01:00:06.600 you know
01:00:07.080 that economic indicator
01:00:08.200 another indicator
01:00:09.840 I mean so
01:00:10.280 actually
01:00:10.840 this teaches me a lesson
01:00:12.180 I should let Drew
01:00:12.740 finish his sentences
01:00:13.500 because when he finishes them
01:00:14.620 I'm more likely
01:00:15.180 to agree with them
01:00:15.780 but
01:00:15.940 there'll be a whole new
01:00:17.980 relationship
01:00:18.400 but at the same time
01:00:19.760 you know
01:00:20.200 Knowles
01:00:20.820 I'll pick on you
01:00:21.720 a little bit
01:00:22.080 when we say
01:00:22.500 you know
01:00:22.740 terrible
01:00:23.160 we shouldn't look at GDP
01:00:24.100 it's not a good
01:00:24.820 indicator of economic
01:00:26.160 but it's not the be all
01:00:26.900 and end all
01:00:27.180 okay it's not the be all
01:00:27.820 but it's the be all
01:00:28.860 okay so
01:00:29.320 there's no such thing
01:00:30.080 as an economic
01:00:30.580 be all and end all
01:00:31.200 okay but
01:00:31.900 I think that
01:00:32.520 we are mixing up
01:00:33.300 a few terminologies
01:00:34.260 here
01:00:34.700 and I think that
01:00:35.280 we ought to
01:00:35.760 tease out the strain
01:00:36.740 for one second
01:00:37.280 there's a difference
01:00:37.900 between economic health
01:00:38.800 and societal health
01:00:39.460 these are not the same thing
01:00:40.300 you can have a very
01:00:41.840 economically healthy society
01:00:43.020 that is breaking down
01:00:45.020 in a lot of social ways
01:00:46.060 with tremendous pathologies
01:00:47.340 I think that's what
01:00:48.060 you're actually seeing
01:00:48.840 and so
01:00:49.520 yes
01:00:50.000 it turns out that
01:00:50.880 we are materially
01:00:51.660 significantly better off
01:00:52.920 than we were in the 1980s
01:00:53.900 in fact
01:00:54.460 we are materially
01:00:55.180 significantly better off
01:00:56.060 than we were in the mid 2000s
01:00:57.240 when people talk about
01:00:58.440 the unaffordability of homes
01:00:59.720 that's because an average
01:01:00.540 home in 1950
01:01:01.360 was a 980 foot
01:01:03.400 you know
01:01:04.200 square foot brick house
01:01:05.620 with no insulation
01:01:06.560 and no heating or air
01:01:07.740 and maybe a bathroom outside
01:01:09.220 like this kind of idea
01:01:10.680 that we're living worse
01:01:11.420 than your parents
01:01:11.880 or grandparents
01:01:12.340 is just belied
01:01:13.220 by every available fact
01:01:14.460 maybe you're living worse
01:01:15.960 than your grandparents
01:01:16.480 are right now
01:01:17.520 but you're not living worse
01:01:18.740 than your grandparents
01:01:19.340 were at the same age
01:01:20.580 right if you're a 20 year old
01:01:21.620 living in 2025
01:01:22.760 you are not worse off
01:01:24.220 than your grandparents
01:01:24.860 were living as a 20 year old
01:01:26.440 in 1958 or 1960
01:01:27.900 you have an iphone
01:01:28.880 but you don't have a house
01:01:29.880 I mean I agree
01:01:30.820 the houses are nicer now
01:01:31.860 but you don't have one
01:01:32.400 your apartment is nicer
01:01:33.540 than their house was
01:01:34.440 okay that is a reality
01:01:35.820 if you're living anywhere
01:01:36.620 except for New York City
01:01:37.560 and by the way
01:01:38.860 the idea that you couldn't
01:01:40.020 move somewhere
01:01:40.480 and get a house
01:01:41.140 now you're getting back
01:01:43.020 to my original point
01:01:43.700 which is on a personal level
01:01:44.960 if you want to live a life
01:01:45.840 like your grandparents
01:01:46.400 you might have to do the thing
01:01:47.600 that your grandparents did
01:01:48.500 okay your grandparents
01:01:49.120 went to a war
01:01:49.960 and then they came back
01:01:50.800 and moved to a town
01:01:51.540 that they actually probably
01:01:52.280 did not grow up in
01:01:53.240 and then they got a house
01:01:54.320 that was like off the lot
01:01:55.900 from some big corporation
01:01:58.640 that built a bunch of
01:01:59.300 standard box looking houses
01:02:00.560 that now you drive past
01:02:01.500 those on the freeway
01:02:02.140 and you say
01:02:02.320 I can't believe
01:02:02.760 somebody ever lived in those
01:02:03.740 so it's kind of
01:02:04.380 rose colored glasses
01:02:05.720 about the past
01:02:06.360 drives me a little bit insane
01:02:07.760 and again
01:02:08.800 I think that
01:02:09.560 if we want to look
01:02:10.080 at the real problems
01:02:10.560 in our society
01:02:11.060 we shouldn't create
01:02:11.680 a mythical past
01:02:12.420 and we shouldn't create
01:02:13.320 a mythically terrible present
01:02:14.440 we should actually look
01:02:15.540 at the problems
01:02:16.160 in our society
01:02:16.720 and one of those would be
01:02:17.620 people not having kids
01:02:18.780 one of those would be
01:02:19.460 deep depression
01:02:20.340 and unhappiness
01:02:21.060 people killing themselves
01:02:22.000 with opioids
01:02:22.580 people having their jobs
01:02:25.180 taken by illegal immigrants
01:02:26.320 in certain industries
01:02:27.080 those are actual
01:02:27.820 real solvable problems
01:02:28.840 but I don't have
01:02:29.620 a DeLorean
01:02:30.180 all I have right now
01:02:31.680 is the way that people
01:02:32.800 are living right now
01:02:33.980 and so now we have to
01:02:34.560 look at the problems
01:02:35.000 in front of us
01:02:35.440 and how do we solve those
01:02:36.360 yeah but that's the one
01:02:37.960 that's the one part
01:02:39.100 where I
01:02:39.420 so at the buzzer
01:02:40.680 I get to disagree
01:02:41.440 with you Ben
01:02:41.860 I remember
01:02:43.020 there was one thing
01:02:44.060 you said in that
01:02:44.680 in that clip
01:02:45.160 that I did disagree
01:02:46.000 that I couldn't remember
01:02:46.500 but then you just
01:02:46.920 said it again
01:02:47.300 so the one part
01:02:49.940 about well this is
01:02:51.080 you know America
01:02:51.680 it's how America
01:02:52.440 has always been
01:02:53.000 that you leave
01:02:54.380 and you go somewhere
01:02:55.180 else away from
01:02:55.780 your family
01:02:56.280 and I think that
01:02:57.920 like back in the
01:02:58.660 pioneer days
01:02:59.380 I mean there is
01:03:00.260 something about that
01:03:01.240 that's in the
01:03:01.740 American spirit
01:03:02.520 of like literally
01:03:03.380 going out into
01:03:04.340 a wilderness
01:03:04.900 and building
01:03:06.460 your own life
01:03:07.320 maybe a thousand
01:03:08.400 miles away
01:03:08.940 from anyone
01:03:09.320 that you know
01:03:09.780 and so there is
01:03:10.700 that's American
01:03:12.020 in a certain sense
01:03:12.940 but that was back
01:03:13.600 in the pioneer days
01:03:14.320 I think for most
01:03:14.980 of American history
01:03:17.060 it's like anywhere
01:03:18.420 else in the world
01:03:19.000 people they grew up
01:03:20.760 in a place
01:03:21.280 they didn't move
01:03:21.860 that far away
01:03:22.460 they stayed
01:03:23.260 where their support
01:03:23.860 systems were
01:03:24.460 we are less mobile
01:03:25.660 now and by the stats
01:03:26.840 we are less mobile
01:03:27.460 now than we have
01:03:28.000 ever been any time
01:03:28.820 in American history
01:03:29.500 quick raise your hand
01:03:30.380 if you are currently
01:03:31.340 living in the town
01:03:32.480 where you grew up
01:03:33.380 but you're saying
01:03:35.000 we're less mobile
01:03:36.020 now
01:03:36.280 and I'm saying
01:03:37.300 that we are
01:03:37.660 a unique breed
01:03:38.620 in that we actually
01:03:39.860 like we're a little
01:03:40.680 older than the
01:03:41.280 Gen Zers
01:03:41.760 okay
01:03:42.140 like we
01:03:43.140 but the people
01:03:44.020 who tend to be
01:03:44.520 more successful
01:03:45.100 and again
01:03:45.940 as a piece of advice
01:03:46.600 are the people
01:03:47.360 who tend to actually
01:03:48.280 move in pursuit
01:03:49.360 of opportunity
01:03:49.920 and if you look
01:03:50.660 historically speaking
01:03:51.440 it is not true
01:03:52.040 that in 1920
01:03:52.620 everybody is living
01:03:53.540 in the town
01:03:53.880 where they grew up
01:03:54.460 in fact in 1920
01:03:55.840 there were more
01:03:56.280 people who were
01:03:56.680 moving across the
01:03:57.700 country at great
01:03:58.460 expense and difficulty
01:03:59.380 than there are today
01:04:00.280 in 2025
01:04:01.460 exceptional people
01:04:04.700 exceptional people
01:04:05.640 move
01:04:06.400 they go into the
01:04:07.140 wilderness
01:04:07.440 they build new towns
01:04:08.420 but most people
01:04:09.200 are not exceptional
01:04:09.840 sexy handsome people
01:04:11.120 yes
01:04:11.400 yeah yeah
01:04:12.120 right
01:04:12.300 so and you want
01:04:13.580 a country
01:04:14.520 filled with communities
01:04:15.800 and filled with
01:04:16.540 you know people
01:04:17.360 with traditions
01:04:18.020 and things like that
01:04:18.860 so I kind of
01:04:19.600 half agree with you
01:04:20.260 on this
01:04:20.500 I do believe
01:04:21.100 that exceptional people
01:04:22.020 should and will move
01:04:23.020 but I think that
01:04:24.300 Matt is right
01:04:25.200 that it shouldn't be
01:04:26.040 like that for everybody
01:04:26.900 sorry go back to Matt
01:04:27.800 so Matt can finish
01:04:28.340 disagreeing with me
01:04:28.880 because I'm being a jerk
01:04:29.480 again
01:04:29.720 no I think
01:04:31.740 I think
01:04:32.160 I think that's
01:04:32.860 I don't know
01:04:34.340 the claim that
01:04:35.620 people were more
01:04:36.960 mobile in the 1920s
01:04:38.180 there's also
01:04:39.280 there's a technological
01:04:39.900 side of this too
01:04:40.900 that for a lot
01:04:42.040 of American history
01:04:42.760 you know moving away
01:04:43.800 from your family
01:04:44.540 and going to another
01:04:46.600 state over
01:04:47.300 was like a three month
01:04:48.580 journey
01:04:49.000 and you know
01:04:49.760 people are going to
01:04:50.220 die along the way
01:04:50.940 so that is
01:04:52.280 one of the reasons
01:04:53.100 why we know
01:04:53.720 that for a lot of
01:04:55.260 you know American history
01:04:56.560 and human history
01:04:57.340 people didn't tend
01:04:58.380 to do that
01:04:59.000 I mean sometimes
01:04:59.500 they did
01:05:00.060 but that was
01:05:00.620 again that's like
01:05:01.480 you're a pioneer
01:05:02.120 I think that the
01:05:04.400 at the very least
01:05:05.320 and I don't think
01:05:05.980 we're disagreeing
01:05:06.500 on this point
01:05:06.940 that the desire
01:05:08.560 to stay
01:05:10.420 in your community
01:05:11.500 where you were born
01:05:12.680 where your family is
01:05:13.780 stay with your
01:05:14.780 support system
01:05:15.940 with your families
01:05:16.700 and your family
01:05:17.540 and your friends
01:05:18.020 that's a good desire
01:05:19.820 there's nothing wrong
01:05:20.680 with that
01:05:21.080 I know that
01:05:21.500 and a healthy
01:05:23.500 country is one
01:05:24.640 where people
01:05:25.200 if they want to do that
01:05:26.500 are able to do it
01:05:27.760 so I think that's the part
01:05:29.260 I think we all agree
01:05:30.580 on that right
01:05:31.020 that's
01:05:31.400 you know this gets back
01:05:33.340 though to this point
01:05:34.260 of the neat and pat
01:05:35.760 distinction between
01:05:36.580 economic health
01:05:37.340 and social health
01:05:38.080 I'm not sure
01:05:39.180 that we can
01:05:39.720 obviously they're
01:05:40.420 distinct concepts
01:05:41.040 but I'm not sure
01:05:42.020 that we can totally
01:05:42.760 separate them
01:05:43.520 you know especially
01:05:43.940 as increasingly
01:05:45.080 in the modern age
01:05:45.840 we think of ourselves
01:05:46.600 as homo economicus
01:05:48.100 you know we're like
01:05:48.760 primarily economic creatures
01:05:50.820 and I don't
01:05:52.080 I think we're just
01:05:52.760 integral creatures
01:05:53.820 and we have all
01:05:54.720 of these things
01:05:55.260 together
01:05:55.880 and so
01:05:56.360 you know especially
01:05:57.540 at this kind of moment
01:05:58.880 you look now
01:06:00.020 compare it to 1980
01:06:01.460 or 1880
01:06:02.400 for that matter
01:06:03.020 one of the major
01:06:04.200 problems that we have
01:06:05.000 is that social solidarity
01:06:06.340 has really frayed
01:06:07.920 that religiosity
01:06:09.200 has declined
01:06:10.500 precipitously
01:06:11.220 though there are some
01:06:12.180 signs that that's
01:06:12.880 turning around
01:06:13.500 and you can't divorce
01:06:15.180 that from the
01:06:16.420 birth rate problem
01:06:17.240 you know you can't
01:06:17.940 divorce that
01:06:18.380 from the fact that
01:06:19.220 people aren't having kids
01:06:20.040 these are great predictors
01:06:21.120 you know stability
01:06:21.700 tradition and religion
01:06:22.840 are predictors of people
01:06:24.420 having kids
01:06:24.800 and you can't divorce
01:06:25.700 that from the economic
01:06:26.500 problems because
01:06:27.460 if we don't import
01:06:28.480 the entire third world
01:06:29.700 we're told that our
01:06:30.940 our economy is going
01:06:31.800 to collapse
01:06:32.200 that GDP is going
01:06:32.980 to collapse
01:06:33.400 so that's the whole
01:06:34.220 argument for mass
01:06:34.940 migration
01:06:35.360 and so these problems
01:06:36.400 are all so deeply
01:06:37.340 intertwined
01:06:38.100 that it seems to me
01:06:39.140 that there has to be
01:06:40.360 some firmer
01:06:41.760 political solution
01:06:43.720 to rather than
01:06:45.340 just say look
01:06:45.980 we're going to let
01:06:46.820 the free hand of the
01:06:47.560 market you know
01:06:48.380 work its way
01:06:49.340 and we'll let the chips
01:06:50.060 fall where they may
01:06:51.040 a lot of people
01:06:51.460 are looking around
01:06:51.860 and saying I don't like
01:06:52.640 where the chips
01:06:53.000 are falling
01:06:53.400 well I mean
01:06:53.800 this is a great place
01:06:55.500 for us to conclude
01:06:56.600 because I'm going to
01:06:56.940 disagree for one second
01:06:58.360 with Knowles
01:06:58.820 and just say
01:06:59.300 that there are many
01:07:00.680 many more impoverished
01:07:01.840 countries than the
01:07:02.800 United States
01:07:03.420 that have less severe
01:07:04.880 pathologies than the
01:07:05.800 United States
01:07:06.340 and in the past
01:07:07.400 we were a less
01:07:08.440 wealthy nation
01:07:09.380 with less severe
01:07:10.300 pathologies
01:07:10.840 and so this is why
01:07:11.540 I say that trying to
01:07:12.320 tie the economic
01:07:13.080 situation to the
01:07:14.040 pathologies
01:07:14.560 I think in some cases
01:07:15.960 and in most cases
01:07:16.760 actually
01:07:17.140 can be a fool's errand
01:07:18.560 but we'll have to
01:07:19.600 save that for next time
01:07:20.680 because here's the deal
01:07:21.620 before we leave folks
01:07:22.500 our biggest and best
01:07:23.580 sale of the year
01:07:24.000 is happening right
01:07:24.700 this very instant
01:07:25.980 like at this moment
01:07:27.200 while you're listening
01:07:28.000 to us
01:07:28.520 all Daily Wear
01:07:29.460 plus annual memberships
01:07:30.300 are 50% off
01:07:31.400 you get everything
01:07:32.060 you get access
01:07:32.900 to the DW library
01:07:33.900 of movies
01:07:34.420 documentaries
01:07:35.080 Matt's documentaries
01:07:36.040 mostly is what we're
01:07:36.680 talking about there
01:07:37.340 because those are
01:07:37.680 the best ones
01:07:38.140 that have ever been
01:07:38.600 made
01:07:38.780 and series that
01:07:39.560 stand for the ideals
01:07:40.260 that keep America
01:07:41.080 free
01:07:41.400 and that of course
01:07:42.080 includes
01:07:42.540 the Pendragon cycle
01:07:43.740 Rise of the Merlin
01:07:44.540 it is coming January
01:07:45.440 22nd
01:07:46.360 all access members
01:07:47.240 get early access
01:07:48.180 to episodes 1 and 2
01:07:49.320 1 month early
01:07:50.160 on Christmas day
01:07:51.000 which is a bit
01:07:51.480 of a sweetener
01:07:51.900 for you there
01:07:52.340 you empower DW
01:07:53.500 plus to build culture
01:07:54.940 defend values
01:07:55.680 launch stories
01:07:56.340 that ensure your voice
01:07:57.200 and your values
01:07:57.760 shape the future
01:07:58.540 of the United States
01:07:59.520 whether you want to join
01:08:00.700 or give the gift
01:08:01.300 of a DW membership
01:08:02.200 to someone
01:08:02.620 now is the time
01:08:03.360 to do it
01:08:03.620 at 50% off
01:08:04.900 it is our best deal
01:08:06.040 of the year
01:08:06.960 you can head on over
01:08:07.540 to dailywire.com
01:08:08.700 slash subscribe
01:08:09.900 we will all be very happy
01:08:11.060 to see you over there
01:08:12.440 well in just a moment
01:08:13.580 we are going to bring you
01:08:14.780 the magical
01:08:16.100 mystical
01:08:16.620 trailer
01:08:17.420 for
01:08:18.180 finally
01:08:18.700 the Pendragon cycle
01:08:19.740 Rise of the Merlin
01:08:20.640 is coming January 22nd
01:08:22.020 guys
01:08:22.260 thanks for stopping by
01:08:23.480 we will see you here
01:08:24.400 hopefully never
01:08:25.740 for the rest of us
01:08:26.280 but actually
01:08:26.720 we will see you here
01:08:27.340 in a couple of weeks
01:08:28.140 and we will get together
01:08:28.820 and disagree
01:08:29.480 in friendly fashion
01:08:30.940 on friendly fire
01:08:31.920 with one another
01:08:32.720 without further ado
01:08:33.700 here is the trailer
01:08:34.260 oh this is an illusion
01:08:39.120 an echo of a voice
01:08:41.080 that has died
01:08:41.800 and soon that echo
01:08:45.460 will cease
01:08:46.040 they say
01:08:57.720 that Merlin
01:08:58.760 is mad
01:09:00.020 they say
01:09:05.040 he was a king
01:09:05.780 in Dovid
01:09:06.400 the son of a princess
01:09:08.980 of lost Atlantis
01:09:10.400 they say
01:09:11.900 the future
01:09:12.740 and the past
01:09:13.960 are known to him
01:09:15.160 that the fire
01:09:17.000 and the wind
01:09:17.580 tell him
01:09:18.260 their secrets
01:09:19.220 that the magic
01:09:20.600 of the hillfolk
01:09:21.500 and druids
01:09:22.220 come forth
01:09:23.080 at his easy command
01:09:24.560 they say
01:09:27.100 he slew
01:09:28.240 hundreds
01:09:29.120 hundreds
01:09:30.400 do you hear
01:09:31.200 that the world
01:09:32.220 burned
01:09:32.800 and trembled
01:09:33.640 at his wrath
01:09:34.540 the Merlin
01:09:38.880 died long
01:09:40.020 before you
01:09:40.820 and I
01:09:41.320 were born
01:09:42.220 Merlin
01:09:44.280 Emrys
01:09:44.840 has returned
01:09:46.320 to the land
01:09:46.900 of the living
01:09:47.400 Portigan is gone
01:09:50.640 Rome is gone
01:09:52.140 the Saxon
01:09:54.020 is here
01:09:54.820 Saxon Hengist
01:09:57.060 has assembled
01:09:57.560 the greatest war host
01:09:58.540 ever seen
01:09:59.160 in the island
01:09:59.700 of the mighty
01:10:00.240 and before the summer
01:10:01.380 is through
01:10:01.900 he means to take
01:10:03.140 the throne
01:10:03.640 and he will have it
01:10:06.360 if we are too busy
01:10:08.040 squabbling amongst ourselves
01:10:09.340 to take up arms
01:10:10.220 against him
01:10:10.880 here is your hope
01:10:12.600 a king will arise
01:10:14.400 to hold all Britain
01:10:15.620 in his hand
01:10:16.400 a high king
01:10:17.900 who will be the wonder
01:10:18.940 of the world
01:10:19.560 you
01:10:21.480 to a future
01:10:24.640 of peace
01:10:25.980 there'll be no peace
01:10:29.100 in these lands
01:10:29.820 till we are all dust
01:10:31.060 men
01:10:31.980 of the island
01:10:32.780 of the mighty
01:10:33.340 you stand together
01:10:35.580 you stand
01:10:37.940 as Britons
01:10:38.960 you stand as one
01:10:40.800 great darkness
01:10:44.960 is falling
01:10:45.500 upon this land
01:10:46.360 these brothers
01:10:48.960 are our only hope
01:10:49.620 to stand against it
01:10:50.620 not our only hope
01:10:54.000 they say Merlin
01:10:56.060 slew 70 men
01:10:57.320 with his own hands
01:10:58.340 I could say
01:10:59.980 he slew 500
01:11:01.260 no man is capable
01:11:04.780 of such a thing
01:11:05.660 no mortal man
01:11:07.400 you