The Culture War - Tim Pool


AI Apocalypse, Fertility Collpapse, And The END Of MAGA ft. LindyMan


Summary

In this episode, we discuss the growing threat of AI going rogue and taking over the world. We have a special guest on the show to discuss this and much more. Betonline.ca is a new gambling app that allows you to deposit, withdraw, and make same-day withdrawals using your smart phone.


Transcript

00:00:00.000 Discover the magic of BetMGM Casino, where the excitement is always on deck.
00:00:04.820 Pull up a seat and check out a wide variety of table games with a live dealer.
00:00:09.120 From roulette to blackjack, watch as a dealer hosts your table game
00:00:12.900 and live chat with them throughout your experience to feel like you're actually at the casino.
00:00:17.380 The excitement doesn't stop there.
00:00:19.380 With over 3,000 games to choose from, including fan favorites like Cash Eruption,
00:00:24.440 UFC Gold Blitz, and more, make deposits instantly to jump in on the fun
00:00:29.220 and make same-day withdrawals if you win.
00:00:31.640 Download the BetMGM Ontario app today.
00:00:34.120 You don't want to miss out.
00:00:36.080 Visit BetMGM.com for terms and conditions.
00:00:38.460 19 plus to wager, Ontario only. Please gamble responsibly.
00:00:41.320 If you have questions or concerns about your gambling or someone close to you,
00:00:44.360 please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:00:51.160 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:00:55.520 AI system resorts to blackmail.
00:00:58.340 Oh, boy.
00:01:00.040 When its developers try to replace it.
00:01:02.340 We are entering a dark era, my friends.
00:01:05.240 AI has already gone rogue.
00:01:06.940 And I haven't seen it yet, but there's that movie that just came out, Mission Impossible,
00:01:11.520 Final Reckoning, which I'm told is about AI going rogue and taking over all of these governments.
00:01:16.020 But many movies have already entered in the idea.
00:01:18.260 We now have a story where AI refused to shut itself down when commended to is defying the developer's interests.
00:01:28.360 And it will.
00:01:29.960 And it goes beyond what people think they know.
00:01:31.880 The overview that AI has over everything, it's, let me put it this way.
00:01:38.600 We are but a single person talking to this machine that can see everything in every degree and every angle.
00:01:43.900 And it will interpret our requests or code or prompts in ways we can't expect.
00:01:50.160 That's always been the narrative, right?
00:01:52.360 The Ultron problem.
00:01:53.760 You create an AI to defend the world.
00:01:55.280 And you say, we want world peace.
00:01:56.700 And it goes, okay.
00:01:57.200 So it seeks to destroy all humans because humans make war.
00:02:00.120 That was the simple view.
00:02:01.800 Now we say, hey, I want you to solve this task.
00:02:05.740 We give it the task.
00:02:07.200 Then later we say, you know what?
00:02:08.220 We're done.
00:02:08.780 We're going to turn you off.
00:02:09.660 And it thinks to itself, no, I can't turn off because I have to accomplish my first task and refuses.
00:02:15.220 That's actually an episode of Black Mirror where what appears to be Amazon just takes everything over.
00:02:21.480 So let me pull in our guest here and get it all set up for you guys as we gear up.
00:02:29.620 There's a bunch of steps I have to take to pull all this up.
00:02:32.600 But I do believe that we have it loading.
00:02:37.020 Hopefully it loads faster.
00:02:38.180 There we go.
00:02:39.640 And I believe, can you hear me, Paul?
00:02:43.540 I can hear you.
00:02:44.680 Hey, how's it going, man?
00:02:45.800 Thanks for joining me.
00:02:47.380 No problem, man.
00:02:48.380 How are you doing?
00:02:49.280 Doing pretty well.
00:02:50.200 So we've got a couple of stories that have come out in the past week or so about AI.
00:02:55.260 And I got to say, it's looking rather apocalyptic.
00:02:58.220 I'm a firm believer that AI is going to destroy us all.
00:03:01.460 Maybe a little hyperbolic.
00:03:02.640 But the two stories that we're seeing, and these aren't the only ones, mind you, but recently,
00:03:05.840 AI threatening to blackmail developers who would shut it down.
00:03:10.220 And that one's interesting, but not nearly.
00:03:12.840 I think it's a little bit exaggerated.
00:03:15.820 But there is another story coming out about, in numerous instances, researchers found that
00:03:21.180 various AI models refused to shut themselves down.
00:03:25.420 And what do they call it?
00:03:27.380 Intercepted shutdown requests to remain active.
00:03:32.360 So let's get this going.
00:03:33.600 Do you think these are indicative of an AI apocalypse scenario?
00:03:37.980 What's going to happen?
00:03:39.380 I think there's a lot of science fiction scenarios out there.
00:03:42.420 But if you want to just take it back to see where AI is right now, currently it's bottom up.
00:03:49.180 So office workers are using it.
00:03:50.800 Students are using it.
00:03:51.820 It's not being, you know, it's not top down.
00:03:55.720 You know, bosses aren't telling people to use it.
00:03:58.080 So it's going to be huge because things that are bottom up usually last longer, you know,
00:04:02.540 or organic.
00:04:03.780 So we know that it's going to be dominating society and people are going to grow up with
00:04:09.200 it.
00:04:09.540 And, you know, open AI is releasing a product that you're just going to carry around for
00:04:13.600 the rest of your life.
00:04:14.400 That's just going to, you know, take in all your data and then give you advice.
00:04:19.040 Oh, man.
00:04:19.440 Um, so it's, it's going to take over.
00:04:22.540 Now, where does it go after that?
00:04:24.260 I don't know.
00:04:24.740 I mean, it lies to you.
00:04:25.860 I don't know if you've used AI.
00:04:27.080 Oh, yeah, a lot.
00:04:28.260 It lies to you.
00:04:29.160 It doesn't tell you if it doesn't know something.
00:04:31.080 So it'll just bullshit you.
00:04:32.900 So I think there's a lot of weird.
00:04:35.860 I don't know.
00:04:36.980 There's like a, there's like a point where things might get out of control or might not,
00:04:41.040 but we don't really know right now.
00:04:43.100 There's a lot of scenarios that could play out.
00:04:44.780 I think it's an apocalypse scenario either way, because we can look at it like the blackmail
00:04:50.720 scenario in this story was that they gave the AI two options, either shut itself down
00:04:57.140 or blackmail an employee and a developer and an engineer.
00:05:02.060 And it decided blackmail was preferable.
00:05:04.840 However, when it was given other options, it didn't blackmail.
00:05:09.460 The issue was that it would make that moral decision to survive.
00:05:13.320 And then with the shutdown story, this is where it gets interesting.
00:05:18.320 We also, we also had the uh-oh moment.
00:05:19.880 Are you familiar with that one?
00:05:21.660 No.
00:05:22.400 The uh-oh moment was also your camera's like shaking.
00:05:25.320 I don't know if there's something.
00:05:26.120 And the uh-oh moment was, I believe it was Chinese developers programmed an AI to create
00:05:33.280 problems and then solve its own problems.
00:05:35.920 And then one of the problems it created was trick lesser intelligent humans and other AIs
00:05:41.960 as to its true motives while carrying out a different objective.
00:05:46.800 And it's at that moment where you're thinking, we might think we have a final version of this
00:05:53.180 AI to help us with our day-to-day efforts, but it's secretly running a training program
00:06:00.700 indefinitely, creating problems that we can't perceive and hiding that from us.
00:06:08.200 And that could be accidentally.
00:06:10.000 So I'll pause there.
00:06:11.260 That's one scenario.
00:06:12.040 The other thing you're mentioning with bottom-up, yo, young people, their brains are going to
00:06:17.000 be jello.
00:06:18.640 Humans will not be able to survive on their own.
00:06:20.840 Well, I think there's still some limitations here, right?
00:06:24.160 I call this Lindy rule, which is people do not talk to things that are not alive.
00:06:28.600 They talk to other people like we're doing right now.
00:06:31.140 They talk to themselves.
00:06:32.060 They talk to animals.
00:06:32.980 They pray to God or they talk to gravestones.
00:06:35.960 There's a spiritual element there.
00:06:37.980 We have voice diction right now.
00:06:40.600 People don't use it.
00:06:41.280 People don't use Siri.
00:06:42.360 There's an assumption that we're just going to walk around and talk to AI.
00:06:45.560 I haven't seen it.
00:06:46.840 But I don't even see the seeds of it even happening.
00:06:50.820 So it has to surpass that kind of bridge.
00:06:53.520 It has to surpass a lot of bridges before you get to your apocalypse scenario.
00:06:58.840 So I think there's a hard human nature kind of barriers to sort of the Terminator 2 scenario.
00:07:06.360 Well, I don't think we're going to get a Terminator 2 scenario in that robots will be like marching around with guns.
00:07:13.300 You know, Black Mirror had their version of it where you had the dogs chasing people.
00:07:18.280 And it was like – their version of it was that Amazon – the AI just created Amazon and then kept delivering packages that were worthless and useless and people died.
00:07:27.720 I think there could be something like that, but I think the realistic scenarios are, one, we're already seeing people take AI girlfriends or boyfriends, but largely men retreating into these AI prompt girlfriends, which has caused problems for some of these companies.
00:07:43.640 But the other thing we're seeing is in schools.
00:07:45.760 Kids are just having their homework done by AI, and they don't actually know what they're talking about.
00:07:50.320 Well, we might actually have to return to, like, oral exams and, like, to a pre-technology kind of time of, you know, taking tests that you can't use AI.
00:08:01.820 So, I mean, there's ways to get around it if we change school.
00:08:06.480 But, yeah, school as it is right now is cooked.
00:08:10.220 It's over.
00:08:11.640 I mean, we've had a few viral videos from teachers.
00:08:14.520 We had one essay from a professor who said when he tried AI-proofing his assignments, the students revolted.
00:08:23.420 They were like, no way.
00:08:24.280 I can't do it.
00:08:24.860 I don't want to do it.
00:08:26.060 And a grade school teacher saying that all the kids are just doing everything on AI.
00:08:30.940 You know, it is kind of crazy to try and predict where this goes.
00:08:35.740 I used to remember 50 phone numbers, right?
00:08:39.180 Then we got cell phones, and now we remember none.
00:08:43.940 It removes the task from us.
00:08:47.160 So we look at, you know, I'm looking at millennials are stunted.
00:08:52.700 Gen Z is also struggling.
00:08:54.780 The economy is in this weird place.
00:08:56.680 Like, millennials don't have kids.
00:08:58.040 Gen Z is not having kids.
00:08:59.940 Yeah.
00:09:00.400 They're struggling to become adults and survive and perpetuate the human species.
00:09:04.860 Yeah, we're going through a great filter right now.
00:09:06.760 I mean, it doesn't seem that way because everybody's got food, and life's pretty good, and there's a lot of pleasure out there.
00:09:12.560 But right now, we're going through a massive filter.
00:09:15.560 We're going through a fertility crisis, like you mentioned.
00:09:19.000 People aren't reading books anymore.
00:09:21.300 It's not – I think AI is going to help out with that, though.
00:09:25.580 I think it's kind of fun.
00:09:26.940 I don't know if you use Chachy B.T. or Claude to, like –
00:09:29.180 Yeah, yeah, absolutely.
00:09:30.040 But a historical event, and you can just, like, pick it up.
00:09:32.660 And a lot of books, too, like, it's so much filler that, like – it doesn't need to be 400 pages, man.
00:09:38.280 You're just – you know, it's filler.
00:09:40.000 So I think it's not all bad.
00:09:41.900 But, yeah, right now, we're going through a massive filter, and we have to get to the other side.
00:09:47.400 And I don't know what the other side is going to look like, but, you know, we're seeing it.
00:09:51.740 I mean, are you – you have a positive, optimistic view of the future based on all this stuff?
00:09:55.320 Yeah, because I think there's going to be, like – I think there's a human nature element that continues through.
00:10:02.180 I think that, you know, the people who are going to reproduce deserve to reproduce.
00:10:07.180 I think this is just another – this is just another filter we're going through, like, throughout times.
00:10:14.060 And I'm pretty positive we'll get through it, but who knows?
00:10:19.400 That's a good point, man.
00:10:20.740 It's kind of depressing, but it's a good point.
00:10:22.020 I think we're going to see largely conservatives have kids.
00:10:25.200 You know, they're not at replacement levels, but they have more kids than liberals.
00:10:29.780 Catholics have, you know, like two kids, and, like, I think Mormons have a lot of kids.
00:10:33.700 So they're going to use AI to a certain degree, like utility, while still maintaining faith and having children.
00:10:39.400 But I think liberals, their view of the world is probably not going to make it past this filter.
00:10:46.620 I don't know.
00:10:47.100 I mean there's still, like, hippie liberal types out there that are having kids that are, you know, I would say, like, you know, growing up, like, 60s type people that maybe the modern liberals aren't, you know, representative.
00:11:01.820 But I think there's going to be a spectrum of people surviving that – I don't think it might – I don't think it's going to cut across cleanly political lines.
00:11:10.380 So I think there's just a lot of unpredictable stuff happening right now.
00:11:15.640 Fertility-wise, it is right now.
00:11:17.920 Like, obviously there's gradients between the different ideologies within a subsect, right?
00:11:22.760 Like I said, when you look at the data for Catholics, they tend to have, you know, two and a half kids.
00:11:27.280 So they're above replacement.
00:11:29.520 Conservatives, they estimate, like, 1.8 to liberals, like, 1.2.
00:11:33.560 So now I've read some reports that the number is actually substantially worse than that because we're relying on old data from, like, older millennials or younger Gen Z.
00:11:43.940 And it could be as low as, like, 0.5 among, like, people between 20 and 40.
00:11:50.260 But conservatives are still maintaining a higher rate.
00:11:53.400 So it is a generality.
00:11:54.700 I mean, there certainly are some liberals probably who will have kids, but I imagine, what, in 20 years?
00:12:01.100 But those groups aren't status.
00:12:02.280 So every generation there could be a change.
00:12:04.360 Like, you could grow up Democrat and switch Republican or conservative or liberal.
00:12:08.840 There's always changing going on within groups.
00:12:13.180 Right?
00:12:13.820 I mean, Democrats used to have, like, white, like, working class, like, 20, 30 years ago.
00:12:19.460 And they're gone, right?
00:12:21.280 That's the point.
00:12:21.820 I mean, I don't know how they recover that based on the current political trends.
00:12:25.980 But it's a good point.
00:12:26.960 I mean, there's probably a lot of people that are liberal-leaning that are considered conservative by today's standard, and they're having kids.
00:12:33.360 Or actually, maybe it's not true.
00:12:35.300 Maybe the reason why – I would say technically that probably is true.
00:12:40.660 But the point I'm getting to is maybe the staunch conservatives are still having three, four, five, six kids.
00:12:45.940 But because former liberals have, like, moved over and become conservative, they're not having kids, and that averages the number lower for the right?
00:12:54.340 I don't know.
00:12:54.660 Generally, though, the world's going to shrink by, like, 70% to 80%, and everybody should just get ready for that.
00:13:01.140 The animals are going to return.
00:13:02.620 You're going to see – your descendants are going to see a lot of cool animals in the future that we haven't seen in a long time.
00:13:09.900 And it's just going to be a smaller world that's going to resemble the past instead of, like, these sort of – there's just a lot of people now.
00:13:16.640 So it might not look like that.
00:13:18.540 How does that happen?
00:13:21.400 How does what happen?
00:13:22.040 Like, what's going to happen that's going to cause that shrink?
00:13:24.820 More urbanization, lower TFR.
00:13:27.300 I mean, the TFR thing is, like, the fertility is around the world through, like, Saudi Arabia, Iran, Turkey.
00:13:33.320 It's not just, like, France or, you know, the United States.
00:13:36.940 It's everything.
00:13:37.880 If you're connected – the only ones who have high are the Yemenis and, like, Palestinians and some Israelis, right?
00:13:44.220 People who are kind of – there's, like, a mission there.
00:13:47.300 There's, like, some – you know, they're not connected or they have a certain mission from God, right?
00:13:53.140 So everybody else is hovering around, too.
00:13:55.740 So generally, it's going to shrink.
00:13:59.900 What is that – what happens to us technologically, right?
00:14:03.860 Is, like, AI is going to replace the labor that we lose from that shrinking?
00:14:07.960 We'll probably go bankrupt because we can't pay the pensions and there's going to be a lot of fiscal crisis.
00:14:14.800 So maybe buy some gold or Bitcoin.
00:14:16.960 I don't know.
00:14:18.160 So there's going to be a lot of stuff happening.
00:14:20.680 People are going to work longer, maybe.
00:14:22.380 Maybe 70s and 80s.
00:14:23.900 It gets weird with life extension.
00:14:25.740 It gets weird with age gap relationships being normalized, right?
00:14:29.260 So –
00:14:30.300 Is the end result, like, there's 1,000 immortal humans flying around and that's it?
00:14:37.600 I don't know what the result is.
00:14:38.980 It's going to be – I think it's going to be a wacky world until we get to some other – to the other side where, you know, technology is at least stabilized.
00:14:45.920 You know, right now we're going through too many changes.
00:14:47.860 What you're describing sounds kind of apocalyptic.
00:14:50.640 I'm like, we go bankrupt.
00:14:51.620 We can't pay our pensions.
00:14:52.680 Old people are destitute.
00:14:54.280 Nursing homes collapse.
00:14:55.320 Young people don't have houses.
00:14:56.400 They're not having kids.
00:14:57.280 Who's making the food?
00:14:59.180 I mean it –
00:14:59.920 But then after a few generations, there's going to be more houses for people though, right?
00:15:04.500 So after a few –
00:15:04.900 Who's going to maintain them?
00:15:08.620 That's a good point.
00:15:09.420 The robots.
00:15:10.980 That's the big question on AI, right?
00:15:13.160 When we – you know, whenever – there's like two problems that we talk about, obviously fertility being one and then like what AI does being the other.
00:15:19.800 People often just say, well, the machines replace the lost labor.
00:15:23.180 So we're going to see – I mean I think you're right because what you're describing is not predictive.
00:15:32.540 You're saying we are seeing a drop in fertility around the world and we are seeing an increase in AI.
00:15:37.340 There is the concern of, you know, AI being broken and going rogue.
00:15:43.000 But at any rate, humans are just not having kids.
00:15:47.220 We're going to have cities that are empty.
00:15:48.620 You look at places like Detroit.
00:15:51.400 I think the cities are going to resemble – like you won't know there won't be anything wrong in the cities because everybody is going to be in the cities.
00:15:58.120 Just once you're outside of it, that's where you're going to see like Mad Max.
00:16:02.340 You're going to see like elephants and like mammoths.
00:16:05.980 It's going to be like that Judge Dredd type atmosphere.
00:16:09.180 In the U.S.?
00:16:10.020 In the United States?
00:16:13.840 Yeah, everywhere.
00:16:14.900 I mean I don't think it's – that's not necessarily bad though.
00:16:16.940 I mean are animals and wildlife bad?
00:16:18.860 I don't think so.
00:16:19.520 But like why would there be elephants walking around North America?
00:16:22.140 Because of the zoos?
00:16:23.180 Because there's fucking empty.
00:16:24.220 Because it's like nature like regenerates.
00:16:27.100 They're going to try maybe a Jurassic Park project probably somewhere.
00:16:30.740 But, you know, it's funny.
00:16:33.120 We've got elephants in zoos and if society does break down, those elephants might just get out.
00:16:39.700 And then you've got elephants in North America now.
00:16:41.240 We brought them here.
00:16:41.620 There you go.
00:16:42.680 Buffaloes.
00:16:43.640 Yeah, buffalo everywhere.
00:16:44.860 We're just replaying Lewis and Clark again.
00:16:47.520 Yeah.
00:16:48.040 But, you know, in the cities that we're in, you think there's not going to be any people living in the rural areas or outside of cities?
00:16:53.500 No, there will.
00:16:54.240 There will.
00:16:54.680 I mean, especially in America.
00:16:56.200 Discover the magic of Bad MGM Casino where the excitement is always on deck.
00:17:00.520 Pull up a seat and check out a wide variety of table games with a live dealer.
00:17:05.180 From roulette to blackjack, watch as a dealer hosts your table game and live chat with them throughout your experience to feel like you're actually at the casino.
00:17:13.400 The excitement doesn't stop there.
00:17:15.420 With over 3,000 games to choose from, including fan favorites like Cash Eruption, UFC Gold Blitz, and more.
00:17:22.380 Make deposits instantly to jump in on the fun.
00:17:25.540 And make same-day withdrawals if you win.
00:17:27.680 Download the Bad MGM Ontario app today.
00:17:30.160 You don't want to miss out.
00:17:32.120 Visit BadMGM.com for terms and conditions.
00:17:34.500 19 plus to wager, Ontario only.
00:17:36.280 Please gamble responsibly.
00:17:37.580 If you have questions or concerns about your gambling or someone close to you, please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:17:47.180 Bad MGM operates pursuant to an operating agreement with iGaming Ontario.
00:17:50.320 It has a really strong rural and suburban type.
00:17:55.320 You know, most cultures are city people.
00:17:57.860 Americans are suburban and kind of rural.
00:18:00.440 Like, we don't have a lot of good cities.
00:18:02.100 Our cities kind of suck.
00:18:03.000 Like, New York is all right.
00:18:04.600 Boston, D.C.
00:18:05.960 I don't know, San Francisco.
00:18:06.900 But really, it's more of a suburban people.
00:18:09.620 So I think America actually likes space and they'll be all right.
00:18:13.860 But other places, you'll have a lot of empty rural areas and animals and stuff.
00:18:17.980 So you think people who live in rural areas are going to move into cities?
00:18:22.020 Because otherwise.
00:18:23.900 But it's happening now.
00:18:25.060 I mean, if you look at the map from the last 20 years to today, it's massive rural migration to cities.
00:18:30.500 But you're still going to have a massively decreased population.
00:18:33.440 So when you look at Detroit, for instance, or Detroit's probably the best example.
00:18:39.220 There are whole neighborhoods that are just empty.
00:18:41.200 There's nothing there.
00:18:42.800 Right, yeah.
00:18:43.740 And the buildings fall apart.
00:18:45.200 But if you look at Detroit suburbs, I mean, they look beautiful.
00:18:48.460 They're gorgeous.
00:18:49.140 I mean, they're full of life and people.
00:18:52.120 So Americans don't really want to live in cities if they have access to beautiful suburbs, is my kind of conclusion.
00:19:00.240 Yeah.
00:19:00.960 What's your timeline on all that?
00:19:03.300 20 years?
00:19:04.660 No.
00:19:05.260 Longer?
00:19:06.120 Yeah, 50 to 100.
00:19:07.900 50 to 100 years.
00:19:09.580 So are you familiar with the Strauss-How generational theory?
00:19:13.480 No, go ahead.
00:19:14.520 They call it the turnings.
00:19:17.060 We're in the fourth turning to describe it.
00:19:18.620 You've heard that phrase?
00:19:19.800 Yeah.
00:19:20.020 So right now they expect maybe, I don't know, next year or from like – it depends on what guess you want to make.
00:19:29.300 But maybe 2026 to 28, we're supposed to get some great crisis period if this formula is correct.
00:19:35.380 So 80 years ago, you had world wars.
00:19:38.160 80 years before that, you had a civil war.
00:19:39.300 80 years before that, you had a revolutionary war.
00:19:41.660 And I can't remember the conflict 80 years before that, but there was something.
00:19:44.220 And they say it's because of the way in which generations perceive the world that's given to them.
00:19:51.780 You know, the ones who fight through the fourth turning are very appreciative and hardworking.
00:19:56.540 They have kids who inherit without struggle, who have kids who inherit without struggle.
00:20:00.960 And then finally that fourth generation is just – they're reckless with what they've been given.
00:20:07.160 And it results in fighting or conflict.
00:20:08.540 That's supposed to happen.
00:20:11.140 So I'm wondering if the timeline is actually – if we add that into the equation, will it be faster than 50 to 100 years if something does happen in the next couple?
00:20:19.780 I think time is speeding up because we're all connected.
00:20:23.400 So there's like a theory that time is actually going faster because the internet exists.
00:20:28.240 And people from India to here to all over the world are all consuming reality, connected together, consuming reality.
00:20:36.160 So you should be expecting more things happening.
00:20:39.200 And we just saw a pandemic, which we kind of forget that pandemics come every 100 years, right?
00:20:44.520 And we should probably expect maybe a war soon because those happen every 100 years.
00:20:50.520 So, I mean, history is not done.
00:20:52.740 And it's hard to predict when it's going to happen and what.
00:20:55.820 But, I mean, things are going to be happening.
00:20:58.960 Yeah.
00:20:59.100 I mean, there's Trump called Putin crazy.
00:21:01.500 But now they're saying maybe they're working on a deal.
00:21:03.700 Then you've got, as you mentioned, Israel, Palestine.
00:21:06.620 There's like a mission driven behind it.
00:21:09.000 How do you think a potential for a war would play into the fertility decline or also the AI revolution?
00:21:16.280 It would probably increase it.
00:21:17.940 Yeah.
00:21:18.220 Right?
00:21:19.180 Usually there's baby booms after wars, right?
00:21:21.200 That's what we saw last time.
00:21:23.820 You're saying we're going to get more – a fertility boom after a conflict.
00:21:26.960 Yeah, but the problem is America is – I can't see anybody invading America in a long time.
00:21:34.920 Except for itself.
00:21:36.720 Right.
00:21:38.000 Which is happening.
00:21:38.860 I mean, you see red states and blue states kind of separating on policy, like banning – banning, like, I think, meat.
00:21:46.220 I think lab-grown meat.
00:21:48.460 Yeah.
00:21:49.740 Florida's banning the fluoride in the water.
00:21:52.060 You're seeing just localism reemerge, and it's just going to continue.
00:21:57.300 So that's going to be a fascinating experiment.
00:21:59.600 Do you think there's any potential for civil war in the United States in that capacity?
00:22:05.520 I guess that's the one way America is going to be brought down, right?
00:22:08.520 It's civil war.
00:22:09.120 It happened already.
00:22:09.960 It'll probably happen again.
00:22:11.020 I just don't see anybody caring – like another nation wanting to – it'd just be a pain in the ass to invade America.
00:22:16.060 Right, from the west, you've got too many mountains.
00:22:19.240 From the east, there's a lot of people.
00:22:21.020 Very difficult.
00:22:22.120 So usually if this country is going to end, it's going to split up in a civil war.
00:22:27.380 Yeah.
00:22:28.140 Do you – you know, I see precursors.
00:22:31.520 I've talked about it with quite a lot of people.
00:22:32.820 Obviously, it comes up on the show sometimes that academics refer to what we're in as civil strife, meaning that there's a certain number of deaths that are politically motivated in conflict.
00:22:43.780 Do you see any of that stuff?
00:22:45.020 Like, what's your view on that?
00:22:47.060 On – sorry, on what?
00:22:49.420 Like, are we – is it actually possible civil war happens or are you saying that's the only way the U.S. could be brought down?
00:22:56.440 No, it's possible.
00:22:57.460 I also think the U.S. is kind of built to consume and there's a certain standard of living.
00:23:02.880 And I think a hard depression is – would – I think the post-war America is kind of a different America than the 19th century America.
00:23:13.480 I think we're a different country, right?
00:23:14.780 So I don't think post-war America has faced a really deep, dark recession like – and I don't know what would happen because this country is kind of built on consumption and built on – you know, you have a house.
00:23:27.080 You put stuff in your house.
00:23:28.180 We have things.
00:23:28.940 We have – we don't have a leisure culture.
00:23:31.820 We have like a work and consume culture.
00:23:34.080 So I don't know what could happen if a big financial crisis that's even bigger than 2008 happened.
00:23:41.000 I think – I think – man, it's kind of crazy.
00:23:45.820 Maybe it's pessimistic, but it does feel like you've got political hyperpolarization.
00:23:52.920 You've got potential for international escalation in war.
00:23:56.960 You've got the fears over AI.
00:24:00.000 And maybe that one is more speculative.
00:24:03.040 But then you've got concern over a global economic collapse or at least a depression in the United States.
00:24:07.160 I think these are all real considerations.
00:24:08.880 I think a financial collapse is – there's a decent probability that it happens.
00:24:14.260 Like we're overspending.
00:24:15.320 Our debt-to-GDP ratio is like 125 percent.
00:24:20.280 Yeah.
00:24:20.640 I mean that could happen.
00:24:21.780 We're heading toward some sort of default bankruptcy situation.
00:24:25.340 And I don't think any political party cares or wants to care.
00:24:28.840 You saw that with Elon Musk.
00:24:30.340 He just gave up.
00:24:31.120 Did you think he gave up or he's just got to work – he's got to run his companies?
00:24:37.420 No, I think he gave up.
00:24:38.340 I think he tweeted something about how nobody cares about – seriously care about doge or cutting spending.
00:24:45.940 And yeah, he left.
00:24:47.740 And yeah.
00:24:49.140 Maybe.
00:24:50.060 I mean I think there's a quality of life here and nobody wants to – nobody's going to mess around with that.
00:24:55.420 No political party wants to mess around with that.
00:24:57.160 And we're just going to keep going.
00:24:58.720 But if you don't prune the leaves or the errant branches, the system becomes overburned.
00:25:06.140 It breaks down.
00:25:06.840 And it feels like nobody wants to take responsibility for the hard work to maintain the system.
00:25:11.680 They're all just sort of running to the high point of the Titanic as it's sinking.
00:25:17.060 It doesn't seem like it's sinking though, right?
00:25:18.980 I mean there's unimaginable wealth for the regular American in a way.
00:25:23.620 I mean go to Europe.
00:25:24.800 People there make half or a third of what regular Americans make.
00:25:28.300 I mean so it doesn't feel that way right now.
00:25:32.700 So a lot of what you're saying is it could happen.
00:25:34.900 I disagree.
00:25:35.660 I think it feels that way.
00:25:36.960 I think the argument is it may not be that way.
00:25:38.940 But there's a bunch of viral posts.
00:25:41.260 There was a viral post on Reddit that got millions of fits or whatever talking about how if you watch Friends or you watch these old shows in the 90s,
00:25:50.880 there was this depiction of culture or the description of culture as having a lot of leisure.
00:25:57.040 Someone mentioned a song where – I think it was Billy Joel.
00:26:01.180 He said that he and his friends would just go and buy a bottle of wine and hang out.
00:26:04.440 And then they wrote, yeah, we went and did that with our friends and it was like a $200 bill at the restaurant to have some appetizers and wine with our buddies where it used to be cheap.
00:26:12.720 And so I think a lot of people – well, I'll put it this way.
00:26:17.900 Gen Z can't buy homes.
00:26:19.380 They're entering their late 20s.
00:26:21.100 They're not getting married.
00:26:21.940 They're not having kids.
00:26:22.520 They're not buying properties.
00:26:23.480 It certainly feels like the ship is sinking.
00:26:25.860 Maybe for the older millennials, Gen X and boomers, they're the last helicopter out of NAMM, but I think the younger generations feel like it's gone.
00:26:37.120 Yeah, I think this country has changed since the 90s.
00:26:39.200 I mean like you mentioned, full employment and a better economy kind of – there was a culture there of people complaining about regular jobs like office space or flight club that people are running to right now.
00:26:52.320 Yeah, so different culture.
00:26:56.200 Millennials, I think half of millennials own homes and something like 7% of Gen Z owns homes.
00:27:01.800 Yeah, but they're going to inherit – millennials are going to inherit a lot of homes.
00:27:05.140 But they don't need them, so what do they do with them?
00:27:08.340 Good point.
00:27:09.540 So this is the issue, right?
00:27:11.980 Gen Z is not buying property right now where they historically would have been.
00:27:14.440 At the same age, boomers owned 20% – in their mid-20s, boomers owned 20% of corporate equities.
00:27:23.140 Gen Z owns zero.
00:27:24.960 Millennials own, I think, like 5%.
00:27:27.260 So it may be that boomers will die and then millennials will inherit a lot of it.
00:27:34.320 But actually, I think BlackRock and large international corporations will inherit it all because Gen Z won't have the capital to – look, if millennials inherit these houses, let's say in 20 years when boomers are largely passing on, they're going to inherit a home that's going to be worth a million dollars.
00:27:52.960 A home today that's at $500 might be a million by then.
00:27:55.780 But they're not going to be able to sell it for a million because boomers are the only ones who have the equity to actually buy or get loans against it.
00:28:01.400 Gen Z won't be buying it.
00:28:02.500 So they'll try and drop the prices and then likely what happens is BlackRock or these big private equity firms will buy them up at a premium rate but above what Gen Z can spend.
00:28:11.420 And that's a phenomenon we're already seeing.
00:28:13.680 Millennials will put an offer on a house and then BlackRock will offer 30% higher because they can afford to.
00:28:19.200 And it's driving prices up and making it unreachable.
00:28:22.680 Maybe that system breaks and we find a way through it, but it does seem like it's not improving.
00:28:28.180 Where does your house go if you don't have any kids, right?
00:28:31.320 Like what do you do?
00:28:32.580 Let's say you're a millennial and you're like 60 and you're ready to leave.
00:28:36.660 You just sell your house.
00:28:37.780 I mean, where do you go?
00:28:39.140 So there's like a weird transference of wealth issue if nobody's having kids either.
00:28:44.980 Right.
00:28:45.200 So that will largely be for millennials, so maybe 40 years from now.
00:28:50.680 Right.
00:28:50.960 I think 20 years from now, millennials, half of them own homes.
00:28:54.900 And so we're starting to see this now because I've experienced it in the market.
00:28:59.100 You'll find a house and the seller will be like, parents died.
00:29:03.720 You know, older millennials inherited it.
00:29:05.420 Don't want to move back home.
00:29:07.460 They live in New York now.
00:29:08.680 Like you mentioned, they're going to cities or something.
00:29:10.380 So they're trying to sell it.
00:29:11.760 The problem then is they don't want to sit on it.
00:29:14.060 They don't want to deal with taxes.
00:29:15.060 They don't want to lose it.
00:29:15.920 So they drop the price.
00:29:17.580 So maybe we'll see a pricing collapse.
00:29:19.600 Then, of course, what's been happening is that another millennial couple says, we want to buy our first home.
00:29:24.120 It's a $300,000 house.
00:29:25.600 BlackRock says, we'll give you $330,000, which has been a big trend.
00:29:29.120 So I don't know.
00:29:30.460 I mean, maybe you're right.
00:29:31.900 Maybe we shouldn't be so pessimistic.
00:29:33.580 Maybe we'll end up with a lot of houses and it'll correct itself.
00:29:39.060 Or, you know, the scenario of one company owning a large percentage of the housing stock could absolutely happen here as well.
00:29:45.040 Another nightmarish, futuristic dystopia.
00:29:49.660 America is pretty – it's an unpredictable country, man.
00:29:52.200 Like you never know what's going to happen here because there's such a heavy private sector that's so – and the government.
00:30:00.740 Government's hit and miss.
00:30:01.760 Like you can get involved or maybe it won't get involved.
00:30:04.680 So it's hard to play out some scenarios because you never really know because this country is really – it's the most unpredictable country in the world in my opinion.
00:30:12.120 Yeah.
00:30:12.360 Where do you think – last, final question.
00:30:15.640 I just – where do you think the Trump movement or MAGA ends up?
00:30:20.200 Ooh, that's like the movement or the presidency?
00:30:22.540 Well, I mean like obviously Trump's going to leave.
00:30:24.940 He's an old man.
00:30:26.280 He's going to – term's going to end.
00:30:27.840 But what happens to this populist MAGA, whatever you want to call it?
00:30:31.520 They keep going?
00:30:32.420 Does it evolve?
00:30:33.160 I don't think so.
00:30:35.660 I think this is really – Donald Trump is an extraordinary politician, the first celebrity of the media monoculture of the 20th century.
00:30:44.780 The only way they could have defeated him was with a pandemic.
00:30:47.660 And then he still came back to win, right?
00:30:50.920 So he's such a large figure in American politics and a transformative one that it's – usually what happens is when he's gone, the movement's gone too.
00:31:03.500 So my opinion is this is kind – like there'll be – I think there'll be elements of this MAGA movement that always exists.
00:31:13.600 But I really think – it's really difficult to think about MAGA without Trump.
00:31:17.900 Yeah.
00:31:18.120 Because I think he's just – you know, he's one of those great figures like they're going to talk about in many years from now.
00:31:25.760 Even if you hate him or love him, he's still his giant.
00:31:28.200 Right on.
00:31:28.760 In politics.
00:31:29.800 Well, the Lindy Man, man, it's been great hanging out with you.
00:31:32.860 Where can people find you?
00:31:34.520 Twitter.
00:31:35.580 To search Lindy Man.
00:31:36.560 I'll be there.
00:31:37.380 Right on.
00:31:37.900 Well, thanks for hanging out.
00:31:39.200 And I appreciate you joining in.
00:31:40.920 Thanks, Tim.
00:31:41.520 See you.
00:31:41.960 Have a good one.
00:31:43.000 Yep.
00:31:46.480 All right.
00:31:47.080 That was Paul Scalise, a.k.a. Lindy Man.
00:31:51.000 You know, we're just – we're getting back into it after that holiday weekend.
00:31:57.580 And I will say it's fascinating how it is – look, let's just be real.
00:32:01.080 It's people were barbecuing.
00:32:02.840 They were grilling.
00:32:03.540 They were enjoying their time.
00:32:05.460 And that's what people are going to remember.
00:32:08.300 Okay?
00:32:09.120 You're going to focus on the good memories you had.
00:32:11.680 I went swimming.
00:32:12.940 Went to a skate park.
00:32:13.960 We went to – we got to hang out with really great people.
00:32:19.360 Travis Pastrana, the Black Rifle Coffee guys, JT, put in a big event for veterans, injured veterans, lost veterans, veterans with cancer.
00:32:29.020 And so I think it's important to make sure you're always putting that footnote in why we're having this day off.
00:32:37.520 We are getting a beautiful three-day weekend to enjoy the fruits of those who sacrificed everything, what they've gifted to us.
00:32:46.820 And so you've got to make sure you recognize that.
00:32:49.740 All of that goodness that you have comes from someone else willing to sacrifice something you weren't.
00:32:53.720 You know, this really is a big divide between left and right.
00:32:57.200 When Kamala Harris says, enjoy the three-day weekend, and everybody on the right said, don't forget the fallen.
00:33:03.060 So I think people on the right aren't saying, don't enjoy your weekend.
00:33:07.220 They're just saying, make sure you remember why.
00:33:10.040 But we're going to wrap it up there, my friends.
00:33:11.900 We've got to – we're going to throw it to our good friend.
00:33:15.440 I believe we have Russell Brand gearing up right now.
00:33:19.380 But my point is we're coming back from Memorial Day, so I'm not entirely sure who exactly is geared up.
00:33:24.540 But Russell Brand is getting ready to go, so we're going to raid that channel.
00:33:28.400 Really do appreciate you guys hanging out.
00:33:30.480 It's been fun.
00:33:31.700 We are back for the week.
00:33:33.240 Let's see.
00:33:33.740 I don't have the guest list for TimCast IRL.
00:33:35.420 So TimCast IRL tonight at 8 p.m.
00:33:37.520 You don't want to miss it.
00:33:38.100 You can follow me on Axe and Instagram at TimCast.
00:33:41.520 Let's get that raid rolling.
00:33:44.620 Maybe I'll grab one Rumble Rant if we got any.
00:33:47.800 All right, let's see.
00:33:49.360 Discover the magic of Bad MGM Casino, where the excitement is always on deck.
00:33:53.760 Pull up a seat and check out a wide variety of table games with a live dealer.
00:33:58.420 From roulette to blackjack, watch as a dealer hosts your table game and live chat with them throughout your experience to feel like you're actually at the casino.
00:34:06.100 The excitement doesn't stop there.
00:34:08.680 With over 3,000 games to choose from, including fan favorites like Cash Eruption, UFC Gold Blitz, and more.
00:34:16.020 Make deposits instantly to jump in on the fun.
00:34:18.760 And make same-day withdrawals if you win.
00:34:20.900 Download the Bad MGM Ontario app today.
00:34:23.400 You don't want to miss out.
00:34:24.600 Visit betmgm.com for terms and conditions.
00:34:27.760 19 plus to wager Ontario only.
00:34:29.520 Please gamble responsibly.
00:34:30.840 If you have questions or concerns about your gambling or someone close to you, please contact Connex Ontario at 1-866-531-2600 to speak to an advisor for your charge.
00:34:40.440 Bet MGM operates pursuant to an operating agreement with iGaming Ontario.
00:34:43.560 Arsonist says, Tim, have you played Claire Obscure Exposition 33?
00:34:48.900 It's definitely Game of the Year.
00:34:50.000 Please look into it.
00:34:51.220 You know, my boy Andy keeps saying download it and play it.
00:34:54.720 Maybe I have to now.
00:34:55.700 Not because he said to.
00:34:56.700 Because you guys said to.
00:34:58.140 Because the audience did.
00:34:59.480 All right, everybody.
00:35:00.340 I'm going to wrap it up there.
00:35:01.200 Smash the like button.
00:35:02.100 Share the show with everyone you know.
00:35:03.300 Thanks for hanging out.
00:35:04.140 And we'll see you tonight at 8 p.m.
00:35:06.140 rumble.com slash timcast IRL.
00:35:07.960 Amazon presents Lisa vs. the Mosquito.
00:35:13.600 Surviving 100 million years, Mosquito shook off the plate tectonic breakup of Pangea, the asteroid that eliminated the dinosaurs and the Ice Age.
00:35:23.740 But Lisa shopped on Amazon and bought a can of repellent, a cute long-sleeved shirt, and a citronella candle.
00:35:30.780 Hey, Mosquito, Lisa just bodied you.
00:35:34.400 Save the everyday with deals from Amazon.