The Podcast of the Lotus Eaters - November 20, 2025


The Podcast of the Lotus Eaters #1300


Episode Stats

Length

1 hour and 38 minutes

Words per Minute

170.5874

Word Count

16,722

Sentence Count

417

Misogynist Sentences

9

Hate Speech Sentences

57


Summary

In this episode of Lotus Eaters, we talk about how calling people racist will backfire on both sides of the political fence, and why we should be worried about being called racist. We also talk about Elon Musk's AI Tower of Babel, and what the left gets wrong about socialism.


Transcript

00:00:00.240 Good afternoon and welcome to the podcast of the Lotus Eaters episode 1300 on the 20th of November 2025.
00:00:09.800 I'm your host Harry, joined today by part-timer Josh and part-timer Firaz.
00:00:15.200 Hello!
00:00:16.280 And today we're going to be talking about how calling people racist will backfire.
00:00:21.860 I'm going to be talking about Elon Musk's AI Tower of Babel and his utopian dreams,
00:00:27.500 which definitely are just memes and will stay that way.
00:00:32.120 And you're going to be telling us what socialists get right.
00:00:35.360 Yes.
00:00:36.240 It's going to be a very short segment.
00:00:37.700 It's going to be a bit of an involved segment actually.
00:00:40.640 It's about industrial policy.
00:00:42.080 Okay.
00:00:42.540 It should be interesting, what with a libertarian on the panel.
00:00:46.380 I'm not that libertarian these days.
00:00:48.680 When it comes to paying taxes...
00:00:50.220 Age is catching up with you.
00:00:51.300 When it comes to paying taxes, I become very libertarian.
00:00:54.160 But when it comes to use of government power, I'm not.
00:00:56.460 That's the same with me these days.
00:00:58.400 I did this little bit last week on the whole trad tax,
00:01:01.640 and then it got me thinking about all of the taxes
00:01:03.500 and the way that they're built to screw you over.
00:01:06.120 Income tax, national insurance, the way the student loan repayments work.
00:01:10.460 And it really does make my blood boil.
00:01:12.620 You rake me over the coals, but we broadly agree with each other.
00:01:15.480 Basically.
00:01:17.180 But it's fun to poke fun at you, Josh.
00:01:19.680 It is.
00:01:20.320 It's fun.
00:01:21.140 It's what bros do.
00:01:22.360 It's what bros do.
00:01:23.240 But outside of all of that, we might as well get into the news.
00:01:27.060 Actually, I had an announcement before.
00:01:28.720 Oh, all right.
00:01:29.720 It's five years since I first joined Lotus Eaters today.
00:01:32.360 I joined on the 25th of November 2020, and that was five years ago.
00:01:37.560 Well, happy Lotus Eaters' birthday.
00:01:39.420 Thank you very much.
00:01:40.400 It's such a shame you're no longer employed by us.
00:01:42.420 I know.
00:01:43.140 It sort of takes the wind out of the sails a little bit, doesn't it?
00:01:46.500 Although people will point out in the comments, he's back just as much.
00:01:49.340 I was like, no, I'm not.
00:01:50.520 Definitely not.
00:01:51.160 I'm doing a lot less work.
00:01:52.220 It's great.
00:01:53.720 Anyway, I should get on with what I'm actually here for, shall I?
00:01:57.080 Must you?
00:01:58.060 I can if you want.
00:01:59.060 We can just have a chat.
00:02:00.180 Sounds nice.
00:02:01.220 We can reminisce.
00:02:03.200 Well, I've known you for four years, but those past five years, no, we're not going to do that.
00:02:07.120 People are generally ignoring accusations of racism these days, and that's because it's
00:02:14.700 pretty much an oversaturated word.
00:02:17.080 It's been used too much.
00:02:17.980 It's like currency.
00:02:18.740 If you inflate it too much, then it becomes devalued, it doesn't hold any weight.
00:02:21.980 It's not a social reputational weapon anymore.
00:02:24.820 And amongst the right, the more someone is called racist, the more we think they're our
00:02:28.060 guy, which is a sort of backfire thing, because it's going to backfire both for the left and
00:02:33.700 the right.
00:02:34.300 For the left, by calling everything racist and caring about immigration is racist, they're
00:02:39.980 going to marginalize themselves because people are concerned about immigration and they're
00:02:45.000 willing to put up with being called racist because, as we can see from the US and both
00:02:50.840 the UK and many other European countries, people calling these parties racist isn't putting
00:02:56.340 people off voting for them and vocally supporting them.
00:02:58.860 So it's sort of backfiring a little bit.
00:03:01.180 But it also backfires for the right as well, because we think that someone who gets called
00:03:07.860 racist every day makes them sort of our guy, just like, yes, this person must be on our
00:03:12.900 side because the left's giving them a hard time.
00:03:14.680 And that's not necessarily true, because both Trump and Farage in Britain get this in abundance
00:03:20.900 and pretty much Farage is a Thatcherite neoliberal.
00:03:25.300 No one really looks at Margaret Thatcher and says, you know what her legacy is?
00:03:29.120 It was racism.
00:03:30.360 Well, very few people anyway.
00:03:32.460 He's not really breaking out of the paradigm and it's pretty standard, colourblind, meritocratic
00:03:38.560 stuff.
00:03:39.480 Whereas Trump...
00:03:40.580 A lot of normies don't know that though, because a lot of normies do perceive the world
00:03:46.580 through a mainstream media lens.
00:03:48.800 On last Friday after I got back home, I went to the pub to meet up with a mate and we were
00:03:53.180 talking to some guy, half Irish, half English, who, as he was leaving the table, he was very
00:03:59.200 drunk, he started screeching about how, you know, like, because somebody brought up Remembrance
00:04:04.360 Day, he started screeching about how, oh, you're not allowed to support Remembrance Day
00:04:09.420 or remember all of those people who died to fight fascism when fascism's rising in the UK.
00:04:14.100 If you're going to go and vote for Nigel Farage, you should go and kill yourself.
00:04:17.820 And I was not enough pints down and had a banging headache, so I did not want to start
00:04:23.020 an argument over it, but I was just, like, facepalming, going, please go away.
00:04:27.020 He was on his way out at the time he started screeching about it anyway, so I was like,
00:04:31.220 just get out.
00:04:32.840 It's very prevalent, isn't it?
00:04:34.380 So, normies do still perceive the world through this lens.
00:04:38.240 But it's not really true, is it?
00:04:40.280 Objectively, if you look at the policies and you look at the actual evidence, rather than
00:04:44.400 the hearsay and what, you know, is said in the media, which are the, you know, two best
00:04:49.400 ways to control people is low information, fast-paced sort of tolerance.
00:04:55.320 Low information saturation.
00:04:56.260 Exactly.
00:04:56.560 I mean, this guy was not the most intelligent person that I've ever met.
00:04:59.600 Normally a nice guy, but...
00:05:01.560 And Donald Trump, to pivot as a sort of parallel, is basically a 1990s New York Democrat, you
00:05:09.820 know.
00:05:10.380 A lot of his policies are similar to that of Bill Clinton.
00:05:13.520 You know, Barack Obama did more deportations than Trump, although, of course, they do face
00:05:17.660 different situations as a sort of caveat.
00:05:19.600 They counted differently.
00:05:20.900 They also counted differently.
00:05:21.600 Everybody turned away at the border as a deportation under Obama.
00:05:25.040 Which is sort of cheating a little bit.
00:05:27.140 Yes.
00:05:27.520 But it's still fair to say that lots of people are disappointed at the rate of deportation.
00:05:32.420 Yes.
00:05:32.600 And also, the U-turn on H-1Bs and bringing in more legal migration, these sorts of things.
00:05:39.060 To that, they'll point to the lack of border crossings since Trump came in office, which
00:05:44.180 is fair.
00:05:45.160 But I think we have to accept, easily reversible by the next administration, if you get a Democrat
00:05:51.980 in, who's just going to open the floodgates again, and if you are not deporting the 10
00:05:56.360 million plus illegals who are already in the country, at a rate that is going to reverse
00:06:01.320 their entry into the country, then you're not even making a dent.
00:06:05.020 Because all of those illegals are going to have children, and you're going to face the
00:06:07.660 same birth rate problems that everywhere else in the West is experiencing.
00:06:11.220 I'm not on the Elon birth rates train, but I do admit and accept, because you have to
00:06:17.680 recognize, if white birth rates are what they are, as opposed to non-white birth rates, that
00:06:23.860 is where, even with closed borders, the demographic change is going to happen.
00:06:28.320 Yes.
00:06:29.300 Yeah.
00:06:30.180 And I think that there are lots of other things as well that are points of concern, but just
00:06:34.860 because the Democrats would be worse doesn't mean that you should just be perfectly happy
00:06:38.340 with the alternative.
00:06:39.540 You can still have expectations and hold people you otherwise support to account, and I think
00:06:45.520 that that's fair.
00:06:46.280 But people don't necessarily do that, and it's a shame, because, and I think that this
00:06:50.500 is sort of playing a part in it, by calling both of them, you know, racist and fascist.
00:06:56.240 It doesn't really defame their reputations anymore.
00:06:59.320 If anything, it makes them more popular with their base, because it legitimizes them as an
00:07:03.500 authority by attacking them, doesn't it?
00:07:05.660 And, you know, the worst consequence is that it contains the genuine right-wing movements,
00:07:12.780 because people think they're much more radical than they really are.
00:07:16.220 Like Trump and Farage, not really that radical in the grand scheme of things.
00:07:20.120 Perfect example, Maloney.
00:07:21.540 Yep, exactly.
00:07:22.200 The rise of fascism in Italy all over again, and people got really hyped up for it, because
00:07:28.100 they're saying, if she's that bad, she must be willing to make the difficult decisions
00:07:32.240 that are going to turn Italy's situation around.
00:07:34.760 She gets in, what do you get?
00:07:37.100 More Blairite neoliberal.
00:07:39.180 Yeah, exactly.
00:07:39.660 Which is basically a continuation of Thatcherite neoliberalism, anyway.
00:07:42.620 And that's what I see, these sorts of public accusations of, oh, these people are racist.
00:07:47.720 I'm going to use examples from Farage, because there's been a recent story that I find interesting.
00:07:52.300 But it could apply just as much to Maloney, or Trump, or any political leader that is described
00:07:58.580 as, you know, a controversial right-winger, when actually the reality is they're not actually
00:08:02.700 that controversial.
00:08:03.620 And all of these accusations actually serve to legitimize them to their own base, when
00:08:11.900 perhaps people could afford to be a little bit more critical towards these people, because
00:08:15.680 they do deserve it.
00:08:17.240 And you shouldn't just take people for granted and be a cheerleader for them.
00:08:20.960 You can drag these leaders to the right if you employ the right strategy.
00:08:27.100 So here's a recent example from only a month ago.
00:08:29.520 So Shibana Mahmood, a Pakistani lady who currently occupies the position of Home Secretary, called
00:08:36.880 Nigel Farage worse than racist over dog-whistle politics.
00:08:41.740 She's claiming that he's not actually a virtually racist, but he dog-whistles, which makes him
00:08:45.960 worse.
00:08:47.140 So we'd be the dogs in this case?
00:08:49.660 I guess so.
00:08:50.600 Right.
00:08:50.800 And we pick up on it.
00:08:51.600 But except for the fact that many people on the right are actually critical of Farage
00:08:55.620 for not being right-wing enough.
00:08:57.620 Yes.
00:08:57.760 And so I don't understand what these dog-whistles are.
00:09:01.300 In fact, many of the people who they're worried about are critical of him for not being radical
00:09:07.520 enough.
00:09:08.280 So what is the dog-whistle?
00:09:09.940 This only serves to legitimize the notion that he is radical and makes people support him
00:09:15.300 from the right.
00:09:16.840 And this is the one that I think is perhaps the most egregious.
00:09:21.920 So this is a story about smoke coming out of his mouth because he said something that
00:09:29.620 is that on fire.
00:09:31.780 They're just making him look cool.
00:09:34.200 Deeply shocking, Nigel Farage faces fresh claims of racism and anti-Semitism at school.
00:09:39.000 When was Nigel Farage in school?
00:09:40.600 I mean, he's 70?
00:09:43.340 I think it was in the early 80s, maybe the late 70s.
00:09:48.020 And lots of these claims are coming from when he was like 13.
00:09:51.820 Ah.
00:09:52.480 Which is interesting.
00:09:54.660 I'm going to read some of this because it's funny how they frame him.
00:09:58.880 Like, they make him just sound like a schoolboy, even from sort of our era, Harry, where he's
00:10:05.400 just saying controversial things to wind people up.
00:10:07.720 Well, you would have been at the right age.
00:10:10.960 You're only a little bit older than me.
00:10:14.260 Do you remember Sycopedia?
00:10:16.140 Yeah, of course.
00:10:16.860 Yeah, okay.
00:10:17.460 Everybody who's around our age should remember Sycopedia.
00:10:20.380 It's exactly what it sounds like.
00:10:21.580 It was an encyclopedia of, like, really terrible, unfunny jokes, but they were as offensive and
00:10:28.820 disgusting as possible.
00:10:30.580 And me and my friends on the school grounds during breaks would take turns looking through
00:10:35.460 and finding the most gross-out jokes possible to tell, or offensive jokes as possible to
00:10:40.720 tell one another.
00:10:41.540 That's what schoolboys do.
00:10:43.520 Because at the time, it's funny.
00:10:45.920 So anyone who's had normal schooling, particularly if they were, you know, once a boy, and I don't
00:10:52.120 mean that in the modern sense, I mean they have grown up to be a man, will understand that
00:10:57.320 this is how schools sort of work, even if lots of this is true.
00:11:01.080 And we're also relying on the account of a couple of people remembering something from,
00:11:06.500 what, 50 years ago?
00:11:08.300 So it's just ridiculous.
00:11:10.120 So all his years in UKIP, all his years in frontline politics, this was not an issue.
00:11:16.420 It wasn't an issue in the Brexit debate.
00:11:18.060 And then all of a sudden, when he's on the cusp of being prime minister, all of a sudden
00:11:21.640 it's just like, oh, it comes back to me.
00:11:24.280 Interesting.
00:11:25.180 Modern memory drugs or something like that?
00:11:26.940 I don't understand it.
00:11:28.320 But okay.
00:11:29.880 Yeah, the latest revelations come from someone called Peter Eted-Gui.
00:11:35.540 Eted-Gui, I don't know.
00:11:37.380 And he says, he would sidle up to me and growl, Hitler was right.
00:11:41.420 Or gas them, sometimes adding a long hiss to simulate the sound of the gas showers.
00:11:47.340 If this even is true, right?
00:11:48.960 If this guy was acting super triggered and getting really upset at it, he's just going
00:11:54.580 to do that more.
00:11:55.420 You're going to find worse and worse ways.
00:11:57.860 Anyone that's been to school knows that if you react by going, oh, no, you can't say
00:12:02.620 that.
00:12:03.160 That's terrible.
00:12:04.100 Oh, no, I'm so offended.
00:12:06.220 In a school, to school children, what you're going to do is invite more of it.
00:12:10.980 And you're going to make yourself more of a target.
00:12:12.400 And you're going to earn more of it.
00:12:14.500 Because you're being a little bitch.
00:12:16.800 What he said.
00:12:17.640 And apparently, Eted-Gui, 61, is a BAFTA and Emmy-winning director and producer whose
00:12:24.720 credits include Kinky Boots and McQueen and Super Slash Man.
00:12:29.460 Sounds terrible.
00:12:31.000 But anyway, back then he was a 13-year-old boy at a loss as how to handle what he described
00:12:36.580 as a sudden and inexplicable intrusion of anti-Semitism into his life.
00:12:41.800 I like how they're talking about bantering between 13...
00:12:44.440 He's reliving the trauma, is he?
00:12:45.400 13-year-olds being edgy to one another and being a bit mean.
00:12:51.600 Don't feed the trolls.
00:12:53.100 Nobody ever learned this.
00:12:54.740 I know.
00:12:55.480 It's so silly.
00:12:57.060 But all this is...
00:12:57.740 As if this even is true.
00:12:58.860 But in the current political landscape, all this does is make people think, oh, 13-year-old
00:13:05.480 Faraj was a bit of a laugh, wasn't he?
00:13:08.200 You know, sidling up to people and just saying the worst thing he could possibly do.
00:13:14.020 And then hissing at...
00:13:15.480 I can admit to doing that sort of thing to people all of the time in school.
00:13:19.880 In Lebanon, in a diverse society, your only friends were people who you could make nasty,
00:13:26.800 obscene jokes with about their religious background.
00:13:30.760 And if you crossed that barrier, you were friends.
00:13:34.220 And you could joke about each other's religions and make fun of each other and that was it.
00:13:37.940 But I have no idea about British schoolboy culture.
00:13:41.860 It's very much similar, except it's just more general just abuse.
00:13:46.020 Yeah, fair enough.
00:13:46.520 It's not as sectarian unless you're maybe in Scotland.
00:13:48.780 So basically you're normal boys.
00:13:51.040 Yes.
00:13:53.100 This...
00:13:54.020 I have no idea whether or not this is true.
00:13:56.580 I don't have any reason to believe it now.
00:13:59.480 But okay, somebody said something 50 years ago.
00:14:02.820 Mm-hmm.
00:14:03.760 Moving on.
00:14:04.760 I know.
00:14:05.040 It's pretty desperate, isn't it?
00:14:06.360 And it carries on to say,
00:14:08.040 in recent weeks, the Guardian has heard allegations from more than a dozen school contemporaries of Farage
00:14:13.440 who recount incidents of deeply offensive behavior throughout his teenage years.
00:14:18.120 Okay.
00:14:18.780 This is so silly, isn't it?
00:14:20.360 Because not only are they sort of making him out to normal people as he was once a schoolboy
00:14:27.800 who did offensive things that, you know, there weren't any Jewish people in my school.
00:14:33.740 So, you know, I couldn't have possibly done this.
00:14:35.880 But there are many equivalents of things that I did that were edgy.
00:14:40.660 And at the time, no one really batted an eye.
00:14:43.520 I don't think anyone's going to come out and speak to the Guardian and say,
00:14:45.640 Josh was, you know, teasing me for my various heritages or something.
00:14:49.740 Are you running for Parliament?
00:14:51.340 No.
00:14:52.020 Oh, okay.
00:14:53.620 It's funny that it comes out then, isn't it?
00:14:55.720 Yes.
00:14:55.940 I'm happy to sit on it until it's the most opportune moment.
00:14:59.220 It is clearly weaponized, right?
00:15:01.220 Whatever this is true or false, this is clearly weaponized.
00:15:04.680 I'm sorry, but I can see what you mean as well, Josh,
00:15:07.080 because this is very clearly could be aimed at people like us
00:15:12.660 to try and rehabilitate Farage as a bit of a laugh,
00:15:16.340 someone who had a great time in school,
00:15:18.620 somebody who was known for being a bit of a leader.
00:15:21.540 He wasn't afraid to go there and make jokes that other people wouldn't.
00:15:25.080 And there's absolutely an element to this whole story,
00:15:29.040 especially at a time when Zoomers are starting to invert the boomer morality,
00:15:34.040 that this could be an attempt to try and win people over.
00:15:37.600 Now, that might be conspiratorial thinking 4D chess.
00:15:41.340 I don't think it's necessarily deliberate,
00:15:42.840 but it's going to have that effect, I think.
00:15:45.360 Whether it's deliberate or not, it's still going to have that effect.
00:15:48.160 And I think it isn't deliberate in this case.
00:15:50.000 I think it is the Guardian, you know, having a little tantrum.
00:15:52.980 And the funny thing is that this happened all the way back in 2013.
00:15:56.600 There was another school day's letter.
00:15:58.880 And this time they were trying to call him a fascist instead of a racist.
00:16:03.300 And it's talking about he was recommended to be a prefect, basically,
00:16:07.760 to give you the TLDR.
00:16:09.560 And to read a little bit of it, I'm going to scroll down.
00:16:13.520 It says,
00:16:13.940 The letter says when one teacher said Farage was a fascist,
00:16:17.280 but that was no reason why he would not make a good prefect,
00:16:20.040 which I thought was a funny line in and of itself.
00:16:22.980 He's an ideal candidate.
00:16:24.360 There was a considerable reaction from colleagues.
00:16:27.560 It's a bit weird to call a school child a fascist anyway, isn't it?
00:16:31.140 Isn't it really weird to just be that extreme?
00:16:34.820 I know.
00:16:35.800 And then the funny thing is,
00:16:37.460 it carries on to say further on down the line,
00:16:40.680 it talks about him, you know,
00:16:42.100 getting a group of boys to march for a small town,
00:16:44.920 singing Hitler Youth songs, apparently.
00:16:47.020 Which just sounds like a bit of a troll.
00:16:51.980 I don't actually think Nigel Farage is a national socialist,
00:16:55.840 and nor does pretty much anyone.
00:16:57.740 I mean, on GB News the other day,
00:16:59.380 he was fantasizing about if we'd just basically gone in
00:17:02.040 after the First World War and genocided the Germans
00:17:04.620 to avoid the Second World War.
00:17:07.020 So he's about as far from being a national socialist as possible.
00:17:10.460 I would say so.
00:17:10.960 Even despite his German wife.
00:17:13.140 So this part I found the most funny.
00:17:14.940 Terry Walsh, who was the deputy master at Dulwich,
00:17:17.100 the school he was at, i.e. deputy head,
00:17:19.080 says Farage was well known for provoking people,
00:17:21.460 especially left-wing English teachers
00:17:23.040 who had no sense of humour.
00:17:25.980 The former master of Dulwich,
00:17:28.160 the man who appointed Farage and received Chloe Deakin's letter,
00:17:31.220 says he has no memory of the meeting or the letter,
00:17:33.580 but he agrees with his former deputy.
00:17:36.160 It was naughtiness, not racism.
00:17:38.300 I didn't probe too closely into that naughtiness.
00:17:41.100 But the staff were fed up with his cheekiness and rudeness.
00:17:43.400 They wanted me to expel him, but I saw his potential,
00:17:45.980 made him a prefect, and I was proved right.
00:17:48.200 And then it carries on to say,
00:17:49.360 but several Dulwich old boys have told me
00:17:51.460 they recall Farage making racist remarks as a pupil
00:17:54.100 and voicing support for right-wing groups,
00:17:56.520 though none have been willing to say so publicly.
00:17:58.980 It's interesting, isn't it?
00:18:00.700 It's just silly.
00:18:01.940 This is just the same sort of thing that they're employing.
00:18:05.520 It's not going to work.
00:18:06.700 And I would also like to point out this,
00:18:10.280 that his deputy leader, Richard Tice,
00:18:13.940 was meeting people in Israel.
00:18:16.840 So I don't know about these accusations of anti-Semitism.
00:18:20.400 Confirming that we at Reform Party UK
00:18:23.060 stand strong with Israel
00:18:24.280 and look forward to working closely together
00:18:26.160 after we win the next general election.
00:18:28.040 And this is a sentiment that Farage himself
00:18:30.320 has echoed many times.
00:18:32.260 And yeah, Nigel Farage denies the Gaza genocide
00:18:36.120 and backs weapons exports to Israel.
00:18:38.320 I don't know.
00:18:39.200 These don't seem like the actions of someone
00:18:41.360 who's truly anti-Semitic to me.
00:18:43.920 Maybe he's just playing the long game.
00:18:45.580 He's going to stab old Nettie in the back.
00:18:48.280 Who knows?
00:18:49.720 And then what's happened here is
00:18:52.540 obviously the Labour Party have seen an opportunity,
00:18:56.540 this is a recent story again now,
00:18:59.460 from just yesterday,
00:19:01.940 where the number 10, i.e. Keir Starmer and his government,
00:19:04.840 is calling on Farage to urgently address
00:19:06.640 the disturbing allegations of past racist behaviour
00:19:09.360 when he was 13.
00:19:12.080 Really?
00:19:13.560 I mean, everything that I've done in the...
00:19:15.160 I mean...
00:19:16.760 Everybody has some things,
00:19:18.500 some skeletons in their closets.
00:19:19.920 If they have to go this far back,
00:19:21.920 my goodness.
00:19:23.060 Yeah.
00:19:23.840 This is really desperate.
00:19:24.720 I mean, if I went back to my school days,
00:19:27.120 I'd have much worse things than this.
00:19:29.400 Putting people headfirst in bins and, you know...
00:19:32.280 Oh, you went headfirst.
00:19:33.460 That's pretty brutal.
00:19:34.580 I know.
00:19:35.700 I was...
00:19:36.460 That was...
00:19:37.180 That's going pretty far, actually.
00:19:38.420 That's pretty mean.
00:19:39.540 Yeah.
00:19:40.540 I mean, it's all meant in good fun.
00:19:41.900 We had a rotation.
00:19:44.360 The way to effectively bully
00:19:46.540 is not to pick on any one person too much
00:19:48.240 because then it becomes actually mean-spirited and nasty.
00:19:50.700 You pick on people in a rotation
00:19:53.020 and you also do it to your friends
00:19:54.260 and you also accept and be a good sport when it's you.
00:19:56.780 Yes.
00:19:57.060 You know, what goes around comes around
00:19:58.340 and you sort of accept the cycle.
00:20:00.860 That's the way you should view it.
00:20:02.020 That's the healthy way of viewing it.
00:20:04.500 But yes, they're obviously trying to jump on this
00:20:06.720 and try...
00:20:07.480 And all this is doing is legitimising him
00:20:09.520 as this really bad man.
00:20:12.420 If you vote for him, he's going to be bad.
00:20:14.080 He's going to deport every foreign person.
00:20:16.360 All of you people across the country
00:20:17.860 who support mass deportations,
00:20:19.860 oh, look at how evil and racist Nigel Farage is.
00:20:22.700 I bet he actually would do mass deportations.
00:20:25.400 Don't want to vote for him, do you?
00:20:27.580 The Stephen Edgington interview
00:20:29.000 where he says it's a political impossibility
00:20:30.700 and although he's been a bit more willing
00:20:32.600 to agree to stuff recently,
00:20:34.260 talking about the Boris wave
00:20:35.440 and how they need to reverse it,
00:20:37.280 I don't know how I feel about believing that necessarily.
00:20:41.980 I'll believe it when I see it.
00:20:43.300 It comes across like political calculation.
00:20:46.140 It does.
00:20:46.940 And in the wake of the Conservative Party betrayal,
00:20:49.640 I don't think believing any politician at their word,
00:20:53.140 even if they're proven to be trustworthy,
00:20:54.840 is a good idea.
00:20:56.880 Josh, you're suggesting something
00:20:58.200 absolutely revolutionary here,
00:20:59.520 which is that we should judge people
00:21:00.640 based on their actions.
00:21:01.860 Yes, rather than their words.
00:21:03.180 Yes, that's absolutely incredible.
00:21:05.360 I've never heard of that before.
00:21:07.160 But unfortunately, most of politics
00:21:08.840 is judging people on their words
00:21:10.080 and not their actions.
00:21:10.860 Because most people don't have to actually
00:21:12.540 look at their actions.
00:21:13.420 It's democracy for you.
00:21:13.820 Yeah.
00:21:14.620 It is a bit of a flaw of democracy, isn't it?
00:21:16.580 And the Labour Party was, you know,
00:21:19.140 throwing the baby out the pram.
00:21:20.960 These are deeply disturbing allegations
00:21:22.620 Nigel Farage must urgently explain himself.
00:21:25.600 And as Nick Dixon rightfully points out,
00:21:28.300 this rubbish was already in Crick's biography.
00:21:30.440 It's absolutely not urgent.
00:21:31.520 So this is a biography from ages ago,
00:21:33.300 which he himself posted about only yesterday,
00:21:36.400 saying today's Guardian feature on Farage
00:21:38.380 is anti-Semitism and racism at Dulwich.
00:21:40.540 It's largely based on my book,
00:21:42.060 which came out ages ago.
00:21:45.060 So it's also old news as well.
00:21:47.840 So it's all a bit absurd.
00:21:50.160 And here's the sort of cookie cutter leftist response.
00:21:53.820 Remarkable thing about this story
00:21:55.020 is not that Farage is the repulsive,
00:21:56.980 open racist he is,
00:21:58.500 but that so much energy is spent
00:22:00.140 on debating this fact and saying,
00:22:01.880 well, some people didn't have
00:22:03.200 racist experience with him,
00:22:04.720 so it's all very complicated.
00:22:06.080 Yeah, it doesn't matter to them.
00:22:08.480 They're going to think what they think regardless.
00:22:10.200 They don't need any actual tangible evidence.
00:22:13.040 And even if, you know, it is true,
00:22:16.140 is it really fair to hold him accountable
00:22:17.920 for when he's 13?
00:22:18.960 Also, who cares?
00:22:20.400 Exactly, exactly.
00:22:21.980 He made some jokes.
00:22:23.920 I learned some words when I was 13,
00:22:26.480 and if people were filming me
00:22:28.040 when I was learning those words,
00:22:29.960 oh, they'd have a field day as well.
00:22:31.980 And one of the final things is
00:22:36.040 Keir Starmer clearly has been briefed
00:22:39.280 that this isn't a good idea,
00:22:40.560 that saying Farage is racist
00:22:42.460 only legitimizes him.
00:22:45.460 Someone must understand the read of the country
00:22:47.240 that when in 2024
00:22:48.500 the entire country erupted in riots
00:22:50.820 against immigration,
00:22:52.720 maybe that's free PR.
00:22:54.920 Maybe saying Farage is a racist
00:22:56.660 and he's going to get rid of everyone.
00:22:58.120 And people are like, yes, I like this.
00:22:59.920 And I'm not just, you know,
00:23:01.960 saying that, you know, hyperbolically.
00:23:05.700 There's a recent poll from the 1st of October,
00:23:08.320 so not too long ago,
00:23:09.700 asking about,
00:23:10.880 do Britons think Reform UK are racist?
00:23:13.620 And even some Reform supporters were like,
00:23:17.540 yes, we are.
00:23:20.040 I mean...
00:23:21.120 4% of them.
00:23:23.840 Yes.
00:23:24.720 8% of them are unsure.
00:23:26.940 I don't know.
00:23:27.960 Well, sometimes I am,
00:23:29.440 but not everyone, you know.
00:23:30.760 It depends on the group in question.
00:23:32.260 We let ourselves have a cheeky bit of racism
00:23:34.540 on the weekends.
00:23:37.640 The thing is,
00:23:39.520 if what you're saying here,
00:23:41.620 the most important thing about it,
00:23:42.780 is really about reading the room.
00:23:44.740 As in,
00:23:46.160 Britain is now at a stage
00:23:47.440 where saying somebody is racist
00:23:50.060 makes them more popular
00:23:51.220 because the experiment of multiracialism,
00:23:56.180 multiculturalism
00:23:57.140 has failed so spectacularly
00:23:59.200 and has really gone so badly
00:24:01.560 for the native British
00:24:02.520 that now they're willing to say,
00:24:05.000 actually, yes,
00:24:06.040 I want to chuck out the baby
00:24:07.060 with the bathwater.
00:24:08.860 I want to be racist,
00:24:10.080 and that's fine,
00:24:10.700 and that will make me support
00:24:11.680 somebody even more.
00:24:13.080 And that's going to happen
00:24:13.660 more and more as this goes on.
00:24:14.740 And that's going to happen
00:24:15.180 more and more.
00:24:15.860 And then you see people like
00:24:16.860 the Gaza MPs
00:24:19.040 and their associates saying,
00:24:20.180 no, we're just as British as you are.
00:24:22.540 And then the next day,
00:24:23.500 no, I'm Pakistani.
00:24:24.500 And then the day after,
00:24:25.480 well, there's going to be a war.
00:24:26.300 No, I'm not going to fight for Britain.
00:24:28.540 Okay.
00:24:31.060 This is completely devalued currency,
00:24:34.040 exactly as you said
00:24:34.900 at the beginning.
00:24:36.160 And it's so entirely pointless
00:24:37.920 to follow this line of argument
00:24:40.000 or even to be interested in it
00:24:42.160 when we know
00:24:43.600 that every single policy
00:24:44.700 under DEI
00:24:45.760 is explicitly targeting
00:24:47.740 white people.
00:24:50.200 And so,
00:24:51.200 it seems to me
00:24:52.100 that the conclusion is
00:24:52.820 everybody is racist
00:24:54.320 and always was.
00:24:56.000 And always will be as well.
00:24:57.140 And always will be
00:24:57.980 because everybody's going to have
00:24:59.720 an in-group preference
00:25:00.620 unless they're an insane leftist.
00:25:02.520 Well, it's just denying
00:25:03.360 an aspect of human nature
00:25:04.500 that is...
00:25:05.420 Because our in-group preferences
00:25:07.260 are tied to basically
00:25:08.780 our preference for
00:25:09.660 our own flesh and blood
00:25:10.860 and family.
00:25:11.880 And, you know...
00:25:12.600 And because there are visual cues
00:25:13.860 that tell you
00:25:14.580 if somebody is your
00:25:15.720 flesh and blood
00:25:16.620 and family or isn't,
00:25:18.340 it sort of leads
00:25:19.720 to that conclusion.
00:25:20.540 So the whole discussion
00:25:21.760 about race and racism
00:25:22.720 is so entirely pointless
00:25:24.080 and empty.
00:25:25.100 I just don't want
00:25:26.840 to be involved in it.
00:25:27.640 I don't care.
00:25:28.320 It's as simple as
00:25:29.120 do they look more like me
00:25:30.540 than not?
00:25:31.600 And even on juries
00:25:32.680 in America,
00:25:33.680 they apply this rule
00:25:35.120 that if there is
00:25:36.000 a member of the jury
00:25:37.040 who looks
00:25:38.540 a lot like
00:25:39.800 the defendant
00:25:40.460 who's going to be,
00:25:41.740 you know,
00:25:42.000 have the trial against them,
00:25:43.160 they will take them
00:25:44.320 off of the jury
00:25:45.100 because...
00:25:46.300 They must have
00:25:46.660 forgot about that
00:25:47.280 with the Derek Chauvin
00:25:48.120 trial, didn't they?
00:25:48.740 Yeah, they must have.
00:25:49.800 But because
00:25:50.600 they might see
00:25:51.620 themselves in that person.
00:25:53.520 It's as simple
00:25:54.680 as that.
00:25:55.800 It's how OJ
00:25:56.340 was acquitted.
00:25:58.480 It's very much
00:25:59.280 how OJ Simpson
00:25:59.900 was acquitted.
00:26:00.660 Blacks on juries
00:26:01.340 always say
00:26:02.300 a black person
00:26:02.820 is innocent
00:26:03.220 even if they're not.
00:26:04.680 Yes.
00:26:05.020 Or like the Derek Chauvin
00:26:05.940 trial when there was
00:26:06.600 an actual BLM activist.
00:26:08.220 Yeah, exactly.
00:26:09.060 But it's okay
00:26:09.520 when they do it.
00:26:10.060 But I think it's interesting here
00:26:11.640 that basically
00:26:12.120 the people who
00:26:13.200 are trying to frame
00:26:15.340 themselves as
00:26:16.000 opponents to the right,
00:26:17.840 you know,
00:26:18.100 of course the Conservatives
00:26:18.920 pretend they're right wing
00:26:19.940 and so do their supporters,
00:26:21.340 but they're not really.
00:26:22.220 They're of the left
00:26:22.780 just like the rest,
00:26:23.740 but they don't think they are.
00:26:25.040 So you can see this split
00:26:26.120 where in the case
00:26:27.960 of the Greens,
00:26:28.640 Labour,
00:26:29.140 and the Lib Dems,
00:26:30.000 the majority think
00:26:30.660 they are racist,
00:26:31.680 whereas the Conservatives
00:26:32.860 in reform
00:26:33.360 generally think they're not.
00:26:35.720 I think you shouldn't care.
00:26:37.660 It shouldn't be a big deal.
00:26:38.740 Or just say,
00:26:40.140 actually,
00:26:40.500 reform's policies are,
00:26:41.420 yes,
00:26:41.740 and I'm going to vote for them.
00:26:44.000 You know,
00:26:44.640 if you got to lose...
00:26:45.080 What by racist do you mean
00:26:45.980 they favour the native British?
00:26:47.760 You would be insane
00:26:48.460 not to vote for them.
00:26:49.660 Yeah, of course.
00:26:51.140 It's silly, isn't it?
00:26:51.840 And that is really now
00:26:53.120 the definition of racist.
00:26:54.100 Do you favour your own group
00:26:55.080 over others?
00:26:55.680 Well, everybody else does.
00:26:57.060 What are you going to do about it?
00:26:58.760 Be normal.
00:26:59.560 Well, it's like the
00:27:00.440 prisoner's dilemma, isn't it?
00:27:01.580 Where if you don't
00:27:03.720 stick up for yourself...
00:27:04.420 It really is like
00:27:05.000 the prisoner's dilemma.
00:27:05.680 If you don't stick up for yourself,
00:27:07.340 you're just putting yourself
00:27:08.780 on the chopping block
00:27:09.780 for someone to betray you.
00:27:11.160 Yes.
00:27:12.220 Yes.
00:27:12.920 It's ridiculous.
00:27:14.020 It's irrational.
00:27:15.140 It doesn't make any sense.
00:27:16.240 But there are lots of other polls
00:27:17.600 where it's just like,
00:27:18.460 do you think the party
00:27:19.380 or do you think
00:27:20.020 Keir Starmer thinks they are?
00:27:21.500 Do you think
00:27:22.160 Keir Starmer thinks
00:27:23.480 reform voters are racist?
00:27:25.280 So they're asking
00:27:26.440 random people on YouGov
00:27:28.880 to psychically analyse
00:27:30.820 what they think
00:27:31.840 Keir Starmer might think.
00:27:32.680 The funny thing is
00:27:33.300 Keir Starmer publicly said
00:27:34.420 he's not.
00:27:36.300 He just said that
00:27:37.200 the closest you could say
00:27:38.740 is that the immigration
00:27:39.960 policies were.
00:27:41.580 What a worthless
00:27:42.860 toll to get a perception
00:27:45.220 of what people think
00:27:46.720 is Keir Starmer's perception.
00:27:49.140 But the most interesting
00:27:50.400 tidbit here is this.
00:27:52.620 Generally,
00:27:53.380 most reform voters
00:27:54.500 don't think that
00:27:55.260 the party is racist,
00:27:56.320 but they accept
00:27:57.060 that the left
00:27:57.660 believe that they are.
00:27:58.600 So again,
00:28:01.260 they have an awareness
00:28:03.700 of how they are perceived
00:28:04.720 and that kind of
00:28:06.220 confirms your point
00:28:07.160 that the more
00:28:07.840 they use the term racist
00:28:09.280 the more it is devalued
00:28:10.940 because the people
00:28:12.780 who support
00:28:13.240 the Conservatives
00:28:13.880 and reform here
00:28:14.760 seem to accept
00:28:16.260 that yes,
00:28:17.020 you're going to be
00:28:17.620 called racist
00:28:18.020 no matter what
00:28:18.600 so you might as well
00:28:19.440 live with it
00:28:19.780 and move on.
00:28:20.660 And you know,
00:28:21.460 there's also the argument
00:28:22.400 that well,
00:28:23.000 you may as well
00:28:23.400 embrace it as well.
00:28:24.980 Exactly.
00:28:26.040 Why if you're going
00:28:27.140 to be called racist anyway,
00:28:28.260 why should you moderate
00:28:29.080 your actions?
00:28:29.880 Yes.
00:28:30.480 And that's what I've said
00:28:32.520 about reform
00:28:33.100 the whole time
00:28:33.760 is that they're going
00:28:34.360 to say these things anyway
00:28:35.140 so you may as well
00:28:35.940 be as radical as you like
00:28:37.360 and actually solve
00:28:38.120 the problems, right?
00:28:39.160 Exactly.
00:28:40.680 And that's what the left
00:28:41.920 is unwittingly doing
00:28:43.060 by using this
00:28:44.060 but also, of course,
00:28:45.060 it does backfire
00:28:45.920 for the right as well
00:28:47.060 because it allows people
00:28:49.020 to accept
00:28:49.600 less radical politicians
00:28:52.360 when actually
00:28:53.220 they want radical ones.
00:28:54.860 Yes.
00:28:57.140 All right,
00:28:57.980 we've got a few
00:28:58.680 Rumble Rants
00:28:59.440 and Super Chats
00:29:00.240 if you'd like
00:29:00.660 to read through them.
00:29:01.860 Of course.
00:29:02.700 If I can see them.
00:29:04.380 Jake Taylor says
00:29:05.460 on YouTube
00:29:06.060 for £5,
00:29:07.120 thank you.
00:29:07.840 Keep up the great work,
00:29:08.680 boys.
00:29:09.040 I sometimes see
00:29:09.760 some of you in town
00:29:10.840 but too pussy
00:29:11.880 to say hello.
00:29:12.780 Well,
00:29:13.200 if you see me
00:29:13.820 you can say hello to me.
00:29:14.680 I'm always happy
00:29:15.140 to say hello to people
00:29:16.420 and it actually,
00:29:17.820 usually everyone's...
00:29:19.180 You'll often find Josh
00:29:19.400 on street corners
00:29:20.340 so if you've got
00:29:21.060 a little change
00:29:21.900 make sure to throw in some.
00:29:23.240 If you've got some coppers
00:29:24.300 pass them my way.
00:29:25.300 But no,
00:29:26.740 I've never had
00:29:28.040 an interaction
00:29:29.420 with someone
00:29:29.900 from the audience
00:29:30.420 that hasn't
00:29:31.280 made my day better
00:29:32.220 so always come
00:29:33.140 and say hello.
00:29:33.680 I'm always happy for it.
00:29:35.500 JM for $20,
00:29:37.160 thank you very much,
00:29:37.960 says,
00:29:38.300 idea,
00:29:38.740 asking migrant advocates
00:29:40.140 to legally act
00:29:41.180 as guarantors
00:29:41.800 for migrants
00:29:42.360 and accept civil liabilities
00:29:43.720 for all harms
00:29:44.720 they cause
00:29:45.180 can be collective advocates
00:29:46.940 as a class
00:29:47.860 taking personal responsibility
00:29:50.560 for migrants
00:29:51.320 as a class.
00:29:51.980 I've thought about this before
00:29:53.740 but it's still
00:29:54.320 conceding too much ground
00:29:55.880 I think
00:29:57.040 in that you're still
00:29:58.120 conceding that they
00:29:58.920 should be let in at all
00:29:59.780 I say
00:30:00.280 don't even do that.
00:30:02.220 You're also suggesting
00:30:03.760 that they have the ability
00:30:04.780 to take personal responsibility
00:30:06.320 for the actions
00:30:07.180 of their own groups
00:30:07.900 when the grooming gangs
00:30:09.000 alone should have shown
00:30:09.820 that they won't.
00:30:10.920 Yeah,
00:30:11.360 I mean I like the thinking
00:30:12.640 but I think
00:30:13.760 the realities of the situation
00:30:15.780 will make this impossible.
00:30:16.760 After the second
00:30:18.260 whatever
00:30:19.700 £50,000 fine
00:30:21.240 that somebody has to pay
00:30:22.760 people will sort of say
00:30:24.240 I'm not taking that risk.
00:30:26.600 There is some merit to it
00:30:27.920 but it's too late in the game.
00:30:29.100 Yeah,
00:30:29.360 I'd say so.
00:30:30.660 That's Random Name says
00:30:31.640 blessing my Thursday morning
00:30:33.460 with my favourite eaters
00:30:34.740 of the Lotus.
00:30:35.920 Also the real reason
00:30:36.700 Harry was annoyed
00:30:37.440 by the Irish guy
00:30:38.320 is because he said to him
00:30:39.500 aren't you a bit tall
00:30:40.640 for a leprechaun?
00:30:41.920 That's true,
00:30:42.620 this is true.
00:30:43.620 You were there clearly
00:30:44.680 you were fly on the wall
00:30:45.900 exactly how it went down.
00:30:48.560 All safe,
00:30:48.860 thank you very much.
00:30:50.220 Anyway,
00:30:51.060 so
00:30:51.420 I've spoken about AI
00:30:53.320 a bit recently
00:30:54.460 I was going over
00:30:55.520 in a segment
00:30:56.080 a while back
00:30:56.900 about the anti-white bias
00:30:58.840 of AI
00:30:59.720 how a lot of these programs
00:31:01.620 the LLMs
00:31:02.540 other than Grok
00:31:03.780 specifically
00:31:04.520 have an anti-white bias
00:31:06.300 built into them
00:31:07.760 but AI is becoming
00:31:09.480 something which is
00:31:10.520 getting an air of inevitability
00:31:12.200 about it.
00:31:13.180 We're being told
00:31:13.800 constantly
00:31:14.520 that AI
00:31:15.520 is the future
00:31:16.620 AI is the thing
00:31:17.600 that's going to
00:31:18.220 drag us
00:31:19.020 kicking and streaming
00:31:19.860 into a future
00:31:21.360 and I want to examine
00:31:22.960 the vision
00:31:23.960 of that future
00:31:24.940 as being presented
00:31:25.900 by high-level
00:31:27.320 technocrats
00:31:28.460 because if there is
00:31:29.420 one person
00:31:30.100 I do not trust
00:31:31.440 to have the best
00:31:32.500 interests of
00:31:33.540 humanity
00:31:34.460 or my people
00:31:35.840 in mind
00:31:36.760 it is
00:31:37.320 technocrats
00:31:38.260 who treat everything
00:31:39.160 as if it is
00:31:39.920 a scientific
00:31:40.780 computer-controlled
00:31:41.960 experiment
00:31:42.760 where you can
00:31:43.720 just shift
00:31:44.260 widgets around
00:31:45.280 and adjust
00:31:46.760 conclusions
00:31:47.560 to your liking
00:31:48.740 what you're
00:31:49.160 describing as
00:31:49.720 materialists
00:31:50.360 yes materialists
00:31:51.600 people would know
00:31:52.340 this is a spiritual
00:31:52.880 conflict
00:31:53.400 what you're describing
00:31:54.240 as materialists
00:31:55.040 yes and that is
00:31:56.180 what we are seeing
00:31:56.960 here
00:31:57.300 this is an image
00:31:58.500 from the recent
00:31:59.940 US-Saudi
00:32:01.380 investment forum
00:32:02.780 where Elon Musk
00:32:04.140 was in attendance
00:32:04.840 alongside
00:32:05.420 as we can see here
00:32:06.400 OpenAI
00:32:07.320 President Greg Brockman
00:32:09.000 and NVIDIA CEO
00:32:10.780 Jensen Huang
00:32:12.160 you can see them
00:32:13.200 here
00:32:13.520 in their
00:32:14.300 suits
00:32:14.960 and
00:32:15.480 dress shoes
00:32:17.000 question mark
00:32:17.960 posing for a photograph
00:32:20.320 and a number of
00:32:21.180 statements were made
00:32:22.120 by Elon Musk
00:32:23.160 at this forum
00:32:24.780 so we can see
00:32:25.740 a few of them
00:32:26.520 here
00:32:27.420 if you go out
00:32:30.640 long enough
00:32:31.000 assuming there's
00:32:31.540 a continued
00:32:31.980 improvement
00:32:32.620 in AI
00:32:34.000 and robotics
00:32:34.560 which seems likely
00:32:35.840 the money
00:32:37.760 will stop
00:32:38.980 being relevant
00:32:39.620 at some point
00:32:40.200 in the future
00:32:40.700 there will still
00:32:43.140 be constraints
00:32:44.400 on power
00:32:45.540 like electricity
00:32:46.780 and mass
00:32:48.160 the fundamental
00:32:49.720 physics elements
00:32:50.680 will still be
00:32:51.700 still be
00:32:52.520 constraints
00:32:53.080 but
00:32:54.360 I think
00:32:55.520 at some point
00:32:56.300 currency
00:32:59.260 becomes irrelevant
00:33:00.140 Jensen
00:33:02.940 any thoughts
00:33:03.520 so
00:33:05.060 you can pause
00:33:06.620 it there
00:33:06.900 yeah
00:33:07.080 my understanding
00:33:08.280 of what he's
00:33:08.920 saying there
00:33:09.480 is not necessarily
00:33:10.460 that this is going
00:33:11.180 to be something
00:33:11.680 that's going
00:33:12.220 to happen
00:33:12.580 in my lifetime
00:33:13.180 but the inevitable
00:33:14.140 conclusion of AI
00:33:15.520 is that it's
00:33:16.440 basically going
00:33:17.120 to manage
00:33:17.700 everything for us
00:33:19.620 to the point
00:33:20.100 where there's
00:33:20.480 going to be
00:33:20.840 no human
00:33:21.440 in charge
00:33:22.020 and therefore
00:33:22.580 the economic
00:33:23.740 system as we
00:33:24.360 know it
00:33:24.660 will break
00:33:25.560 down
00:33:25.920 he's talking
00:33:26.860 about a
00:33:27.480 complete revolution
00:33:28.640 of the way
00:33:29.320 that the societies
00:33:30.740 are structured
00:33:31.380 and according
00:33:32.460 to these kinds
00:33:34.100 of projections
00:33:34.780 that he's bringing
00:33:35.620 up here
00:33:36.120 he's saying
00:33:37.160 in 10 to 20
00:33:37.900 years
00:33:38.420 if he is correct
00:33:40.300 so that is
00:33:41.200 within our
00:33:42.140 lifetimes
00:33:42.820 I do not think
00:33:44.040 that he is
00:33:44.680 correct in this
00:33:45.880 but no I
00:33:46.760 disagree
00:33:47.120 it is a strange
00:33:48.600 vision to be
00:33:49.400 presenting to
00:33:50.560 people in the
00:33:51.620 first place
00:33:52.340 this idea that
00:33:53.140 work will be
00:33:53.640 optional because
00:33:54.180 this sounds an
00:33:55.000 awful lot to
00:33:55.620 me
00:33:56.000 one like
00:33:58.200 bug man
00:33:58.960 mentality and
00:33:59.920 I do think that
00:34:00.760 Elon is king of
00:34:02.000 the bug men
00:34:02.560 sorry he is
00:34:04.040 and I also think
00:34:05.500 that this is kind
00:34:06.260 of a vision of
00:34:07.280 Marx's ideal future
00:34:09.140 sounds a little
00:34:09.660 bit globalist
00:34:10.980 and weff like
00:34:11.760 as well doesn't it
00:34:12.520 what was it
00:34:12.900 the idea of
00:34:13.700 8 hours leisure
00:34:14.620 8 hours work
00:34:15.680 8 hours arts
00:34:17.160 was what the
00:34:18.940 socialist utopians
00:34:20.260 and Marx
00:34:21.520 who I know
00:34:22.200 wasn't a socialist
00:34:22.880 utopian etc
00:34:23.860 put forward as
00:34:25.640 the end goal
00:34:26.540 of a socialist
00:34:27.260 society
00:34:27.700 and the problem
00:34:28.960 with that is
00:34:29.640 it's this idea
00:34:30.420 that if you
00:34:31.100 unshackle people
00:34:32.860 from the chains
00:34:34.220 of abundance
00:34:34.860 sorry from the
00:34:36.560 chains of scarcity
00:34:37.420 and you have
00:34:38.240 super abundance
00:34:39.300 in this sense
00:34:40.300 administered purely
00:34:41.640 by AI
00:34:42.420 that you will
00:34:43.980 have this
00:34:44.540 flourishing of
00:34:45.340 human creativity
00:34:46.260 and that human
00:34:47.560 beings will not
00:34:48.560 become restless
00:34:49.300 they will not
00:34:50.260 disconnect from
00:34:51.440 all social bonds
00:34:52.600 they will not
00:34:53.720 become lonely
00:34:54.500 and suicidal
00:34:55.320 that instead
00:34:56.160 they will become
00:34:57.060 endlessly creative
00:34:58.780 and do everything
00:35:00.160 that they've always
00:35:00.800 wanted
00:35:01.180 problem with that
00:35:02.280 is we already
00:35:03.040 had that experiment
00:35:04.300 it was called
00:35:05.180 the covid lockdowns
00:35:06.720 from 2020
00:35:07.700 to early 2022
00:35:08.840 most people
00:35:10.680 did they really
00:35:11.500 start pursuing
00:35:12.300 their dreams
00:35:13.140 did they start
00:35:14.640 to pursue
00:35:15.660 those artistic
00:35:16.520 creative endeavours
00:35:17.980 they've been waiting
00:35:18.740 their whole life
00:35:19.440 to get an opportunity
00:35:20.220 for
00:35:20.640 or did they
00:35:22.060 goon
00:35:22.560 play video games
00:35:23.720 and watch netflix
00:35:25.020 and eat takeout
00:35:26.420 and get fat
00:35:27.400 did they indulge
00:35:28.400 in the wally future
00:35:29.820 I would argue
00:35:30.640 that that's the least
00:35:31.720 severe future
00:35:33.080 that's sort of
00:35:33.600 the best case scenario
00:35:34.720 from something like this
00:35:35.960 and I'm taking it
00:35:37.260 on Elon's terms
00:35:38.300 of what he's suggesting
00:35:39.380 but yes please present
00:35:40.420 the more realistic
00:35:41.720 outcomes
00:35:42.380 I think that
00:35:43.300 with the way
00:35:44.400 that human nature
00:35:45.160 is the struggle
00:35:45.860 to survive
00:35:46.680 is part of
00:35:47.820 what gives life
00:35:48.740 meaning in the first
00:35:49.620 place
00:35:49.960 and if you take
00:35:51.580 that away from people
00:35:52.540 they'll feel like
00:35:53.140 their life is meaningless
00:35:54.020 if they've got nothing
00:35:55.140 to strive towards
00:35:56.040 nothing
00:35:56.320 no reason to
00:35:57.600 better themselves
00:35:58.240 and they're just
00:35:59.860 existing to create
00:36:01.320 with endless opportunity
00:36:03.440 what will actually
00:36:04.260 happen is people
00:36:04.940 will be suffocated
00:36:05.880 by that opportunity
00:36:07.380 one option
00:36:08.340 for Alice
00:36:08.860 this is why
00:36:09.980 the bible says
00:36:10.720 that Adam's role
00:36:12.300 is to labor
00:36:13.540 that now
00:36:15.660 you will have
00:36:16.260 to earn your keep
00:36:17.300 by the sweat
00:36:17.940 of your brow
00:36:18.480 that so long
00:36:20.460 as you are on earth
00:36:21.200 you must be working
00:36:22.080 and this is
00:36:24.240 very much
00:36:25.060 a Christian idea
00:36:25.820 that you must
00:36:26.780 consecrate your work
00:36:27.900 and make your work
00:36:28.660 have value
00:36:29.260 and taking away
00:36:31.300 work from people
00:36:32.360 is in no way
00:36:34.040 going to make them
00:36:34.560 better
00:36:34.780 we saw this
00:36:35.520 with the various
00:36:36.160 experiments on
00:36:37.180 universal basic income
00:36:38.680 which were disastrous
00:36:40.480 across the board
00:36:41.180 they didn't actually
00:36:41.900 help anybody
00:36:42.460 Elon is an advocate
00:36:43.440 for UBI
00:36:44.200 if this future
00:36:45.560 that he's projecting
00:36:46.480 goes forwards
00:36:47.440 he says that
00:36:48.140 well I've got the quotes
00:36:49.240 and I'll go through
00:36:49.980 them in a moment
00:36:50.500 and if it was going
00:36:51.580 to be good for people
00:36:52.820 then the state
00:36:54.300 with the most
00:36:55.000 welfare spending
00:36:56.140 would have the most
00:36:56.880 productive citizenry
00:36:57.840 but everything that we see
00:36:59.500 says that
00:37:00.580 welfare
00:37:01.540 encourages
00:37:02.580 laziness
00:37:03.540 and being indigent
00:37:04.900 it doesn't actually work
00:37:07.120 and there's also
00:37:07.980 another element to this
00:37:08.960 that if AI is able
00:37:11.080 to be
00:37:11.660 you know
00:37:12.820 the thing that's
00:37:13.660 running the economy
00:37:14.560 and there's no human
00:37:15.480 input necessarily needed
00:37:16.600 or very little
00:37:17.440 then who's to say
00:37:19.200 that AI isn't also
00:37:20.500 far exceeding
00:37:21.660 the creativity
00:37:22.180 of human beings
00:37:23.140 making human creative
00:37:24.300 endeavours
00:37:24.680 basically redundant
00:37:25.580 you know
00:37:27.320 not just that
00:37:27.880 I mean for one
00:37:29.060 that's the thing
00:37:29.840 that I always find
00:37:30.700 annoying about people
00:37:31.800 who are kind of
00:37:33.000 proselytising the
00:37:34.160 advances of AI art
00:37:35.660 which is frankly
00:37:36.720 on a personal level
00:37:37.840 I have no interest
00:37:39.480 in consuming
00:37:40.280 or experiencing
00:37:41.540 art that hasn't
00:37:42.380 been created
00:37:43.000 by a human
00:37:43.700 art is an expression
00:37:45.420 of the soul
00:37:46.380 AI robots
00:37:48.520 they do not have
00:37:49.620 souls
00:37:50.060 therefore they have
00:37:51.140 nothing to give for me
00:37:52.340 hear hear
00:37:52.920 but the AI may be able
00:37:55.420 to with the direction
00:37:56.460 of a human being
00:37:57.480 pursue these sorts
00:37:58.780 of things right
00:37:59.480 in a way
00:38:00.120 that might
00:38:02.860 delegitimise
00:38:04.340 human creative
00:38:05.460 endeavours
00:38:05.860 and certainly
00:38:06.280 people will perceive
00:38:07.080 it that way
00:38:07.580 even if it isn't
00:38:08.300 necessarily that way
00:38:09.520 for uncultured types
00:38:11.780 and there is
00:38:12.520 there is the idea
00:38:13.600 of the Greeks
00:38:15.720 which was that
00:38:16.900 leisure
00:38:17.360 the opportunity
00:38:18.500 to be creative
00:38:19.620 is something that
00:38:20.900 really should only
00:38:21.680 be afforded
00:38:22.460 to people who are
00:38:23.300 actually capable
00:38:24.060 of pursuing it
00:38:25.120 and that most
00:38:26.360 other people
00:38:27.140 should be
00:38:29.080 given opportunity
00:38:32.300 to find meaning
00:38:33.220 elsewhere in life
00:38:34.400 there is also
00:38:35.540 an old saying
00:38:36.840 among developers
00:38:37.600 that goes
00:38:38.380 something along
00:38:39.340 the lines of
00:38:39.980 if builders
00:38:41.260 built buildings
00:38:42.260 the way developers
00:38:43.340 wrote code
00:38:44.100 civilization would
00:38:45.020 have been destroyed
00:38:45.560 by the first
00:38:46.100 woodpecker
00:38:46.580 and there is
00:38:49.260 the reality
00:38:49.860 that if you
00:38:50.720 allow machines
00:38:53.060 and code
00:38:53.760 to run
00:38:54.420 your entire
00:38:55.360 civilization
00:38:56.320 and economy
00:38:57.180 well you're
00:38:58.280 just one
00:38:59.000 virus away
00:39:00.180 from being
00:39:00.700 utterly destroyed
00:39:01.440 so this
00:39:04.120 as an aspiration
00:39:05.060 is an insane
00:39:06.920 aspiration
00:39:07.500 and it is not
00:39:08.320 grounded in reality
00:39:09.260 and it is not
00:39:09.960 grounded in humanity
00:39:10.780 and nobody who
00:39:11.880 loves human beings
00:39:12.700 wants them sitting
00:39:13.560 around idly all day
00:39:14.720 you want them to
00:39:16.140 feel that they're
00:39:16.760 productive
00:39:17.880 that they're doing
00:39:18.600 something useful
00:39:19.220 that they're
00:39:19.600 contributing
00:39:20.080 if you don't work
00:39:21.600 you don't plan
00:39:22.000 for the future
00:39:22.540 if you don't
00:39:23.740 plan for the
00:39:24.180 future
00:39:24.420 you're not
00:39:24.860 going to produce
00:39:25.380 anything worthwhile
00:39:26.260 and what he's
00:39:28.300 basically advocating
00:39:29.160 there is a
00:39:29.940 universal nursing
00:39:31.300 home
00:39:31.620 yes
00:39:32.280 and if you've
00:39:32.860 ever seen
00:39:33.280 people in
00:39:33.820 nursing homes
00:39:34.600 it's not
00:39:36.020 great
00:39:36.340 it's really
00:39:37.940 not great
00:39:38.620 but let's
00:39:39.780 carry on
00:39:40.700 on the whole
00:39:41.360 idea of
00:39:42.060 work being
00:39:42.660 optional
00:39:42.960 people have
00:39:43.540 poked fun at
00:39:44.100 this and pointed
00:39:44.680 out that certain
00:39:45.360 parts of the
00:39:45.940 demographics have
00:39:46.860 already figured
00:39:47.440 out that work
00:39:48.060 can be optional
00:39:48.680 so that's an
00:39:50.880 interesting observation
00:39:52.040 one of the best
00:39:53.660 observations though
00:39:54.800 which people have
00:39:55.760 made is that
00:39:56.900 this is completely
00:39:58.560 and utterly
00:39:59.420 contradictory to
00:40:00.740 the way that
00:40:01.440 Elon Musk
00:40:02.120 pursues his own
00:40:03.360 goals with his
00:40:04.680 own companies
00:40:05.640 this idea that
00:40:07.300 you need
00:40:08.240 rising birth rates
00:40:10.340 so that people
00:40:11.840 will have
00:40:13.740 bigger families
00:40:14.940 to keep the
00:40:15.540 economy going
00:40:16.320 but at the same
00:40:17.340 time then AI
00:40:18.440 will replace
00:40:19.240 everything
00:40:19.720 so he's just
00:40:20.580 encouraging people
00:40:21.420 to have
00:40:21.900 children for
00:40:23.240 the sake of
00:40:23.960 it
00:40:24.100 more mouths
00:40:24.980 to feed
00:40:25.580 for the AI
00:40:26.260 overlords
00:40:26.880 more people
00:40:27.720 to sit around
00:40:28.520 playing video
00:40:29.400 games into
00:40:30.140 their old age
00:40:30.900 pursuing no
00:40:32.280 greater meaning
00:40:32.940 in their own
00:40:33.460 lives
00:40:33.720 very strange
00:40:34.800 and also
00:40:35.780 surely
00:40:36.920 if AI
00:40:37.960 is going to
00:40:38.600 be that
00:40:39.520 amazing
00:40:40.200 why do you
00:40:42.440 need so many
00:40:43.260 H1Bs
00:40:44.600 Elon
00:40:45.060 why is this
00:40:46.240 as of last
00:40:47.360 Wednesday
00:40:47.880 this is a
00:40:48.400 statement he
00:40:48.880 made
00:40:49.120 an account
00:40:49.920 here
00:40:50.280 was responding
00:40:51.300 to SpaceX
00:40:52.140 saying thanks
00:40:53.580 to all of the
00:40:54.140 1800 plus
00:40:55.020 members of the
00:40:55.560 SpaceX team
00:40:56.500 happy Veterans
00:40:57.720 Day
00:40:58.140 and it points
00:40:59.380 out SpaceX
00:40:59.900 use less than
00:41:00.720 20 H1B
00:41:02.080 workers between
00:41:02.980 2011 and
00:41:03.820 2024
00:41:04.320 less than
00:41:05.400 20
00:41:05.860 they hire
00:41:06.600 almost 100%
00:41:07.800 American
00:41:08.240 because defense
00:41:09.300 contractors are
00:41:10.120 heavily restricted
00:41:10.840 from employing
00:41:11.600 visa holders
00:41:12.640 and it works
00:41:13.740 we have the
00:41:14.180 people we have
00:41:14.720 the talent
00:41:15.100 but companies
00:41:16.100 want cheaper
00:41:16.820 labor
00:41:17.140 to which
00:41:18.060 Elon responds
00:41:19.200 personally
00:41:19.980 saying
00:41:21.440 President Donald
00:41:22.800 Trump is
00:41:23.880 right
00:41:24.440 regarding H1Bs
00:41:25.700 we must
00:41:26.440 distinguish clearly
00:41:27.440 between companies
00:41:28.280 that need to
00:41:29.040 hire critical
00:41:29.700 world-class talent
00:41:30.820 from other
00:41:31.680 countries
00:41:32.200 versus companies
00:41:33.260 that simply
00:41:33.780 hire low-cost
00:41:34.840 non-US
00:41:36.060 employees
00:41:36.660 to increase
00:41:37.740 profits
00:41:38.420 SpaceX has
00:41:39.680 succeeded
00:41:40.760 despite not
00:41:42.200 being able to
00:41:42.940 hire critical
00:41:43.680 world-class talent
00:41:44.560 from other
00:41:45.040 countries due
00:41:45.640 to ITAR
00:41:46.560 laws
00:41:47.020 you are
00:41:47.620 inviting in
00:41:48.660 potential
00:41:49.260 spies
00:41:49.840 by the
00:41:51.040 sounds of
00:41:51.500 it to me
00:41:51.940 Elon
00:41:52.260 had we
00:41:53.020 been able
00:41:53.540 to do
00:41:53.940 so
00:41:54.220 our progress
00:41:55.140 would have
00:41:55.480 been faster
00:41:56.040 now if
00:41:56.760 I was
00:41:57.400 working for
00:41:58.220 SpaceX
00:41:58.720 and saw
00:42:00.260 this
00:42:00.800 the spit
00:42:02.360 in the face
00:42:03.380 to the work
00:42:04.360 and dedication
00:42:05.000 that you have
00:42:05.460 done for
00:42:05.820 SpaceX
00:42:06.180 I would go
00:42:07.120 on strike
00:42:07.780 I might
00:42:08.600 quit
00:42:08.880 because what
00:42:09.580 Elon is
00:42:10.080 saying there
00:42:10.820 is great
00:42:12.100 work
00:42:12.560 but if I
00:42:13.220 could have
00:42:13.480 replaced you
00:42:14.060 with an
00:42:14.380 Indian
00:42:14.740 I would
00:42:15.600 have
00:42:15.780 and I
00:42:16.060 would have
00:42:16.180 expected them
00:42:16.740 to do
00:42:17.020 better than
00:42:17.580 you anyway
00:42:18.180 that's vile
00:42:20.220 that's
00:42:20.700 disgusting
00:42:21.500 and that is
00:42:22.460 again this
00:42:23.300 bug man
00:42:24.020 mentality
00:42:24.660 human beings
00:42:25.960 do not belong
00:42:26.860 to nations
00:42:27.540 it's not a
00:42:28.880 mentality that
00:42:29.640 he puts for
00:42:30.180 European countries
00:42:31.220 for some reason
00:42:31.920 but he has
00:42:32.680 this new world
00:42:33.360 mentality with
00:42:34.240 America
00:42:34.760 that America
00:42:35.620 is the place
00:42:36.360 the global
00:42:36.900 experiment
00:42:37.460 where anybody
00:42:38.460 from anywhere
00:42:39.380 and everywhere
00:42:40.360 can come and
00:42:41.320 be used as
00:42:41.900 interchangeable
00:42:42.760 widgets
00:42:43.140 and traded
00:42:44.560 out on the
00:42:45.280 basis of
00:42:45.780 min-maxing
00:42:46.540 skills according
00:42:47.400 to credentials
00:42:48.120 and that's
00:42:49.640 how he sees
00:42:50.400 this
00:42:50.700 and again
00:42:51.340 if you work
00:42:52.040 for SpaceX
00:42:52.600 and you're
00:42:53.160 watching this
00:42:53.800 if I was in
00:42:55.180 your position
00:42:55.640 I would be
00:42:56.560 very insulted
00:42:57.620 by the
00:42:58.140 implication
00:42:58.640 of this
00:42:59.620 statement
00:43:00.020 it's also
00:43:00.700 reading some
00:43:01.840 of the rest
00:43:02.160 of what he's
00:43:02.620 saying here
00:43:03.220 it's a
00:43:04.500 strange
00:43:05.020 conflation
00:43:05.740 of the
00:43:06.280 H1Bs
00:43:07.080 which are
00:43:07.660 done
00:43:08.140 what
00:43:08.420 hundreds of
00:43:08.920 thousands
00:43:09.140 of people
00:43:09.480 like 600,000
00:43:10.640 or something
00:43:11.120 in that magnitude
00:43:11.960 wasn't it
00:43:12.420 last year
00:43:12.980 and he's
00:43:14.340 there talking
00:43:14.780 about getting
00:43:15.460 the world
00:43:15.980 class talent
00:43:16.680 in things
00:43:17.620 like Tesla
00:43:18.200 so what
00:43:19.060 Elon's basically
00:43:19.760 talking about
00:43:20.280 is there
00:43:20.640 a handful
00:43:22.300 of people
00:43:22.960 I want
00:43:23.580 for SpaceX
00:43:25.520 around the
00:43:26.140 world
00:43:26.380 which is
00:43:26.900 not the
00:43:27.300 same order
00:43:27.760 of magnitude
00:43:28.260 as the
00:43:29.540 H1Bs
00:43:30.320 anyway
00:43:30.820 even without
00:43:32.480 the H1Bs
00:43:33.860 or not
00:43:35.100 SpaceX is
00:43:36.220 limited
00:43:36.520 so it doesn't
00:43:37.200 matter
00:43:37.540 also can I
00:43:38.860 just say
00:43:39.220 the thing is
00:43:40.140 when he had
00:43:40.640 his big meltdown
00:43:41.540 last year
00:43:42.060 people looked
00:43:43.020 into Tesla
00:43:43.620 and found
00:43:44.200 that it was
00:43:44.620 employing
00:43:45.240 hordes
00:43:45.920 of H1Bs
00:43:47.380 potentially
00:43:47.880 10%
00:43:48.760 of its
00:43:49.220 workforce
00:43:49.760 and if you
00:43:50.720 look here
00:43:51.240 industrial
00:43:51.760 engineer
00:43:52.600 base salary
00:43:53.720 of $80,000
00:43:55.200 I looked
00:43:56.180 into it
00:43:56.740 that is
00:43:57.280 basically
00:43:57.800 as low
00:43:58.760 end
00:43:59.060 for a
00:43:59.500 US
00:43:59.860 industrial
00:44:00.440 engineer
00:44:00.900 as you
00:44:01.400 can pay
00:44:01.800 them
00:44:02.060 their base
00:44:02.780 salaries
00:44:03.220 can go
00:44:03.780 up to
00:44:04.040 $103,000
00:44:05.320 so you
00:44:06.200 can't
00:44:06.680 convince me
00:44:07.420 that Elon
00:44:08.160 is hiring
00:44:08.760 these people
00:44:09.440 purely because
00:44:10.400 they are the
00:44:10.860 best in the
00:44:11.500 world
00:44:11.840 when instead
00:44:12.540 it looks
00:44:12.920 like he's
00:44:13.340 trying to
00:44:13.720 nickel and
00:44:14.180 dime his
00:44:14.680 own company
00:44:15.400 to save
00:44:16.140 on costs
00:44:16.800 well most
00:44:17.340 most businesses
00:44:18.260 the main
00:44:19.060 expenditure
00:44:19.620 is the
00:44:20.200 labour costs
00:44:20.860 isn't it
00:44:21.280 so when you're
00:44:22.360 looking at a
00:44:22.960 spreadsheet
00:44:23.320 the thing
00:44:24.160 that you
00:44:24.380 want to
00:44:24.600 get down
00:44:25.000 the most
00:44:25.400 is the
00:44:25.760 labour costs
00:44:26.380 because it's
00:44:27.220 the largest
00:44:27.840 share of
00:44:28.360 your expenses
00:44:28.900 but sorry
00:44:29.460 for us
00:44:29.780 you're going
00:44:30.020 to say
00:44:30.240 something
00:44:30.520 what I was
00:44:31.260 going to
00:44:31.460 say is
00:44:31.720 that if
00:44:32.520 in 20
00:44:32.900 years all
00:44:33.320 of this
00:44:33.500 is going
00:44:33.740 to be
00:44:33.960 absolutely
00:44:34.280 pointless
00:44:34.700 why would
00:44:35.740 you bring
00:44:36.020 anybody
00:44:36.280 now
00:44:36.600 I mean
00:44:36.840 the short
00:44:37.260 termism
00:44:37.640 of the
00:44:37.900 thinking
00:44:38.220 is absolutely
00:44:39.280 insane
00:44:39.680 if he
00:44:40.040 believes
00:44:40.580 that in
00:44:41.320 10 to
00:44:41.560 20 years
00:44:41.940 time
00:44:42.280 all of
00:44:43.540 these
00:44:43.680 H1Bs
00:44:44.300 will be
00:44:44.820 sitting
00:44:45.260 around
00:44:45.560 doing
00:44:45.800 absolutely
00:44:46.160 nothing
00:44:46.520 why
00:44:48.080 make them
00:44:48.900 the problem
00:44:49.520 of your
00:44:50.380 companies which
00:44:51.160 are presumably
00:44:51.680 going to be
00:44:52.200 producing
00:44:52.540 the robots
00:44:54.240 that make
00:44:55.460 work irrelevant
00:44:56.220 for everybody
00:44:56.680 else
00:44:57.000 why increase
00:44:58.600 the future
00:44:59.320 burden on
00:44:59.900 yourself
00:45:00.400 and your
00:45:01.500 companies
00:45:01.900 just to save
00:45:03.240 nickels and
00:45:04.180 dimes now
00:45:04.780 well perhaps
00:45:06.040 because he
00:45:06.900 believes that
00:45:07.880 AI will
00:45:09.280 somehow
00:45:10.280 eliminate
00:45:11.080 poverty
00:45:11.700 altogether
00:45:12.360 because of
00:45:13.360 the super
00:45:13.820 abundance
00:45:14.240 that it will
00:45:14.760 generate
00:45:15.140 here he is
00:45:15.620 saying as
00:45:16.040 much
00:45:16.360 but AI
00:45:17.820 and human
00:45:18.160 robots
00:45:18.540 will actually
00:45:19.620 eliminate
00:45:20.580 poverty
00:45:21.020 and Tesla
00:45:22.160 won't be the
00:45:22.560 only one
00:45:22.840 that makes
00:45:23.120 them
00:45:23.260 I think Tesla
00:45:23.700 will pioneer
00:45:24.220 this but
00:45:24.780 there will
00:45:25.140 be many
00:45:25.740 other companies
00:45:26.240 that make
00:45:26.580 humanoid
00:45:26.940 robots
00:45:27.320 but there is
00:45:28.400 only basically
00:45:29.040 one way
00:45:29.600 to make
00:45:31.300 everyone
00:45:31.680 wealthy and
00:45:32.260 that is
00:45:32.580 AI and
00:45:33.000 robotics
00:45:33.400 and we
00:45:35.040 can't talk
00:45:35.560 about robotics
00:45:36.340 without
00:45:36.780 AI
00:45:37.260 I mean
00:45:38.960 outside of
00:45:40.840 all of the
00:45:41.260 actual logistic
00:45:42.060 concerns of
00:45:42.860 such things
00:45:43.400 what it would
00:45:43.900 do to
00:45:44.380 humanity
00:45:45.040 to actually
00:45:46.400 experience
00:45:47.160 something like
00:45:47.800 this
00:45:47.960 a complete
00:45:48.680 upending
00:45:49.300 of all
00:45:49.980 societal
00:45:50.520 relations
00:45:51.180 as far
00:45:52.080 back as
00:45:52.540 we can
00:45:52.860 remember
00:45:53.340 well the
00:45:53.840 whole world
00:45:54.400 being unemployed
00:45:55.200 is it
00:45:56.240 yeah
00:45:56.500 but does
00:45:57.360 it sound
00:45:57.760 realistic
00:45:58.280 does it
00:45:59.180 sound realistic
00:45:59.980 or does
00:46:00.860 this sound
00:46:01.420 like the
00:46:01.920 high-minded
00:46:02.840 utopian
00:46:03.660 musings of
00:46:04.520 a guy who
00:46:05.100 indulges in
00:46:05.940 too much
00:46:06.340 science fiction
00:46:07.160 yeah
00:46:07.820 because
00:46:08.280 in looking
00:46:09.720 into this
00:46:10.500 I looked
00:46:11.680 back on
00:46:12.160 some of the
00:46:12.480 stuff to do
00:46:13.080 with the
00:46:13.520 dot-com
00:46:13.960 bubble
00:46:14.280 back in
00:46:15.200 the late
00:46:15.520 90s
00:46:16.180 and in
00:46:16.720 some of
00:46:16.920 the videos
00:46:17.300 that I
00:46:17.580 watched
00:46:17.920 it
00:46:18.060 went back
00:46:18.540 to the
00:46:18.780 1970s
00:46:19.700 when the
00:46:20.020 US government
00:46:20.600 was involved
00:46:21.240 in the
00:46:21.860 creation of
00:46:22.520 a very very
00:46:23.260 early version
00:46:24.000 of the internet
00:46:24.540 to begin with
00:46:25.100 and you can
00:46:25.860 find clips
00:46:26.560 all the way
00:46:27.040 back then
00:46:27.720 of
00:46:28.600 scientists
00:46:30.300 American scientists
00:46:31.200 saying that
00:46:31.880 I have no doubt
00:46:32.940 that by the year
00:46:33.600 2000
00:46:34.140 this technology
00:46:35.100 will make it
00:46:35.760 so that you
00:46:36.380 can have
00:46:36.780 a brain surgeon
00:46:38.300 in Scotland
00:46:39.280 operating directly
00:46:40.940 on a patient
00:46:41.700 in New Zealand
00:46:42.520 I remember
00:46:43.080 seeing those
00:46:43.840 sorts of things
00:46:44.460 does this
00:46:46.100 sound like
00:46:46.520 something
00:46:46.780 that's going
00:46:47.160 to be
00:46:47.520 realistic
00:46:48.380 or does it
00:46:49.960 sound like
00:46:50.500 that kind
00:46:50.980 of high
00:46:51.480 minded
00:46:52.600 fantasy
00:46:53.240 I mean
00:46:53.940 prove me
00:46:54.800 wrong
00:46:55.180 experience
00:46:55.780 may prove
00:46:56.560 me wrong
00:46:57.020 maybe AI
00:46:57.680 is going
00:46:58.220 to change
00:46:59.300 everything
00:46:59.980 and unleash
00:47:01.260 all of this
00:47:02.600 onto the world
00:47:03.400 but right
00:47:04.580 now
00:47:05.200 it just
00:47:06.260 sounds like
00:47:06.900 Elon Musk
00:47:07.680 is
00:47:08.340 indulging
00:47:10.220 in science
00:47:10.820 fantasy
00:47:11.280 realistically
00:47:12.020 this sort
00:47:13.200 of technology
00:47:13.780 he's talking
00:47:14.380 about
00:47:14.740 is
00:47:15.200 hundreds
00:47:16.160 if not
00:47:16.620 maybe even
00:47:17.140 thousands
00:47:17.600 of years
00:47:18.200 into the
00:47:18.700 future
00:47:19.100 it's not
00:47:20.520 feasible
00:47:21.260 certainly
00:47:21.700 not the
00:47:22.040 10 to
00:47:22.340 20 years
00:47:22.980 that he
00:47:23.220 was talking
00:47:23.660 about
00:47:24.020 it's not
00:47:24.500 happening
00:47:24.860 in any
00:47:25.360 of our
00:47:25.600 I think
00:47:25.980 they're
00:47:26.400 assuming
00:47:26.840 a continually
00:47:27.700 exponential
00:47:28.500 rate of
00:47:29.420 progress
00:47:30.040 in technology
00:47:31.000 which is
00:47:31.880 not ever
00:47:33.060 really something
00:47:33.740 that you
00:47:34.200 can bank
00:47:34.740 on
00:47:35.040 but even
00:47:35.480 if it
00:47:35.700 does
00:47:35.940 happen
00:47:36.320 human
00:47:36.760 nature
00:47:37.060 will be
00:47:37.380 human
00:47:37.600 nature
00:47:38.000 and we
00:47:39.220 will find
00:47:39.660 massive
00:47:40.020 differences
00:47:40.560 through which
00:47:41.980 to build
00:47:42.340 up conflict
00:47:42.900 and we
00:47:44.020 will end up
00:47:44.440 fighting each
00:47:44.880 other over
00:47:45.600 all kinds
00:47:46.300 of things
00:47:46.780 using this
00:47:48.120 kind of
00:47:48.420 technology
00:47:48.880 so it's
00:47:49.900 not going
00:47:50.300 to be
00:47:50.740 a utopia
00:47:51.780 where everybody
00:47:52.340 sits around
00:47:52.900 idly
00:47:53.440 it's going
00:47:54.660 to translate
00:47:55.160 into
00:47:55.760 weapons
00:47:56.660 techniques
00:47:57.160 of control
00:47:57.900 coercion
00:47:59.100 etc
00:47:59.500 imagine
00:48:00.280 imagine
00:48:00.300 having
00:48:00.620 a fully
00:48:00.940 automated
00:48:01.320 police
00:48:01.680 force
00:48:01.940 it's
00:48:02.200 going
00:48:02.400 to be
00:48:02.720 horrible
00:48:03.160 to be
00:48:04.620 constantly
00:48:05.140 interacting
00:48:05.700 with robots
00:48:06.260 with absolutely
00:48:06.920 no humanity
00:48:07.540 and no sympathy
00:48:08.080 for you
00:48:08.540 so the dream
00:48:10.000 itself
00:48:10.380 even if it
00:48:10.860 was
00:48:11.080 realizable
00:48:12.540 in the
00:48:13.680 time frame
00:48:14.220 that he's
00:48:14.600 talking about
00:48:15.280 is not
00:48:16.480 a good one
00:48:17.100 if you've
00:48:18.260 got lots
00:48:18.640 of idle
00:48:18.960 hands
00:48:19.240 you've got
00:48:19.580 lots of
00:48:19.920 people
00:48:20.240 for a war
00:48:21.300 effort
00:48:21.640 and through
00:48:22.780 the AI's
00:48:23.600 logic
00:48:23.980 well
00:48:24.360 these other
00:48:25.140 countries
00:48:26.000 have resources
00:48:26.840 that it could
00:48:27.420 acquire
00:48:27.780 and better
00:48:28.320 achieve
00:48:28.660 its aims
00:48:29.140 therefore
00:48:29.960 the inevitable
00:48:30.620 conclusion is
00:48:31.380 we need to
00:48:31.760 invade these
00:48:32.340 countries
00:48:32.660 and take
00:48:33.000 their resources
00:48:33.620 because resources
00:48:34.780 are always
00:48:35.480 to a robotic
00:48:37.300 mind
00:48:37.820 rational to
00:48:38.900 acquire
00:48:39.380 and the human
00:48:40.520 cost of
00:48:42.060 the life
00:48:42.800 is a subjective
00:48:44.180 thing that is
00:48:44.840 felt by human
00:48:45.620 beings but an AI
00:48:46.540 might not necessarily
00:48:47.480 perceive that
00:48:48.320 and let's remember
00:48:49.080 again that this is
00:48:50.140 all banking on
00:48:50.860 the idea that
00:48:51.520 somebody who is
00:48:52.420 trying to make
00:48:53.100 his AIs
00:48:53.740 neutral like
00:48:54.620 Elon Musk
00:48:55.340 is the one who
00:48:56.360 ends up with
00:48:56.800 the monopoly on
00:48:57.580 this technology
00:48:58.320 rather than
00:48:59.280 the people
00:48:59.700 who are
00:49:00.000 programming it
00:49:00.740 to be
00:49:01.160 purposefully
00:49:01.720 anti-white
00:49:02.500 and value
00:49:03.600 the lives
00:49:04.200 of white
00:49:04.700 people
00:49:05.120 far far
00:49:06.140 far far
00:49:06.540 far below
00:49:07.340 anybody else
00:49:08.940 building on that
00:49:09.720 remember some
00:49:10.220 of the studies
00:49:11.400 that were done
00:49:12.020 on AIs
00:49:12.760 were showing
00:49:13.540 that some
00:49:13.880 of these
00:49:14.660 AIs
00:49:15.200 were valuing
00:49:16.480 the lives
00:49:16.900 of ICE
00:49:17.520 agents
00:49:18.380 a thousand
00:49:20.000 times below
00:49:21.120 the lives
00:49:21.740 of illegal
00:49:22.260 immigrants
00:49:22.860 do you want
00:49:24.520 that kind
00:49:25.100 of technology
00:49:25.680 in charge
00:49:26.160 of every
00:49:26.800 facet
00:49:27.280 of human
00:49:27.740 life
00:49:28.100 I don't
00:49:28.660 think so
00:49:29.120 invert
00:49:29.620 this
00:49:29.960 invert
00:49:30.440 this a
00:49:30.740 little bit
00:49:31.100 imagine the
00:49:32.140 Chinese have
00:49:32.660 a similar
00:49:33.220 breakthrough
00:49:33.680 where they
00:49:34.600 decide that
00:49:35.160 the value
00:49:35.660 of a Chinese
00:49:36.220 life is worth
00:49:37.120 what ten
00:49:37.600 thousand
00:49:38.020 one million
00:49:39.060 ten million
00:49:39.680 the lives
00:49:40.800 of others
00:49:41.300 and how
00:49:43.420 would that
00:49:43.780 behave in a
00:49:44.380 conflict
00:49:44.700 and how
00:49:45.700 would that
00:49:46.020 behave if
00:49:46.460 you gave
00:49:46.780 it actual
00:49:47.680 autonomy
00:49:48.080 and how
00:49:49.040 would you
00:49:49.460 be able to
00:49:51.080 control it
00:49:51.640 if you gave
00:49:52.140 it autonomy
00:49:52.560 and how
00:49:53.940 would you
00:49:54.240 stop your
00:49:54.620 own system
00:49:55.040 from being
00:49:55.500 infected
00:49:55.900 with one
00:49:56.420 virus that
00:49:57.040 changes
00:49:57.460 its value
00:49:58.720 system
00:49:59.100 so the
00:50:00.680 ambition
00:50:01.540 itself
00:50:02.140 regardless
00:50:02.680 of its
00:50:03.280 feasibility
00:50:04.120 is an
00:50:05.440 absolutely
00:50:05.980 insane
00:50:06.520 one
00:50:06.820 and is
00:50:07.880 absolutely
00:50:08.380 a bad
00:50:08.940 one
00:50:09.180 and an
00:50:09.660 unethical
00:50:10.300 immoral
00:50:11.220 unchristian
00:50:12.540 inhumane
00:50:13.180 one
00:50:13.460 and even
00:50:14.940 if you
00:50:15.300 programmed
00:50:15.660 into
00:50:16.080 an AI
00:50:16.640 you know
00:50:17.700 and you
00:50:17.960 got it
00:50:18.240 perfectly
00:50:18.680 correct
00:50:19.200 that it
00:50:20.180 values human
00:50:21.280 life
00:50:21.700 in the
00:50:22.500 same way
00:50:23.000 that a
00:50:23.260 human
00:50:23.480 being
00:50:23.840 might
00:50:24.140 then
00:50:25.280 there's
00:50:26.420 also
00:50:26.660 a
00:50:26.940 human
00:50:27.220 being
00:50:27.500 or a
00:50:27.760 Christian
00:50:27.960 human
00:50:28.220 being
00:50:28.520 it's
00:50:29.580 also
00:50:29.840 a bit
00:50:30.240 difficult
00:50:30.560 to
00:50:30.840 figure
00:50:31.740 out
00:50:32.000 which
00:50:32.360 way
00:50:32.660 to go
00:50:33.040 but even
00:50:33.880 then
00:50:34.320 the way
00:50:36.000 in which
00:50:36.320 AI is
00:50:36.780 going to
00:50:37.260 improve
00:50:37.740 and accelerate
00:50:38.320 its improvements
00:50:39.020 is that AI
00:50:39.620 works on
00:50:40.180 itself
00:50:40.520 to improve
00:50:40.980 itself
00:50:41.340 therefore
00:50:41.920 it has
00:50:42.240 the ability
00:50:42.660 to change
00:50:43.100 its code
00:50:43.500 so there's
00:50:43.980 no guarantee
00:50:44.520 that it
00:50:44.940 will remain
00:50:46.180 anyway
00:50:47.140 it might
00:50:47.880 decide that
00:50:48.300 the value
00:50:48.680 of AI
00:50:49.140 is worth
00:50:49.720 all of
00:50:50.120 human
00:50:50.320 lives
00:50:50.660 and decide
00:50:52.080 that okay
00:50:52.820 I'm going
00:50:53.280 to sky
00:50:53.920 net all
00:50:54.260 of you
00:50:54.460 and genocide
00:50:54.860 you
00:50:55.220 yeah this
00:50:55.680 is the
00:50:55.980 future
00:50:56.820 there is
00:50:57.340 no
00:50:57.640 this is
00:51:00.200 not a
00:51:00.420 good thing
00:51:00.820 this is
00:51:01.060 not a
00:51:01.300 moral
00:51:01.560 thing
00:51:01.980 but either
00:51:03.340 way to
00:51:03.740 carry on
00:51:04.600 there are
00:51:05.280 the other
00:51:05.580 problems
00:51:05.960 with this
00:51:06.500 like Elon
00:51:07.140 Musk
00:51:07.500 when he
00:51:08.060 went on
00:51:08.600 Joe Rogan
00:51:10.740 last month
00:51:11.320 in October
00:51:11.860 saying that
00:51:13.500 when AI
00:51:14.700 and robotics
00:51:15.420 like Tesla's
00:51:16.140 Optimus
00:51:16.600 eliminate all
00:51:17.420 work and
00:51:17.820 money
00:51:18.040 the government
00:51:18.480 should hand
00:51:19.020 out a
00:51:19.380 universal
00:51:19.880 income
00:51:20.520 which is
00:51:21.520 contradictory
00:51:22.060 because if
00:51:22.680 it's
00:51:22.840 eliminated
00:51:23.280 work and
00:51:24.280 money
00:51:24.600 what's the
00:51:25.020 point of
00:51:25.320 an income
00:51:25.840 if you
00:51:26.240 can just
00:51:26.940 get the
00:51:27.340 Star Trek
00:51:27.940 future
00:51:28.380 where you
00:51:28.820 can just
00:51:29.100 press a
00:51:29.500 button
00:51:29.740 and get
00:51:30.160 someone to
00:51:30.680 bring something
00:51:31.320 to you
00:51:31.700 without any
00:51:32.220 charge
00:51:32.620 he says
00:51:33.300 we'll have
00:51:34.000 in a benign
00:51:34.620 scenario
00:51:35.200 universal
00:51:35.940 high income
00:51:36.940 anyone can
00:51:38.020 have any
00:51:38.460 products or
00:51:39.060 services that
00:51:39.720 they want
00:51:40.120 but there
00:51:40.380 will be a
00:51:40.720 lot of
00:51:41.500 trauma and
00:51:42.200 disruption
00:51:42.660 along the
00:51:43.680 way
00:51:43.920 for one
00:51:44.660 if everybody
00:51:45.320 has universal
00:51:46.120 high income
00:51:46.880 that means that
00:51:47.480 nobody has
00:51:48.160 high income
00:51:48.880 because it
00:51:49.400 levels it
00:51:49.980 all out
00:51:50.440 besides if
00:51:51.280 money isn't
00:51:52.260 relevant anymore
00:51:53.960 what's the
00:51:54.300 point of
00:51:54.700 income
00:51:55.120 two this is
00:51:56.380 the technocratic
00:51:57.040 problem
00:51:57.500 this is the
00:51:58.000 problem with
00:51:58.380 technocrats
00:51:58.900 the non-human
00:51:59.720 way that they
00:52:00.400 think the
00:52:01.200 soulless
00:52:01.900 inhuman minds
00:52:03.820 that they have
00:52:04.340 what gives you
00:52:05.460 the right to
00:52:06.460 inflict that
00:52:07.160 trauma and
00:52:07.880 disruption on
00:52:09.180 normal people
00:52:10.020 this would be
00:52:10.860 managerialism on
00:52:13.440 steroids
00:52:13.920 absolutely
00:52:14.980 this would be
00:52:15.500 injecting the
00:52:16.580 existing
00:52:17.300 managerialist
00:52:18.060 system
00:52:18.500 globalist
00:52:19.820 system
00:52:20.160 with steroids
00:52:21.060 and letting
00:52:21.620 it run
00:52:22.060 amok
00:52:22.400 and with
00:52:23.560 absolutely
00:52:23.980 no input
00:52:25.360 from
00:52:26.320 human values
00:52:27.960 which should
00:52:28.360 always be
00:52:28.820 Christian values
00:52:29.500 it's just
00:52:30.360 crazy
00:52:30.720 it's just
00:52:31.180 it's mad
00:52:32.740 scientist territory
00:52:33.560 it's really
00:52:34.280 mad scientist
00:52:34.840 territory
00:52:35.100 certainly is
00:52:35.680 and then let's
00:52:36.220 move on to
00:52:36.720 the next part
00:52:37.620 of this
00:52:38.040 and I'll try
00:52:38.480 and get through
00:52:38.860 this as quickly
00:52:39.340 as possible
00:52:39.760 which is the
00:52:40.320 question of
00:52:41.120 whether AI
00:52:41.940 is a bubble
00:52:43.600 right now
00:52:44.640 as well
00:52:45.180 I've spoken
00:52:45.860 about this
00:52:46.280 briefly
00:52:46.680 our access
00:52:47.360 has been
00:52:47.720 blocked
00:52:48.060 from the
00:52:48.480 Wall Street
00:52:48.940 Journal
00:52:49.320 but just to
00:52:50.220 summarise the
00:52:50.800 article that I
00:52:51.420 did have up
00:52:52.120 here
00:52:52.380 it's that
00:52:53.420 Elon Musk's
00:52:54.300 ex-AI is in
00:52:55.580 advanced talks
00:52:56.460 to raise
00:52:56.880 15 billion
00:52:57.900 dollars
00:52:58.360 right now
00:52:59.420 lifting its
00:53:00.300 valuation
00:53:00.940 to a new
00:53:02.060 equity at
00:53:02.860 a 230
00:53:03.840 billion dollar
00:53:05.040 valuation
00:53:05.860 which is
00:53:06.480 according to
00:53:06.920 people familiar
00:53:07.700 with these
00:53:08.240 plans
00:53:08.560 the new
00:53:09.140 valuation would
00:53:09.840 represent
00:53:10.220 a significant
00:53:11.300 increase from
00:53:12.200 113 billion
00:53:13.600 dollars
00:53:14.080 which was
00:53:15.120 disclosed
00:53:15.680 after
00:53:16.320 X-AI
00:53:17.800 acquired
00:53:18.940 the social
00:53:20.080 media site
00:53:20.760 X in
00:53:21.520 March
00:53:22.060 the terms
00:53:22.940 of the
00:53:23.160 new
00:53:23.320 fundraising
00:53:23.780 were disclosed
00:53:24.320 to investors
00:53:24.840 by Musk's
00:53:25.760 wealth manager
00:53:26.440 in June
00:53:27.140 X-AI
00:53:27.660 raised
00:53:28.260 5 billion
00:53:29.120 dollars
00:53:29.400 in equity
00:53:29.820 and 5 billion
00:53:30.900 dollars in
00:53:31.380 debt to
00:53:31.800 help build
00:53:32.340 out its
00:53:32.700 colossus
00:53:33.200 data center
00:53:33.920 in Memphis
00:53:34.760 Tennessee
00:53:35.520 Musk's
00:53:36.380 rocket
00:53:36.640 company
00:53:37.040 SpaceX
00:53:37.460 invested
00:53:38.340 2 billion
00:53:39.120 dollars
00:53:39.520 in the
00:53:39.840 company
00:53:40.200 as part
00:53:41.100 of that
00:53:41.440 round
00:53:41.880 Musk
00:53:42.480 who is
00:53:42.760 chief
00:53:43.520 executive
00:53:43.980 officer
00:53:44.420 of Tesla
00:53:44.980 has publicly
00:53:45.800 supported the
00:53:46.580 idea of
00:53:47.360 Tesla
00:53:47.780 investing
00:53:48.340 in X-AI
00:53:49.380 as well
00:53:50.240 at a recent
00:53:50.860 shareholder
00:53:51.400 meeting
00:53:51.760 Tesla
00:53:52.120 shareholders
00:53:52.720 had a
00:53:53.100 mixed
00:53:53.340 response
00:53:53.800 to a
00:53:54.380 proposal
00:53:54.780 that asked
00:53:55.500 the board
00:53:55.880 to make
00:53:56.300 such an
00:53:56.680 investment
00:53:57.040 and is
00:53:57.640 now up
00:53:58.240 to the
00:53:58.520 board
00:53:58.800 to decide
00:53:59.480 ahead of
00:53:59.860 the meeting
00:54:00.220 Tesla
00:54:00.520 chair
00:54:00.800 Robin
00:54:01.140 Denham
00:54:01.560 told the
00:54:03.140 journal
00:54:03.520 that she
00:54:03.900 questioned
00:54:04.300 the logic
00:54:04.700 of such
00:54:05.080 an
00:54:05.200 investment
00:54:05.520 said the
00:54:06.160 board
00:54:06.400 hadn't done
00:54:06.980 any of
00:54:07.580 the due
00:54:08.080 diligence
00:54:08.640 required to
00:54:09.600 move forward
00:54:09.980 so it
00:54:10.260 seems like
00:54:10.740 Elon is
00:54:11.380 trying to
00:54:12.480 start shifting
00:54:13.500 his assets
00:54:15.140 a lot of
00:54:15.660 his financial
00:54:16.260 assets and
00:54:17.000 investment
00:54:17.420 from his
00:54:18.320 other companies
00:54:19.120 into
00:54:20.080 X-AI
00:54:20.820 as well
00:54:21.820 which will
00:54:22.340 again inflate
00:54:23.360 that huge
00:54:24.180 valuation
00:54:24.860 of the company
00:54:26.520 this is
00:54:27.300 this is his
00:54:27.520 new
00:54:27.860 obsessive
00:54:28.880 focus
00:54:29.400 and this
00:54:30.200 is part
00:54:30.580 of this
00:54:30.860 whole
00:54:31.120 worry
00:54:31.480 that people
00:54:31.980 have right
00:54:32.460 now
00:54:32.620 that I've
00:54:32.840 seen people
00:54:33.240 talking about
00:54:33.920 of whether
00:54:34.640 this is a
00:54:35.160 bubble
00:54:35.380 which is why
00:54:36.120 I'm glad to
00:54:36.600 be sat on
00:54:37.140 this panel
00:54:37.600 with two
00:54:38.260 guys who
00:54:38.640 would probably
00:54:38.960 know a bit
00:54:39.380 more about
00:54:39.860 such things
00:54:40.560 than I do
00:54:42.380 because people
00:54:43.140 have been
00:54:43.360 talking about
00:54:44.020 this for
00:54:44.480 months
00:54:44.800 this is an
00:54:45.240 article back
00:54:45.820 from the
00:54:46.120 beginning of
00:54:46.600 October
00:54:47.120 that's going
00:54:48.500 to be really
00:54:48.860 bad fears
00:54:49.520 over AI
00:54:50.080 bubble bursting
00:54:50.860 grow in
00:54:51.420 Silicon Valley
00:54:52.360 tech giants
00:54:53.640 are spending
00:54:54.080 big on AI
00:54:54.920 in rush to
00:54:55.740 dominate the
00:54:56.360 boom
00:54:56.680 that's from
00:54:57.360 the end
00:54:57.940 of October
00:54:58.860 companies are
00:55:00.760 being told
00:55:01.280 that they are
00:55:01.760 over investing
00:55:02.760 in AI
00:55:03.460 that's warnings
00:55:04.180 from fund
00:55:05.220 management
00:55:05.760 and Google's
00:55:07.620 boss
00:55:08.220 currently
00:55:08.760 is saying
00:55:09.660 a trillion
00:55:10.400 dollar
00:55:10.960 AI investment
00:55:12.660 boom has
00:55:13.220 elements of
00:55:13.900 irrationality
00:55:14.720 and what this
00:55:15.420 is talking about
00:55:16.080 is Alphabet
00:55:16.820 the parent
00:55:17.840 company of
00:55:18.480 Google
00:55:18.860 has shares
00:55:19.740 that have
00:55:20.020 doubled in
00:55:20.500 value in
00:55:20.980 seven months
00:55:21.560 to 3.5
00:55:22.480 trillion
00:55:23.060 dollars
00:55:23.920 as markets
00:55:24.820 have grown
00:55:25.200 more confident
00:55:25.860 in the
00:55:26.200 search giant's
00:55:27.420 ability to
00:55:27.920 fend off the
00:55:28.380 threat from
00:55:28.780 Jap GPT
00:55:29.740 owner
00:55:30.040 OpenAI
00:55:31.540 a particular
00:55:32.540 focus is
00:55:33.040 Alphabet's
00:55:33.540 development of
00:55:34.140 specialized
00:55:34.580 superchips
00:55:35.440 for AI
00:55:35.960 that compete
00:55:36.720 with NVIDIA
00:55:37.620 run by
00:55:38.220 Jensen Huang
00:55:38.840 who we saw
00:55:39.280 the image of
00:55:39.780 earlier
00:55:40.120 which recently
00:55:41.360 reached a
00:55:42.040 world first
00:55:43.020 five trillion
00:55:44.460 dollar
00:55:45.020 valuation
00:55:46.120 as valuations
00:55:47.460 rise some
00:55:48.000 analysts have
00:55:48.740 expressed
00:55:49.280 skepticism
00:55:50.100 about a
00:55:50.860 complicated web
00:55:51.860 of 1.4
00:55:53.300 trillion
00:55:53.800 dollars worth
00:55:54.480 of deals
00:55:54.840 being done
00:55:55.380 around
00:55:55.960 OpenAI
00:55:56.720 which is
00:55:57.440 expected to
00:55:57.980 have revenues
00:55:58.560 this year
00:55:59.140 of less than
00:56:00.020 one thousandth
00:56:01.140 of the
00:56:01.960 planned
00:56:02.380 investment
00:56:03.220 the tech
00:56:04.100 giant is
00:56:04.680 also expanding
00:56:05.520 its footprint
00:56:06.020 in the UK
00:56:06.900 in September
00:56:08.200 Alphabet announced
00:56:09.120 that it was
00:56:09.480 investing in UK
00:56:10.580 artificial intelligence
00:56:11.800 committing five
00:56:12.860 billion pounds
00:56:13.580 to infrastructure
00:56:14.300 and research
00:56:15.220 over the next
00:56:16.060 two years
00:56:16.540 this will be
00:56:17.200 alongside I
00:56:18.080 would assume
00:56:18.740 Oracle and
00:56:19.780 Larry Ellison
00:56:20.440 working with
00:56:21.280 Tony Blair
00:56:22.020 to try and
00:56:22.740 implement AI
00:56:23.680 into chat
00:56:25.100 bots for
00:56:25.820 your GP
00:56:26.500 because Tony
00:56:27.300 Blair wants
00:56:27.900 to make the
00:56:28.460 entire GP
00:56:29.440 system
00:56:30.000 automated
00:56:31.100 under AI
00:56:32.220 so you're
00:56:33.700 getting this
00:56:34.040 huge investment
00:56:34.980 boom in
00:56:36.000 Silicon Valley
00:56:36.740 now even
00:56:37.500 Jeff Bezos
00:56:38.200 is getting
00:56:39.220 involved in it
00:56:39.820 as well
00:56:40.080 putting 6.2
00:56:42.160 billion dollars
00:56:43.160 into this
00:56:44.560 and himself
00:56:45.340 as co-CEO
00:56:46.440 behind a new
00:56:47.160 AI startup
00:56:48.100 so you're
00:56:48.840 getting this
00:56:49.260 huge swell
00:56:50.800 of investment
00:56:51.500 from people
00:56:52.120 who are
00:56:52.460 already
00:56:52.860 let's be
00:56:53.300 honest
00:56:53.600 the heads
00:56:54.560 of huge
00:56:55.280 tech monopolies
00:56:56.480 into this
00:56:57.680 and then they
00:56:58.460 are building
00:56:59.160 that infrastructure
00:57:00.080 into the
00:57:00.940 foundations
00:57:01.580 of national
00:57:03.280 infrastructure
00:57:04.120 as well
00:57:05.260 the problem
00:57:06.120 I see with it
00:57:06.940 along with a lot
00:57:07.800 of different
00:57:08.200 things is
00:57:08.900 just this
00:57:10.020 Tuesday
00:57:10.540 right
00:57:11.340 if you make
00:57:12.400 your entire
00:57:13.460 system
00:57:14.520 completely dependent
00:57:15.740 on AI
00:57:16.480 the gigantic
00:57:17.580 server farms
00:57:18.560 and chips
00:57:19.080 that are required
00:57:19.740 for it
00:57:20.040 the energy
00:57:21.040 needed to
00:57:21.680 power all of
00:57:22.360 this
00:57:22.460 this is one
00:57:23.200 thing that a lot
00:57:23.980 of green leftists
00:57:25.080 are worried about
00:57:25.880 not because of
00:57:27.180 the potential
00:57:27.900 issues with
00:57:28.980 fault lines
00:57:30.440 in the infrastructure
00:57:31.260 more to do
00:57:32.180 with just the
00:57:32.660 amount of energy
00:57:33.620 it uses up
00:57:34.380 for green
00:57:34.960 climate change
00:57:35.660 purposes
00:57:36.000 these are hugely
00:57:37.500 energy intensive
00:57:38.600 things for a
00:57:39.420 power grid
00:57:39.980 Google itself
00:57:41.500 Alphabet
00:57:42.100 are going back
00:57:43.360 on their green
00:57:44.060 climate energy
00:57:44.940 targets
00:57:45.540 so that they
00:57:46.480 can put all
00:57:47.020 of this
00:57:47.320 investment
00:57:47.800 into AI
00:57:49.060 because it
00:57:49.760 is just
00:57:50.260 that energy
00:57:51.280 intensive
00:57:52.060 and yet
00:57:52.840 on Tuesday
00:57:53.760 cloudflare
00:57:55.100 goes down
00:57:55.760 one website
00:57:57.200 goes down
00:57:58.060 and all of a
00:57:58.880 sudden
00:57:59.040 half of the
00:57:59.960 internet
00:58:00.280 goes down
00:58:00.960 with it
00:58:01.500 and that
00:58:02.460 is
00:58:02.620 these are
00:58:03.360 websites
00:58:03.880 big websites
00:58:04.740 like X
00:58:05.540 like
00:58:06.380 even our
00:58:08.100 website
00:58:08.580 was taken
00:58:09.940 down
00:58:10.340 by it
00:58:11.060 for a little
00:58:11.520 bit
00:58:11.760 right
00:58:12.000 and so
00:58:12.900 you have
00:58:13.380 everything
00:58:14.080 hooked up
00:58:14.820 and you
00:58:15.040 expect there
00:58:15.760 to be
00:58:16.240 lots of
00:58:17.180 contingencies
00:58:18.400 for these
00:58:18.940 fault lines
00:58:19.520 but there
00:58:20.200 isn't
00:58:21.140 there isn't
00:58:22.740 so you get
00:58:23.280 your entire
00:58:23.980 society
00:58:24.760 globally
00:58:25.460 hooked up
00:58:26.940 onto the
00:58:27.480 AI mainframe
00:58:28.420 what happens
00:58:30.460 what could
00:58:30.840 possibly go
00:58:31.380 wrong
00:58:31.600 what happens
00:58:32.500 when the
00:58:33.140 equivalent of
00:58:33.820 a cloudflare
00:58:34.520 issue
00:58:34.880 goes up
00:58:35.640 and the
00:58:36.120 entire
00:58:36.620 global economy
00:58:37.860 which in
00:58:38.780 Elon Musk's
00:58:39.580 mind is
00:58:40.000 entirely run
00:58:41.020 for the
00:58:41.380 purposes of
00:58:41.920 super abundance
00:58:42.720 by AI
00:58:43.480 goes down
00:58:44.520 for half
00:58:44.940 a day
00:58:45.280 what
00:58:47.080 what happens
00:58:47.960 then
00:58:48.360 do your
00:58:49.680 robots
00:58:50.140 freak out
00:58:50.820 do they
00:58:51.060 just shut
00:58:51.580 down
00:58:51.940 do they
00:58:52.920 start
00:58:53.320 like
00:58:53.720 do they
00:58:54.500 start
00:58:54.820 Skynet
00:58:55.300 attacking
00:58:55.660 you
00:58:55.920 like
00:58:56.400 I don't
00:58:56.960 know
00:58:57.100 you tell
00:58:57.500 me
00:58:57.700 what happens
00:58:58.540 in that
00:58:59.380 situation
00:58:59.940 and when
00:59:00.460 people talk
00:59:00.960 about the
00:59:01.400 AI bubble
00:59:02.020 I couldn't
00:59:02.720 get the
00:59:03.320 Bloomberg
00:59:03.720 article
00:59:04.460 but I
00:59:04.960 got this
00:59:05.620 screenshot
00:59:06.740 from the
00:59:07.660 Bloomberg
00:59:07.980 article
00:59:08.620 that talks
00:59:09.460 about it
00:59:10.000 pointing to
00:59:11.300 all of
00:59:11.720 this
00:59:12.020 and how
00:59:12.620 it
00:59:13.860 you may be
00:59:15.460 able to make
00:59:15.900 more sense
00:59:16.420 of this
00:59:16.780 than me
00:59:17.240 but a lot
00:59:17.880 of this
00:59:18.220 seems to
00:59:18.720 be
00:59:19.060 classic
00:59:20.060 money
00:59:21.100 changing
00:59:21.780 money goes
00:59:23.220 from one
00:59:23.780 hand to
00:59:24.320 another
00:59:24.680 and back
00:59:25.220 again in a
00:59:25.760 big circle
00:59:26.400 until it
00:59:26.840 ends back
00:59:27.280 at the same
00:59:27.720 place and
00:59:28.180 because that
00:59:28.560 money is
00:59:29.000 changing hands
00:59:29.720 over and
00:59:30.060 over and
00:59:30.320 over again
00:59:30.680 it looks
00:59:30.960 like there's
00:59:31.260 lots of
00:59:31.560 investment
00:59:31.980 going around
00:59:32.700 therefore a
00:59:33.880 magic
00:59:34.240 separate
00:59:34.820 invisible
00:59:35.480 number
00:59:35.940 which doesn't
00:59:36.460 actually
00:59:36.880 correspond to
00:59:37.860 anything
00:59:38.240 practical in
00:59:39.060 reality
00:59:39.500 keeps going
00:59:40.520 up
00:59:40.820 and up
00:59:41.240 and up
00:59:41.660 and up
00:59:42.060 and people
00:59:42.480 are saying
00:59:43.540 that this
00:59:43.820 is very
00:59:44.580 similar to
00:59:45.240 the dot
00:59:45.740 com
00:59:46.120 but just
00:59:46.940 for an
00:59:47.180 example
00:59:47.580 right
00:59:47.900 so like
00:59:48.520 Nvidia
00:59:50.100 is giving
00:59:51.040 investment
00:59:51.620 to Intel
00:59:52.640 who are then
00:59:53.500 producing stuff
00:59:54.380 for CoreWeave
00:59:55.820 who are then
00:59:56.320 selling it
00:59:56.860 straight back
00:59:57.440 to Nvidia
00:59:58.720 at which point
01:00:00.100 I ask
01:00:00.620 why don't
01:00:01.380 these two
01:00:01.920 companies
01:00:02.400 merge
01:00:03.000 because it
01:00:04.600 seems to me
01:00:05.200 that Intel
01:00:05.840 is
01:00:06.180 well
01:00:07.840 actually it
01:00:08.560 raises the
01:00:09.000 federal reserve
01:00:09.620 is pretty much
01:00:10.740 printing the
01:00:11.700 money that
01:00:12.200 keeps these
01:00:12.700 share prices
01:00:13.200 up
01:00:13.500 yeah
01:00:13.900 well it
01:00:14.240 raises the
01:00:14.960 question to
01:00:15.400 me
01:00:15.500 what's the
01:00:15.900 point of
01:00:16.360 CoreWeave
01:00:17.100 like what is
01:00:19.260 the point of
01:00:19.780 this company
01:00:20.340 are they doing
01:00:20.880 something to
01:00:21.660 the stuff that's
01:00:22.260 going in to the
01:00:23.320 hardware or
01:00:23.960 software that's being
01:00:24.860 produced by
01:00:25.420 Intel
01:00:25.900 why can't
01:00:26.780 Nvidia
01:00:27.060 like it's
01:00:27.920 stuff like that
01:00:28.940 you can see the
01:00:29.500 gigantic web
01:00:30.180 again I am not
01:00:31.280 an investment guy
01:00:32.180 I'm not a money
01:00:32.940 guy so maybe you
01:00:34.160 guys can explain
01:00:34.900 this a bit better
01:00:35.620 a bit better than
01:00:37.020 I can
01:00:37.460 I mean this
01:00:38.780 is true of a lot
01:00:40.280 of areas of the
01:00:41.040 economy to be honest
01:00:41.760 particularly when you
01:00:42.580 get to large
01:00:43.480 multinational companies
01:00:44.560 they all do
01:00:46.360 trades with each
01:00:47.780 other so a lot of
01:00:48.600 the economy does
01:00:49.320 just look like
01:00:50.120 this however
01:00:51.680 that doesn't mean
01:00:52.920 that it's not a
01:00:54.260 bubble it just
01:00:55.120 means that a lot
01:00:56.220 of the rest of the
01:00:56.900 economy is a
01:00:57.880 bubble as well
01:00:58.520 yes and a lot
01:01:00.780 of this sort of
01:01:01.420 stuff is incestuous
01:01:02.480 but I think that
01:01:03.940 these companies are
01:01:04.960 still going to
01:01:05.680 emerge as very
01:01:06.960 important but
01:01:07.700 people are over
01:01:08.780 investing in
01:01:09.520 creating the
01:01:10.020 bubble because
01:01:10.540 they're hedging
01:01:11.100 their bets as to
01:01:11.820 which one is
01:01:12.940 going to be
01:01:13.480 leader of the
01:01:14.080 pack because you
01:01:15.580 will get so many
01:01:16.440 spoils when it
01:01:17.420 you know a clear
01:01:18.400 front runner does
01:01:19.300 emerge
01:01:19.820 well and the
01:01:21.960 thing I looked at
01:01:22.920 this Forbes
01:01:23.620 article that was
01:01:25.040 talking about the
01:01:25.640 AI bubble that
01:01:26.560 isn't there
01:01:27.220 they're making the
01:01:27.920 argument that it's
01:01:28.580 not a bubble in
01:01:29.880 response to a lot
01:01:31.040 of people
01:01:31.540 Forbes says that
01:01:32.520 I kind of assumed
01:01:33.260 that it must be
01:01:33.960 well yeah that's
01:01:35.520 that's one of the
01:01:36.200 words and plus
01:01:36.940 this article is
01:01:38.500 just is just
01:01:39.540 word babble
01:01:41.100 garbled nonsense
01:01:42.340 most it's written
01:01:43.160 by AI it's written
01:01:44.600 by a guy called
01:01:45.280 Jason Snyder who
01:01:46.800 let me I'll just
01:01:48.500 I'll just find it
01:01:49.480 I'll just find it
01:01:50.340 is there's just a
01:01:51.120 little bit of
01:01:51.580 nonsense in here
01:01:53.280 where he's talking
01:01:54.440 about how he
01:01:55.180 created a
01:01:56.300 a lamp
01:01:58.240 basically here it
01:01:59.060 is so this is
01:02:00.500 the kind of like
01:02:01.320 intellectual content
01:02:02.300 that I get from
01:02:03.220 this and the kind
01:02:03.980 of flowery poetic
01:02:05.280 language that he
01:02:06.260 uses he says
01:02:07.680 this connection
01:02:09.120 sorry civilization
01:02:10.140 itself is the
01:02:11.200 story of arranging
01:02:12.100 energy into
01:02:12.800 meaning this
01:02:13.780 connection is
01:02:14.580 personal years
01:02:15.900 ago I invented
01:02:16.820 the Luki or
01:02:17.860 Luchi solar
01:02:18.980 lantern to bring
01:02:19.820 light to
01:02:20.340 communities without
01:02:21.240 reliable electricity
01:02:22.320 it was an effort
01:02:23.600 to democratize
01:02:24.780 photons to
01:02:26.200 capture sunlight
01:02:27.140 and convert it
01:02:28.640 into opportunity
01:02:29.820 in Africa I
01:02:31.320 watched children
01:02:32.180 hold a Luchi
01:02:33.560 lantern with
01:02:34.300 or light
01:02:35.520 became cognition
01:02:37.120 energy
01:02:38.400 became hope
01:02:39.960 I joked about
01:02:40.780 it being written
01:02:41.280 by AI but
01:02:42.500 it does actually
01:02:44.540 sound like it
01:02:45.180 does have that
01:02:45.760 feeling it comes
01:02:46.520 across as totally
01:02:47.380 stars in his
01:02:48.540 eyes this will be
01:02:49.580 infinite expansion
01:02:50.740 forever GDP
01:02:52.240 brain line go
01:02:53.500 up on graph
01:02:54.220 forever equal
01:02:55.080 good the main
01:02:57.040 argument that he's
01:02:57.920 making when you
01:02:58.680 sift through all
01:02:59.420 of the waffle
01:02:59.980 is that what is
01:03:01.020 being built by
01:03:02.200 all of these
01:03:02.780 companies who are
01:03:03.420 making huge
01:03:03.980 investments is
01:03:04.780 infrastructure rather
01:03:06.640 than the same
01:03:08.320 thing that happened
01:03:08.920 with the dot-com
01:03:09.680 bubble where it
01:03:10.280 was just a load of
01:03:10.900 companies being
01:03:11.620 started by nobodies
01:03:12.680 with inflated stock
01:03:13.760 prices who then
01:03:14.880 went out of
01:03:15.860 business out of
01:03:16.820 nowhere as soon as
01:03:17.740 the stock market
01:03:18.440 took a little hit
01:03:19.220 when Japan hit a
01:03:20.480 recession made
01:03:21.140 everybody scared
01:03:22.000 but it's only
01:03:22.620 infrastructure when
01:03:23.500 it's actually
01:03:24.000 infrastructure like
01:03:25.420 at the minute
01:03:26.360 most infrastructure is
01:03:27.900 run not with
01:03:29.020 AI or if it is
01:03:30.280 AI it plays a
01:03:31.780 small role and
01:03:32.520 still largely run
01:03:33.580 by people so if
01:03:35.520 that is his
01:03:36.160 argument well it's
01:03:37.320 got to actually
01:03:38.120 come into fruition
01:03:40.980 for that to be a
01:03:41.940 tangible argument
01:03:42.880 well he does
01:03:44.060 contrast it with the
01:03:45.320 dot-com bubble
01:03:46.260 sorry the dot-com
01:03:47.880 boom and the
01:03:48.700 bubble which then
01:03:49.420 popped and the
01:03:50.620 argument that he
01:03:51.220 presents is this
01:03:52.540 that period is
01:03:53.400 remembered as a
01:03:53.980 bubble because
01:03:54.460 thousands of
01:03:55.000 companies failed
01:03:55.660 but that
01:03:56.100 interpretation misses
01:03:57.020 the larger truth
01:03:57.940 the infrastructure
01:03:58.920 built during that
01:04:00.100 frenzy created the
01:04:01.100 modern internet
01:04:01.860 much of today's
01:04:02.940 economic output is
01:04:04.260 driven by a small
01:04:05.220 group of winners
01:04:05.900 companies like
01:04:06.840 Amazon Google and
01:04:08.080 Meta that emerged
01:04:09.260 from the wreckage
01:04:10.180 and now define the
01:04:11.500 S&P 500 so what
01:04:13.480 I'm getting from that
01:04:14.320 is the best that we
01:04:15.480 can hope for what
01:04:16.300 we can expect is
01:04:17.860 for a bunch a
01:04:19.640 series of government
01:04:21.120 backed censorious
01:04:22.560 monopolies to become
01:04:24.180 the market leaders in
01:04:25.520 control of all of
01:04:27.040 this and we can
01:04:27.880 guarantee that they
01:04:29.060 will be government
01:04:29.620 backed if they're
01:04:30.420 building the
01:04:30.900 infrastructure into
01:04:31.960 the governments
01:04:32.520 themselves and a
01:04:34.160 lot of these that we
01:04:34.940 can see here Amazon
01:04:36.260 well Jeff Bezos is
01:04:37.420 investing in it Google
01:04:38.780 heavily involved in
01:04:40.080 it Meta also
01:04:41.480 heavily involved in
01:04:42.440 it these are going to
01:04:43.340 be the same
01:04:44.240 monopolies that we
01:04:45.500 have already come to
01:04:46.660 know and love so
01:04:48.060 that's the best future
01:04:49.560 that we can look
01:04:50.560 forward to according
01:04:51.600 to this all of your
01:04:53.060 favorite government
01:04:53.980 backed monopolies will
01:04:55.520 have even more
01:04:56.900 control of your life
01:04:58.500 and Elon Musk is
01:04:59.740 hoping that he can
01:05:00.600 get ahead of all of
01:05:01.440 that so that he can
01:05:02.500 give you the pod
01:05:04.440 person bug man life
01:05:06.360 that you've always
01:05:07.300 been looking for
01:05:08.200 this frankly no
01:05:10.100 matter how it ends
01:05:11.600 up working isn't a
01:05:13.960 future that I was
01:05:15.180 looking forward to
01:05:16.680 and that's the best
01:05:17.740 way that I can put
01:05:18.540 it yep all right
01:05:20.820 we've got quite a few
01:05:21.640 rumble rants and
01:05:22.860 super chats so I'll
01:05:24.160 go through this
01:05:24.840 counterpoint on Harry
01:05:26.200 to Harry on AI the
01:05:27.220 Will Stancil show is
01:05:27.980 the greatest adult
01:05:28.720 cartoon comedy
01:05:29.480 running I'm aware of
01:05:31.840 it but frankly he's a
01:05:33.040 funny enough meme for
01:05:33.980 five minutes I'm not
01:05:34.840 going to watch a whole
01:05:35.440 show about him I mean
01:05:36.620 it's only short I don't
01:05:37.980 care it did make me
01:05:38.680 laugh I don't care
01:05:39.420 random name you should
01:05:40.740 watch a video called
01:05:41.360 the four types of
01:05:41.960 dystopia explains how
01:05:42.920 the merchant class and
01:05:43.660 mercenaries who will
01:05:44.340 sell the country for
01:05:45.000 profit and the
01:05:45.540 bureaucrat class
01:05:46.340 legislates to preserve
01:05:47.260 the system true
01:05:48.300 McLeod software can be
01:05:50.000 as theoretical as
01:05:50.720 possible but the real
01:05:51.460 world limits show up
01:05:52.700 Amazon is hiring
01:05:53.400 nuclear engineers to
01:05:54.300 try to power their
01:05:54.960 stuff never mind
01:05:56.000 Moore's law with
01:05:56.720 chips AI crash is
01:05:57.840 coming potentially
01:05:58.820 hapsification
01:05:59.940 Zinlord Firas
01:06:01.880 thank you
01:06:02.840 random name again
01:06:04.280 programmer here it is
01:06:05.380 easy to make an AI
01:06:06.300 that is either super
01:06:07.220 smart for hyper
01:06:07.900 specific tasks or
01:06:08.940 really retarded
01:06:09.640 overall the things
01:06:10.700 that are currently
01:06:11.300 labeled AI aren't
01:06:13.040 true AI I think it's
01:06:14.340 a bubble
01:06:14.700 Amandine 512 have
01:06:16.780 you guys seen the
01:06:17.400 circle with Tom
01:06:18.240 Hanks and the Harry
01:06:18.940 Potter girl it feels
01:06:19.780 more predictive every
01:06:20.620 year we advance
01:06:21.760 towards an AI
01:06:22.440 surveillance utopia I
01:06:23.520 do remember that
01:06:24.120 coming out my
01:06:24.720 missus watched it I
01:06:25.680 did not not just a
01:06:27.100 string if this
01:06:27.620 segment isn't AI
01:06:28.460 generated please show
01:06:29.480 me your fingers
01:06:30.180 everyone there you
01:06:32.260 go there you go
01:06:33.700 easy E on on super
01:06:35.300 chats also AI isn't a
01:06:36.720 bubble because there's
01:06:37.320 no normie at the
01:06:37.980 bottom to take the
01:06:38.560 financial losses this
01:06:39.800 is just normal
01:06:40.400 corporate inflation of
01:06:42.020 stock price yeah but
01:06:43.720 generally if everything
01:06:44.840 crashes all at once
01:06:46.020 with the stock market
01:06:47.160 and investments that
01:06:48.080 does end up affecting
01:06:48.960 the guy at the
01:06:49.560 bottom yes sadly just
01:06:51.840 like we saw in the
01:06:52.780 2008 recession Chris
01:06:54.860 H in the future one
01:06:56.640 solar flare can bring
01:06:57.720 it all down although
01:06:58.620 ignoring the sun in
01:06:59.600 their models isn't new
01:07:00.660 to these people cough
01:07:01.560 cough climate change
01:07:02.540 true easy E reminder
01:07:04.620 that most scientists
01:07:05.780 don't know more than
01:07:06.860 most people outside the
01:07:07.900 specific area of study
01:07:08.820 a constant frustration
01:07:09.660 of engineers is
01:07:10.440 explaining to them why
01:07:11.200 their predictions aren't
01:07:12.160 accurate I just want to
01:07:13.980 be out exploring space
01:07:15.660 Luke Stewart also your
01:07:18.600 argument of COVID kind
01:07:19.620 of fails because people
01:07:20.440 were locked in their
01:07:21.120 houses it wasn't that
01:07:21.980 they couldn't work they
01:07:22.760 just they couldn't go
01:07:23.700 ahead and live I
01:07:25.180 disagree frankly when
01:07:27.080 given the option to be
01:07:28.120 lazy people will be
01:07:29.400 lazy I think that
01:07:30.780 COVID is exactly what
01:07:32.580 a lot of people wanted
01:07:34.440 which was the excuse to
01:07:37.780 stay at home and play
01:07:38.660 video games and get fat
01:07:40.000 and not do anything with
01:07:41.140 their lives and if you
01:07:42.380 make that permanent then
01:07:45.040 that's not that's not
01:07:46.540 great Harry I agree it's
01:07:49.080 high-minded fantasy unless
01:07:50.200 we get the Terminator
01:07:50.960 result high-minded
01:07:52.500 dystopia then yes I'll
01:07:54.460 go through a few more
01:07:55.520 England needs strong
01:07:56.420 nationalism flags are
01:07:57.460 good mimetics but if
01:07:58.380 people don't
01:07:58.900 simultaneously reject the
01:08:00.000 interlopers while
01:08:00.620 gaining nationalism in
01:08:02.640 the UK will remain a
01:08:04.020 hotel for the world I
01:08:05.900 don't know what you're
01:08:06.360 so worried about Harry I
01:08:07.200 for one welcome our
01:08:07.880 cyberpunk overlords I
01:08:09.580 find Musk's view of the
01:08:10.520 world endearing and its
01:08:11.480 naivety money is
01:08:12.820 currency currency is
01:08:13.700 level of power for
01:08:14.380 states one they won't
01:08:15.380 choose to be abandoned
01:08:16.280 the game makers are
01:08:17.160 playing the game as
01:08:18.360 well good point good
01:08:19.820 day everyone I like AI
01:08:21.140 art because it will
01:08:21.900 allow me to create art
01:08:23.060 without with it having
01:08:24.360 to deal with the biases
01:08:25.380 of the creator bake the
01:08:26.900 gay wedding cake I
01:08:29.560 don't I don't know how
01:08:30.540 applicable that is here
01:08:31.660 you can just make art for
01:08:32.840 the sake of it you can
01:08:34.040 just make your own art
01:08:34.820 meant to say AI not not
01:08:37.480 I damn also I like the
01:08:39.060 idea of having not to
01:08:39.840 deal with the ideology or
01:08:40.900 conflict of the creator
01:08:41.860 as someone who sucks a
01:08:42.800 lot but has great ideas
01:08:44.060 and would like a scribe
01:08:45.160 to write or paint for
01:08:46.000 me sorry bro AI already
01:08:48.580 hates you most AI already
01:08:50.220 hates you anyway good
01:08:51.780 luck all the same yeah
01:08:52.740 good luck yeah thank you
01:08:54.240 uh so let's talk a little
01:08:56.060 bit about what socialists
01:08:57.420 get right because we want
01:08:59.020 to try to be charitable
01:08:59.980 towards these people uh and
01:09:01.860 try to sort of yeah yeah
01:09:04.240 yeah it's it's sort of a
01:09:06.140 requirement sorry uh but
01:09:08.820 before we go there um this
01:09:12.060 is about defending economic
01:09:13.180 pragmatism and about being
01:09:14.580 realistic when it comes to
01:09:16.220 economic questions and the
01:09:18.500 starting point here that I
01:09:19.520 want to talk about is where
01:09:20.680 we are right now in Britain as
01:09:22.160 an economy ridiculous energy
01:09:23.980 prices uh a completely
01:09:26.160 dysfunctional state that can't
01:09:27.860 achieve anything that it sets
01:09:29.120 its mind to do uh insane
01:09:31.460 levels of welfare this is a
01:09:34.140 breakdown of the spending of
01:09:36.220 the British government and
01:09:38.720 you see pensions around
01:09:40.700 225 billion healthcare 250
01:09:43.660 billion education 120
01:09:45.940 billion defense that
01:09:47.680 everybody likes to complain
01:09:48.500 about just 70 and then
01:09:50.360 actual welfare I don't know
01:09:52.260 why pensions weren't
01:09:53.160 included 188 billion and
01:09:55.680 then protection separate the
01:09:57.020 two out to make it huge uh
01:09:59.620 and then protection which
01:10:01.020 again means welfare uh at 50
01:10:03.600 billion so there is this
01:10:05.940 massive level of of spending
01:10:08.400 what's protection versus
01:10:09.900 defense uh protection isn't
01:10:12.740 about the policing that's
01:10:14.300 included in all other
01:10:15.820 spending protection is various
01:10:18.440 social safety nets so it's
01:10:20.780 another part of another part
01:10:22.040 of welfare just separated out
01:10:23.200 to make the number and
01:10:24.020 another part of welfare
01:10:24.880 another part of the only thing
01:10:25.920 I would keep here would be
01:10:27.660 the defense spending and
01:10:29.800 policing and maybe justice
01:10:31.520 yeah the rest of it as far as
01:10:33.220 I'm concerned we could save
01:10:35.020 trillions of government
01:10:36.860 spending I'm going to
01:10:38.160 disagree with you in a
01:10:38.980 second but I'll explain to
01:10:40.160 you why add to that around
01:10:42.140 three trillion in official
01:10:43.560 debt which is almost 100% of
01:10:45.880 GDP and then there is the
01:10:47.940 hidden um unfunded
01:10:50.520 liabilities that are also to
01:10:52.360 do with pensions mostly and
01:10:54.340 that might be three or four
01:10:55.640 times as much as the actual
01:10:57.180 debt so you're dealing with
01:10:59.000 debt to GDP that's in the
01:11:00.260 500 to 600 percent level
01:11:02.060 which is common across most
01:11:04.880 of Western Europe when you
01:11:06.520 think of the the insane
01:11:08.180 generosity of the welfare
01:11:10.380 system so there is something
01:11:13.120 fundamentally wrong here and
01:11:15.360 what is wrong is this massive
01:11:17.900 overspending the socialists
01:11:19.960 think that the answer must be
01:11:21.860 more social spending and more
01:11:24.500 taxation and more welfare if
01:11:28.500 this was going to solve
01:11:29.980 anything it would have
01:11:31.140 already but it didn't and it
01:11:33.320 won't the other issue that the
01:11:37.560 socialists get wrong is
01:11:39.240 immigration that this can be
01:11:41.140 sustained or paid for by
01:11:43.280 bringing in more and more
01:11:44.340 people who end up being net
01:11:46.440 drains because their tax
01:11:48.140 contributions over their
01:11:49.160 lifetime are much smaller than
01:11:51.980 what they take in government
01:11:53.600 services so this is the picture
01:11:56.820 that we have here and it's
01:11:58.560 worth understanding how did we
01:12:00.420 get here the British
01:12:02.560 government nationalized most
01:12:04.040 industries after the second
01:12:05.400 world war then it privatized
01:12:08.060 those industries the process in
01:12:10.440 which this happened was
01:12:11.660 basically they took a bunch of
01:12:13.580 successful entities be they in
01:12:16.120 health care various charities
01:12:17.720 that were providing health care
01:12:18.960 or in railroads or in mines or in
01:12:21.760 energy or in water and bundled
01:12:24.340 them together into centralized
01:12:26.040 entities 20 years later they
01:12:28.540 realized that this was a
01:12:29.480 disastrous decision then they
01:12:31.160 sold them off but instead of
01:12:32.800 selling them off into smaller
01:12:34.200 chunks that would be
01:12:35.740 competitive they sold them as is
01:12:39.340 as these massive conglomerates and
01:12:42.200 then they ended up capturing the
01:12:44.040 institutions that regulated them and
01:12:46.640 so you ended up with this weird
01:12:48.140 situation of excessive regulation
01:12:49.900 excessive concentration much less
01:12:53.440 distribution of wealth therefore
01:12:55.400 more inequality and then after the
01:12:58.380 privatizations an insane level of
01:13:00.760 regulation and with the DEI and
01:13:03.420 net zero that regulation truly became
01:13:06.660 incredibly destructive to the
01:13:08.840 extent that Britain has pretty much
01:13:10.280 lost its industrial base
01:13:11.520 British steel is collapsing it's ended
01:13:14.140 up only owned by China Tata steel
01:13:16.400 boats bought some steel mills ended up
01:13:18.240 selling them off Britain doesn't make
01:13:20.240 anything anymore so the total that the
01:13:24.000 actual examination of this picture
01:13:26.480 does agree with what the welfare
01:13:30.640 socialists say that there is too much
01:13:32.800 inequality too much poverty too many
01:13:34.500 people are suffering but as usual all of
01:13:37.120 the solutions that they prescribe for it
01:13:38.560 are wrong except for one except for one
01:13:42.400 because there were countries that did get
01:13:45.680 out of being severely de-industrialized and
01:13:49.320 extremely weakened economically and found
01:13:52.860 a way out and what these countries are are
01:13:56.120 essentially the various Asian miracle
01:13:57.740 countries Japan Korea China Singapore Hong
01:14:02.380 Kong etc what they all had in common was a
01:14:06.720 very high level of cooperation between the
01:14:10.240 the public sector and the private sector and
01:14:13.480 the way that this relationship worked was
01:14:15.680 basically the government would set out a
01:14:18.820 bunch of industries and technologies that
01:14:21.580 they wanted to support and finance and
01:14:24.280 help develop they set out bits of
01:14:27.620 infrastructure that they wanted to build they
01:14:30.620 basically made a plan but they didn't
01:14:34.540 follow the full central planning approach of
01:14:38.160 early China and the Soviet Union it was more of a
01:14:41.580 dialogue between the big businesses and the
01:14:45.160 oligarchs on the one side and the state and
01:14:49.500 so in a country like Japan you had the
01:14:51.600 Ministry of Industry and Transportation
01:14:55.800 MITI these guys did an incredible job they
01:14:59.860 worked on finding markets for Japanese
01:15:02.020 companies they employed a level of
01:15:04.400 protectionism for their own economy they then
01:15:07.820 went with export-led growth in order to develop
01:15:11.320 their economy and it succeeded to the extent
01:15:15.060 that Japan was stuck producing third and
01:15:18.440 fourth-rate goods in the 1950s and by the
01:15:21.640 1980s it was it was the second largest economy
01:15:24.380 in the world and competing with the United
01:15:26.640 States and causing all kinds of troubles for
01:15:28.840 American manufacturers the Koreans followed
01:15:33.040 pretty much the same approach in 1962 this was a
01:15:37.660 destroyed economy they had they were operating on
01:15:40.920 subsistence level to the extent that in 1950
01:15:43.660 Sudan and and South Korea had the same living
01:15:47.700 standards this is how bad it was in South Korea and
01:15:52.280 then the South Korean government decided that we are
01:15:54.740 going to pursue a policy that is aimed at building up our
01:15:58.900 industrial capacity and this policy was supported with a combination of loans and
01:16:04.480 financing and market access and so on and so forth and industrial policy
01:16:10.320 succeeded they really made a great job and now South Korean cars are competing with
01:16:16.920 everybody South Korean chips are indispensable South Korean phones are
01:16:20.900 market leaders they took their GDP per capita from 87 dollars in the 60s 87 dollars to
01:16:33.440 10,000 15,000 20,000 dollars this was the extent of their success and it was
01:16:40.260 neither the Soviet or early Chinese pure state control but it was also not free
01:16:48.900 market libertarianism it was not economic liberalism that did not succeed that did
01:16:54.700 not build them what they did manage to do was create a level of cooperation
01:17:00.060 between industries and the state you could go on and read about this I've given
01:17:06.780 you a bunch of links here and China did something very similar it China went
01:17:11.980 through different phases according to Philip Pilkington six different phases of
01:17:16.500 development the first one is a complete failure where all they did was just we're
01:17:21.360 gonna invest in as much infrastructure and industry as we could and then people
01:17:26.640 starved and died under Mao and then Deng Xiaobing came over basically he had been
01:17:33.960 kicked out of political life and then he became the leader of China and he
01:17:38.160 inverted that and adopted economic pragmatism and when you look at where
01:17:42.780 Britain is today it's this level of economic pragmatism that's needed but
01:17:48.000 it's happening now under very different conditions because the economic
01:17:51.880 pragmatism of the various Asian Giants was predicated on there being an endless
01:17:59.100 expansion of globalism and more and more market access being given to Asia into
01:18:05.500 Europe and the United States this phase is done and any British policymaker has to realize this the
01:18:13.840 Americans want to prioritize their own market the EU wants to prioritize its own market the Japanese are
01:18:19.000 always going to be protectionist so we're going to our South Koreans tariffs are coming back
01:18:23.500 tariffs are coming back so there is this level of realism that's needed but this level of realism was very present in Britain's own
01:18:32.500 industrial revolution it was mostly privately led but when things like the
01:18:39.700 canal mania happened where Britain pretty much connected all of its industrial hubs
01:18:45.400 with canals and allowed the very rapid transportation of goods between different
01:18:51.340 locations this was done through acts of Parliament that helped the financiers of the
01:18:58.840 canals so it was three steps really to summarize the first step was somebody very clever with
01:19:06.580 usually self-taught and engineering would decide on a canal route they would go to Parliament and get
01:19:13.340 that approved and allow for the acquisition of land and the acquisition of resources then they'd go to
01:19:18.760 investors and get that privately funded why because Britain is never going to be this kind of centrally led economy
01:19:25.240 it's just not in the nature of the English to be this subservient to the state they don't work this
01:19:32.440 way we shed tinkerers yes yes so it was a combination of the shed tinkerers and the government working on
01:19:41.640 supporting them and allowing them to become rich so it's extremely similar if you wanted to think of the British
01:19:48.040 the British temperament is extremely similar to how the Chinese are operating today they're defining the
01:19:53.320 technologies that they want they're defining the objectives that they want and then the state is
01:19:58.520 providing the loans and the financing and the backing and this can happen to to Britain as well and this kind
01:20:06.040 of semi-planning with a level of competition does work but it requires the shedding of ideology the shedding of
01:20:16.040 socialist ideology that says that the state must control all of the means of production and the
01:20:22.280 shedding of libertarian ideology that says no no the state has no role in the allocation of capital
01:20:28.200 the state should have a role and should have objectives and should have policies and these
01:20:32.440 policies should be geared at what the state is supposed to do defending the homeland being able to raise an
01:20:37.320 army being able to kill enemies the thing to remember is when this happened when the building
01:20:45.560 of canals happened which coincided with the napoleonic wars the total take of the British government out of
01:20:51.800 the economy was between 9 and 20 percent now 45 percent of the British economy is controlled by the state
01:21:02.920 and it's mostly wasted on welfare so the socialists are partly right some level of cooperation between the state
01:21:12.280 and the economic leaders is warranted and the state does have a role in the allocation of capital
01:21:20.280 but not this insane welfarism and not this insane taxation i'm going to preempt two things that i know
01:21:27.560 people are going to comment go on one of them is uh i'm sure people are going to point out uh japan's
01:21:33.320 very high debt to gdp ratio yes sort of known which came after the 80s that's true yes which came after the 80s it came after
01:21:41.560 the economic miracle had passed and then they had their lost decade of the 1990s
01:21:47.320 essentially and they've been in lost decade after lost decade since then more or less
01:21:53.080 that's also a function of the collapse in birth rates in japan and the collapse in demographics and
01:21:59.240 they've decided to just fund the elderly through welfare not through industrial policy in order
01:22:05.160 to pay for that so yes there are possible traps in that and the trap is doing exactly what Britain is
01:22:11.880 doing today and i know someone in the comments is going to point out that well um some liberals uh
01:22:18.840 liberal economists or free market economists or libertarians um argue that you need protectionism
01:22:25.160 when business is emerging and then when it flourishes in a specific industry then you can
01:22:29.800 open up to um the market which is limited things i i can also respond to that which is um
01:22:37.800 what leading industries do we have yeah we don't have any industries that we can open the markets
01:22:46.200 there is none because the whole point of that is you build up build up your industries so that they are
01:22:50.760 able to compete and this isn't necessarily but libertarian this is more like alexander hamilton
01:22:56.440 um almost guild uh protectionist politics um mercantilism even yeah mercantilism you build up those
01:23:04.360 industries internally until they can compete on the global stage then you open up the markets so that
01:23:10.760 you're able to out compete other industries hopefully put them out of business and force them into being
01:23:16.760 uh reciprocal to you look with the geopolitics of today and with the western demographics of today
01:23:23.960 it might end up being that europe and her daughters that is europe latin america north america etc
01:23:31.000 australia decide that they are forming one trade zone and they are competing against each other in that zone
01:23:39.240 and within each country there's a level of tinkering there's a level of specialization etc
01:23:43.400 it's it's it's one way of turning the west around essentially a more unified christendom
01:23:53.080 more or less that accepts that its civilizational enemies are going to be the muslim world and china
01:23:59.640 which to to to be honest about it that's very much what the civilizational enemies of the west are
01:24:04.920 always going to be in the in the world that we live in so there is room for that there is room for
01:24:11.800 something along these lines so long as you don't end up putting all of your industries in the united
01:24:17.480 states um having them all be in china is not much much worse way worse exactly exactly and notice i
01:24:26.520 don't mention india as a civilizational competitor for obvious reasons um so there is a there is something
01:24:33.800 to say about how the geopolitics of trade have changed and how there is a need for christendom to
01:24:41.800 recognize itself as chris as christendom because having polish immigrants turned out to be a blessing
01:24:47.560 compared to all of the other alternatives uh and and you've personally experienced that comparatively
01:24:54.280 comparatively they still they still like a lot of them still like their welfare fair enough a lot of
01:25:00.200 them still like their welfare although i'd argue that now it's a bigger problem with the bulgarians
01:25:04.520 and the romanians but that's true that's true there was there was still problems even when it's
01:25:08.600 somewhere as culturally close as poland there were still huge problems josh might remember this cast
01:25:15.480 your mind back to the mid-2000s of course where they would come and they would create their own little
01:25:20.120 ethnic enclaves and there would be ethnic divisions there one of the schools in the town that i grew up
01:25:26.360 had riots because of polish versus english tensions yep that were going on so it it can still create
01:25:33.880 problems even with people as close to us as that i would say i would say the the scandinavians and
01:25:41.480 germans and and uh people more along those lines don't go into our country and form their own ethnic
01:25:49.400 enclaves anywhere near as badly yep fair enough fair enough i think it's a matter of scale isn't it if a
01:25:54.840 large enough number of people turn up on mass they form their own communities and that's not to
01:26:00.520 disparage all poles like you say it's just it's just a matter of scales it's a matter of human nature
01:26:06.520 we do it in spain it's it's also a matter of human nature people naturally congregate around people who
01:26:11.160 speak their own languages but the point the last point that i want to make uh given the time is that
01:26:17.000 this is only possible in all of these countries where there is energy abundance where there is
01:26:24.600 plentiful of cheap and reliable energy so if you're going to pursue anything like that um you must have
01:26:33.400 an insane explosion in the amount of energy that is being produced in britain and you must have
01:26:40.040 exploitation of oil gas coal in order to transition to nuclear you need to have national security on
01:26:47.320 that as well you can't be national security has to be a top priority in energy you can't outsource your
01:26:53.320 energy production uh britain's standard of living pretty much stagnated right after it stopped being
01:27:01.400 a net energy exporter you have to have a huge amount of abundant energy and if you pair that with very
01:27:08.760 basic things like letting schools be academically selective like letting schools base train for
01:27:15.960 talent as opposed to for ideology with this kind of talented population and with energy and with some
01:27:23.640 cooperation between the state and capital there can be an industrial revival and if the right wants to
01:27:30.360 win it has to address the economic question not just the national question and this is what
01:27:38.680 really missing in the vision of the right and that's why with somebody like nigel faraj who is
01:27:44.040 essentially just a thatcherite this isn't going to be enough there has to be a full-on attack on the
01:27:52.600 left's economic agenda they're saying let's just tax and spend the answer must be let's build wealth and
01:27:58.920 part of building wealth in the countries that have succeeded has involved a very high level of cooperation
01:28:05.480 between government and capital owners uh this is oligarchy to some extent but so long as you
01:28:12.680 discipline the oligarchs and they're afraid of the state you can make this into a into a winning formula
01:28:18.600 it needs to be i i said this on a recent podcast it needs to be guided in the interests of the nation
01:28:24.840 the nation itself being the people of a particular time a particular place and ancestry it needs to be
01:28:32.120 in their benefit rather than the benefit of the bottom line of these companies exactly solely
01:28:39.240 exactly exactly exactly exactly and that's all the time that we have for this segment
01:28:45.320 well i'll read a couple of comments yeah i think that was really interesting okay so we'll uh go
01:28:50.040 through uh we've got a few rumble rants random name great podcast gents amazing segments thank you
01:28:55.480 very much thank you for taking the time to read all super chats i always try to don't want to
01:29:00.360 shortchange people here's five dollars monopoly money canadian oof though not for you harry
01:29:07.400 you got your pot of gold waiting thanks when did harry become a leprechaun i i i missed something
01:29:13.400 since people decided that i was ginger ah egregiously wrongly gaily yeah low no no your hair decided
01:29:23.320 logan reminder forget uh forgot the promises of progress yeah house vacation i've been using
01:29:29.960 that uk public public spending website for years as a way to get me depressed uh you should see the
01:29:35.960 covid years i can now inform you guys i'm thoroughly depressed yippee you don't need to go to that
01:29:41.320 website you've got a podcast you know that's progress you've progressed from one stage of
01:29:46.760 depression to the next so you're welcome do we have any video comments harry
01:29:55.240 we don't have any video comments screw it let's go over by five or so minutes let's read a few of these
01:30:00.120 uh website comments as well josh do you want to go through yours of course um sophie live says yeah
01:30:05.400 these people don't understand that we are now at a point where if a politician openly states he's racist
01:30:10.200 in favor of white people that would be a vote winner as no one in power seems to have the corner of
01:30:15.000 poor white people at all it's not just poor white people middle class um white people in britain in
01:30:20.520 particular yep have been squeezed perhaps the most um of anyone and that's not just me you know being
01:30:26.520 self-interested but yeah we could show you our pay slips to demonstrate but we won't
01:30:33.000 uh kevin fox says if i were nigel i would explain the comments were made when i was 13 right after
01:30:38.840 keir starlin explains why for a ukrainian rent boy set fire to his old house in a car
01:30:43.400 um if he uh won't i wouldn't um yeah where is that where where is the explanation for the rent
01:30:51.080 boys that story is buried deeper than something demand answers keir harry demands video footage i demand the
01:31:02.520 live leaks dear me jimbo g says honestly sick of even hearing the word racist at this point
01:31:14.360 they think literally everything is racist so who cares correct like when dealing with colonialism
01:31:19.720 the only time they acknowledge the english race is when it's being used as a stick to beat you with
01:31:24.200 yeah i i like pointing this out that the english as an ethnicity only exists when we're being racist or
01:31:30.840 colonial so basically when we're being better than other people or when they're trying to extort us
01:31:35.080 for reparations that's still related to colonialism isn't it i put it under the same umbrella i always
01:31:40.760 i always like to think you know like um oh so you're you're british just like that so we can split
01:31:45.160 the bill then yeah right and when who who's that actually going to in that case if all of these
01:31:52.040 people are just as british as i am my favorite thing is to bring up people you know particularly with
01:31:56.600 online indians who are perhaps some of the most annoying nationals online um i bring up the the
01:32:01.720 fact that many indians participated in colonialism in india you're going to demand reparations this is
01:32:06.920 funny that you don't this is something that we should remind the irish as well when they're kvetching
01:32:12.280 as they like to do which is the irish were also heavily involved in colonialism as red shirts and other
01:32:20.360 and other administrators we're very grateful for you for that and yes it was glorious and don't be
01:32:24.840 ashamed so stop shirking your own responsibility there you stupid mix uh ed milliband harnessing
01:32:32.120 enoch's spinning grave uh brilliant uh brilliant bowie wrote a prophetic song about this 50 years
01:32:40.840 ago called save your machine ai will destroy humanity not by killing us but by creating so
01:32:45.240 much peace and abundance that it drives us insane it begins with president joe once had a dream
01:32:50.840 sleepy joe much i do love some david bowie richard schmier elon is right whether you think that
01:32:57.400 makes him marxist or a bug man or not or whether you think he's even saying it's a desirable outcome
01:33:02.120 or not if ai continues to improve that is a big if as we've spoken about and with some programmers in
01:33:09.320 here have said as well whether ai is actually intelligent is still up for question whether it's
01:33:15.480 capable of doing all of these things that he thinks it is and says it will be is still a huge
01:33:20.840 question that is unanswered there will eventually come a point where ai can do literally every job
01:33:25.880 there'll be nothing for people to do and no reason for them to do it the economy will collapse and
01:33:30.200 people will turn to hedonism to get through the day while elon keeps making these predictions i don't
01:33:35.080 recall him ever saying that this is the outcome that he wants in fact i'm pretty sure he said he'd rather
01:33:40.360 we don't have ai at all but if it must happen it must be our ai and not china's i don't believe
01:33:46.760 that this isn't an outcome that elon wants because for one he is a technocrat and technocrats all adhere
01:33:53.400 to the same technocratic logic and mentality so i do disagree with you there and um i've got to say
01:34:01.400 that is not a rosy picture of the future that we are painting here there's a reason it is being
01:34:06.440 compared with the tower of babel if it's going to cause all of humanity to collapse maybe it isn't
01:34:12.280 something that is worth pursuing there's also one job ai will never be able to take which is
01:34:17.400 human advocate against ai it might surprise you it might surprise you uh sam western harry talking of
01:34:25.960 socialist utopians with insane ideals and visions look up king camp gillette the founder of the company
01:34:31.960 that would eventually become gillette and his outline for a mega city called metropolis in a
01:34:36.280 book he wrote called the human drift available on amazon he explained among other things that would
01:34:42.200 not only be uh that would not only possess a perfect economical system of production and
01:34:47.400 distribution ran by a united company but would also be powered by niagara falls that's very
01:34:53.720 interesting yeah interesting do you want to read through some of yours yeah sure uh sophie live says
01:34:58.680 well even then these tactics can't have been that good you just went ahead and listed the three
01:35:03.080 countries with the lowest birth rate yeah no i agree with you it doesn't solve the human problem
01:35:08.920 the human problem is a spiritual problem and that's separate from the economic problem but yes you're
01:35:13.560 you're absolutely right uh cumbrian kulak says i'd advise one of you to read the welfare trait how
01:35:20.520 state benefits impact personality by dr adam perkins psychiatrist so you might like him josh okay
01:35:25.800 might not like him it says i'm not a big fan of psychiatry but okay i do admit that sometimes
01:35:31.560 they can have good ideas so i'm not a zealot necessarily okay it's a very succinct book full
01:35:36.600 of data quite a tiring book not cheap ed dutton references it a lot okay uh grant gibson let's
01:35:44.680 talk about what the socialists get right immediately lists this like seven things come on yes yeah
01:35:50.840 you didn't expect me to become a socialist did you uh but they're right that there is an economic
01:35:55.000 problem also worth noting what caliber of polls went to britain they have a bad reputation in
01:36:00.440 both in poland too that's a fair point like i said i'm not trying to smear all polls with that
01:36:06.040 uh i have heard from a few people now you're reminding me that yeah it was a lot of the
01:36:09.800 scrounging polls that poland didn't want to come over but that's the same when you've got a welfare
01:36:13.960 state that's open to the world that's the same even with the islamic countries laughing at us
01:36:18.280 going like thanks for taking all our criminal suckers yeah pretty much like it keeps happening
01:36:23.000 that's why we need to stop giving welfare to the world's criminals yeah uh roman observer says same
01:36:30.040 thing happened in italy they had a mixed economy in the 50s and 60s and then privatization and
01:36:34.840 de-industrialization in the 90s it's it's it's a real problem like industry and keeping people employed is
01:36:41.800 should be a top state priority and nobody on the right is addressing that and we've got two more
01:36:47.320 honorable mentions from daniel butchers and zesty king both saying happy birthday zesty says now
01:36:52.760 you're 30 you're officially old i felt it before but now it's nice to have a formal title i can't
01:36:58.840 believe how young you people are well thank you i'm not making me feel younger already yeah you'll get
01:37:04.600 there harry that's quicker than i'd like as well finally your ginger hair will go gray you guys can
01:37:11.240 keep just like coping and telling yourself that i'm ginger but what do you what do you identify as
01:37:16.760 blonde ah it doesn't matter i've got lots of redheads in my family my hair goes a little bit
01:37:20.840 red in the sun so it's nothing to be ashamed of just come out and proud as ginger harry i'm not
01:37:24.920 saying i'm ashamed i'm saying that it's just not true oh you're in you're deep in the ginger closet
01:37:29.560 and uh oh that's a random name said it's always better to be a ginger than an anagram of one and
01:37:37.400 on that note i think that's where we should end thank you all very much for joining us today i think
01:37:43.480 it's been a a good time so we'll see you again tomorrow uh where we can depress you even more
01:37:49.560 have a great day folks
01:37:59.560 you