The Podcast of the Lotus Eaters - October 15, 2024


The Podcast of the Lotus Eaters #1022


Episode Stats

Length

1 hour and 40 minutes

Words per Minute

180.21927

Word Count

18,164

Sentence Count

1,473

Misogynist Sentences

21

Hate Speech Sentences

66


Summary

The Lotus Eaters are joined by Stelios and Josh to discuss the use of the word 'retarded' as a pejorative term for people with a low IQ. They discuss the pros and cons of the term and how it can be rehabilitated.


Transcript

00:00:00.000 Hello and welcome to podcast of the Lotus Eaters episode 1022 on the 14th of October
00:00:14.880 2024. I am joined by Stelios. Hello everyone. And also Josh. Hello there. And today we are
00:00:21.380 going to be talking about rehabilitation of the R word. Stelios is going to explain how
00:00:27.840 little rocket man is putting the kaboom back into civil engineering. And how these roads
00:00:33.480 didn't blow themselves up. Yes, they didn't blow themselves up. And I will be talking about how to
00:00:37.780 make you very wealthy if that is something that you're interested in. Also, if you would like to
00:00:42.840 buy a copy of the Islander edition two, you can't because you've left it too late and orders have
00:00:48.120 closed. So you're going to have to wail in purgatory for the next three months waiting for the next
00:00:52.660 edition to come out. And I'd also like to thank the nice people at Robin's Forge for sending us
00:00:59.140 some lovely key fobs. So thank you very much. And also this, which is a stamp. I believe there
00:01:05.040 might have also been some stationary holders as well. There's a whole bunch of stuff. Yes. Thank
00:01:11.460 you. Just people send us nice stuff, which is which is lovely, including Robin's Forge. So thank you
00:01:17.000 very much. Right. So with that, let's dive into Josh. Yes. So I'm going to be talking about language
00:01:26.000 today. And obviously, language is very political. We talk about that all the time. And the left
00:01:30.840 tries to control language, because if you control language, you control what people think to a
00:01:35.740 certain extent. And that's not really controversial. I think we all understand that. And it's also a
00:01:40.280 sort of bipartisan view of language. And so if we wish not to grant them power, it's beneficial to know
00:01:46.700 how their linguistic subversion operates, which, funnily enough, Stelios and I covered at one
00:01:51.420 point. I can't remember when it was. And Beau as well. Yeah, this was the 20th of July 2023, of course,
00:01:57.040 ahead of the curve as always. So if you want a large amount of detail about this, that is the place to
00:02:02.320 go. But I'm going to talk about a very specific case. The word of the day today is retard. And this
00:02:10.260 has been on the fringes of acceptability over the past 10 or so years. Obviously, to be 100% clear,
00:02:19.560 I do not approve when applied to people with special needs. I don't think they should be teased. I don't
00:02:26.340 think, you know, anyone with any disability deserves to be mocked. And so I definitely don't mean in that
00:02:32.200 sense. However, it is used by, you know, you or I say, if, you know, I were to do something stupid,
00:02:40.000 it would make sense for you to say, oh, you know, you're being a bit of a, right? Okay,
00:02:46.180 wouldn't it? That, that, yes, that makes sense. People use it in that term. And when people use
00:02:50.680 it in that way, they don't mean to denigrate other people. Well, except the person they're calling the
00:02:57.620 the word, they're not insulting disabled people or anything like that. What does the original word
00:03:03.780 actually mean? So it's just a clinical word. And we're going to actually look at some examples. No,
00:03:10.020 it's not. I don't think so. Anyway, Josh, can I ask you something on this? Because there's an equivalent,
00:03:15.060 there's an analogy. So when it comes to flirting, a lot of people say, well, this, this lady isn't
00:03:22.740 rejecting you, she's rejecting your approach. That's the mentality you would need to be going
00:03:27.740 for. Sure, I don't disagree. So obviously, the word was not originally meant to be insulting. It
00:03:35.540 wasn't originally a pejorative term. And I think that you shouldn't necessarily view it as that
00:03:40.860 harsher word, really. There are lots of other words in use today, that had medical backgrounds
00:03:47.360 that just mean intellectually challenged. And that's, you know, that's how we use it in
00:03:52.240 colloquial terms. And I don't think it should be policed, because by policing people's language,
00:03:56.960 then obviously, you're giving up power to someone because you're yielding to their will.
00:04:02.400 But intellectually challenged by what standard?
00:04:05.440 Uh, well, many different standards. It's just a subjective thing, isn't it?
00:04:09.520 Yes. Like, so you need a baseline population
00:04:13.360 to draw the inference that that is the that is the baseline anything below a certain level.
00:04:18.720 But don't you then run? Well, everyone uses that themselves as their own baseline. And so,
00:04:23.040 right, you know, everyone who is less intelligent than ourselves, we view as,
00:04:27.440 you know, there's two standard deviations of IQ below. And that's retarded.
00:04:31.840 You could say that that's that's one potential.
00:04:34.480 That's gonna be like 70% of the population.
00:04:36.080 I mean, except if you're very smart, and you could say, I mean, there are
00:04:39.520 people who are less intelligent than I am and who are still geniuses.
00:04:45.600 That is the most modest as ever subtle brag I've heard today. Well done.
00:04:50.240 So anyway, to bring things back, moron, is another word originally a clinical term coined in the 20th
00:04:58.400 century by the psychologist Henry H. Goddard. So that would be off the table if we follow this
00:05:04.000 same standard set for the R word. Idiot used in the 19th and 20th century is another clinical term
00:05:11.600 that is taken to mean someone who is not very intelligent. Cretin as well, we just had a very
00:05:16.720 specific use. It's a good one, isn't it? It doesn't get as much exposure. So that was originally a medical
00:05:22.720 term for someone with congenital hyperthyroidism. What's that? I don't know. I'm not a doctor.
00:05:29.280 Right. Okay. I'm a psychologist. I'll tell you how it makes me feel. I think it means you have very
00:05:32.960 fast metabolism. Yeah. People who are critting if you have a fast metabolism. No, no, no. If you
00:05:38.560 are hyperthyroidical. I think in severe cases, it can lead to mental difficulties. I don't understand
00:05:45.840 exactly, but that's the origin of it basically. Somebody in the comments will know. I'm sure they
00:05:50.000 will. So other ones, simpleton, dunce, you'll like this one, stellios, feeble-minded, dullard,
00:06:01.440 backwards. They're all words that had clinical backgrounds. These would be off limits, right?
00:06:08.720 And so I feel like if you ban these sorts of words, you lose some of the poetry of the English language,
00:06:14.880 right? Because having options gives you a good amount of nuance, gives you the option to have
00:06:20.800 an almost artistic approach to language because each word has its own sentimental attachment.
00:06:26.720 And so each one has its own sort of distinct character. And you, you know, what makes a good
00:06:32.240 insult is that someone is, has sort of architected, that's not a word, I don't think.
00:06:38.800 Um, they've created it in a very specific and deliberate way. And it reveals something about
00:06:45.360 their thought processes that they've chosen to say, you know, simpleton over, over backward.
00:06:50.720 I remember getting called backward a lot when I was a kid at school, but that was, that was because
00:06:55.920 of my political views. My liberal teachers just did, because a subject would come up, like this would
00:07:01.280 happen all the time in geography. We'd be talking about like groups of people around the world.
00:07:04.400 And I would say, well, yeah, but it's because of this. And then the teachers would be like, well,
00:07:08.640 you're backward for thinking that. It's like, yeah, but it is.
00:07:12.720 So what they were doing there was they were taking a very ableist stance against you.
00:07:18.080 And they were, they were mocking you.
00:07:20.160 Yeah, but I'm far more able than them. The issue is they just didn't like my politics.
00:07:24.160 It's almost like they're using it as a pejorative term, isn't it?
00:07:27.120 Yes.
00:07:27.440 In almost exactly the same way.
00:07:29.360 I came to quite like it.
00:07:30.160 Yeah, it's a good word as well, isn't it? Badge of honor even.
00:07:34.000 But there are some woke safe ones. Stupid is woke safe because it had no clinical background.
00:07:40.400 It entered into the English language in the 16th century via French.
00:07:44.240 And it originates in Latin for stupia or stuper, depending on how you pronounce it.
00:07:49.600 I don't know. I don't speak Latin.
00:07:50.640 Stuper is a good word. We should use that more.
00:07:52.400 Yeah, which I imagine, you know, that's a shared origin to that word.
00:07:56.800 I didn't know it's French though, so I don't like it.
00:07:57.680 It means to be amazed or stunned, but it does originally come from Latin.
00:08:00.880 Right.
00:08:01.200 So it's okay.
00:08:03.360 And also dumb, that's Old English, so that's an Anglo-Saxon word.
00:08:08.160 It means mute or speechless and also relates to Old Norse.
00:08:12.560 I think dumber. D-U-M-B-R. I don't know how it's pronounced.
00:08:17.440 Again, I don't speak any Norse languages.
00:08:19.920 Also, the Dutch D-U-M means stupid and the German D-U-M also means stupid.
00:08:26.560 So there's a sort of similarity here.
00:08:29.760 You'll be inclusive when you're insulting people.
00:08:31.520 Yes. And so dumb and stupid are okay.
00:08:35.680 Any of the others, apparently not.
00:08:38.400 But we would lose so much of the poetry of the English language if we got rid of them.
00:08:41.360 Who decided this? Who said that we're not allowed to?
00:08:43.320 Well, anything that is ableist. So any word that has a clinical background that has now become colloquially pejorative is bad because it mocks people with disabilities.
00:08:57.120 Which it doesn't, by the way, because I think 99% of all people, particularly people who use these words, don't want to mock those people because it's seen as very mean, which it is.
00:09:08.000 And this has even made its way into sort of YouTube terms of service and stuff like that, isn't it?
00:09:13.240 Yes.
00:09:13.900 Right.
00:09:14.360 I mean, it informs a lot of things.
00:09:16.060 And what I thought interesting was Vox recently wrote an article which they've got a sort of great paywall of China up.
00:09:24.460 So we can't actually read it.
00:09:26.760 But this lady, Constance Grady, wrote about it coming back.
00:09:32.460 And here it is. That's the thing.
00:09:34.480 So basically, there are a few things here that I thought were interesting.
00:09:38.000 She starts off by saying around 15 years ago, a new campaign took off across young social media, these young social media ecosystem.
00:09:45.400 People with learning disabilities and intellectual disorders were asking everyone else to stop using the R word to describe them or even to make jokes.
00:09:52.920 But it's not. No one ever uses it to mean them because that's mean.
00:09:56.980 Right. Everyone understands that that's unnecessary and uncouth pointing out to someone, you know, their problems.
00:10:02.800 That is that isn't funny.
00:10:04.760 Well, you just mean spirited.
00:10:05.980 You might not have problems. You might just belong to a group that has a different bell curve.
00:10:10.860 But point being that it's sort of being falsely equivalent, you know, falsely attributed to these people when that's not what people mean when they use them.
00:10:21.380 But someone did say something that I did agree with.
00:10:23.920 Apparently, the linguist Caitlin Green says, what you're doing when you use a slur is you're telling people in your audience, this is the kind of person I am.
00:10:33.600 And this is the kind of attitude I have towards the normies that don't use that word, which is kind of true, isn't it?
00:10:38.980 I don't like normies.
00:10:39.660 Exactly. And so we use language that signifies we're above it, right?
00:10:45.540 We don't agree with the programming.
00:10:47.840 And so we use certain words that signal to everyone else that, yes, we don't care for this moral standard.
00:10:53.960 We're not a part of it, which I think is actually the case.
00:10:56.780 Sounds encouraging.
00:10:57.300 Yes. So another outlet, this one I think was published on the 10th of October.
00:11:03.080 This is the 10th of September.
00:11:05.720 It's only about a month apart here.
00:11:08.560 Why are people using the R word again?
00:11:10.520 And they're noticing it as well.
00:11:11.940 They're talking about X and how lots of people on there are using it, particularly on social media, actually.
00:11:20.540 We had one of our presenters called a retard.
00:11:23.560 By Constantine Kissin, that's right, yes.
00:11:25.680 Not very nice of Constantine.
00:11:27.400 No.
00:11:27.680 Does he not know the YouTube guidelines?
00:11:29.360 He's not allowed to say that about people.
00:11:32.440 And Harry's a lovely chap.
00:11:34.580 Doesn't deserve it.
00:11:35.420 But here is Business Insider also talking about it.
00:11:38.100 This was back in August.
00:11:39.480 So there have been a repeated number of articles talking about it making a comeback, which I find interesting because I imagine that they're not all going to be talking about it without just cause, right?
00:11:50.000 Particularly when it's sort of a topic like this.
00:11:53.880 And so what it suggests is that if a word enters woke purgatory, it can come back.
00:11:59.860 And if we look at how it's come back and why it's come back, then perhaps other things can be rehabilitated as well.
00:12:08.120 And there is...
00:12:09.400 Where are you going with that?
00:12:10.960 Well, because I must say, I did think this segment was about the hard R, so I was slightly confused going into it.
00:12:20.900 Well, my position on that is I think it's okay to say every word.
00:12:23.980 I don't think any noise your mouth can make should be taboo and...
00:12:29.140 So is this going to come back in a way that only if you are that thing...
00:12:33.500 So, I mean, are we going to get to the stage where Democrats are allowed to describe themselves with the R word, but nobody else is?
00:12:41.340 And if you do, they just like...
00:12:42.680 No, I think it's...
00:12:44.340 We've sort of returned to the 1990s standard, it seems, whereby both left and right are using it.
00:12:50.420 It's sort of a bipartisan word now, apparently.
00:12:53.040 Right, okay.
00:12:53.660 And so it's acceptable, which I find interesting.
00:12:56.320 It's sort of...
00:12:56.960 People are coming together kind of thing.
00:12:58.280 There's some evidence that perhaps a little bit of woke is being put away, that people are fed up of this tone policing,
00:13:04.660 and people actually kind of like being able to say what they want.
00:13:08.360 And Business Insider alleges cancelled words of making a comeback on the left.
00:13:13.320 Some say it's a reaction to tone policing.
00:13:15.440 So, yes, they're explicitly saying that, yeah, it's not just evil right-wingers that are terrible and mean,
00:13:21.320 it is the left as well that are using these words, which is interesting.
00:13:26.540 And, in fact, one of our new MPs had an interesting tweet back in the day.
00:13:34.340 There she is, struggling to scroll down.
00:13:37.260 But she had a tweet in 2009 saying,
00:13:41.500 effing Estonian R-words.
00:13:46.140 That's very specific.
00:13:48.220 It was, seems to me, a massive non-sequitur,
00:13:51.640 because I can't think of any reason that you can be that annoyed with Estonia,
00:13:56.480 or the people from Estonia, to tweet that out.
00:13:59.900 Oh, she wanted them out of her flat, apparently.
00:14:02.720 Why were they in her flat?
00:14:03.440 I don't know why there were so many Estonians that she got driven to that.
00:14:08.320 How many Estonians is too many, do you reckon, in a flat?
00:14:11.900 Well, if it's your flat, I would, one?
00:14:14.340 One?
00:14:15.320 Well, I mean, if they're uninvited, sure, but I get the...
00:14:17.980 You've invited them round, I mean, that's different, but...
00:14:21.440 I think we need context here.
00:14:23.360 I know, and the funny thing is, there was no context provided.
00:14:26.340 She just apologised and didn't explain it.
00:14:28.500 And I want to know, how many Estonians were in there for you to react in this way?
00:14:34.240 I feel there must be a backstory behind this.
00:14:37.160 Yeah.
00:14:38.460 Please tell us, if you're watching.
00:14:41.180 My curiosity has just peaked here.
00:14:44.020 So, yes, even left-wing members of British Parliament have used the word,
00:14:49.240 and it has not cancelled their career.
00:14:51.100 She just said, yeah, I'm sorry about that, and moved on, and no one cared.
00:14:54.800 And it's interesting, isn't it?
00:14:57.280 Were it a different word, perhaps, then there might have been a much stronger reaction.
00:15:01.440 She might have even had to have, you know, stepped down or something.
00:15:05.780 But we even get it in headlines here.
00:15:08.160 So, here's the Irish Times.
00:15:09.600 My teacher told my parents I was retarded.
00:15:12.340 My grandmother taught me to read.
00:15:14.620 That's a heartwarming story, isn't it?
00:15:16.540 Yeah.
00:15:17.700 Throwing their grandmother under the bus a bit there, aren't they?
00:15:20.340 Well, maybe the grand...
00:15:23.380 Well, which is the cause and the effect here?
00:15:25.980 Did the grandmother teach the kid to read, and then they got called retarded,
00:15:30.440 or did they get called retarded, and then the grandmother...
00:15:32.560 This is like the chicken or egg.
00:15:33.700 We'll never know.
00:15:34.680 They are doing this because they want to argue against homeschooling.
00:15:38.660 Ah, yes, right.
00:15:41.040 It's probably true, but it is a lady here.
00:15:44.340 This is back in June.
00:15:45.320 And so, if a mainstream publication is able to use the word in a headline and not censor it,
00:15:51.380 it's also a sign of how far we've come, isn't it?
00:15:54.260 And so, it's also been used in a film that you wouldn't be able to see me dead at.
00:15:59.560 Deadpool and Wolverine.
00:16:00.860 I want to see that.
00:16:01.540 They dropped, apparently, the R word, and then there are loads of people complaining about it.
00:16:08.000 It's like, once was enough, and there's other ableist jokes as well.
00:16:10.960 I super didn't appreciate it.
00:16:13.800 Just dropping the super in there is just, you know that they're an American leftist now, don't you?
00:16:19.000 But the left's whole thing is that anybody who doesn't agree with them is a mental midget.
00:16:27.200 So, why do they have an issue with this?
00:16:30.680 Because this is basically their go-to all the time.
00:16:32.260 Because it's ableist, and therefore bigoted, because you're talking from a...
00:16:36.700 No, they don't.
00:16:37.640 That's what they miss.
00:16:38.580 It's just a bludgeon, really, to beat the right with.
00:16:41.600 But if we just say, yeah, actually, it doesn't mean that, and also you can use it.
00:16:46.120 So, when they're attacking the right, what is the...
00:16:51.260 So, they don't actually mean it?
00:16:54.140 As in, they're doing it not because they actually care about the word,
00:16:57.740 but because attacking the right is a moral good in their eyes, all on its own.
00:17:01.720 Do they think the right are actually stupid or not?
00:17:04.360 Because if they are, then why would they have a problem with the retard word?
00:17:08.500 Because that's what they actually think the right is.
00:17:10.380 You're imposing logical things that don't apply here.
00:17:16.000 It's...
00:17:16.880 I mean, you call some people, when you want them to feel bad,
00:17:21.120 you call them what they don't want to be called.
00:17:23.120 Also, this, I think, was back in March.
00:17:28.380 This is some sort of...
00:17:29.580 I don't know what it is.
00:17:30.480 It's Instagram.
00:17:31.140 I don't ever go on that.
00:17:32.800 But they're saying, it doesn't matter if you're autistic
00:17:35.040 and have had the arse layer used against you,
00:17:37.400 you still can't reclaim it.
00:17:38.800 So, it's not like a certain pass that begins with N.
00:17:43.400 But hang on, hang on.
00:17:44.760 Autistic doesn't mean you're low IQ, does it?
00:17:48.220 No, they're just...
00:17:48.800 Well...
00:17:49.560 Very high IQ in autistic.
00:17:50.860 I know lots of them.
00:17:52.740 I know.
00:17:53.420 Lotus eaters wouldn't function without them.
00:17:55.020 No.
00:17:55.440 Well, we wouldn't have any audience either.
00:17:58.200 Of course, yeah.
00:17:59.060 Yes.
00:18:00.460 Yeah, our entire business operation is powered by autism, so...
00:18:04.660 High IQ autists.
00:18:05.760 Exactly.
00:18:06.480 This is...
00:18:06.960 Right.
00:18:07.980 When we use this word, we're not denigrating people like this.
00:18:11.300 We're celebrating them, if anything.
00:18:14.100 So, I'm trying to make an issue of it,
00:18:16.820 and then there are a bunch of leftists in the comments,
00:18:18.800 which I'm going to save you from looking at.
00:18:20.860 But there are basically people complaining about it coming back,
00:18:24.640 and people saying,
00:18:25.940 well, hang on a minute, this qualifies for me.
00:18:28.280 And here's Reddit talking about it as well.
00:18:31.940 And this is Reddit, so it's a hive of leftism.
00:18:34.380 Let's be real, ableism is the most normalised joke.
00:18:37.660 Blah, blah, blah, blah, blah.
00:18:38.460 Also, the BBC tried to normalise it with a disabled comedian
00:18:43.260 having a show that had the R word in the title.
00:18:47.700 What was the disability?
00:18:48.700 Is it gums or something?
00:18:50.260 I can't remember what she's got.
00:18:52.360 But there was this.
00:18:54.800 I saw this list of ableist words you can cut out.
00:18:58.740 So, idiot, crazy, differently abled, dumb, insane, lame, moron, psycho, specially abled.
00:19:04.340 I thought that was one of their words, personally, specially abled.
00:19:07.200 And stupid.
00:19:08.300 These are all ableist words that are not allowed to be said.
00:19:11.860 And I noticed an absence of a word.
00:19:13.800 This is all the way back last year.
00:19:16.360 And I shared this.
00:19:19.780 If you're listening, it is the orc from the Two Towers.
00:19:23.520 That's an Uruk-hai.
00:19:24.800 It's an Uruk-hai, yes.
00:19:26.000 Thank you.
00:19:26.540 Sorry.
00:19:27.360 Fact check.
00:19:28.220 That's all right.
00:19:28.720 I'm a big Lord of the Rings fan.
00:19:29.940 I just saw Samson correcting his glasses there as well.
00:19:32.320 Are they the smarter kind of orcs?
00:19:34.520 Yes.
00:19:35.020 The non-hard word orcs.
00:19:36.440 But it looks like that word is back on the menu, boys.
00:19:41.120 So, yeah.
00:19:41.540 It's not on the list.
00:19:43.000 Apparently, it's okay now.
00:19:45.600 So, I also wanted to tie this into the fact that it's just being dropped in political discourse now.
00:19:51.860 As in Trump.
00:19:53.040 This, of course, was something that was leaked to try and make Trump look bad.
00:19:57.100 But it doesn't make Trump look bad.
00:19:58.800 It points out a flaw in Kamala Harris more than anything.
00:20:01.760 And, of course, Rolling Stone prints this very offensive word in the title, Uncensored.
00:20:06.180 So, it can't be that bad.
00:20:07.560 Even though the Daily Mail is clutching its pearls here, Donald Trump makes horrific remark.
00:20:12.560 Oh, it was just terrible.
00:20:13.660 I know you said at the beginning that we're not allowed to call anybody the R-word.
00:20:18.920 But if we're now starting to talk about Kamala Harris, I'm going to have a real hard time from this point onwards.
00:20:26.640 Trump alleged that she is this thing.
00:20:29.760 Oh, so we don't need to say it because we can just report on the fact that Donald said it.
00:20:33.720 And also, this is a very wild headline.
00:20:36.720 Trump called Harris, an R-word, railed against Jews.
00:20:41.720 Interesting.
00:20:42.640 It sounds like Trump just went on a berserk.
00:20:45.420 You know, he was turning over tables and all sorts of things.
00:20:49.220 Obviously, he was railing against them because some of them supported her.
00:20:52.880 But let's watch this in a completely unrelated note.
00:21:00.000 There is a very important relationship, which is an alliance with the Republic of North Korea.
00:21:05.480 And it is an alliance that is strong and enduring.
00:21:10.100 And today, there were several demonstrations of just that point.
00:21:14.660 Hang on, we've got a strong alliance with North Korea.
00:21:21.260 Are you sure about that?
00:21:21.760 Famous, famous alliance between North America, you know, the United States, and North Korea.
00:21:29.240 I don't know, you know, if you were to name the one country that is the least aligned, that would probably...
00:21:34.420 But she genuinely is.
00:21:37.280 A very intelligent lady.
00:21:39.660 That's what you wanted to say, wasn't it, Dan?
00:21:41.560 Yes.
00:21:41.780 I mean, on a side note, I think she's about to go into a massive doom loop.
00:21:47.380 Yes.
00:21:47.760 Because her internal polling is so bad that she's thinking of doing Joe Rogan.
00:21:52.440 I saw that.
00:21:53.020 And other interviews.
00:21:53.700 If she does interviews, people will realise what an R word she is.
00:21:58.360 And then her internal polling will get worse.
00:22:01.160 And then she'll do more interviews.
00:22:02.260 And she'll just go into this doom loop.
00:22:04.580 And the people you've got to feel sorry for, the three-letter agencies, are going to have to try and...
00:22:09.780 Rehabilitate her image.
00:22:12.660 No.
00:22:13.000 No, no, no, no, no, no, no, no, no, no, no.
00:22:14.240 We can't say that.
00:22:15.540 All right.
00:22:16.060 Not allowed.
00:22:17.720 You'll get us kicked off of YouTube, Dan.
00:22:19.400 I don't enjoy censoring people.
00:22:21.640 However, I don't want to lose our entire channel.
00:22:24.760 Right.
00:22:25.160 I thought I could say the F word.
00:22:26.480 I mean, it was in a bloody New York's Time article.
00:22:30.060 Fortification.
00:22:30.820 Yes.
00:22:31.100 Yes.
00:22:31.520 That's what you meant.
00:22:32.620 Yes, that's what I was going to say.
00:22:33.960 They're building castles.
00:22:34.740 That's what they're doing.
00:22:35.260 And anyway, there are articles like this, just basically calling her stupid, which, of course, as we know, is one of the woke words that are okay.
00:22:46.040 But the fact that this is being said is catastrophic.
00:22:49.320 She's too stupid to do most jobs, let alone president.
00:22:53.780 Yes.
00:22:54.080 And it's also interesting that Trump comes out, calls her the R word, and then loads of media outlets are just like, yes, of course she is.
00:23:01.740 And it's also worth mentioning as well that she can't claim the moral high ground here because, as the Daily Mail reported, in 2019, she had to apologize for laughing when a man in New Hampshire described Trump as mentally R worded and claims she hadn't heard the words he used in that moment.
00:23:19.640 By the way, the chat is claiming that Trump has agreed to go on Rogan, which I hadn't heard.
00:23:24.880 Have you heard that?
00:23:25.420 I have heard that as well.
00:23:27.160 I think a lot of these things go around.
00:23:30.500 We'll have to wait and see until it's nailed down and dates given.
00:23:35.580 Wouldn't it be really good if Rogan had Trump and Camel Laugh at the same time?
00:23:40.640 That would probably be the best debate going, wouldn't it?
00:23:44.620 Because Joe Rogan would actually be reasonably fair, I think.
00:23:48.640 Yes, that would be good.
00:23:50.400 But my point here is that it's interesting to see things sort of reversing a little bit and that language can indeed become acceptable again.
00:24:00.780 And although I don't think you should use it liberally, I think have a bit of class and a bit of taste.
00:24:08.580 It's heartening to know that once something becomes off limits, it doesn't always have to be.
00:24:15.400 That's heartwarming.
00:24:16.100 A nice heartwarming segment about retards.
00:24:19.120 That's nice.
00:24:20.320 We have the chat.
00:24:21.340 Oh, yes.
00:24:22.680 Speaking of retards, let's read some comments, shall we?
00:24:25.480 I'm joking, by the way.
00:24:29.300 So, retard is one of my favourite words.
00:24:32.000 Part of its appeal is that it combines very well with other words.
00:24:36.360 Libtard, commitard, eftard.
00:24:38.520 Yes, that is true.
00:24:42.240 Yeah, good point.
00:24:43.080 Good point.
00:24:43.660 That's a random name.
00:24:44.300 The beauty of the language is that it evolves.
00:24:46.300 If the leftoids are trying to restrict the use of certain words, then I'll simply invent new words that serve the same function.
00:24:53.180 What a bunch of femgroids.
00:24:55.600 Nice creativity there.
00:24:56.900 Lofar Truther says, trying to artificially get rid of language just leads to people mocking it.
00:25:04.180 For example, Reddit bans retard.
00:25:06.580 So, WallStreetBets starts calling everyone highly regarded individuals.
00:25:10.620 Nature finds a way.
00:25:11.440 I like people calling each other regarded.
00:25:14.000 I think it's funny.
00:25:15.480 And you see it in YouTube comments quite a lot as well.
00:25:18.920 That's Random Name says, I think it's more of a gendered issue than it seems.
00:25:23.300 I think most of us men love insulting one another, my friends.
00:25:26.880 I can't.
00:25:28.440 I think that's funny.
00:25:30.980 I do the same, but I can't read that out.
00:25:33.960 What was that, sorry?
00:25:35.260 It's okay.
00:25:35.920 Carry on.
00:25:36.760 Oh, Stelios was wiggling the mouse, were you?
00:25:41.580 But yes, That's Random Name.
00:25:43.100 I think that's funny.
00:25:43.800 I just don't want to read that out because it'll make me look bad.
00:25:46.560 But anyway, who are we on to now?
00:25:50.780 Stelios.
00:25:51.500 Do I have control of the mouse?
00:25:55.260 Good.
00:25:56.320 There you go.
00:25:57.920 Right.
00:25:58.460 So, we will go to the Korean Peninsula and talk about roads.
00:26:02.320 And roads that didn't blow themselves up.
00:26:05.700 And we are going to talk about the significance of this event because a lot of people are thinking
00:26:10.920 that it is a harbinger of things to come, of bad things, that this is a bad moon rising.
00:26:16.700 So, I want to show you exactly what happened.
00:26:19.640 Is that a Credence Clearwater Revival reference?
00:26:22.240 Yeah, I think so.
00:26:23.180 I see a bad moon rising.
00:26:24.960 Yeah, I love that song.
00:26:25.860 Trouble on the way.
00:26:26.660 Yeah, of course.
00:26:27.360 So, why did these roads definitely not end themselves?
00:26:34.240 Well, you shall see, Josh.
00:26:37.060 Right.
00:26:37.520 So, we have here an article from The Independent with the title,
00:26:41.280 North Korea threatens to turn South Korea into piles of ashes after drones fly over Pyongyang.
00:26:49.360 So, what happened was that lately there have been several accusations from both sides.
00:26:57.460 And the North Korean side has said that the South Korean side has sent several balloons away,
00:27:05.100 several drones, and several propaganda messages.
00:27:10.560 So, I have a video.
00:27:12.960 We'll show you exactly what is going on with the balloons.
00:27:15.920 But that was the accusation.
00:27:17.780 And they said that it happened three times on the 3rd, the 9th, and the 10th of October.
00:27:22.320 And the drones were carrying leaflets that were filled with inflammatory rumors and rubbish.
00:27:29.040 So, that was their propaganda.
00:27:31.040 And you'll see that they were basically throwing a lot of trash on North Korea.
00:27:38.080 To be fair, the South Koreans should have figured out by now that if they just dumped their rubbish over the North Korean border,
00:27:45.460 all the North Koreans can benefit by going through the rubbish and actually finding food.
00:27:49.900 Because, of course, a lot of them are starving in North Korea.
00:27:53.940 They had a lot of problems actually sourcing food because the climate isn't necessarily suitable for agriculture.
00:28:00.520 Well, maybe, you know, they could be exporting some manure for free to help them with agriculture.
00:28:07.500 Right. So, what we have here is suspicious activity because, initially, the South Korean side denied the allegations.
00:28:17.060 But afterwards, soon afterwards, they said that we don't know what happened exactly.
00:28:22.120 Well, it almost sounds like they're fly-tipping.
00:28:24.340 Like, they're just, you know, I don't know what to do with this mattress that I've got.
00:28:28.780 It's all stained.
00:28:30.120 I found it in a tunnel in New York.
00:28:32.140 What am I going to do with it?
00:28:33.200 This is all a bit Monty Python, where they're just, like, catapulting their rubbish into their enemy.
00:28:38.800 Well, throwing feces is something that has been going on for a long time in this peninsula to each other.
00:28:46.940 I'd like to think that, as a species, we've moved beyond the point of just throwing our feces at each other.
00:28:52.700 Have you tried it? It's good fun.
00:28:54.640 No, I don't know.
00:28:55.720 I mean, in Swindon, there are people who do this.
00:28:58.720 That's true, actually.
00:28:59.560 Yeah, so, but the issue here, and this is why a lot of voices are saying that this may be the harbinger of something like World War III,
00:29:10.180 is that this is the first time North Korea has accused its rival of flying drones to drop propaganda leaflets into the country.
00:29:18.000 So they upgraded.
00:29:19.720 Okay.
00:29:20.540 It's not just throwing feces at each other.
00:29:23.940 They're actually throwing propaganda.
00:29:26.560 The only bit there is using the drones, because, of course, during, like, say, Second World War, for example,
00:29:31.820 bombers would drop leaflets all over the enemy, you know, with messages on it.
00:29:35.760 Yeah.
00:29:36.400 But they're just getting drones to do it now.
00:29:37.600 So this has been perceived as an escalation of the situation.
00:29:44.640 And we have here, Jürgen Naudit says, Third World War, some are already writing.
00:29:50.280 And he says something about a German military expert who writes,
00:29:53.360 we live in a time where once in a century events happen every three months.
00:29:58.180 China and North Korea have attracted the attention of the world today.
00:30:01.200 And he says media citing intelligence services, North Korea may blow up roads to South Korea today
00:30:07.580 and conduct military operations near the demilitarized zone.
00:30:12.160 And he's talking about earlier that the North Korean side had troops close to the borders
00:30:17.460 to combat, to have them as combat ready.
00:30:22.820 And he's also talking about China launching a large-scale military exercise near Taiwan with no end date.
00:30:30.180 Ships and aircraft are approaching the peninsula from the different direction.
00:30:34.300 And we have all the other wars going on at the moment.
00:30:38.860 So they're saying that perhaps blowing up these roads is going to be the final straw that's going to break the camelback.
00:30:48.220 Maybe.
00:30:48.480 So, first and foremost, obviously World War III is not going to happen, no matter how much the German wants it.
00:30:55.720 Secondly, he is kind of right that North Korea should be a little bit annoyed
00:31:02.860 because dropping propaganda across the border isn't particularly neighborly.
00:31:07.880 So the South Koreans have done something and they are known to do this, right?
00:31:12.220 Well, the question is who cast the first stone?
00:31:14.300 Because you could say that, you know, in the same way with the propaganda, maybe North Korea did it first.
00:31:20.440 I'm certainly no fan of North Korea, don't get me wrong.
00:31:22.840 A lot of people are saying that essentially what is going on is that there are North Korean defectors
00:31:28.120 who are functioning like activists in South Korea sending messages with those balloons in the months from April to late fall
00:31:40.140 where there are North winds where they are taking all these balloons to Korea.
00:31:44.520 They should have used Chinese lanterns, then they've got plausible deniability.
00:31:47.820 It can't be us, it was the Chinese.
00:31:50.060 They share a land border.
00:31:51.140 But I want to ask both of you a very important question.
00:31:55.180 Now, look at this thing here.
00:31:57.140 I want to ask you, what do you notice?
00:32:00.380 Is that a net?
00:32:01.480 Are you noticing?
00:32:03.520 I'm noticing, yeah.
00:32:05.240 What are you noticing?
00:32:06.440 With the net?
00:32:08.080 It stops at a certain point.
00:32:12.080 So, boom.
00:32:13.360 Oh, right, I see.
00:32:14.120 Blew it up.
00:32:14.300 Okay, is that a road getting blown up, I take it?
00:32:18.580 Yeah.
00:32:18.800 It looks like it.
00:32:19.600 Yeah, the road is exploding.
00:32:22.040 Because everyone knows the way you stop drones is you blow up roads.
00:32:25.540 Yeah.
00:32:26.260 Flawless.
00:32:27.120 Flawless.
00:32:27.820 Hang on, isn't the road blowing up retaliation for the drones?
00:32:33.480 I think so, yes.
00:32:34.660 Is North Korea doing the blowing up, is it?
00:32:36.640 It is, yeah.
00:32:37.080 North Korea is doing the blowing up.
00:32:39.980 According to several reports.
00:32:41.980 It's very precise right behind the other net, wasn't it?
00:32:43.800 According to several reports, these roads aren't in use because economic activity has ceased.
00:32:51.060 Well, yeah, except you can see bunches of people on it.
00:32:54.280 Yeah, so essentially, we need to understand what's going on here because it's like North
00:33:00.120 Korea saying, you're provoking us, you're engaged in aggression, so we are going to retaliate
00:33:09.860 by blowing up our own roads.
00:33:12.000 And they have it here, you see the goodbye.
00:33:15.260 Wait, so that's the North Korean side?
00:33:16.740 Yes.
00:33:18.140 Yeah.
00:33:19.860 And they blow up their own right.
00:33:22.160 They blow up the roads at the places where these roads are connecting North Korea to South
00:33:28.460 Korea.
00:33:29.700 And it's...
00:33:30.000 Yeah, it's fairly simple.
00:33:31.560 It's just there are roads connecting them that they don't use and they blew them up because
00:33:36.620 they can.
00:33:37.540 Yeah.
00:33:37.640 And it doesn't actually change very much because it's just a symbolic gesture, right?
00:33:41.900 Right.
00:33:42.260 So we have here this account next to us saying North Korea has blown up sections of roads
00:33:46.820 and railroad lines on its side on the fortified border between the two countries, South Korea's
00:33:52.380 military said.
00:33:53.100 In response, the South Korean military fired warning shots toward the demarcation line separating
00:33:59.020 the neighbor.
00:34:00.160 So I also find it interesting that it says goodbye in 10 meters in that previous sign.
00:34:06.360 So it's almost like it's giving you a warning that there's going to be an explosion.
00:34:10.360 Yeah.
00:34:10.540 Goodbye.
00:34:11.260 Goodbye.
00:34:12.040 You're going to blow up.
00:34:12.760 So it says there Pyongyang pledged last week to completely shut down inter-Korean roads
00:34:18.240 and further fortify areas on its own side of the border.
00:34:22.500 And according to South Korea's Joint Chief of Staff, the military is already planting mines
00:34:27.880 and barriers along the border and additional work using heavy equipment was seen on Monday.
00:34:33.040 So a lot of people were saying that intelligence says that they are going to do this and it actually
00:34:38.260 happened.
00:34:38.580 So the North Koreans are behaving as if they expect to get invaded quite soon?
00:34:42.760 Well, we don't know exactly, but they are saying that they are engaging in suspicious
00:34:48.980 activity and we are going to monitor the situation.
00:34:54.120 That's essentially what they said.
00:34:55.380 But they also replied with some firing shots and they are escalating rhetoric.
00:35:01.400 Whenever the North Korean side escalates in their rhetoric, they're also doing the same.
00:35:06.600 Now, there have been reports of military councils in North Korea about,
00:35:12.760 plans of how to move forward and how to respond to the incident by immediate military action.
00:35:19.860 We have here Kim Yo-jong, the sister of North Korean leader Kim Jong-un, warned South Korea
00:35:26.120 of a horrible disaster if Seoul sends more drones into its territory.
00:35:31.100 Now contain yourself, Dan.
00:35:32.320 I do like Kim Jo-jong.
00:35:33.700 And we have here the South Korean Joint Chiefs of Staff, the Admiral Kim Myung-soo, who told
00:35:41.380 sailors recently at a naval sector defense command near Seoul that if North Korea launches a
00:35:47.560 hostile provocation, they should retaliate immediately, strongly and until the end.
00:35:52.660 So basically, they're escalating rhetoric at the point.
00:35:55.400 Right, so I want to show you some context here because it seems that, you know, a lot of
00:36:02.500 these things are happening every now and then.
00:36:04.880 There have been escalations, but aggression isn't unheard of in the region.
00:36:10.160 It's a constant phenomenon.
00:36:11.780 So I want to give you some kind of context because I was searching for it.
00:36:15.120 And it said that, I found that in the early year, in the early January, Kim Jong-un called
00:36:26.420 for some changes in the constitution of North Korea.
00:36:30.520 Sorry, I'm distracted by the fact this man needs a hairstylist urgently.
00:36:34.640 And he said that they are going to make some changes and they are going to name their primary
00:36:43.620 foe and principal enemy, not the US, but South Korea.
00:36:50.460 That's quite significant because don't they have a museum just illustrating how evil the
00:36:56.320 United States is and they've got that one boat that they've captured that they always
00:37:02.020 prayed around and they've sort of demonized them as the big bad enemy.
00:37:07.200 And the fact that they're changing that now is quite significant because, of course, the
00:37:11.520 regime relies on the fear of the external enemy to maintain its power to a certain extent,
00:37:17.740 which is pretty basic stuff these days.
00:37:20.120 And for those who would question this or doubt this, we have here from January 2021, Kim Jong-un
00:37:29.960 calling the United States as the biggest enemy and principal enemy of North Korea.
00:37:35.400 So what happened now?
00:37:36.800 And it's a bit more realistic though, isn't it?
00:37:38.460 I think it's unlikely that North Korea could militarily defeat South Korea, but it's a bit,
00:37:42.960 it's a more achievable goal than the United States, I think.
00:37:46.300 I think.
00:37:49.800 Yeah, I feel like the largest military on earth is going to be more difficult to defeat than
00:37:55.880 South Korea.
00:37:56.960 In a smaller military.
00:37:58.340 Well, at least you were connected to until you blew up the roads.
00:38:00.520 Yeah.
00:38:00.720 So what happened afterwards that they ceased economic cooperation.
00:38:04.800 So the roads that they blew up were in use.
00:38:10.140 So they were in use?
00:38:11.800 No, they weren't in use.
00:38:12.820 Okay.
00:38:13.100 North Korea ended all economic cooperation with South Korea.
00:38:17.620 I didn't even realize they were cooperating to begin with.
00:38:21.660 It must have been pretty limited then.
00:38:23.240 Yes.
00:38:23.740 I didn't know this either, to be frank.
00:38:27.740 And it can explain how some, how North Koreans can sometimes send stuff over to the South Korean
00:38:36.660 side with...
00:38:38.520 Normally refugees, right?
00:38:40.140 Yeah.
00:38:40.320 So it's, and they are realigning.
00:38:43.460 I mean, they're not realigning.
00:38:44.720 They have always been pro, they have always been closer to Russia.
00:38:48.800 They say Kim Jong-un said Russia is the most honest friend and partner and called Putin
00:38:52.800 the dearest friend of the Korean people.
00:38:54.660 That's because they fell out, they fell out of the, with the Chinese a little bit, didn't
00:38:58.880 they?
00:38:59.300 Because the Chinese are fed up of them being sort of a toddler that throws their toys out
00:39:04.180 of the pram sometimes and upsets things for them.
00:39:07.100 Yes.
00:39:07.880 And perhaps we should also look back into the 70s, what happened, because also Nixon
00:39:14.140 and Kissinger were working to split the communist bloc and they approached China at the expense
00:39:21.500 of Russia.
00:39:22.260 And that was something that could also have implications for North Korea and its foreign
00:39:28.260 policy against the US for decades.
00:39:30.960 Right.
00:39:31.320 So we have here some photos of the balloon wars, because I saw that what the balloon wars
00:39:37.740 things is really interesting.
00:39:39.780 What they're doing is that they are, they have these balloons and they have these plastic bags.
00:39:46.500 They're filling them with goldfish.
00:39:48.900 It looks like, it looks like the plastic bags that if you go to a fair and you win a prize,
00:39:53.340 they give you like a goldfish.
00:39:54.880 Yeah.
00:39:55.500 Palming off their old pets.
00:39:57.260 I think.
00:39:58.520 This is a bit petty though, isn't it?
00:40:01.040 This is what a bad neighbor is.
00:40:03.980 Well, to be fair, if your bad neighbor just sends over balloons full of presents, it's
00:40:10.320 like I've got you some reading material.
00:40:12.040 Well, look at these presents that don't have, I mean, you could say that they have an exchange
00:40:17.640 card because they're taking the same trash and they're throwing it the other side.
00:40:21.580 So it's endless.
00:40:22.700 A little trash exchange program.
00:40:25.360 Yes.
00:40:25.620 And we have here the balloon wars.
00:40:27.640 South Koreans respond by launching balloons into North Korea with flash drives loaded with
00:40:32.560 movies, music, and also dollar bills.
00:40:34.780 They're assuming that they have computers.
00:40:37.020 Yeah.
00:40:37.220 So things have been escalating.
00:40:39.320 So January 2024, the North Koreans changed the constitution and announced South Korea as their
00:40:45.760 main enemy.
00:40:46.700 Then they seized economic cooperation.
00:40:49.040 Then they started the balloon wars throwing feces at each other and also cigarette butts.
00:40:54.840 So that's a lot funnier.
00:40:57.440 I like that.
00:40:58.160 It reminds me when I first started in the city.
00:41:00.160 We used to, it's not really a thing anymore, but there always used to be a May Day protest
00:41:03.760 where like all the commies would come out and go into the city and protest and the guys
00:41:08.380 would basically flick 50 pound notes at them from the window and then watch them all scrabble
00:41:12.140 around fighting between each other to get the 50 pound notes.
00:41:14.960 So it's a bit like that.
00:41:18.400 Yes.
00:41:18.880 And it says here, so there is a clear escalation that's in June.
00:41:22.880 So balloon wars is something that is attaining increasing sophistication.
00:41:31.820 One side sending trash and the other side sending like iPads and stuff.
00:41:35.580 Yeah.
00:41:36.620 Soon they'll be sending Zeppelins across.
00:41:38.980 Right.
00:41:39.320 So I want to show you what a group of activists say about how they use GPS trackers to time
00:41:46.500 the distribution of leaflets and electronic speakers as their balloons travel hundreds of
00:41:51.720 kilometers into North Korea.
00:41:53.500 The smart balloons, it's like smartphones, as we say, can cost up to a thousand dollars
00:41:59.140 each and are launched up to twice a month for from spring to autumn when favorable winds
00:42:04.880 blow north, aiming to reach North Korea's capital.
00:42:07.680 Pyongyang, one balloon has flown as far as China.
00:42:10.580 That's like some wedding reception level cost balloons, isn't it?
00:42:14.560 Earlier this year, a secret group of activists in South Korea quietly released a so-called smart
00:42:18.020 balloon into the night sky, with a course charted over the border into reclusive North Korea.
00:42:24.060 It carried a sophisticated 15-pound payload.
00:42:27.880 When the group launches them, they may include mechanical leaflet dispensers, bundles of speakers,
00:42:33.180 or GPS trackers.
00:42:34.840 The activist group has been developing these intricate devices since 2016, with regular launches
00:42:40.040 since 2022.
00:42:41.800 The group has not previously discussed its activities with the media.
00:42:45.360 Once or twice a month from spring to fall, when the wind blows north, they launch the balloons
00:42:50.460 with an aim to go deeper into the north, dropping thousands of leaflets and blasting recordings
00:42:55.960 critical of Supreme Leader Kim Jong-un in a North Korean accent.
00:43:01.100 This one is saying Kim Jong-un is a traitor that opposes the people and reunification.
00:43:07.220 So that's how they're operating.
00:43:10.300 I mean, that's quite funny, but I'd be very annoyed if I was some North Korean peasant who
00:43:15.360 was pretty hungry, and then a speaker dropped on my head and started complaining about the
00:43:19.340 dear leader.
00:43:20.520 Imagine it landed on your roof and you couldn't get it, and then all of the communist soldiers
00:43:24.660 come to your house, and they're like, why is there a speaker saying blasphemous things
00:43:30.180 about our dear leader?
00:43:31.140 But if there was a book that introduced people to Austrian economics?
00:43:35.340 That'd be good.
00:43:35.920 Would that be good?
00:43:37.360 They should just drop copies of Thomas Sowell's basic economics on them.
00:43:40.400 Exactly.
00:43:40.640 Or have them ask Brokonomics.
00:43:43.180 Yeah, exactly.
00:43:43.880 Yeah.
00:43:45.040 And say, you know, five pounds a month subscription to Lotus Eater.
00:43:50.080 Yeah, we need to get more North Koreans signing up.
00:43:52.980 Yes, but also things have been escalating even more because this isn't just throwing balloons
00:43:58.060 here or there with a lot of unpleasant substances.
00:44:02.700 They are disrupting the Korean airspace, and it says there that they have shut down runways
00:44:12.280 at two of its main airports multiple times since June because of balloons carrying trash
00:44:18.520 launched by North Korea.
00:44:20.840 I mean, this is quite a cool hobby.
00:44:22.340 I mean, I kind of, I think it's unfortunate that I don't live in one of the Koreas, so my
00:44:27.700 hobby could be building balloons to fly things into the, well, mind you.
00:44:32.900 If only there was a territory to our north that was an inhospitable, frigid land full
00:44:38.300 of people utterly hostile to their southern neighbours.
00:44:41.460 I mean, I have got near my house two towns, North Badersley and South Badersley.
00:44:47.340 I want to see if I can get them doing this.
00:44:50.180 Why would you dox yourself as well?
00:44:52.460 Never mind.
00:44:53.200 Yes.
00:44:53.580 Surely you could just send them up to Scotland.
00:44:55.960 I mean.
00:44:56.520 Yeah, but you'd have to, you'd have to wait a really long time to find out with the blue.
00:45:00.780 If you did it, if you did it with like two towns next to each other, a bit more instant
00:45:05.520 gratification.
00:45:06.200 I don't think I've got the patience to build something to go to Scotland.
00:45:08.860 I think what you could do to really troll the Scottish would be send them Iron Brew and
00:45:13.400 then they open it and it's not orange.
00:45:15.400 It's a different colour.
00:45:16.640 It's like white or something.
00:45:17.600 They're like, what on earth is going on here?
00:45:19.860 Best with the whole world view.
00:45:21.400 Yeah.
00:45:21.600 So have you heard of the butterfly effect?
00:45:23.680 Dan, I'm sure you're going to love this.
00:45:25.300 So there's a theorem that says that a butterfly somewhere could essentially cause a huge catastrophe
00:45:31.940 at the other side of it.
00:45:33.060 I don't think it's a very good theory, but I've heard it.
00:45:34.680 Okay.
00:45:34.960 Do you think that this could be something like this?
00:45:37.780 No.
00:45:38.220 Well, a butterfly flapped its wings and then somebody in South Korea sent my pads to North
00:45:42.660 Korea.
00:45:43.220 No, sent something like an Armageddon or something.
00:45:46.380 North Korea sends some trash to South Korea and it starts World War Three.
00:45:50.800 It's not even like Gabrielo Princip assassinating.
00:45:53.920 I'm not talking about, I'm talking about blowing the roads up because a lot of people and
00:45:58.360 a lot of media are worried about World War Three about this.
00:46:03.240 But the balloon wars are fine.
00:46:04.160 I like the balloon wars.
00:46:05.660 Yeah.
00:46:06.500 Yeah.
00:46:06.660 But now they're escalating.
00:46:08.080 So we have here escalation.
00:46:10.120 South Korea is warning to North Korea regarding nuclear weapons.
00:46:13.160 And we have the president of South Korea saying that if you're going to use nuclear weapons
00:46:19.300 against South Korea, it's going to be the end of you.
00:46:22.540 It's probably true.
00:46:23.780 Yeah.
00:46:24.120 I imagine between the entire world and North Korea, I think the entire world wins.
00:46:30.440 And I think the Russians or the Chinese are going to be like, oh, no, North Korea, that
00:46:34.120 slightly annoying fawn in our side.
00:46:36.380 I'm pretty sure at that point China will be like, yeah, you're on your own, lads.
00:46:39.140 Yeah.
00:46:39.440 Almost certainly.
00:46:41.980 Right.
00:46:42.340 So I wanted to show you here just to find an article that I think is basically blowing
00:46:48.160 this out of proportions.
00:46:49.900 Oh, definitely.
00:46:50.600 From Daily Mail here.
00:46:51.800 It essentially says that things are pushing us to the brink of World War Three and Armageddon.
00:46:57.600 And they're adding North Korea here.
00:47:00.320 It's from yesterday.
00:47:01.460 And they're talking also about these events.
00:47:03.860 I think that, yeah, that's a bit of a stretch.
00:47:07.060 Well, what all of these sorts of articles are doing, other than scaring people to get
00:47:11.780 more views, is encouraging people to support the military-industrial complex and continue
00:47:19.140 the military companies, these private weapons manufacturers, ripping off the taxpayer.
00:47:26.020 Spending sprees.
00:47:27.000 Exactly.
00:47:27.340 Military budget increases.
00:47:28.800 Yeah.
00:47:29.400 So that's the thing here.
00:47:31.300 It seems to me that blowing up roads that nobody has used is more like a very symbolic
00:47:37.260 presentation of, you know, we just don't like you and we don't care about reuniting with
00:47:44.280 you.
00:47:44.760 It's more like symbolic.
00:47:46.160 It doesn't seem to me to be the escalation a lot of people fear.
00:47:50.120 There does seem to be an escalation, but I think that it's very premature to talk about
00:47:55.380 this as just being the beginning of World War Three.
00:47:58.440 Yes, we're one of the few outlets that says, actually, no, there's not going to be World
00:48:02.460 War Three.
00:48:03.800 I like the balloon wars.
00:48:05.000 I think that's funny.
00:48:05.740 I want to do it.
00:48:06.140 Yeah, I want to carry that on.
00:48:07.840 We should do that to France.
00:48:09.600 Thank you.
00:48:11.080 You've got some comments there, Stelios.
00:48:13.320 Right, let me.
00:48:16.020 So, that's a random name.
00:48:18.160 Years ago, a North Korean defector fled through the demilitarized zone and the NK soldiers who
00:48:24.380 shot at him crossed the DMZ for, like, 30 seconds.
00:48:28.860 30 seconds.
00:48:29.280 30 seconds, yeah.
00:48:30.260 This was technically cause for war, yet nothing happened.
00:48:33.600 Chances of war are almost 0% in my opinion.
00:48:37.660 This sounds about right.
00:48:40.800 Phil, this is now a monthly supporter.
00:48:43.520 Thank you.
00:48:44.020 Thank you.
00:48:44.380 Thank you.
00:48:45.060 Ryan Rambles 1993.
00:48:46.820 Was it acceptable for Kissin' to call Harry a retard, then question Carl's judgment in
00:48:53.040 employing him?
00:48:55.220 I'd rather have Harry than Constantine.
00:48:58.060 That's a random name.
00:49:00.060 So, with regards to those tweets, I was, and I discussed it with Carl as well, we were
00:49:05.380 kind of both tempted to jump in, but Harry was doing such a good job that he just didn't
00:49:10.740 need our help.
00:49:12.540 He was slaying it there.
00:49:13.860 I think it's a good discussion to have.
00:49:18.420 Yes.
00:49:19.120 It's a good discussion to have.
00:49:20.640 Yes.
00:49:21.320 And I think Carl should have the discussion.
00:49:25.080 I'm not talking about Harry.
00:49:28.560 Of Lucy.
00:49:29.600 But Harry is a bit retarded.
00:49:31.480 If you're watching Harry, it's true.
00:49:33.720 No, I think Carl has good arguments.
00:49:36.380 Yeah, but in the funny way.
00:49:39.260 Okay, that's a random name.
00:49:41.680 Trade between Koreas looks something like this.
00:49:44.780 North Korea gets K-pop, Twinks, and rubbish.
00:49:48.600 South Korea gets parasite-infested defectors and explosions.
00:49:53.200 Peace is around the corner.
00:49:54.660 Lol.
00:49:56.460 To be fair, I had Korean for lunch today, in solidarity.
00:50:00.880 With which Korea?
00:50:02.120 You'll never know.
00:50:04.120 Which way, Korean man?
00:50:06.020 Is that all of the comment things?
00:50:07.500 Yeah, it is, yeah.
00:50:08.280 Oh, right.
00:50:08.660 Now, yeah.
00:50:09.380 Tell us about the robots, Dan.
00:50:10.820 Yeah, so, yes.
00:50:13.700 So, this segment is about how you can become personally wealthy.
00:50:17.940 So, if that is of interest to you, then do watch on.
00:50:20.840 If it's not, then, you know, go do something else.
00:50:23.900 It's going to be like a mini Brokonomics, this one.
00:50:25.940 In fact, I started realising when I get into it,
00:50:28.460 I could have got a Brokonomics episode out of this,
00:50:31.080 but I'm going to do it here anyway.
00:50:32.640 And basically, what it is, is I'm going to look at disruptive technologies,
00:50:37.160 about how industries get disrupted and the opportunities that spring up from that happening
00:50:42.520 and how you can use it to make lots and lots of money, which is nice.
00:50:46.920 So, previous examples of this would include things like, you know, when PCs came along,
00:50:51.300 when the internet came along, when smartphones came along,
00:50:53.480 all of them disrupted industries and created new entrants that could do very large returns
00:51:00.880 if you invested into them.
00:51:02.720 So, that's what I'm going to do.
00:51:03.640 So, it's less about the robots themselves and more about the opportunities that it throws up.
00:51:07.220 But nevertheless, I do need to talk about the robots themselves,
00:51:10.300 because, obviously, that's what we're here to talk about.
00:51:13.020 So, Mr. Musk basically did his robot presentation the other day.
00:51:20.100 And so, let's just watch some of this to give us a flavour of the kind of thing that he's thinking about.
00:51:24.380 So, there, if you...
00:51:26.240 We need to turn this to Musk's sound.
00:51:30.680 I've muted, Elon.
00:51:31.700 Yeah, be quiet, Elon.
00:51:32.540 So, these are the robo-taxis.
00:51:36.980 So, you might think you're looking at, you know, a standard Tesla car,
00:51:40.420 but no, those things don't have any steering wheels or pedals.
00:51:42.500 I don't really like the look of it.
00:51:44.440 I imagine the technology is very impressive,
00:51:46.220 but they sort of look like, in a video game,
00:51:49.240 when the textures haven't loaded in.
00:51:52.760 Like, the wheels don't have any stuff on them.
00:51:58.380 He does like his cars to have as few polygons as possible.
00:52:02.540 I think they're quite smart.
00:52:04.620 But, yes, so, anyway, so, robo-taxis, that's a thing that he is working on.
00:52:10.800 Cars, like I said, with no steering wheels or pedals.
00:52:12.660 You just get in and it, you know, takes you to wherever it is that you feel that you need to be.
00:52:18.280 So, that was one thing.
00:52:21.820 Let's skip ahead to...
00:52:24.860 Oh, here we go.
00:52:26.660 Let's watch this bit.
00:52:27.880 I like how they showed someone watching football while driving there.
00:52:30.580 That's always responsible.
00:52:32.840 A little bit more.
00:52:33.260 Here we go.
00:52:33.780 Robo-bus.
00:52:35.660 That's the other thing.
00:52:36.580 That looks like, you know, in the Death Star, in the original Star Wars,
00:52:40.060 they had those little boxes that went around the floor.
00:52:43.620 It looks like one of those, but bigger.
00:52:44.840 Yes, it's much bigger.
00:52:46.000 That actually reminds me of Total Recall.
00:52:48.580 The taxi he gets in the beginning.
00:52:51.700 I like the styling.
00:52:52.800 It does look cool, actually.
00:52:54.200 I quite like that.
00:52:55.480 You know 1930s Art Deco?
00:52:57.800 Yeah, I grew up in a 1930s house.
00:53:00.900 Oh, okay, right, yeah.
00:53:01.560 A lot of it was Art Deco.
00:53:03.060 Yeah, so that is good.
00:53:04.320 So, anyway, robo-bus, and that can be configured for people or goods.
00:53:08.540 So, you can have, you know, stuff being delivered in that.
00:53:13.860 So, and he's going to make these.
00:53:16.860 Where's the beauty in it?
00:53:18.900 I think it's quite...
00:53:19.560 It's functional, but...
00:53:20.340 I like it.
00:53:21.280 I think it's very modern.
00:53:23.100 It's modern and sanitized and clean.
00:53:25.980 You know, eventually there'll be variation, won't they?
00:53:28.120 But it's...
00:53:28.900 You don't like that?
00:53:30.260 I think it's lovely.
00:53:31.080 It's very, um, Bioshock.
00:53:32.960 You know, modern taxis, normally they smell like stale cigarettes and sick, so...
00:53:38.900 I think people can puke inside this as well.
00:53:42.080 No, they're going to be shocked.
00:53:43.180 The seats have electrodes in there.
00:53:44.700 If you vomit in there, you actually get...
00:53:46.380 So, mildly off topic, but, yes, as you can see, there's not even a front window there,
00:53:52.280 let alone any pedals or steering wheels or anything.
00:53:54.340 So, yes, the robo-bus is on its way.
00:53:57.720 And, oh, yeah, the most exciting bit was this.
00:54:04.540 This is not CGI, Owen.
00:54:05.620 I think he's actually building himself a robot army, which I think every gentleman should aspire to.
00:54:11.380 He is sort of Bond villain maxing now, isn't he?
00:54:14.480 Yes.
00:54:15.460 He's got his rockets.
00:54:17.240 He's getting a robot army.
00:54:19.880 No, he's come quite a long way.
00:54:21.160 I mean, it was only, what, not even two years ago he first...
00:54:25.000 Look at their hips.
00:54:25.520 So, interestingly, when he was building these, he discovered that...
00:54:30.100 Because they started off doing things the way that you'd think you'd build a robot,
00:54:33.620 and what they found is that they needed to basically replicate the human body,
00:54:38.040 even down to the smallest detail.
00:54:39.420 So, things like where tendons join and joints, how they join into your arm and stuff like that,
00:54:45.100 they found they had to replicate basically everything you do in a body in a robot as well.
00:54:48.760 I'm surprised that that's surprising.
00:54:50.460 Like, you know, we've come a long way to be the animals we are now, and so...
00:54:56.700 Yeah.
00:54:57.900 But you think you just have the actuators on the joints and stuff, but no, it doesn't work like that.
00:55:01.480 Anyway, so here's his vision of what he thinks it will be doing.
00:55:04.760 Now, I disagree with his vision.
00:55:06.120 I'll explain why in a moment.
00:55:07.740 But, you know, there's a robot playing with your children and wiping the table.
00:55:12.220 I mean, I might do a bit of that and the old bit of, you know, handing out drinks.
00:55:16.200 So, anyway, that's their vision of what they're trying to do.
00:55:19.840 Now, so actually the main thing that I want to focus on is I'm less interested in what has happened here.
00:55:27.720 What I'm far more interested in is here is what is going to happen as a result of this.
00:55:34.500 So, I'm kind of looking for the second and third order effects.
00:55:38.340 Well, you're going to need a lot less low-skilled people if, you know, America's going to need a lot less Mexicans if they can create robot Mexicans.
00:55:48.480 I hadn't thought about it from the Mexican angle.
00:55:52.060 But, yes, well, certainly it has a huge impact on the need for mass immigration.
00:55:56.860 That is true.
00:55:57.980 Because why are you mass importing people if a robot...
00:56:00.440 And I think a robot will do a lot of this.
00:56:01.740 Well, this is surprising no one.
00:56:03.020 We knew about robots being the future, like, in the 60s and 70s at least.
00:56:08.660 Yeah, but we didn't actually have them.
00:56:10.080 No, but we knew of them and we knew it would probably be possible.
00:56:13.940 We imagined it, yes.
00:56:14.940 But now we've actually got them and I think we're sort of on the cusp of this.
00:56:18.460 But, anyway, I do think that this is an opportunity to make considerable personal wealth.
00:56:25.520 Now, my return on investment is about 40% since leaving the city.
00:56:29.380 And if you know what a return on investment is, you'll...
00:56:31.020 That's very good.
00:56:31.740 I think that, yes, actually, that is something you might want to listen to.
00:56:35.780 So, basically, what I want to build on is to start with the idea that when a disruptive technology comes along,
00:56:41.100 it eviscerates certain industries and provides massive opportunities.
00:56:47.560 So, if we start with a PC, for example, the big opportunity would be something like Microsoft and other companies like that.
00:56:53.500 You could have seen that and thought, okay, well, rather than having a team of accountants doing everything in a ledger,
00:56:59.740 I can have a PC with some Microsoft Excel on it and I can do the job of 10 accountants with one.
00:57:09.440 You know, significant advantages there.
00:57:11.940 When you've got the internet, that enabled things like, you know, Google and Facebook and so on.
00:57:17.120 You know, very disruptive industries that, again, allow other industries to die, new ones to sort of emerge in its place.
00:57:24.780 And these all become, well, the ones I've mentioned so far, they're all trillion-dollar companies.
00:57:28.940 Smartphones, it might not have been obvious when you first got them and I think more people listening might have experienced that in their lifetime
00:57:38.180 when we kind of went from, you know, well, first of all, the wall-tied phones to then the dumb Nokias to then the smartphones.
00:57:47.520 And I bet a lot of people didn't look at it at the time when it first started to emerge and think, okay, well, this will cause an incel crisis.
00:57:53.320 But that is exactly what it did because the third...
00:57:57.300 Well, I don't think it's just mobile phones and nothing else.
00:58:02.340 Well, I think it's a strong driver because the second...
00:58:04.620 What else could it be?
00:58:05.580 The second-order effect would be that it enables new ways of working and new workflows and new networks and connections,
00:58:16.440 which enable things like Tinder so that women are now not choosing from the best guy who happens to be in the pub that night
00:58:22.980 but the best guy within a 50-mile radius who's got all their pre-prepared snaps lined up on the phone
00:58:29.880 and because of female nature, there's usually, I want that one.
00:58:33.360 And then they're basically all like that.
00:58:35.020 So it creates a sort of incel...
00:58:36.320 So the third-order effects is a result of the second-order effects, which, you know, based on the smartphone.
00:58:41.180 So there's that.
00:58:42.940 Smartphones also massively boosted Facebook, so it took a company that was doing very well
00:58:47.140 and then kind of really turbocharged it and got it up from, you know, hundreds of billions into the sort of trillions valuation.
00:58:53.780 So it went online, what was it, like 2005, 2007, something like that?
00:58:57.280 Yeah, well, it sort of predates...
00:58:59.180 Yeah, it did predate the smartphone bit, but then it sort of turbocharged it.
00:59:03.780 Uber, that's another one.
00:59:04.920 I bet when you first got a smartphone, you didn't look at it and think,
00:59:07.720 well, this is going to disrupt the taxi industry.
00:59:10.340 I thought it would have helped it, if anything, because, of course, it makes calling a taxi easier.
00:59:14.840 Yes, but then instead, you can just have a thing that, you know, you can, you know, select your journey,
00:59:21.140 you know, watch the taxi arrive, have all the advantages of that.
00:59:24.500 So it's really those sort of second and third order effects is where you can identify the money-making opportunity
00:59:31.340 in something like this.
00:59:32.840 So let's start with the robo-taxis.
00:59:34.880 What are the...
00:59:35.880 And I'll just give a very quick skim here of the sort of thing that it's going to disrupt.
00:59:40.580 So, you know, what is a robo-taxi going to disrupt?
00:59:43.320 Well, traditional autos is a fairly obvious one.
00:59:46.760 A normal car you use for about 10 hours a week, so about 5% of the week you're using it.
00:59:53.000 A robo-taxi can be used something like 60% of the week.
00:59:57.140 You know, it does need a little bit of time for cleaning, being idle, and recharging.
01:00:02.080 But still, you can get 60% usage out of them, which means that one of those replaces about 12 cars.
01:00:09.040 That's pretty big, isn't it?
01:00:10.420 Yes, that's fairly disruptive.
01:00:12.120 And also, they're on their own cost curve decline, which, again, I won't get into too much here.
01:00:16.760 But basically, there is...
01:00:19.500 You know, you've got Moore's Law for chips.
01:00:24.320 That's where they grow at a certain rate.
01:00:28.040 Yes.
01:00:28.600 There's something called Wright's Law, which applies to manufacturing, which does seem to hold.
01:00:32.820 And it's held over, you know, well over 100 years at this point, where if you get a cumulative doubling, you get a certain percentage increase based on the industry.
01:00:41.520 And for cars, it seems to be about 15%.
01:00:43.300 So for every cumulative doubling, you get a 15% cost reduction in whatever it takes to produce it.
01:00:48.100 Now, getting a cumulative doubling of regular cars at this point to get the next 15% saving is incredibly hard because there's like 4 billion cars out there.
01:00:59.060 So if you want to... And it's not just a cumulative doubling of the ones that are out there now.
01:01:03.380 It's a cumulative doubling of all the ones that have ever been produced.
01:01:06.140 So let's just call that 10 billion.
01:01:08.300 You've then got to produce another 10 billion cars in order to recognize the next 15% of savings.
01:01:13.520 Whereas of electric cars, they appear to be on their own right curve, meaning that, you know, if there's a couple of million out there, you've just got to produce a couple million more to then get another 15% cost reduction on it.
01:01:22.420 So traditional autos is an obvious one, which I think are going to get cleared away by this.
01:01:27.080 That's an easy one.
01:01:27.960 Oil industry will be impacted because a large percentage of the output goes to road freight and road usage.
01:01:36.200 So that will be impacted.
01:01:37.560 It will impact the marginal cost.
01:01:38.720 There's still going to be lots of uses for petroleum, but, you know, that will be impacted.
01:01:43.980 Then getting on to the slightly less obvious ones, but still reasonably obvious, insurance.
01:01:49.600 So huge auto insurance market on that.
01:01:52.880 But if the robo taxis live up to their claims, they'll be, you know, significantly safer because they don't get drunk or irritated.
01:02:00.520 They also don't, you know, the driver doesn't smell like the real taxi sometimes.
01:02:04.400 Yes.
01:02:04.640 Have you ever had a smelly driver?
01:02:05.520 It's horrible.
01:02:06.740 And then you've got to pay them afterwards.
01:02:08.220 It's like, oh, I've been stuck in a box with you and you stink.
01:02:11.080 I'm giving you money for this.
01:02:12.740 Yes.
01:02:12.980 Well, so hopefully they'll smell nicer.
01:02:17.020 That will be true.
01:02:20.100 Legal.
01:02:20.680 Deodorant will sell less.
01:02:21.960 Personal industry, personal injury, that will go down.
01:02:24.760 There's a lot of that going on.
01:02:27.340 Crime in general, because if you move away from the model of having your own car and having a robo,
01:02:32.740 it's actually you're less likely to leave stuff inside it.
01:02:34.500 And they've got cameras all the way around them.
01:02:36.780 So that will go down.
01:02:40.460 Legal, actually, if you can also expand that to the sort of criminal court system as a whole,
01:02:45.280 because a large percentage, especially in the US, a large percentage of criminal cases are initiated from a traffic stop.
01:02:52.920 And that gives them an excuse to search the vehicle and then it leads on to other things and so on.
01:02:57.400 So there's all of these sort of second and order third effects that flow from it.
01:03:01.600 So, yeah, personal injury would be one.
01:03:02.880 A lot of commercial space is turned over to car parking.
01:03:06.860 That will all go.
01:03:08.280 Rental car companies, they can go freight, which is already struggling to compete against rail.
01:03:13.140 If the cost reductions of something like a robo bus or robo trucks and stuff becomes a thing,
01:03:17.300 you know, that will go.
01:03:18.940 Airports for short haul flights because, you know, why go to the hassle of going to an airport two hours early,
01:03:24.220 queuing up and all the rest of it when you can use that total volume of space just to sit in the back of a robo taxi and drive?
01:03:30.720 So that's the kind of thing, the kind of process that I'm working through.
01:03:34.960 Now, with personal robots, as I said earlier, I don't think the way that Elon showed it is what will actually happen.
01:03:44.380 Can I ask you something on this?
01:03:45.760 Because I want to understand if I understood you correctly.
01:03:49.580 So, essentially, you claim that a lot of these technologies get inserted into the marketplace and society,
01:03:58.800 and they have several effects that are unintended.
01:04:01.480 Well, I don't know if they're intended.
01:04:04.580 I mean, they're kind of very negative.
01:04:06.100 Yes, I mean, is that the point?
01:04:07.500 So what I'm saying is when you get a disruptive new technology, we've seen, we've got three really good examples.
01:04:13.540 So, say, for example.
01:04:14.200 What do you mean disruptive?
01:04:14.880 You mean a new technology that.
01:04:16.220 Yeah, so I class things like the PC, the emergence of the internet and smartphones as very disruptive technologies
01:04:22.020 because it completely destroys some industry, and it allows people who can operate efficiently at scale with new workflows
01:04:30.620 to capture a huge volume of, you know, the market.
01:04:35.660 So, for example.
01:04:35.960 Similar to how, say, the printing press in the past would have been very disruptive to all the people
01:04:40.260 who made their living writing out books by hand.
01:04:42.220 Yes, the monks.
01:04:42.780 Yeah.
01:04:42.960 Yes, exactly.
01:04:44.340 So, I mean, I grew up in a world where you would have, like, five or six travel agents on every high street.
01:04:50.600 I remember that world.
01:04:51.720 Yes, and you might find one now, but you probably won't.
01:04:55.440 You can normally find a couple, actually.
01:04:57.320 There are still lots of – I'm surprised that there are as many brick and mortar, I suppose so.
01:05:02.080 They're the ones with all the money, aren't they?
01:05:03.380 But, I mean, shopping.
01:05:04.760 I mean, who here realistically actually goes out shopping that much?
01:05:09.580 I mean, you just go on Amazon.
01:05:10.600 I hate shopping.
01:05:11.260 Yeah, but do you just go on Amazon now?
01:05:14.300 Yeah, occasionally.
01:05:15.480 Yeah.
01:05:16.180 I do a decent amount of online shopping, but, you know, for food, I never buy shoes online
01:05:21.740 because you can never know how to fit.
01:05:23.780 There are some things that you've got to go in person and buy.
01:05:26.560 So, this food stuff is one of the things I'm coming to, but, you know, basically what I'm saying is you have to recognize when there's a new technology that comes along which is going to disrupt and make certain old industries go away and change and create the opportunity for a workflow process which can operate at scale based on this new technology.
01:05:45.140 And then if you invest in that thing, so if you had invested in Microsoft when the PC first came out, you would have made an awful lot of money.
01:05:52.700 If you had invested in Amazon, you know, when the internet first came out, you would have made a ridiculous amount of money.
01:05:59.260 Same with, you know, Uber, you know, or Facebook.
01:06:03.960 I mean, all of these companies, there was a significant opportunity that emerged.
01:06:07.220 And I am pretty certain that this is going to be a technology of the same impact as those things, which means that there's going to be disrupted industries,
01:06:17.780 which means that you need to be on the lookout for what are the companies that are going to benefit from this.
01:06:22.500 And in order to do that, you need to understand what are the workflows that are going to be disrupted and the new ones that can emerge.
01:06:28.320 What is the profile of the company that you're looking for so that when you see it, you recognize it.
01:06:33.540 That's what I'm driving at.
01:06:34.360 And if you get that right, you will do very well indeed.
01:06:37.960 So, yes, so I don't think that the version that Elon showed, which is basically robots doing,
01:06:42.500 and I think that's what a lot of people expect when we get the robots, they're just going to do basically exactly what humans do.
01:06:47.820 So they're going to go and load your washing machine and unload it and they'll bring you drinks and stuff like that.
01:06:52.300 Basically like a butler for you.
01:06:53.760 Yes, but working in human ways.
01:06:56.360 And I don't think that's what's going to happen.
01:06:58.320 What I think is going to happen is it's going to be a major driver towards subscription outsourcing and logistic based services.
01:07:05.340 So it's going to be this sort of massive growth in utilizing this tech to deliver these new workflows, like I keep saying,
01:07:13.420 and huge companies are going to emerge out of this.
01:07:15.600 So, you know what you're looking for.
01:07:16.800 So I've done some thinking on this and I've kind of beaten down the profile of what it is that I'm going to be looking for.
01:07:22.460 So, for example, with groceries and household supplies, I'm less inclined to believe that your robot butler will go down to the shops with you,
01:07:29.940 a shopping bag and do your shopping for you and then come back and then cook it for you.
01:07:34.380 So I think that is a lot less likely.
01:07:36.340 I think what's more likely is what you're going to get is you will get the – I mean, it can do – so it can do that.
01:07:47.280 It can do infantry management, for example.
01:07:49.500 So this would be the unclever end of what I'm going for here.
01:07:54.360 The unclever end is that it can just see what you've got and then a company will emerge which will offer inventory management,
01:08:02.360 which will probably be run through whatever home AI system you have.
01:08:05.520 So it would be something like, you know, you open the app and it says, you know,
01:08:10.720 do you want to start from a blank slate and specify what your home shopping requirements are
01:08:17.300 or the robot can basically do an infantry of what you've got and monitor it for a couple of weeks and see what your usage is
01:08:24.460 and then the app will just say, okay, we think that these are the things that you want and you can sort of tailor it or something.
01:08:31.880 So basically taking as much friction away from the online shopping experience as possible to kind of rotorise it.
01:08:38.940 So, for example, if I had a robot butler, I'd be like, under no circumstances are we to ever have less than 10 Asahi beers in my beer fridge.
01:08:48.100 That is paramount.
01:08:49.320 Quick tangential fact.
01:08:50.980 Do you know that Asahi is brewed in exactly the same brewery as Peroni?
01:08:56.420 So they're basically the same beer.
01:08:57.200 I also like Peroni.
01:08:58.300 They're both nice, aren't they?
01:08:59.660 There you go.
01:09:00.720 Random fact.
01:09:01.880 So inventory management, but then where I think the robot will come in is that things like –
01:09:08.640 the reason why online shopping at the moment is a hassle and a lot of people can't really be bothered with it
01:09:14.060 is because there's too much friction, too many clicks, and then actually even if you then do it,
01:09:21.260 you've then got to be disrupted by when somebody turns up, which you have to kind of do in the evening.
01:09:26.640 So it disrupts your evening as well and then sort of unload all the boxes and you've got stuff all over the floor because they just don't put it in bags even.
01:09:36.600 They just throw the bloody stuff at you in these crates.
01:09:39.260 With a robot service, not only can you use the inventory management type system there, but the robot, of course, can just take that delivery when you're not around.
01:09:50.080 So, you know, while you're at work or overnight, the delivery can turn up, it can take the delivery, and then it can put it all away for you.
01:09:57.440 So the online shopping experience goes from being something with a lot of friction to something with minimal friction.
01:10:02.140 And, you know, the first you know about it is when you open the fridge to see that you are now back at 10 acai beers.
01:10:09.500 So it'd be a good time to invest in alcohol treatment then, you know, alcohol addiction treatment.
01:10:18.620 Yes, that could possibly be a good third order effect.
01:10:22.120 On food preparation, I don't think it's going to be that thing where you, like in the movies, with a robot cutting really fast and then preparing your meal.
01:10:30.360 I think what it's going to enable is the outsourcing of food on scale.
01:10:36.340 So this would be meal kits handling.
01:10:39.860 So what I'm envisaging here is that the robot will receive a meal kit, which it will unpackage, and it will be compartmentalized in such a way that it is very easy for it to produce a very high-quality meal from the sort of package kit that it gets in.
01:10:54.140 And so what that will drive is sort of customizable meal planning and the adoption of meal kit subscriptions.
01:11:02.320 And there's two ways that it can go.
01:11:03.920 One way it can go is you could have a kind of network model like what you see in Uber, which is you can have a whole bunch of sort of kitchens emerge in your local area who produce, instead of producing really good meals for themselves, they do it at some sort of scale.
01:11:22.400 So they produce like 10 or 50 versions of what they're doing.
01:11:27.020 And then you can sort of log on and you can say, yes, I want one of those or something.
01:11:30.380 And then the network model with the autonomous robots and maybe even the drones and the robots will be able to deliver out versions of this at much greater scale than it's possible now.
01:11:41.040 So it would be much cheaper than a current takeaway model.
01:11:46.180 Well, that sounds good.
01:11:47.560 Yes.
01:11:48.260 Or then potentially it would be affordable to have someone cook for you every night, potentially.
01:11:56.040 Yes.
01:11:56.340 So what I'm envisaging is the sort of young man's flat of the future wouldn't even have a kitchen because it wouldn't need one.
01:12:04.180 Maybe like a microwave.
01:12:05.320 Or space.
01:12:05.960 Yes.
01:12:06.360 Yes.
01:12:06.940 Because you would just have your robot, you would just select what you want and then it would turn up very quickly from a local produced hub and your robot would bring it in and lay it out or something like that and then sort of clear it away for you.
01:12:18.100 But that would also, talking from my sort of tinfoil hat libertarian background, wouldn't that then give the government so much power over your ordinary person that you wouldn't even be able to cook for yourself potentially?
01:12:33.820 Yes.
01:12:33.940 You don't even have the means of cooking for yourself.
01:12:36.360 So if you're a naughty state dissident, they can just kick you off of the platform and then you starve.
01:12:42.720 So, yes, I'm not saying that these trends are necessarily good, but we have gone down the route of greater convenience and less friction with online subscription services to various things.
01:12:53.220 So I just think these trends are going to continue.
01:12:54.840 So I'm not making a commentary here on what I think is desirable.
01:12:58.140 Of course.
01:12:58.600 I'm making a comment on how you can invest in a way that is going to make you money.
01:13:03.860 I think that people would have been upset if we didn't point out it sounds a bit scary, but I'm all for making money.
01:13:10.580 Don't worry.
01:13:11.040 I'm not, though, for the chip in the brain thing.
01:13:14.920 Yeah.
01:13:15.360 I don't like the idea.
01:13:17.000 But imagine how much more convenient it would be to communicate with your robot butler if you had a chip in your brain where you could just will stuff and it will do stuff for you.
01:13:26.600 Imagine the possibility of the government taking advantage of this.
01:13:31.460 Yeah.
01:13:31.640 Well, there's lots of ways.
01:13:33.300 So, again, I'm not saying this is necessary.
01:13:34.960 The other way that the food thing can go is you could get, like, an Amazon-like, I mean, it might even be Amazon, but an Amazon-like provider who does the food subscription things at bulk.
01:13:45.120 And actually, the guy who set up Uber is now doing exactly that.
01:13:49.220 He's preparing sort of highly automated, highly standardized, mass-produced meal-type things.
01:13:56.120 So, they have hubs of chefs producing a whole bunch of things.
01:13:59.780 It goes in a sort of standardized package and then it's ready to ship out.
01:14:03.000 And it doesn't really – I don't think it's going to work just yet.
01:14:05.900 But when you've got a robot who can sort of take delivery of it, you know, very frictionlessly, because, of course, when the robo-taxi with the sort of the delivery drone is turning up,
01:14:15.060 it can sort of ping ahead to your robot and make sure that it's met.
01:14:21.080 Well, it could be in constant communication with the robot.
01:14:23.940 It's not like a – you know, it's almost – it would be the equivalent of your delivery driver being on the phone to you the whole time, wouldn't it?
01:14:30.320 And so, your robot could perfectly time its schedule around the delivery.
01:14:35.260 And a whole bunch of other services.
01:14:36.820 So, the other one is, of course, laundry.
01:14:39.100 So, again, I'm not envisaging that the flats of the future, you'll have a robot who will do the laundry for you.
01:14:46.880 What I'm envisaging is that you won't have a –
01:14:49.340 Why are you blackpilling us?
01:14:50.720 No, no, no.
01:14:51.100 It gets better.
01:14:52.200 You just won't have a washing machine and a tumble dryer at all.
01:14:55.060 You just won't have any laundry in your room.
01:14:56.620 What you'll have is you'll have, like, a bin that you put your laundry in before you go to bed.
01:15:01.420 And overnight, the robot will take that, bag it up, and a robo-taxi will pull up out – or a robo-bus will turn up outside.
01:15:09.100 Hand it over, it will go off to some – a laundry hub in the area, and then come back at, say, 5 a.m., freshly, you know, washed and pressed.
01:15:18.220 And the robot – so, from your perspective, you just – you dump the stuff in the laundry bin before you go to bed, and you wake up in the morning, and there's a tray of fresh laundry waiting for you.
01:15:29.380 So, that's going to be very significant as well, because, of course, a lot of these things are operating under the economy of scale.
01:15:35.800 And so, if we're looking at the human race as a whole, just from the implications of this, we're going to be using our resources very efficiently compared to now if we adopt this model.
01:15:48.500 Because, of course, these big industrial washing machines are much more efficient than every person individually having a washing machine.
01:15:56.440 Yes. So, what I'm building on here – and, again, I'm not giving a commentary on whether this is desirable or not – but there definitely is a trend to better –
01:16:05.100 so, you've got to understand that Amazon is not a shop. Amazon is a logistics company.
01:16:09.020 The reason they are one of the biggest companies in the world, if not the biggest of them, they're very close to it, is because they have nailed their logistics.
01:16:15.380 The reason some of the biggest companies, like Netflix, for example, they are that because they have nailed the subscription model.
01:16:22.360 And what I'm saying is these trends have clearly demonstrated that that is the way the market is going, this way that human nature responds to,
01:16:30.260 and therefore there is necessarily no reason to believe that the future investment opportunities won't be based on the same underlying principles.
01:16:36.920 So, it's then what does the technology enable, which is all of this.
01:16:39.720 So, laundry, that's one. Package delivery. So, at the moment, you get – you know, you go on Amazon and something's delivered in a cardboard box,
01:16:47.420 and it's a bit inconvenient because, you know, you either have to be in or it might get stolen if you live in America off your porch.
01:16:55.980 There are lots of waste packaging goes into this.
01:17:00.560 And Amazon have been working on things like delivery drones, but they can't quite make it work because, you know,
01:17:06.300 who wants to be, you know, answer their front door to find a little mini helicopter buzzing in front of their face?
01:17:10.680 How do you just take the package?
01:17:12.460 But, again, if you're in synced with a robot, you know, the delivery drone can come over and it can query the five or six robots that it's got a delivery to make to,
01:17:22.860 and it can say, okay, which of you is available?
01:17:24.500 And one of them will say, oh, I'm walking the dog at the moment, and the other one will say, yeah, I'm in a charging cycle, you know, give me 10 minutes.
01:17:29.000 And another one will say, yeah, I'm available now.
01:17:31.280 You know, he can just walk outside, put his hand up, and the drone gives him the package straight away.
01:17:35.740 So there's, again, minimum friction on all of this, which speeds it up, which lowers cost.
01:17:40.520 And you can then get into more durable packaging as well, rather than having a bunch of cardboard boxes all the time,
01:17:45.220 because the drone can very easily, you know, return the packages the next time, because friction has been removed.
01:17:50.080 So your delivery process goes from you order something on Amazon and you get it late the next day or the following day,
01:17:57.260 to you order something and in a few hours it's there, and you don't have any waste packaging to deal with.
01:18:01.980 So there's a whole bunch of stuff like that.
01:18:03.500 Medical subscriptions, again, if you know the principle of what I'm describing here, you'll see how this works.
01:18:08.500 So basically what I'm describing is there will be downstream companies that emerge that fulfill this criteria,
01:18:14.460 which is basically logistics and subscription-based, so that when you see these companies emerge and you feel that it fits into this space,
01:18:21.600 but you need a mental model of what the trends are that are going on at the moment, what the enabling technology is.
01:18:28.980 It was the PC, then it was the internet, and then it was smartphones, and now it's going to be robotics.
01:18:34.480 And if you identify in your mind what it is that you're looking for, when these companies emerge,
01:18:38.200 you know to invest in them at an early stage, and it will make you an extraordinary amount of money if you do this right.
01:18:44.100 I think I put gardening robots on here, because I don't think your robot's going to be mowing your lawn either.
01:18:50.860 You might have a dedicated little robot thing, and they already exist.
01:18:53.640 They're not very good at the moment, but it will be that sort of stuff.
01:18:56.940 And basically what I'm envisaging is that the robots, I don't think they're even going to work on human schedules.
01:19:03.100 I think they're going to have their own schedules, and actually what I envisage in the future is that the knights belong to the robots.
01:19:12.300 So I don't even think that when you're at home, the robot is going to be particularly visible.
01:19:17.280 You know, that's when it might be on its charging cycle.
01:19:20.960 What's going to happen is as soon as everybody goes to bed, this is when all the laundry robo-vans come out.
01:19:27.020 Creatures of the night.
01:19:28.020 Yes, and the robots start doing the delivery trade-off and the packaging.
01:19:33.040 Yes.
01:19:33.340 Yeah, but I mean, I don't know if the lawnmower is the thing to look for, because it gives a perfect opportunity to people to just be a lawn by themselves a bit.
01:19:44.580 Well, I suppose you can mow your lawn if you want to.
01:19:46.500 Yeah, if there's a lot of moaning inside the house, it's okay, I'll take a break.
01:19:50.120 On the topic of a lot of moaning, actually, it could potentially, the dawn of robots, put a lot of prostitutes out of business.
01:20:00.220 Okay, I hadn't envisaged, why is that?
01:20:04.080 People are strange.
01:20:04.940 No, actually, maybe don't, maybe don't.
01:20:07.580 No, I won't lower the tone.
01:20:08.900 I mean, they have to improve the external appearance.
01:20:12.120 So, to give a final sort of, you know, this is what I'm kind of envisaging.
01:20:19.260 They're like star troopers.
01:20:21.760 Yes.
01:20:22.340 So, my basic vision is in 2024, let's imagine you're a single man in your flat.
01:20:28.600 You get back from work, you change, you dump your clothes over a chair because you can't be bothered to do the laundry.
01:20:33.600 You might do it at the weekend.
01:20:34.520 You have to take the bins out.
01:20:35.900 You realize that you need to shop.
01:20:38.340 So, you get up, you drive off or you walk off, you go to the shops, you shop, you come back, you unpack.
01:20:44.280 You watch podcasts of the Lotus Eaters with a warm beer because you've just got it back from the shops.
01:20:50.920 You want to get an hour of hell divers in on your PC because, you know, you want some entertainment before you go to bed.
01:20:59.200 But you end up making it two and then you don't get enough sleep and you wake up in the morning.
01:21:03.280 You try and find a clean shirt and you can't and you have to go to work.
01:21:07.720 Unrefreshed.
01:21:08.620 Not looking forward to your weekend jobs.
01:21:10.660 But in 2035, once you've got your robot, you get back from work, you dump your clothes in the laundry bin.
01:21:16.740 You don't have to worry about anything.
01:21:17.940 You don't have to take the bins out because they've already been done.
01:21:20.580 You realize you don't need to shop because, you know, you just selected something from the app and the robot will deal with it and just serve you up a plate of hot food.
01:21:27.980 You can then sit down and enjoy podcast of the Lotus Eaters with a crisp cold beer because your robot had done the inventory management for you.
01:21:36.060 You get two hours of hell divers anyway because you didn't have to go shopping.
01:21:40.400 So you get a full night's sleep.
01:21:41.880 You wake up in the morning.
01:21:43.000 You've got a tray of freshly pressed work shirts ready to go the following day.
01:21:48.740 And then you're thinking forward to the weekend.
01:21:51.280 You don't have to do any household chores because they've already been done overnight for you.
01:21:55.400 So you get to watch Lads Hour as well.
01:21:59.240 And in the future, we'll be replaced by robots.
01:22:02.260 So we'll be streaming all night long.
01:22:04.260 So there is that.
01:22:05.760 But I think there is an investment opportunity until the point that we're actually replaced.
01:22:09.780 So, yeah, mark my words on this.
01:22:11.540 Right.
01:22:11.720 Do we have any of those?
01:22:13.440 We do indeed.
01:22:15.200 Got a few actually there.
01:22:16.420 Right.
01:22:16.700 OK.
01:22:17.400 So that's a random name.
01:22:18.560 Says jokes aside, for the robots to work as intended, as Dan suggests, they need to be very advanced AI.
01:22:23.520 Yeah, but that's the point that I'm making.
01:22:26.480 So everything that I've done there has moved the bulk of the work to external workflows,
01:22:33.300 which are based on logistics management and subscription services.
01:22:36.920 So actually, all the robot needs to do is basically take laundry from a bucket out to a van and then bring a fresh pressed tray back.
01:22:46.240 So actually, the robots can do that today.
01:22:49.080 So I'm not assuming advanced AI in that.
01:22:52.360 And the same with a lot of those processes.
01:22:53.860 Like the food, if you want your robot to cook a meal from, you know, skinning an onion and doing all the rest of it and mixing the spices, that is advanced AI.
01:23:02.340 If it's handling a prepackaged kit from an Amazon-like food provider, the amount of work on it is significantly reduced.
01:23:10.920 So, yeah, I have thought about this quite a lot.
01:23:13.280 That's a random name says, oh, no, that's, oh, yes, that's a random name.
01:23:18.540 Also says, also, this entire model is based on a society where their technology is ubiquitous and universal.
01:23:24.420 How exactly would that get implemented and who would pay for it genuinely curious?
01:23:28.200 Well, no, people will just adopt it because you'll get you'll get an adoption curve of people who like the smartphones.
01:23:33.120 There was a few people at the beginning who had smartphones, but not a lot.
01:23:35.700 And then more and more people kind of went over to it and then everybody's got one.
01:23:39.980 In fact, eventually it'll get to the point whereby you'll actually be impeded for not having one because the culture's moved on so much.
01:23:45.860 So fascinating statistic.
01:23:47.160 Today, there are more people that have a smartphone in the world than have a toilet.
01:23:54.100 There are people in literal shanties with no toilets who still have a smartphone.
01:23:58.200 So that I think is incredible.
01:24:01.080 Makes finding a hole in the woods.
01:24:03.020 Yes.
01:24:04.260 Lady, no, Dragon Lady Chris says, I do very little online shopping.
01:24:10.080 I prefer bricks and mortar stores unless I need an item I can only get online.
01:24:15.840 So good for you.
01:24:17.000 But most people are choosing frictionless logistic solutions, which is why Amazon is one of the biggest companies in the world.
01:24:26.520 Oh, stuff's appearing at the top and the bottom.
01:24:33.600 I assume the main thing stopping the...
01:24:35.480 Oh, no, go back.
01:24:36.940 Because I started reading one.
01:24:38.380 I assume the main thing that's stopping the nighttime robo-economy will be the unintended robots being harvested in diverse areas.
01:24:47.520 A telepresence industry will also be a thing.
01:24:49.660 Yeah, so they need to have a think about how to integrate it with the droids of diversity.
01:24:54.000 And Bald Eagle says, these robots are going to destroy the nursing home and hospice care industries.
01:24:59.080 I trust a robot that wouldn't abuse my grandfather to look after him.
01:25:02.240 Yeah, now that does require a bit more advanced AI, going back on what I said before.
01:25:07.500 But you start the robots with things like the laundry and taking the bins out and the food subscription services.
01:25:12.660 And then, you know, it will clock up more hours in human environments.
01:25:17.660 The learning algorithm will then be fed back and then it can, you know, look after your granddad and stuff.
01:25:22.200 Right.
01:25:23.880 Rattle through all of that.
01:25:25.020 Right.
01:25:25.380 Let's talk about the fat jab, shall we?
01:25:27.680 Here we are.
01:25:31.180 Without any audio, apparently.
01:25:32.720 Get ready to lose some pounds.
01:25:37.180 Get ready to get shot and thrown in a ditch.
01:25:39.920 Get ready to get dipped in acid baths.
01:25:42.640 But first, you're going to get raped.
01:25:45.980 What is that?
01:25:47.680 Oh, it's the hospital.
01:25:49.960 The hospital?
01:25:51.500 Yeah, Tuesday.
01:25:52.640 They burn cripples, eternally ill.
01:25:55.500 Drag on the state.
01:25:58.260 There you go.
01:25:58.640 What is this?
01:25:59.560 You have a safe trip, son.
01:26:00.740 I think that was a TV show, but...
01:26:04.180 Oh, I can't move there then.
01:26:10.600 That's a shame.
01:26:12.220 But yes, it's interesting that they're going to jab fat people on benefits with a fat-destroying thing.
01:26:21.520 Never thought we'd be in that position.
01:26:26.100 AI.
01:26:26.580 With the advancement of AI, I am finding it difficult to generally tell apart what is real and what is not.
01:26:33.160 Complicate this further with the ability to generate AI imagery, which can be used to alter narratives to such an extent that it's near impossible to tell the difference of seeing what is real and what is not.
01:26:44.660 This is kind of made worse with the advancement of AI-generated voice simulation, which is becoming so advanced that it's hard to tell the difference.
01:26:53.140 And yeah, we have one real dystopian future ahead of us.
01:26:56.240 Yeah, I mean, I'm not a fan of this.
01:27:01.860 I'm not a fan of just having a society where we do nothing.
01:27:07.780 Yeah, so I've thought about that as well.
01:27:09.800 The way that you get around the AI problem is you basically need the good form of digital identity.
01:27:15.100 So which is the, I can't remember the name of it, but there is a type of individual-based digital identity that you have where you control the digital ID.
01:27:27.240 And then you can attach it to things like messages and other things that are genuinely from you that can be verified that it is from you as opposed to,
01:27:35.360 because you don't want some scammer cloning your voice, calling your mother and saying, look, I'm on holiday and I've lost my wallet and you need to transfer me money.
01:27:42.260 Well, my parents wouldn't do that even if it was me.
01:27:44.100 So I'm all right.
01:27:44.820 Well, yes, but you want to be able to attach a token that confirms it is used.
01:27:50.060 So yeah, I can't remember the name of the protocol now, but yeah, that solution is being worked on.
01:27:55.720 We got any more videos or...
01:27:57.400 Was that it, Samson?
01:27:58.780 That's it, thank you.
01:27:59.820 We got some comments instead.
01:28:01.620 I've got nothing after this, so I think we can go long and read some comments properly.
01:28:05.720 Is that all right, Samson?
01:28:06.800 Always up for that.
01:28:09.280 Okay, we got some honorable mentions there.
01:28:11.400 Do you want to do the comments, Dan?
01:28:12.620 Yes.
01:28:13.020 Well, I'll do my ones and the common ones and I'll let you do your individual ones.
01:28:18.160 So, Hugo Bossman says, my Islanders came today.
01:28:21.820 I'm going to chill later with a beer and give it a good read, but I already love the aesthetic.
01:28:26.080 Yes, the aesthetic is very good.
01:28:28.080 Roy does that.
01:28:28.660 Well done, Roy.
01:28:30.320 Theodore Pinnock says, I have received my copy of Islander 2 while watching the podcast and then adds Splendid, which it is.
01:28:36.940 The best piece is at the end, by the way.
01:28:41.200 Is that you?
01:28:41.800 It is, yeah.
01:28:42.980 Calm Rob says, happy to report that I've now received my Islander magazine.
01:28:46.520 I lost my job during COVID because somebody took offence to my use of the R word.
01:28:52.120 Alongside my stance on COVID.
01:28:53.620 Yeah, well, that might have also made a big difference.
01:28:56.900 So, good.
01:28:58.040 James Roberts says, got some reading to do this afternoon.
01:29:01.260 Just received my copy of Islander 2.
01:29:02.920 Thanks, Lotus Eaters.
01:29:04.140 Nice job with a poem, Josh.
01:29:05.840 Well written and very meaningful.
01:29:08.180 Thank you very much.
01:29:09.200 BattleBat says, white bill for the day, I had a baby scan yesterday and found out that we're having a boy.
01:29:16.540 Yay.
01:29:16.940 Congratulations.
01:29:18.260 Next generation of Lotus Eaters audience in production.
01:29:21.580 Yes, I have to say, actually, of all the things that we talk about, our people replicating is basically well up there.
01:29:30.520 As in, you know, our side of things are having kids and the others are not.
01:29:33.860 Yes.
01:29:34.840 Ultimately, that's what it comes down to over the long term, isn't it?
01:29:37.460 So, well done, Mr. Bat.
01:29:39.440 Top work.
01:29:40.400 I'm sure you did the hard part there.
01:29:43.120 Make sure you make a lot of little bats.
01:29:45.760 Yes.
01:29:47.440 Well, that's you, isn't it, for your...
01:29:48.800 It is me, yes.
01:29:49.780 Captain Charlie the Beagle says, to be fair, I'm not surprised the R word has made a comeback.
01:29:54.080 People, in brackets men, have always looked for the harshest words to call someone.
01:29:58.380 That is true.
01:29:59.080 You know, one thing that I've had to point out to people before is slurs are often the go-to thing
01:30:05.720 because they're the thing that gets the most reaction, but it's not always been the case.
01:30:11.260 There are lots of other words throughout history that have held a lot more significance because they were the most offensive.
01:30:16.320 People use the most offensive word, and that's determined by people's sensibilities.
01:30:20.860 There's nothing inherent in this particular sound the mouth makes.
01:30:24.040 Since it was cancelled by the Wokies, it has made it forbidden fruit, and as such gave the word more potency.
01:30:31.040 That is correct.
01:30:32.780 Sophie Liv says, it's also similar to how YouTubers banned the words killing and suicide, and everybody now says unalived.
01:30:40.580 I quite like silliness, more generally.
01:30:43.360 It gives conversation a more Monty Python-esque character if you say silly, counterintuitive words.
01:30:51.280 So I actually quite like how we've adapted to the censorship, even though I don't approve of the censorship.
01:30:57.320 And she says, so yeah, first, it sounds stupid.
01:30:59.640 Second, we all know what it means.
01:31:01.240 You changed nothing.
01:31:02.180 People are just saying he was unalived.
01:31:04.340 She unalived herself.
01:31:06.860 You didn't protect anyone.
01:31:08.320 That is true.
01:31:08.900 I also like, you know, they roped themselves as well.
01:31:13.380 That's got a good ring to it.
01:31:15.000 Omar Awad says...
01:31:16.400 They were suicided.
01:31:18.040 Well, that has the word in that YouTube takes accepted.
01:31:21.280 You can say murdered, I think.
01:31:23.640 I think that's actually better than suicide.
01:31:27.100 That's going to be taken out of context, isn't it?
01:31:29.880 As in, for the YouTube algorithm, Omar Awad says, the euphemism treadmill remains forever undefeated.
01:31:36.820 All roads lead back to the original slur.
01:31:39.480 Given time, return to retard was inevitable.
01:31:43.080 And I'll do one more.
01:31:44.060 David Ferugia, if black people can give you an N-word pass, surely I, as someone with an acquired cognitive defect, can give you the TARD card?
01:31:55.020 Well, actually, because I've got a master's degree in psychology, I can diagnose someone as retarded.
01:32:01.960 And so I have a retarded license.
01:32:04.540 I do like the TARD variation.
01:32:07.320 It's, it, I think it, it softens the word slightly, but it makes it more fun.
01:32:12.180 It's like, oh, you TARD, you know?
01:32:13.720 Yes.
01:32:14.320 See, there's a good sort of character to it, isn't there?
01:32:16.960 Yeah, saves a bit of time.
01:32:17.820 Right, so, Jeroen Van Kalkeren, that road used to be a symbol of unity.
01:32:24.640 This is a very symbolic sign that unity isn't possible anymore.
01:32:28.460 Derek Power, blowing up your own road is another Looney Tunes antics turned into reality.
01:32:34.420 We're going to drop anvils from the balloons soon.
01:32:37.140 Screw tape lasers, so what you're saying is that I can launch a best-in-class propaganda balloon for only a K.
01:32:44.500 Yes, that's exactly it.
01:32:47.820 And Charles Burgess, the North Koreans do the propaganda balloons as well.
01:32:52.160 I was stationed in South Korea when Obama was screwing up foreign policy there,
01:32:56.920 and one of my co-workers had one of the leaflets fall at his house.
01:33:01.420 Unfortunately, Intel didn't let him keep it.
01:33:05.240 And Furious Dan, a northern wasteland that's hostile to its southern neighbors, Canada.
01:33:12.380 That was what I had in mind, but that could also work.
01:33:16.240 Yeah, and just one last, Arizona Desert Rat.
01:33:19.300 I'm more of a snowball effect type of person.
01:33:21.660 It stays small, then grows until eventually turns into an avalanche.
01:33:27.240 Do you think it's the butterfly that causes a hurricane in the other side of the world?
01:33:32.920 I never liked that analogy in the first place.
01:33:34.940 The pedant in me sort of, you know, is dissatisfied.
01:33:40.700 It's just like, how can a butterfly start a war?
01:33:43.560 Yes.
01:33:43.780 And what if you've got two butterflies at exactly the opposite side of the world who flap their wings at the same time?
01:33:47.680 And what about if you don't know whether the butterfly is flapping its wings or not?
01:33:53.960 And it could be both flapping its wings and not flapping its wings simultaneously.
01:33:58.100 And the act of observing it could affect the outcome.
01:34:01.620 Are you noticing the butterfly? You have the videos, you know.
01:34:03.160 Two butterflies flying, are you noticing?
01:34:08.160 It's okay, Jeff, come on.
01:34:09.440 All right, okay.
01:34:10.440 Michael Brooks says, call me a Luddite, I don't want this, and they are heavily overselling.
01:34:16.440 I work in a food factory, and none of the robots or machines are even half as capable as claimed.
01:34:22.400 Yeah, so if you look at what the guy who, I can't remember his name now,
01:34:25.360 but the guy who came up with Uber in his food business is doing,
01:34:28.700 it's still human chefs preparing stuff.
01:34:30.660 But what they're doing is they're standardising the packaging and delivery
01:34:34.160 and the fast logistics rollout of it.
01:34:36.740 And that, I think, could be like the Facebook of the, you know,
01:34:41.140 it emerged before the smartphones, but the smartphones really gave it the boost.
01:34:45.440 So I think food logistics subscription services are the thing.
01:34:48.540 We still have human chefs, but it's just, honestly, it's all about the logistics when you go for scale.
01:34:54.620 Nick Taylor says, I manage a 7,500 hectare cropping programme.
01:35:00.140 And between automated vehicles and robots, the agriculture industry will hire almost nobody within a decade.
01:35:05.840 Yeah, this goes to your point about Mexicans, and you just don't need,
01:35:09.660 if this does what I think it's going to do, it's going to look,
01:35:13.660 we're going to look so stupid having imported millions of people.
01:35:16.720 Well, it was stupid to begin with, of course, wasn't it?
01:35:18.640 It's going to look extra stupid.
01:35:19.880 Well, you know, people must have already known that the population we already had
01:35:26.220 would have been too many people to be employed.
01:35:28.700 Without adding more unproductive people that need to be catered.
01:35:31.860 Exactly.
01:35:32.180 It's mental. And actually, on Nick's point, there is a very interesting,
01:35:35.720 and again, I've forgotten the name of this as well, but a very interesting AI company.
01:35:39.600 What it's doing is it goes out into a field, and it individually classifies every single plant.
01:35:46.460 So you imagine a huge field of wheat.
01:35:48.080 It will classify each individual plant, and then when the machine is going over the waters
01:35:52.460 and provide fertiliser and stuff like that, it will adjust the nozzles to give exactly
01:35:57.000 what each individual plant needs.
01:35:59.300 There's no way that a human being could ever compete with that level of efficiency.
01:36:04.120 Yes.
01:36:04.500 So, I mean, it has a, I mean, they're talking about like 30, 40% yield increases,
01:36:09.380 which is enormous.
01:36:10.680 That is insane.
01:36:11.560 Yes.
01:36:12.260 Furious Dan, all this talk about renting robotaxis and butlers conjures the ULO,
01:36:16.200 nothing could be happy future.
01:36:17.600 Yeah.
01:36:17.860 So again, I'm not saying that I like this.
01:36:19.280 I'm just saying that the trend is clearly towards non-ownership and subscription-based
01:36:24.460 services.
01:36:25.680 So, it's fine to not like it.
01:36:28.840 I just intend to make money out of it, so that at least I can be one of the people who
01:36:32.720 do own something.
01:36:33.380 It's easier to mitigate the excesses of society if you're rich.
01:36:36.920 Yes.
01:36:38.200 Omar Ward says, if communists are anything to go by, fully automating the human experience
01:36:43.420 is going to massively accelerate the decline.
01:36:46.180 Rather than using their extra free time to improve, create, 90% of people will settle
01:36:50.500 into ultra-consumerism.
01:36:53.300 Yeah.
01:36:53.880 Well, the problem is that logic then gets you into the Nuri Harari view, which is useful.
01:36:59.800 I hate that man.
01:37:00.900 Yeah.
01:37:01.060 But it's going to be like Westworld.
01:37:05.280 As in, you won't be able to tell the difference between, yeah, I love that show.
01:37:08.860 It's great.
01:37:09.280 Yeah.
01:37:09.560 I love the first series, not the rest of it.
01:37:12.000 Yeah, but that's the thing, that people will start imploding.
01:37:16.940 Well, people can go to the Wild West and live out their fantasies and shoot robots.
01:37:21.820 That sounds cool to me.
01:37:22.660 That would be quite fun, but I don't think I'd want to do it in the Wild West.
01:37:25.600 I'd want to do it in...
01:37:27.700 Where?
01:37:28.640 In the jungle.
01:37:30.280 You're more like a jungle guy.
01:37:31.500 Am I allowed to say inner cities?
01:37:33.000 Is that...
01:37:34.240 Moving swiftly on, Grant Gibson says, all this scaling and pooling is just going to turn
01:37:40.620 this into a hive.
01:37:41.700 Yeah, again, I'm not saying I like it.
01:37:42.860 I just think that's where it's going.
01:37:44.260 I'm going to get scabies from the communal laundry service, and I'm not interested in
01:37:48.360 living in that future.
01:37:49.720 Also, they're not going to let you mow your lawn, drive yourself, or anything the robots
01:37:53.560 can do, whether it's the safety of the environment, they'll find a way to take things
01:37:57.080 away from you.
01:37:57.440 Yeah, I think that's going to happen as well.
01:37:58.840 So, for example, you try getting in a lift a day and operating it yourself, you know,
01:38:02.720 with the handle that goes...
01:38:03.600 Because we used to have people in lifts that would just manually control it with a lever.
01:38:08.420 And it's just not an option anymore.
01:38:10.900 It's all automated for you.
01:38:12.660 And I think, yeah, he's right.
01:38:14.340 Mowing your lawn and driving are going to be things that will be taken away from you
01:38:17.580 eventually.
01:38:17.840 No, I like driving.
01:38:19.960 Yeah, I think he's going to be...
01:38:20.720 I like entering the elevator.
01:38:23.820 Yeah.
01:38:24.340 Yeah.
01:38:24.740 I don't like either of those things.
01:38:26.300 No, no, driving.
01:38:27.240 I like driving.
01:38:29.220 Yeah, I think it will be taken away from you, though.
01:38:32.120 God.
01:38:33.220 I'd rather be doing something else.
01:38:34.680 Arizona Desert Rat says, we already have driverless taxis in Arizona.
01:38:38.000 They're run by Waymo, and they're prohibitively expensive with very limited range of travel.
01:38:42.720 Yeah, so I have a number of issues with Waymo, which I'd love to get into,
01:38:45.740 but I don't have time here, so I won't.
01:38:49.460 Let me see.
01:38:50.080 What else have we got?
01:38:50.860 Thomas Howell says, still hold to this one, Maxim Dan.
01:38:55.160 Robots are going to look more like TARS from Interstellar and less like C-3PO.
01:39:01.460 Okay, so actually, I think most of the robots won't look anything like humans
01:39:04.680 because I classify robo-taxis and robo-buses as robots.
01:39:07.640 They just look like vehicles.
01:39:09.160 So the form will follow the function.
01:39:10.680 But if you've got a robot that operates in the home, it will look like a human
01:39:13.540 because the home is designed around the human form,
01:39:16.260 and therefore the robot needs to be as well.
01:39:17.980 But yeah, there will be a wide diversification,
01:39:20.100 and I don't think you're going to have some weird hybrid of something
01:39:22.040 that doesn't look like a human operating in a human environment.
01:39:24.620 You'll just have something completely different,
01:39:26.340 and a robo-taxi is an example of that and other things as well.
01:39:28.940 I want a robot monkey.
01:39:30.000 Just throwing it out there.
01:39:32.300 Possibly.
01:39:33.920 Justin B says, everybody should read Asimov's robot series.
01:39:37.000 It's a great description of why we should not want robot butlers and drivers.
01:39:40.340 I haven't read that, but I'm sure it is good
01:39:42.440 because old Isaac was good at that sort of stuff.
01:39:45.340 And Monkey Smoke says,
01:39:46.800 no room for a machete fight in these buses.
01:39:49.700 Never take off.
01:39:51.000 Yes.
01:39:52.320 Yeah, if we have this brave new world
01:39:54.460 where people are allowed to reach their maximum efficiency
01:39:57.020 because they're not wasting time on shopping or laundry
01:39:59.780 or taking the bins out,
01:40:02.080 it does come back to that problem.
01:40:03.800 What about people who are just fundamentally bloody useless and unproductive?
01:40:08.240 We'll have a lot of time to sort them out.
01:40:10.760 Yes.
01:40:11.600 Possibly.
01:40:12.420 So with that, we're 10 minutes over.
01:40:14.780 Anything we say before we wrap up?
01:40:16.860 No, I'm okay.
01:40:18.300 Right, bye then.
01:40:18.860 No.
01:40:19.380 Yeah.
01:40:20.220 I'm so worried about it.
01:40:21.240 No, I'm okay.
01:40:23.220 Bye then.
01:40:24.080 Bye then.
01:40:25.380 Bye then.
01:40:25.820 Bye.
01:40:26.720 Bye now.
01:40:27.580 Bye.
01:40:27.740 Bye.
01:40:27.800 Bye.
01:40:28.820 Bye.
01:40:28.880 Bye then.
01:40:29.120 Bye.
01:40:29.740 Bye.
01:40:30.340 Bye.
01:40:30.800 Bye.
01:40:31.040 Bye.
01:40:31.360 Bye.
01:40:31.860 Bye.
01:40:32.020 Bye.
01:40:32.500 Bye.
01:40:32.860 Bye.
01:40:33.100 Bye.
01:40:33.740 Bye.
01:40:34.780 Bye.
01:40:35.040 Bye.
01:40:35.280 Bye.
01:40:35.780 Bye.
01:40:36.000 Bye.
01:40:36.660 Bye.
01:40:37.060 Bye.
01:40:37.360 Bye.
01:40:37.900 Bye.
01:40:38.580 Bye.
01:40:39.420 Bye.
01:40:41.680 Bye.
01:40:42.600 No, bye.
01:40:43.600 Bye.
01:40:44.940 Bye.
01:40:45.680 Bye.
01:40:45.980 Bye.
01:40:46.220 Bye.
01:40:47.120 Bye.