The Podcast of the Lotus Eaters - July 16, 2024


The Podcast of the Lotus Eaters #957


Episode Stats

Length

1 hour and 32 minutes

Words per Minute

192.96648

Word Count

17,789

Sentence Count

1,354

Misogynist Sentences

15

Hate Speech Sentences

44


Summary

In this episode, we discuss why the Trump assassination was most definitely an inside job, and why J.D. Vance is a perfect fit for the job of protecting the President of the United States, Donald Trump. We also discuss why Jill Biden should have been given a bigger role in the Secret Service.


Transcript

00:00:00.000 Hello and welcome to Podcast of the Lotus. It is episode 957. We are getting up there
00:00:15.200 on the 16th of July and I am Dan. I'm joined by Josh. Hello. And Jeff. Hello. Who buys
00:00:21.700 cars. I do. My name's Jeff and I sell used ones by how I'm dressed today. So in this
00:00:28.040 episode we're going to be talking about why the Trump assassination was most definitely
00:00:32.540 definitely an inside job. We're going to be talking about why J.D. Vance is almost perfect.
00:00:38.020 Those are your words not mine. Well I had to write a description for the episode so I went
00:00:41.960 with that. We will find out whether that is actually the case or not. He's a good fit for
00:00:46.480 Trump I think. It's fair to say. Okay. And Jeff buys cars. He's worried that AI is going to start
00:00:51.980 buying cars I think. I think it's going to steal our jobs and our women. Right. Is it going to drive
00:00:57.000 the cars as well? I'm not so worried about that. I'm more worried about what's happening
00:00:59.920 with car sales descriptions in AI but we'll dig into that in a little bit. Yes. Yes. Good
00:01:05.080 thought. Right. First thing we need to talk about is how the Trump assassination was most
00:01:10.680 definitely an inside job. Now I got a little bit of pushback in the office this morning when
00:01:15.320 I said I wanted to cover this and a lot of worried looks and people saying you sure you
00:01:18.580 want to go down. But no. No. It is 100%. I can prove it. I can prove it. Right. Let's have
00:01:23.760 a look at this tweet thread from an excellent source. Very cleverly written. King Bingo
00:01:31.960 underscore. Now what he basically says here. Let's get into it. It definitely was an inside
00:01:38.540 job and we can sort of prove it. So let's go through the evidence that makes it 100% clear
00:01:43.300 what happened. Okay. First up we're not saying that the Secret Service as a whole was in on
00:01:51.260 it. We're not saying that the detail on the ground was in on it. You know I think they
00:01:55.040 acquitted themselves well with the resources that they had. Obviously the guys you know
00:02:01.080 they did what they had to do in the moment. But the organisation as a whole has quite clearly
00:02:06.540 been understaffing the operation on Trump's side quite severely.
00:02:11.660 And that's been going on for quite some time that Trump's been requesting additional security
00:02:16.380 and it has been denied. And in fact there has been considerable discussion about the
00:02:21.220 fact that Jill Biden actually had a decent number of security detail as appointed by this
00:02:27.140 director Kimberley Cheadle. She's been denying that she diverted security away from Trump which
00:02:34.420 you know is undeniable. I don't think Trump himself has been requesting the extra.
00:02:37.880 But his team. His team yeah. The Secret Service on the team have been requesting extra.
00:02:42.380 And Cheadle she is a ex-Pepsi employee. So she was head of security for Pepsi. So she went from
00:02:49.940 guarding soft drinks to the president which is of course a natural move. Very obvious why you might
00:02:54.960 sort of make that sort of parallel shift. She's a Biden appointee, direct appointee. And her primary
00:03:01.800 focus is as we will see it is getting women in the Secret Service to 30%.
00:03:07.380 Right. Which is...
00:03:11.380 Her primary focus isn't the security of the thing that she's working on.
00:03:15.660 It's boosting the number of women.
00:03:18.520 Because of course more women will make him more safe for reasons unknown.
00:03:23.840 Well throughout human history of course it has been the way that women protect men.
00:03:28.080 That's true yeah. We sort of lounge around and do nothing while the women do all the work.
00:03:32.520 At least according to feminists anyway yeah.
00:03:35.520 So yeah questions to be raised there. Now actually this poster does make a different argument.
00:03:43.400 He says you know that a job that requires a response to extreme violence shouldn't just
00:03:47.440 be done by men. It should be done by a few select men.
00:03:52.220 Probably men that know what they're doing and are highly trained.
00:03:54.820 That are physically fit, mentally capable of whatever tasks they've got to do and probably
00:04:01.320 big. Because you know if you are taking a bullet you want to cover as much mass as possible right?
00:04:08.000 Well when you're part of your job is literally to get in the way of the target and the shooter.
00:04:15.700 I mean being big is a qualification in itself.
00:04:18.020 It is yeah.
00:04:18.560 Yeah. It's almost one of the biggest qualifications dare I say.
00:04:23.040 Yes. Very reasonable point. Now the understaffing on the detail meant that a ridiculous situation
00:04:30.180 like this could happen. I mean it's quite absurd when you sort of see it laid out like that.
00:04:35.520 The sniper got to within 150 metres. Now it is worth and you can do this on Google Maps quite
00:04:41.480 easily. Just go and after this segment of course. Go and look out the window and scan the buildings
00:04:48.900 around you. Then go to Google Maps and find one which is 150 metres away. I mean it is a reasonable
00:04:55.180 distance but it's not a big distance.
00:04:58.460 You could see a rifle with your own eyeballs from that distance.
00:05:03.260 As long as you've got 20-20 vision which you should have as a you know the secret service protection.
00:05:07.960 And it is also worth mentioning that he was using an AR-15 style rifle. They haven't specified
00:05:14.020 exactly what rifle he was using yet. But if he had actually used a sniper rifle well he could
00:05:20.100 have been much much further away. And if the security detail was so close to Trump that they
00:05:25.100 weren't even covering that rooftop there 150 metres away then what if someone is I don't know
00:05:32.020 300 metres away 400 metres away then it's going to be completely uncovered.
00:05:36.440 I mean it's one of those things where you know sometimes you're reticent to jump in on
00:05:41.160 things when you when you lack the requisite experience. I think you can be a total amateur
00:05:45.540 and spot that this was a magnificent screw-up.
00:05:49.080 I think if you can shoot at the president and hit him the security has failed.
00:05:54.360 Yes. But the point is that level of understanding was obviously a choice.
00:06:00.760 Well I think if you'd said to anyone with no experience with weapons that there's this event
00:06:05.540 going on and the president's going to be stood here we'd like you to shoot him with something
00:06:08.820 where would you like to be to do that. That's probably where you'd go.
00:06:12.920 You've got to work it out for yourself.
00:06:14.260 So from a security point of view that's probably the first place that I'd probably want to put
00:06:18.480 some staff.
00:06:19.260 Well the worst thing about it is that they did a security walk the day before as a sort of
00:06:23.900 standard procedure and they identified that rooftop as a prime location for a potential
00:06:28.820 shooter and then did nothing about it.
00:06:31.960 Yeah.
00:06:32.380 Our work here is done. That is definitely where we're going to get shot from.
00:06:36.700 But my point is is that understaffing to this level was a deliberate choice by the Secret
00:06:43.840 Service. It was a deliberate choice to understaff it to that level. And that's just the tip of
00:06:49.980 the iceberg. You've got Democrat members who have been introducing legislation to get Donald
00:06:55.060 Trump's Secret Service protection taken away entirely. So this is Bernie something. Bernie
00:07:02.960 Thompson is it? Who tried to get his Secret Service detail taken away. And then look helpfully
00:07:10.260 after the shooting this is the field director for this particular congressman who basically
00:07:15.940 just spells it out. I don't condone violence but please get someone shooting lessons so you
00:07:21.660 don't miss the next time. Excellent. I mean you could not be more clear what the purpose
00:07:28.380 of removing the Secret Security Service that he had was. I mean she's literally spelling
00:07:34.300 it out there. So they are genuinely just saying we need to remove all of his protection so we
00:07:39.340 can shoot him. Well that is literally what she's saying there. Yeah well she's saying that somebody
00:07:43.480 else can do it. Even if you take a charitable interpretation which I don't but just in case you do and
00:07:49.360 say well Donald Trump's a billionaire you know he can afford his own protection but with the
00:07:54.020 Secret Service comes you know certain privileges. Yeah well they have access that private security
00:07:58.260 wouldn't. Exactly. That's exactly my point that he he is of high enough status and notoriety that
00:08:04.620 he needs this security to be able to stay alive. It is indisputable really. And you know the
00:08:11.760 limitations on your own private security are such that sure they might help a bit but they're not
00:08:17.920 going to be as good. I mean to be fair if I was him I'd have both. Yes exactly. Just just overstock
00:08:23.440 it. Have the private security watching the Secret Service just in case any ideas. Yes but so clearly
00:08:30.240 it was a it was a choice to understaff it and it was a choice to try and remove even that. So that is
00:08:35.340 all of that is entirely deliberate. Okay what about the recruitment of the sniper himself? Now I'm
00:08:39.840 arguing that this was also an inside job except they didn't do it by literally going out and
00:08:46.320 recruiting an individual and saying would you like to do this. They did it by overwhelming
00:08:50.120 statistical likelihood. So this is actually called stochastic terrorism. Exactly right Josh that is
00:08:57.760 exactly that's what exactly what I was going to say. It is stochastic terrorism. So basically what
00:09:03.260 you do is you create a situation where it is overwhelmingly probable that the action that you
00:09:09.180 want will occur. So put this into perspective. Let's say I wanted to take out the staff at the
00:09:15.120 Lotus Eaters. Don't do that please. No but but if I did and I didn't want to go to jail
00:09:20.980 for it I could do it statistically. So I could say like okay we're going to do a team building
00:09:26.140 exercise we're going to go cave diving or something you know the riskiest hobby but there is. I
00:09:31.380 wouldn't agree to that for a start. So not now you've told him. Well I've let the cat out
00:09:36.720 the bag now. You're going to choose a location where it would be probable that it would be easy
00:09:41.960 for you to then remove the staff members in the correct way. Well yes I would say we're going to
00:09:46.760 go cave diving and I would select the least experienced guide possible. Yeah. And understaff
00:09:51.300 it. And I would do things like okay we're going to do a Lotus Eaters live event and you'd be all
00:09:57.020 well that's good. Where are we going to do that? Whitechapel. Baltimore. Baltimore or something you know that
00:10:02.860 place in the wild. Baltimore. Yeah. Yes. You know I would pick the most statistically dangerous
00:10:07.580 areas. Yemen. Yeah that's a good one. Yemen if I could get away with it. But you know but you
00:10:13.220 but basically what what's happened is the media have created an environment where and you've got
00:10:18.280 a member of this guy. I mean he's 20 years old. So when did Trump pop onto the scene? It was about
00:10:23.320 eight years ago wasn't it? He's been the head of the Republican Party since he was 12 years old. Yeah.
00:10:29.120 And 12 is probably about the point where you start paying at least some attention to the
00:10:33.940 news. So this kid's entire life he's been bombarded with Trump is literally Hitler. He's an existential
00:10:40.440 threat to the country. He's a genuine danger. Now if he never changed the news channel and
00:10:46.320 watched something else or watched us or you know friends of ours he's just been bombarded
00:10:51.600 this for eight years. It's also worth mentioning that I believe his parents were either registered
00:10:58.780 for the Democratic Party and I think one was a member of the Libertarian Party with with which
00:11:03.440 the state of they may as well be the Democratic Party. Yeah. At this point despite being a
00:11:08.080 Libertarian myself. And I'd actually go further in this argument. I don't think that this is the
00:11:11.480 first shooter. I think he's the shooter that managed to go all the way. It would not surprise
00:11:18.440 me in the least if a half dozen times before a shooter has turned up at one of Trump's rallies
00:11:22.980 taken a look at the security and thought yeah I'm not getting through that and turned around
00:11:26.640 and gone home. This time he turned up and looked at the security and thought oh bloody hell there's
00:11:30.760 a massive hole in this I can actually get through. But honestly it wouldn't surprise me if at least
00:11:36.660 half a dozen times somebody has pitched up with a rifle in their boot before looking to do exactly
00:11:41.340 this. Because they're playing the statistical game. They've been grooming an entire generation of
00:11:46.260 kids to believe that it is your... Because
00:11:48.880 schooling these days
00:11:51.380 all you get, every history, I mean almost everything is
00:11:55.700 World War II, Hitler is bad
00:11:58.420 wouldn't it be good if you could go back in time and assassinate Hitler?
00:12:01.480 Oh but you can't because we don't have time machines. Oh but
00:12:04.240 we've got another Hitler over here. How heroic could you be?
00:12:09.180 So yes
00:12:10.200 So they recruited it through that and basically this points out so...
00:12:15.320 Help me Josh. Stochastic terrorism.
00:12:17.800 So sarcastic terrorism. Sarcastic terrorism, yeah.
00:12:20.640 Works when the media and in fact the entire establishment incites extreme hatred of Trump
00:12:27.240 thus inspiring lone actors to commit unpredictable yet statistically very probable assassination attempts.
00:12:34.980 That's without mentioning Mind Control or MK Ultra or any of those things that people will be talking about in the comments.
00:12:43.580 Did they test the shooter for LSD? That's what we need to know.
00:12:46.460 That is what we need to know. I mean I take issue with the fact that the shooter is 20.
00:12:50.340 I mean most people who I know who are 20 are more concerned with
00:12:54.040 you know getting to work because they probably work as like waiters or bar staff or something
00:12:58.280 and car insurance. They're more bothered about those things than trying to wipe out the leader of one of the Western nations.
00:13:06.340 Well yes.
00:13:07.320 Our resident historian Beau actually mentioned yesterday that a lot of assassins throughout history have been in their 20s, in their early 20s more often.
00:13:16.600 Like most of the 9-11 hijackers, I know that's you know they're not assassins, they're terrorists but
00:13:20.840 you know most of them were in their early 20s. And I think that's quite a good sample to show you that actually
00:13:27.740 young men, right, you know that their brains are not fully developed to understand the implications of what they're doing
00:13:34.880 and also that they're sort of willing to do silly risky things. It's not until you get a bit older that you start becoming a bit more sensitive.
00:13:43.320 So I understand and I'm not sure this is true or not but I have heard it.
00:13:46.380 You grow into consequences.
00:13:47.620 Yes. I have heard that those people that they put in those bunkers to press the button on the nuclear missiles
00:13:55.460 if they get the alert saying yeah press the button, they're all young men in their early 20s for that same reason.
00:14:02.100 What as in they will actually press the button rather than an older person?
00:14:06.520 Of course yeah.
00:14:07.380 Whereas if you put an older man in there he's more likely to say...
00:14:10.520 Are you sure?
00:14:11.280 Yes.
00:14:12.260 I'd rather the latter to be honest.
00:14:14.080 Well yeah but not if you're the nuclear high command you probably wouldn't do what they're told.
00:14:18.440 That's true yeah but I don't want to die.
00:14:20.580 So recruitment of this terrorist, I mean it's been going on in plain sight.
00:14:23.560 I mean I could play some of this but you know you get the idea.
00:14:28.240 This is a senator basically saying that he's more dangerous than Hitler and Mussolini which is presumably quite dangerous indeed.
00:14:36.380 A lot of people have tried to draw similarities between Mussolini and Hitler and the use of the terminology like vermin
00:14:44.840 and the drive that those men had towards autocracy and dictatorship.
00:14:52.540 The difference though I think makes Donald Trump even more dangerous and that is he has no philosophy he believes in.
00:14:59.820 He is not trying to explain...
00:15:01.380 Shut up. Right. Anyway you get the point.
00:15:03.660 I mean that's the sort of rhetoric and obviously I could give you thousands of examples of these.
00:15:08.360 So the recruitment was done in the open in front of all our eyes.
00:15:13.040 I mean this goes right the way to the top.
00:15:14.920 You know Biden was saying on a call to donors it's time to put Trump in the bullseye.
00:15:19.140 I mean this language has been so visceral, so emotive and it's been happening for years.
00:15:24.240 The statistical probability of grooming thousands of would-be assassins across the country combined with the opportunity and the formula is quite simply this.
00:15:35.100 Extreme rhetoric, understaffed Trump detail, you've created the optimal conditions for this to happen.
00:15:41.360 And it is impossible for me to believe that that was an accident.
00:15:46.900 That they knew they were doing it, but they were doing it in a way that they could actually get away with.
00:15:52.120 Well they've got plausible deniability haven't they?
00:15:54.560 If that happens.
00:15:55.520 And to sort of help you out a bit here, you're not saying that the individual in question, this crooks kid who was the attempted assassin, you're not letting him off the hook by saying this.
00:16:09.760 He's still morally culpable for his actions, but you're saying that they have also helped to push people towards this end.
00:16:17.240 Well they've manufactured conditions where this would have to arise.
00:16:20.500 Oh yeah, if it wasn't him, then it would have been someone else.
00:16:23.680 And if it wasn't that event, it would have been another event.
00:16:26.240 Well this is my point about, I severely doubt he was the first shooter.
00:16:29.620 He was just the first shooter that managed to get through the security cord.
00:16:32.900 They've manufactured inevitability.
00:16:35.700 Well almost certainly there's been dozens who just didn't have the bottle to go through with it.
00:16:40.380 Well he's been on the scene for so long now, you know, from about 2016 onwards.
00:16:44.920 It's been eight years, you'd think that in America with firearms as they are, there would have at least be one person that's motivated to do so, but didn't end up giving it a go.
00:16:55.800 Definitely at least one.
00:16:56.980 So my point is, this is not a lone gunman.
00:16:59.140 This is all a complex organism and it's moving with a single purpose and they knew what they were moving for.
00:17:05.660 I'm not saying the CIA recruited the kid.
00:17:07.620 I'm saying this organism as a whole, the media, the Democrats, the educational system, they've all been working towards this thing.
00:17:13.940 And we are incredibly lucky.
00:17:16.040 If this universe was one inch to the left, we would be getting Nikki Haley at the GOP conference receiving the nomination right now.
00:17:23.760 The deep state would have won again.
00:17:25.940 Interestingly about Nikki Haley, I haven't been able to confirm this, but I did see a poster saying that she wasn't going to go to the GOP conference.
00:17:34.680 And 20 minutes before the shooter took his shot, she quietly let out the news that she would be attending.
00:17:40.740 I'd love to confirm that because that's unfortunate timing.
00:17:45.720 If so.
00:17:46.120 Yeah.
00:17:46.840 Or yes.
00:17:47.520 Or why?
00:17:48.560 Fortunate something.
00:17:49.520 Yes.
00:17:50.440 Of course, another man was shot in the crowd and two others seriously wounded.
00:17:54.820 So we posted yesterday the fundraiser.
00:17:57.140 We did indeed.
00:17:57.840 Yeah.
00:17:58.120 I've gone for that.
00:18:00.040 Suggest everybody else who can.
00:18:03.020 What do we know about the people who were, obviously one died and two injured.
00:18:08.040 Do we know what their connection was and who they were?
00:18:11.620 Well, the guy who died was just a family man.
00:18:15.060 Right.
00:18:15.360 Had a business.
00:18:15.900 It's not like he was part of the security detail.
00:18:17.660 No, no, no.
00:18:18.180 He wasn't jumping in front of a bullet.
00:18:19.640 He actually died.
00:18:20.260 He literally was jumping in front of a bullet.
00:18:21.140 He jumped in for his family.
00:18:22.940 I read it yesterday and it was a really difficult read.
00:18:25.600 I'd struggle to get through it, to be honest.
00:18:27.880 I think it was his daughter writing it saying how she finds her dad.
00:18:31.540 Dad is actually a hero and he did take a bullet for his family.
00:18:35.000 So she knows that his love is genuine.
00:18:37.260 Now, I don't even want to talk about it.
00:18:38.680 It's going to get sad again.
00:18:39.540 Yeah, it's quite awful.
00:18:43.320 So on the point of it being an inside job, if it wasn't that, why would the Biden campaign
00:18:51.800 be frantically pulling ads off air at the moment that basically were saying we need to
00:18:56.740 get this guy?
00:18:58.120 Because they had a whole bunch of ads running and more planned saying, you know, this guy
00:19:02.060 is an existential threat to democracy.
00:19:03.740 He's a dictator.
00:19:05.260 He's a fascist.
00:19:06.280 We need to deal with this person.
00:19:07.900 And if it wasn't deliberate, what they were doing, why are they frantically pulling all
00:19:12.360 of those ads now?
00:19:13.320 Well, a charitable interpretation could be that they realized that it looked bad if they
00:19:17.700 were saying all these things about a guy who was almost assassinated.
00:19:21.140 I mean, it doesn't look bad when you get caught doing the thing that you did.
00:19:26.400 But they may not necessarily feel like they had a hand in it, but they may think that
00:19:32.380 They totally did.
00:19:33.660 I'm not necessarily disagreeing.
00:19:35.220 I'm playing devil's advocate here.
00:19:36.680 Right.
00:19:37.900 But they did.
00:19:41.660 Also this, if it was not an inside job, a deliberate choice to give the recruited terrorists
00:19:46.980 a maximum opportunity, how could you possibly square that with this video that the Secret
00:19:50.600 Service themselves put out?
00:19:52.100 So this is a video that Secret Service put out where they're talking about protecting
00:19:56.500 the NATO conference and the extreme details that they go to.
00:20:00.740 Let's see if we can watch a bit of this.
00:20:02.140 Can I even be publishing that?
00:20:03.840 The Secret Service is the lead federal agency for the NATO.
00:20:08.600 It is the 75th anniversary of the signing of the treaty.
00:20:11.940 It is the will of the people of the world for freedom and for peace.
00:20:16.700 And so there's a lot of enhanced focus on this particular summit.
00:20:21.460 In the final 48 hours leading up to NATO, our agency mobilizes all of its units with precision
00:20:27.920 to ensure the safety of our protectees.
00:20:30.240 Transport on its way.
00:20:31.320 Coordination with our local, state and federal partners is critical in implementing a safe
00:20:38.000 and secure event for NATO.
00:20:40.000 We work together to mitigate any threat to our protectees and to the community.
00:20:43.140 Who do they make this video for?
00:20:46.100 The final 48 hours of planning.
00:20:47.520 The people attending, right?
00:20:48.940 Yeah.
00:20:49.460 Make them feel comfortable about going to the events.
00:20:53.220 It's a great recruitment video.
00:20:55.120 It is.
00:20:55.600 Yeah, that's true.
00:20:57.160 But I mean, I won't bore you with all of it, but I mean, they go on to say, you know,
00:20:59.940 every eventuality is covered, every letterbox, every bin, every roof.
00:21:04.960 Every roof.
00:21:05.480 Well, yeah.
00:21:06.780 Including the obvious ones.
00:21:09.560 Yes.
00:21:09.820 If it wasn't a, so, help me out, Josh.
00:21:14.460 Stochastic terrorism.
00:21:15.740 Event.
00:21:16.300 Then why were the media running stories like this?
00:21:19.480 Yes, it's okay to compare Trump to Hitler.
00:21:21.680 Don't let me stop you.
00:21:23.220 I mean, you know.
00:21:24.200 Yes.
00:21:24.680 Slightly flammable.
00:21:26.740 Yeah.
00:21:27.100 Now, the interesting thing I found with this is, like I say earlier, normally, the more you
00:21:32.400 know about a subject, the more nuanced your take is.
00:21:35.720 Dan Bongino, who actually served in the Secret Service, is having none of this.
00:21:39.820 Right.
00:21:40.760 He's just like, no, this is, this is unbelievable incompetence.
00:21:46.880 Former Secret Service agent, Dan Bongino joins me now.
00:21:50.740 Is this gross incompetence or is it something even more scary that we can't even think about?
00:21:57.080 No, it's gross incompetence.
00:21:58.520 I mean, how do you let, I mean, think about it, right?
00:22:00.180 The counter sniper team, they train out, the Secret Service counter sniper team, obviously
00:22:04.100 they had a mitigated sniper threat.
00:22:05.400 But they train out to 1,000 yards.
00:22:07.500 So we're talking about, say it was 200 yards, it was actually less than that, where this
00:22:11.360 shooter was, right?
00:22:13.440 You're talking about a fraction of what they're trained at.
00:22:16.760 They're also trained to spot guys in ghillie suits, cracked open windows.
00:22:21.600 Right.
00:22:21.860 He's on a white roof in broad daylight.
00:22:24.420 He's on a white roof.
00:22:25.380 Now, again, it'd be, it'd be easy for me to say, oh, well, let's not get into Monday
00:22:30.140 morning quarterbacking.
00:22:31.160 Bro, it's not a freaking football game.
00:22:33.960 It's the President of the United States' life.
00:22:36.200 The Secret Service has one job.
00:22:39.100 Well, outside of counterfeiting and protection, it has one job.
00:22:43.220 Didn't mince his words, did he?
00:22:44.880 Yeah.
00:22:45.920 And this is the thing.
00:22:47.160 I mean, he's having none of this.
00:22:49.280 You know, he's pointing out they're trained to spot people in ghillie suits rustling through
00:22:52.700 the bushes five miles away, let alone an Elizabeth Warren lookalike on a tin roof on a bright
00:22:57.980 sunny day, clanking his way across it.
00:23:01.560 Likewise.
00:23:02.440 Now, this, I can't, I can't, I mean, this guy says he used to work in military intelligence
00:23:06.460 and he gives his take on it.
00:23:07.660 And I can't, you know, verify that he actually did or not.
00:23:10.080 But he just makes the interesting point that I think I've heard elsewhere, that drones
00:23:13.660 covering the area are standard procedure.
00:23:17.180 Makes perfect sense.
00:23:18.220 I mean, you can't get a better lay of the land than viewing it from above, can you?
00:23:24.000 Yes.
00:23:24.900 As well as the obvious.
00:23:26.180 And again, this guy who we do know was a Green Beret Special Forces sniper.
00:23:32.220 And again, I mean, he feels that this is, it just goes beyond the possible bounds of
00:23:39.780 a mere fuck up.
00:23:42.360 Hey, what's going on, everybody?
00:23:44.040 I'm here to tell you why the shooting of President Donald Trump yesterday, without a doubt in my
00:23:48.460 mind, was a planned and coordinated attack on the president inside our government, our
00:23:55.600 local agency or police force.
00:23:58.440 Here's why.
00:23:59.120 My name is Matthew Murphy, and I am a retired Green Beret, retired from this Special Forces
00:24:04.800 group, but I'm also a level one sniper, which means that I have graduated the highest level
00:24:10.940 of training for snipers that you can do in special operations.
00:24:14.560 We're trained, I'm going to avoid confidential or classified terminology, we're trained in
00:24:19.280 assassinations and counter-assassinations.
00:24:21.280 Weeks, if not months, before the president will ever be at that spot.
00:24:25.060 And they do a site security assessment, and they do that with the local police forces
00:24:29.500 and agencies to ensure that every potential security threat or vulnerability is secured
00:24:35.880 and, of course, protected against.
00:24:38.820 Now, this is done way before...
00:24:41.060 So that's in the reading, links, if you want to hear his full argument.
00:24:43.720 He goes a bit further than I do.
00:24:47.660 I'm simply making the case that this was an assassination done through overwhelming statistical
00:24:52.800 probability and the under-resourcing in order to maximize the likelihood of that to take
00:24:57.940 place.
00:24:59.440 But I mean, he actually goes further and say that the level of the security failings was
00:25:03.300 so extreme that even that can't explain it.
00:25:06.540 So, legitimate questions to be asked, but yes, as far as I'm concerned, the democratic
00:25:12.900 organism, the regime, they knew exactly what they were doing.
00:25:15.640 This was an intended result.
00:25:18.520 And, you know, we missed it by...
00:25:20.980 Well, Trump missed it by an inch, but not for the lack of their trying.
00:25:26.080 Over to you, Josh.
00:25:27.480 Okay.
00:25:28.440 So from...
00:25:29.840 Oh, I need to check this.
00:25:32.440 We've got two...
00:25:33.340 We've got two rumbly rants.
00:25:34.960 Rumble rants, yes.
00:25:35.880 Where do I see them?
00:25:36.420 It is worth mentioning as well, we are still reading out the rumble rants after each segment,
00:25:41.060 so if you have something relevant to say about what we're covering, you can do it.
00:25:45.560 Do you want me to read them for you, Dan?
00:25:47.320 Yes, please, because I can't find them.
00:25:49.540 A name I'm not going to read out because it phonetically makes it sound like I'm swearing.
00:25:53.880 They didn't pull the trigger, but the media certainly aimed the gunman at Trump.
00:25:59.000 Yes.
00:25:59.240 It's a succinct way of putting it.
00:26:00.780 Binary Surfer says,
00:26:02.620 These people want us dead or ruined.
00:26:04.120 They think it's a moral duty to do so.
00:26:05.760 At what point is the right going to stop treating this as a disagreement over politics
00:26:10.840 and start treating it as existential?
00:26:13.120 I think that's increasingly happening, whether people want it to or not.
00:26:16.620 I think that the fact that it's got to this point already and people are sort of more comfortable
00:26:22.200 than ever to come out and say, listen, we support this guy.
00:26:26.220 This is enough.
00:26:28.320 We're not having this anymore.
00:26:29.680 We could easily do a segment on the demographics that have flipped on to supporting Trump.
00:26:36.220 People just aren't having it anymore.
00:26:37.600 I'll be doing that on Thursday, I believe.
00:26:39.960 Excellent.
00:26:40.260 Sean487 says,
00:26:43.900 One F-Up can be understood.
00:26:45.680 Three F-Ups.
00:26:47.660 It's hard, but yes, idiots exist.
00:26:49.920 17 F-Ups from start to finish is a planned assassination.
00:26:54.320 Pretty much, yeah.
00:26:55.720 I mean, we shouldn't, as laymen, be able to analyse what's gone on there.
00:27:00.460 You know, we're not specialists in protecting important people.
00:27:04.240 We shouldn't be able to go, hold on a minute.
00:27:06.460 You should have looked at that roof.
00:27:07.660 You should have done this.
00:27:08.360 You should have done that.
00:27:09.220 We should be going, oh, you know, they've done everything.
00:27:11.400 I wonder what possibly went wrong here.
00:27:13.280 But as people looking in, you go, well, the roof was really obvious.
00:27:18.000 But a lot of people, I've noticed in the comments as well,
00:27:20.020 have picked up on the way the guy crawled across the roof.
00:27:23.500 Like, even the way he was crawling,
00:27:25.880 they saw him for a couple of minutes before, you know,
00:27:28.680 people were saying, oh, there's a guy on the roof.
00:27:30.160 There was a video of him long before he actually took shots at Trump,
00:27:33.540 crawling on the roof, and people in the audience pointing and saying,
00:27:36.520 there's a man on the roof.
00:27:38.080 Police, police, there's a man on the roof.
00:27:39.880 And they're sort of not doing anything.
00:27:41.580 It's odd, because I look at it and I think,
00:27:42.940 well, if I was crawling along with a gun,
00:27:44.500 I probably could have done a better job of crawling across that roof.
00:27:47.360 So don't say that.
00:27:48.920 It's interesting, though, but we shouldn't have to be saying these things.
00:27:52.460 Everything should have been dealt with in a professional way.
00:27:55.360 And yeah, there's major holes, major holes.
00:27:57.640 Yes, pretty severe ones.
00:28:00.860 So Trump has picked his vice president,
00:28:03.420 and it's worth pointing out that it was one of the two people
00:28:06.040 that I thought were going to be the favourites.
00:28:08.200 So I kind of told you so.
00:28:10.300 Did you say that?
00:28:11.160 I highlighted J.D. Vance and Doug Bergum,
00:28:14.820 and then I also talked about two others who I said were more unlikely,
00:28:18.020 and that was Vivek and Kristi Noem,
00:28:21.560 because both of them didn't seem that likely to me.
00:28:24.000 But I did talk about all of the potential people here,
00:28:27.380 and I also said it would be announced on Monday, and it was.
00:28:30.020 So yes, it turns out Jason Miller can be trusted,
00:28:33.100 because he was the person who said Trump will be announcing his VP.
00:28:36.140 And yes, this is the man who is now going to be Trump's running mate
00:28:42.020 for the 2024 presidential election.
00:28:45.660 And this is J.D. Vance.
00:28:47.920 He is the senator for Ohio,
00:28:49.740 and I think it's fair to say he is both a political fresh face
00:28:53.360 and a bit of an outsider,
00:28:55.100 because he only assumed political office in the Senate in January of 2023,
00:28:59.460 which, as far as political careers go,
00:29:03.400 you get a seat in the Senate,
00:29:04.420 that's the first time holding political office,
00:29:07.060 and then you're running for vice president.
00:29:10.060 That's quite a quick rise.
00:29:11.500 It is indeed.
00:29:12.880 Some people have criticised him, mainly left-wingers,
00:29:15.420 but I think there is potentially a valid criticism
00:29:19.020 that he might be too inexperienced.
00:29:20.940 But I also think that his experience
00:29:22.800 isn't why Trump hired him in the first place.
00:29:26.200 Why do you think he hired him?
00:29:27.540 I think he hired him because he's been a loyalist for quite some time.
00:29:32.420 Well, I mean, he was very anti-Trump, though.
00:29:33.980 He was, but he was also one of the first to declare for him as well.
00:29:38.280 So Trump has sort of forgiven him in a sort of Caesar-esque way
00:29:41.100 of when he gets into Rome, he's just like,
00:29:43.200 it's okay, it's okay.
00:29:44.160 If you come over now, I will forgive you.
00:29:46.760 It was that sort of thing,
00:29:47.860 which is actually quite politically savvy.
00:29:49.820 Because I don't know too much about this guy,
00:29:51.340 apart from he was a VC before he did this,
00:29:53.200 so he must be quite a bright chap because, you know.
00:29:55.160 Yes.
00:29:56.520 You've got a bit of a vested interest there in saying that.
00:29:58.860 But he did the Rumble deal
00:30:00.160 that you may well be watching us on right now.
00:30:04.220 And I know that the Rumble exit on that was 2.1 billion.
00:30:08.260 I don't know how much of it his firm had,
00:30:12.100 but that's a pretty good exit.
00:30:13.640 I can imagine so, yeah.
00:30:15.480 But yes, he's only 39 years old,
00:30:17.420 which is actually quite young for a vice president.
00:30:19.420 And that makes him one of the younger candidates
00:30:21.200 in US history, in fact.
00:30:23.260 And I think that this is a good sort of counterbalance
00:30:26.340 to Trump being quite old.
00:30:28.980 In recent history?
00:30:29.980 I mean, weren't all the founding fathers like 30-something?
00:30:32.180 That's true, yes.
00:30:33.020 But in recent years, obviously,
00:30:36.240 it's been trending towards sort of OEPs.
00:30:38.240 Yes, a jury equity or whatever they call it.
00:30:39.940 That's the one.
00:30:40.840 But he is specifically known for writing the book
00:30:43.540 The Hillbilly Elegy,
00:30:45.720 which is a memoir about his family life,
00:30:47.820 about his upbringing in Appalachia,
00:30:50.120 growing up with a mother who was a drug addict,
00:30:53.400 and he grew up in poverty.
00:30:55.680 And this was actually turned into a Netflix original film.
00:30:58.140 And I think in the book,
00:30:58.860 he's actually quite critical of Trump,
00:31:00.480 but then eventually warmed to him.
00:31:02.940 I haven't actually read it myself,
00:31:04.800 so I'm just going by what other people are saying.
00:31:06.940 So that could be wrong.
00:31:08.000 I've ordered that it should be arriving this afternoon,
00:31:10.060 and I'll get to work on it.
00:31:11.720 Okay.
00:31:12.280 Well, that'd be interesting, won't it?
00:31:13.940 But yes, it was turned into a film in 2020,
00:31:15.700 which won lots of awards,
00:31:17.680 if that matters to you.
00:31:19.820 But having a big film made about your memoir
00:31:22.360 probably helps a little bit, doesn't it?
00:31:24.820 Who made the film?
00:31:26.000 Netflix.
00:31:27.380 Netflix.
00:31:28.140 I know.
00:31:28.780 Is he black in the Netflix version?
00:31:30.900 I don't think so, actually.
00:31:32.480 Oh, right.
00:31:32.920 Okay.
00:31:33.160 I saw a picture of some screenshots of the film.
00:31:37.960 Okay.
00:31:38.280 It's actually white people.
00:31:39.980 I mean, they're dealing with what...
00:31:41.880 So I suppose you've got Hillbilly in the title.
00:31:43.820 It's a bit of a giveaway, isn't it?
00:31:45.380 Yeah.
00:31:46.840 But yes, I think that that's a certain amount of prestige
00:31:49.000 that if your life gets a film made about it,
00:31:52.120 that might add to your sort of draw a touch,
00:31:55.740 just a little bit.
00:31:56.360 He also enlisted in the Marine Corps
00:31:58.200 after leaving high school and served in Iraq
00:32:00.120 as a public affairs officer.
00:32:02.060 I'm not sure if that's a combat role,
00:32:03.340 but it's still in Iraq, right?
00:32:04.700 It's still potentially dangerous.
00:32:06.960 So it's not necessarily anything to turn your nose up at.
00:32:10.420 He also afterwards attended Ohio State University
00:32:13.280 where he graduated in 2009
00:32:15.000 with a BA in political science and philosophy.
00:32:17.720 And then he went on to Yale Law School
00:32:19.680 and he became a successful lawyer.
00:32:21.940 And then he went into venture capital and tech
00:32:25.240 and you talked about the Rumble deal already
00:32:26.880 before running for the Senate.
00:32:29.300 And then when he actually won his Senate seat,
00:32:31.860 of course, he played a key role
00:32:33.300 in the 2023 train derailment in East Palestine, Ohio.
00:32:36.740 I remember that one.
00:32:38.520 Yeah, it's a bit confusing.
00:32:40.580 Well, he was a senator, wasn't he?
00:32:42.860 So he played a small part in it.
00:32:45.380 He wasn't necessarily the governor,
00:32:46.600 which would be the person more responsible for it.
00:32:50.240 But he did play a role in it.
00:32:51.900 He also played a part in the bipartisan bill
00:32:54.500 to prevent further derailments in the future.
00:32:56.900 And it's worth talking about his history as well,
00:33:01.380 because you said he was a sort of never-Trumper,
00:33:03.780 as they're called.
00:33:04.820 He called him an idiot and reprehensible in 2016.
00:33:09.120 And then according to the press,
00:33:10.660 he privately referred to him as America's Hitler.
00:33:13.280 And here's the quote that the press had been citing.
00:33:16.560 And this is supposedly in a private Facebook message.
00:33:19.360 So take it with a grain of salt or so.
00:33:21.400 It was a long time ago.
00:33:22.800 I think that, you know,
00:33:23.740 he's changed his mind a little bit,
00:33:25.240 seeing as he's running as his vice president.
00:33:27.800 But he says...
00:33:28.260 Did he change his mind on Trump or Hitler?
00:33:30.740 I think it might be Trump.
00:33:32.320 Right, okay.
00:33:36.340 I heard that, yeah.
00:33:37.740 But it says,
00:33:38.760 I go back and forth between thinking Trump is a cynical a-hole
00:33:42.100 like Nixon,
00:33:43.400 who wouldn't be that bad
00:33:45.080 and might even prove useful,
00:33:47.980 or that he's America's next Hitler.
00:33:51.160 So, yes,
00:33:53.180 that's not the nicest thing to hear.
00:33:54.660 And also in 2017,
00:33:55.900 he was a CNN contributor,
00:33:57.960 which,
00:33:58.820 seeing how Trump voters view CNN,
00:34:01.880 is a little bit of a stain on the record.
00:34:04.220 But, you know,
00:34:05.020 I do believe in forgiving people who change their minds.
00:34:07.420 And he did get himself vaxxed
00:34:08.560 and got his family vaxxed.
00:34:10.680 But I think he was quite good on being anti-mandate.
00:34:14.000 Yes.
00:34:15.000 He was anti-vaccine mandate, yes.
00:34:17.020 So, not an A grade on that,
00:34:18.960 but pretty close.
00:34:20.400 So, according to Republican Senator for Wyoming,
00:34:23.580 John Barrasso,
00:34:25.420 I think his name was Barrasso,
00:34:27.300 who Vance himself describes as his sort of mentor figure,
00:34:31.480 he changed his mind on Trump
00:34:32.800 when he saw what he actually did for the country
00:34:35.120 because he didn't like his rhetoric.
00:34:37.660 He didn't like how he carried himself as a man,
00:34:39.840 but then started to realise,
00:34:40.980 actually,
00:34:41.320 what he's doing for the country is good.
00:34:42.720 He started seeing the good results
00:34:44.220 during Trump's presidency.
00:34:46.980 And then he became one of Trump's most loyal supporters.
00:34:50.600 And some see this as sort of cynical pragmatism.
00:34:54.420 I'm a little bit more charitable, I think.
00:34:56.720 Well, the thing that I've noticed
00:34:57.560 from scanning through Twitter this morning
00:34:59.040 is that the Libs and the Rhinos
00:35:02.720 are apoplectic about this choice.
00:35:05.240 Yes.
00:35:05.760 So, in terms of it being self-serving for Trump,
00:35:07.980 I get the impression
00:35:08.900 that you're less likely to be assassinated
00:35:11.880 if it goes to this guy.
00:35:14.560 Well, that's true because...
00:35:15.520 Kind of a smart choice.
00:35:16.980 Because he's got lots of the same opinions as Trump
00:35:19.980 at the minute.
00:35:21.020 Yes.
00:35:21.620 And so, if they get rid of Trump,
00:35:23.820 they're going to get another Trump.
00:35:25.480 Yes.
00:35:26.400 And so, it sort of disincentivises an assassination.
00:35:28.920 Even recent events and events.
00:35:31.240 That does help.
00:35:32.300 I don't know whether that was in his consideration
00:35:34.600 because he was already one of the frontrunners
00:35:36.500 before the assassination attempt.
00:35:38.740 But it certainly helps, I imagine.
00:35:40.480 I've got to think that almost having a bullet
00:35:42.460 go through the back of your brain
00:35:43.660 changes your thinking on pretty much everything.
00:35:46.460 I would say so.
00:35:47.440 I've never had it happen.
00:35:48.540 Including about your assassination insurance.
00:35:51.320 Yeah.
00:35:51.980 I imagine his premiums on that
00:35:54.100 are going to go straight up, aren't they?
00:35:55.580 Well, yes.
00:35:56.100 But if he was leaning towards
00:35:58.840 whoever the other guy was that you said,
00:36:00.660 and then that bullet whizzes past his head,
00:36:03.800 you've got to think,
00:36:04.820 yeah, I'll go with the one
00:36:05.960 that I'm less likely to get me assassinated.
00:36:09.080 That helps.
00:36:10.080 Yes.
00:36:10.580 But anyway,
00:36:11.460 some of Trump's financial backers
00:36:14.240 aren't a big fan of him.
00:36:15.800 However, he does have significant financial backing
00:36:18.480 from a number of wealthy benefactors
00:36:20.540 such as Peter Thiel.
00:36:22.920 As well as the fact that he's helped raise money,
00:36:25.680 I think from Silicon Valley,
00:36:27.640 for Donald Trump,
00:36:28.760 which lots of Trump loyalists
00:36:30.500 have really appreciated.
00:36:31.640 A lot of those guys are swinging over.
00:36:34.020 And it looks like the failed assassination.
00:36:37.000 Because I think a lot of people,
00:36:38.280 they kind of wanted to flip to Trump,
00:36:40.620 but they kind of needed...
00:36:42.540 Good reason.
00:36:43.220 They needed a reason.
00:36:44.180 Something they could hang it on.
00:36:45.440 Because it's still a bit risky
00:36:46.920 if you're in one of those liberal environments
00:36:48.800 like Silicon Valley
00:36:50.000 to just come out and say you're supporting Trump.
00:36:52.040 But the assassination was
00:36:53.140 something that they could have thought,
00:36:55.120 yeah, okay,
00:36:55.480 I can just come out and declare now.
00:36:57.580 So I don't think you have any problem
00:36:58.880 raising money from Silicon Valley now.
00:37:00.380 No, I would imagine not.
00:37:02.880 So he's also a Catholic.
00:37:05.080 He converted to Catholicism
00:37:07.120 later on in his life
00:37:08.600 because he wasn't raised religiously.
00:37:11.420 And he seems to be relatively devout
00:37:15.180 from what I can gather at least.
00:37:17.520 He also notes about his wife.
00:37:19.920 This might actually help get some reach
00:37:22.540 in other voting demographics.
00:37:24.980 His wife is Hindu
00:37:26.760 and he met her in law school
00:37:28.620 and married in 2014.
00:37:30.780 And they held...
00:37:31.660 This might be interesting to you.
00:37:33.120 He held two wedding ceremonies,
00:37:34.540 one in which was Hindu,
00:37:36.500 one in which was presumably Catholic.
00:37:38.880 And they have three children,
00:37:40.900 Ewan Vivek,
00:37:41.920 probably not Vivek Ramaswamy,
00:37:44.140 and Mirabelle.
00:37:45.100 So he is a family man,
00:37:46.300 I think it's safe to say.
00:37:48.140 So what does he actually believe then?
00:37:50.040 So he believes that
00:37:51.960 abortion should be set by the states,
00:37:54.520 which is Trump's position,
00:37:56.540 although he personally supports a ban.
00:37:59.360 He believes that marriage
00:38:00.440 is between a man and a woman.
00:38:02.180 He wants to ban gender-affirming care.
00:38:04.700 He supports building the border wall.
00:38:06.820 He wants an end to illegal immigration
00:38:09.200 and wants to reduce legal migration
00:38:11.140 by enforcing higher standards.
00:38:13.280 He wants to raise import tariffs
00:38:14.840 to protect jobs,
00:38:15.780 especially on goods
00:38:16.680 from places like China,
00:38:18.580 that are political enemies
00:38:19.680 of the United States.
00:38:21.480 And he's also a climate skeptic
00:38:23.500 that is a supporter
00:38:24.960 of using fossil fuels.
00:38:26.440 So this all seems relatively consistent
00:38:28.360 with Trump, doesn't it?
00:38:29.380 Yes, like all of this.
00:38:31.720 And he's quite sensible
00:38:32.660 on Ukraine as well.
00:38:33.640 He thinks that...
00:38:34.260 You've foreshadowed exactly
00:38:36.080 what I was about to say.
00:38:37.500 So here he says,
00:38:39.440 I've got to be honest with you,
00:38:40.400 I don't really care
00:38:41.080 what happens to Ukraine
00:38:41.960 one way or the other.
00:38:42.920 And he has been
00:38:45.600 a pretty vocal critic
00:38:47.060 within the Republican Party
00:38:48.240 of not funding Ukraine.
00:38:52.800 To give him his credit,
00:38:53.940 having listened to the clip,
00:38:55.320 he's very much bothered
00:38:56.540 about all the Ukrainians
00:38:59.080 getting needlessly killed.
00:39:00.560 Yeah, well...
00:39:01.180 So when he says
00:39:01.680 he doesn't care about Ukraine,
00:39:02.660 I think he's talking about
00:39:03.540 the political situation.
00:39:04.560 He is, yes.
00:39:05.100 He's not talking about
00:39:05.840 the individuals getting slaughtered,
00:39:07.500 which he wants to stop
00:39:08.520 as soon as possible,
00:39:09.540 which is again...
00:39:10.120 I've heard the same thing.
00:39:11.140 So yeah, that's a good point
00:39:12.680 to raise there.
00:39:13.920 So obviously the media
00:39:15.000 is very upset about this
00:39:16.160 because lots of elites,
00:39:18.280 both political and financial,
00:39:20.140 have a lot of...
00:39:22.100 How do I put it?
00:39:23.600 They've got a lot riding
00:39:25.060 on Ukraine doing well.
00:39:27.400 Yes, they've got a lot
00:39:28.160 of money in Ukraine.
00:39:29.800 And it's disaster
00:39:31.620 for Europe and Ukraine
00:39:32.960 that he's been picked,
00:39:35.440 even though it's probably
00:39:36.640 what Trump would do anyway.
00:39:37.960 And it's probably going to be
00:39:39.180 Trump's final choice anyway.
00:39:41.160 So it doesn't necessarily
00:39:42.220 matter too much
00:39:43.020 what he thinks
00:39:43.540 because I can't see Trump
00:39:45.240 being hands-off
00:39:46.200 with something like that.
00:39:47.600 No.
00:39:48.300 And one interesting thing
00:39:50.580 as well is this.
00:39:52.440 He says,
00:39:53.300 I'll be as strong
00:39:53.980 an advocate for US-Israel
00:39:55.600 relationship as anyone.
00:39:57.120 And this is pretty much
00:39:58.680 the same position as Trump.
00:40:00.560 And I'm sure that's great
00:40:01.880 if you support that.
00:40:02.800 And his distinction
00:40:04.780 between Israel and Ukraine
00:40:06.900 is Israel has
00:40:07.920 an achievable objective,
00:40:09.180 Ukraine does not.
00:40:10.120 That's a direct quote from him.
00:40:13.220 And there are some criticisms
00:40:15.600 of this.
00:40:17.360 This is Michael Tracy saying,
00:40:18.680 JD Vance is not
00:40:19.520 an anti-interventionist
00:40:21.540 in the sense of the word.
00:40:22.740 He supports unconditionally
00:40:23.920 arming Israel
00:40:24.740 and encircling China.
00:40:26.620 He merely argues
00:40:27.260 the EU should be
00:40:28.300 made to dump more money
00:40:29.640 into Ukraine,
00:40:30.420 echoing Trump's view
00:40:31.280 so the US can focus
00:40:32.480 on preparing for war
00:40:33.300 with China.
00:40:34.260 Oh, I don't like it now.
00:40:35.680 Yeah, that doesn't look
00:40:36.340 quite so good, does it?
00:40:37.480 Yeah.
00:40:37.800 And I think that...
00:40:39.120 You're really getting me
00:40:39.800 on board then,
00:40:40.440 but this doesn't sound so good.
00:40:41.740 Yeah, this is where my...
00:40:43.080 You don't support anyone
00:40:44.100 unconditionally.
00:40:45.440 No.
00:40:46.060 There's no need
00:40:46.680 to encircle China.
00:40:47.760 In fact, militarily,
00:40:49.400 the whole US military
00:40:50.760 weapon supply chain
00:40:52.540 is all fundamentally built
00:40:54.220 on Chinese products.
00:40:56.740 And what's the next one?
00:40:57.980 The EU should not be
00:40:58.760 dumping more money
00:40:59.460 into Ukraine.
00:41:00.260 No.
00:41:00.420 They should be getting peace
00:41:01.480 as soon as possible.
00:41:02.500 Well...
00:41:03.140 And we should not be
00:41:03.860 preparing for a war
00:41:04.620 with China.
00:41:05.280 To give him
00:41:06.680 a slight bit of
00:41:08.080 benefit of the doubt here,
00:41:08.940 I do think that the US
00:41:10.040 focusing a bit more
00:41:11.020 on China over Russia
00:41:12.600 makes sense
00:41:13.240 because they demonise Russia,
00:41:15.920 but I don't see Russia
00:41:17.460 as much of a threat
00:41:18.400 to the US as China is.
00:41:19.620 Or just stop
00:41:20.400 warmongering in general.
00:41:21.580 I don't want
00:41:22.140 warmongering at all, no.
00:41:23.180 I don't believe
00:41:23.740 in foreign escapades
00:41:25.240 and funding the
00:41:26.720 military-industrial complex
00:41:28.140 whatsoever.
00:41:28.760 I think it doesn't
00:41:29.340 benefit the American people.
00:41:31.420 And nor do I think
00:41:33.040 that giving
00:41:33.900 unequivocal support
00:41:36.360 to Israel
00:41:37.360 in the form of
00:41:38.080 taxpayers' money
00:41:38.840 is really in the
00:41:40.260 interests of
00:41:40.820 American taxpayers.
00:41:41.860 I think that he's
00:41:42.640 sort of got this
00:41:43.120 Israel-first,
00:41:43.900 America-second view
00:41:45.120 that I think
00:41:46.740 is a little bit
00:41:47.360 distasteful
00:41:47.940 because this is tax money
00:41:49.080 that has been
00:41:49.720 taken by force
00:41:50.700 from American people
00:41:52.300 and then given
00:41:52.960 to a foreign government
00:41:53.820 that they may not
00:41:54.700 agree with.
00:41:55.340 And I don't necessarily
00:41:56.340 think that that's...
00:41:57.660 Mind you, it might all
00:41:57.900 tie together with
00:41:58.700 the not wanting
00:41:59.940 to get assassinated
00:42:00.740 because you don't
00:42:01.400 want the pros on it.
00:42:04.020 That's very true.
00:42:05.020 Yes.
00:42:06.460 So if I were
00:42:07.520 an American taxpayer,
00:42:08.480 I'd be quite angry
00:42:09.280 about this.
00:42:10.180 In fact, I'm pretty
00:42:11.320 annoyed at all the
00:42:12.120 taxes I pay in the UK
00:42:13.240 so I can sympathise.
00:42:15.400 There are lots of
00:42:16.360 problems to be solved
00:42:17.180 in the United States.
00:42:18.120 Obviously, you've got
00:42:18.720 drug problems in
00:42:19.520 places like Philadelphia.
00:42:20.600 You've got poverty
00:42:21.560 in Detroit.
00:42:22.420 You've got the
00:42:23.040 homelessness problem in LA.
00:42:24.300 There are lots of places
00:42:25.320 where that money
00:42:25.880 can be spent at home
00:42:26.940 and I think that
00:42:27.860 actually, if you want
00:42:29.900 to build a successful
00:42:31.020 country, well, putting
00:42:32.480 your own nation
00:42:33.260 before that of others
00:42:34.580 is normally a good
00:42:35.400 start and if people
00:42:37.260 want to privately
00:42:37.920 donate money to
00:42:38.780 other countries,
00:42:39.420 that's fine, but I
00:42:40.380 don't think the
00:42:40.860 government should be
00:42:41.400 picking and choosing
00:42:42.260 which countries they
00:42:43.020 like based on, you
00:42:44.800 know, their own
00:42:45.500 personal politics.
00:42:46.600 I don't think that's
00:42:47.180 fair.
00:42:48.400 So it's mainly due
00:42:50.360 to this sort of thing
00:42:51.400 like these lobbying
00:42:54.140 groups, have a lot
00:42:55.220 of power and
00:42:56.720 influence, they're
00:42:57.320 well-funded and they
00:42:59.220 are backing Trump
00:43:00.480 quite significantly
00:43:01.320 and this is
00:43:02.980 reinforcing his
00:43:04.740 support for spending
00:43:06.780 all this taxpayer
00:43:07.460 money.
00:43:08.640 He has said some
00:43:09.660 funny things though
00:43:10.440 to bring it back
00:43:11.980 to redeeming things
00:43:12.800 so let's have a
00:43:14.860 listen to what he
00:43:15.740 says here.
00:43:16.860 By the way, I have
00:43:18.500 to beat up on the
00:43:19.080 UK just one additional
00:43:20.700 thing.
00:43:21.120 I was talking with a
00:43:21.920 friend recently and we
00:43:24.380 were talking about, you
00:43:25.760 know, one of the big
00:43:26.700 dangers in the world of
00:43:28.080 course is nuclear
00:43:28.680 proliferation, though of
00:43:29.680 course the Biden
00:43:30.140 administration doesn't
00:43:30.880 care about it and I
00:43:32.600 was talking about, you
00:43:33.580 know, what is the
00:43:34.300 first truly Islamist
00:43:36.180 country that will get a
00:43:37.880 nuclear weapon and we
00:43:38.860 were like, maybe it's
00:43:39.500 Iran, you know, maybe
00:43:41.040 Pakistan already kind of
00:43:42.180 counts and then we sort
00:43:43.080 of finally decided maybe
00:43:43.980 it's actually the UK
00:43:44.740 since Labour just took
00:43:46.580 over.
00:43:49.020 But...
00:43:49.220 Wow.
00:43:50.340 We have become the
00:43:51.520 butt of a joke.
00:43:53.580 Yeah, but it's a
00:43:54.760 well-landed joke.
00:43:55.920 Yeah.
00:43:56.120 It's a very well-landed
00:43:56.840 joke.
00:43:57.240 Yes.
00:43:57.560 I mean, Pakistan does
00:43:58.460 already have nuclear
00:43:59.280 weapons but, you know,
00:44:00.700 the joke doesn't work if
00:44:01.860 you say that, does it?
00:44:03.000 Yes.
00:44:03.680 I like jokes at the
00:44:04.760 expense of the Labour
00:44:05.380 Party so that did make me
00:44:07.080 appreciate him and
00:44:07.860 actually...
00:44:08.580 That doesn't offend me
00:44:09.480 in the slightest.
00:44:10.220 It's just...
00:44:10.440 No, no, no.
00:44:11.000 It was targeted at the
00:44:11.940 Labour Party, wasn't it?
00:44:12.640 It's just a man agreeing
00:44:13.680 with the ONS.
00:44:14.500 Calling it like you
00:44:15.180 see it.
00:44:15.620 Yeah.
00:44:16.180 And funny enough, our
00:44:17.300 politicians have actually
00:44:18.100 responded to this.
00:44:20.200 Angela Rayner says,
00:44:21.320 well, I think he said
00:44:22.160 quite a lot of fruity
00:44:23.000 things in the past as
00:44:23.980 well and fruity doesn't
00:44:25.280 have the same connotation
00:44:26.380 in Britain that it does
00:44:27.380 in the United States.
00:44:28.320 She's not saying that,
00:44:30.440 you know, he's doing
00:44:31.220 that Catholic thing of,
00:44:33.500 you know, doing
00:44:35.100 untoward things in...
00:44:37.060 Wait, is that what it
00:44:37.600 means in the US?
00:44:39.160 Not necessarily
00:44:40.180 noncy but...
00:44:41.760 Right.
00:44:42.040 ...homosexual.
00:44:42.520 Oh, right, right.
00:44:44.620 Rather than just...
00:44:45.420 You lost us both there.
00:44:47.200 I was trying to be
00:44:48.140 euphemistic, but I suppose
00:44:49.200 I may as well just say it.
00:44:50.220 Okay.
00:44:52.440 And I don't recognise
00:44:54.400 that characterisation.
00:44:55.540 I'm very proud of the
00:44:56.300 election success of the
00:44:57.300 Labour Party, blah, blah,
00:44:58.400 blah, blah, blah, blah.
00:44:58.920 She carries on and
00:45:00.400 let's remember that
00:45:01.960 Angela Rayner was seen
00:45:03.460 in a mosque basically
00:45:04.840 begging the Islamic
00:45:05.820 community, please vote
00:45:07.160 for us, please, please.
00:45:08.240 It was rather
00:45:09.840 embarrassing, but
00:45:10.800 that's by the by.
00:45:13.480 We do also have...
00:45:13.860 Well, he now literally
00:45:15.060 has five MPs who
00:45:16.500 elected on the issue
00:45:18.240 of Gaza solely.
00:45:19.880 Yeah.
00:45:20.440 Islamic nationalists.
00:45:21.420 Yes.
00:45:22.380 And obviously there is
00:45:23.940 an Islamic faction in
00:45:25.220 the Labour Party, so
00:45:26.360 he's not untrue.
00:45:27.920 There is a...
00:45:28.120 There's a conspiracy
00:45:29.060 theory in there
00:45:29.880 somewhere.
00:45:30.680 Somewhere.
00:45:31.640 Buried beneath...
00:45:32.920 Between the lines.
00:45:33.600 The appointment of
00:45:34.660 this J.D. Vance fellow
00:45:35.720 has also had a bit of
00:45:37.040 a reaction from our
00:45:38.120 Foreign Secretary.
00:45:39.280 He...
00:45:39.960 This is David Lammy,
00:45:40.900 of course, who has
00:45:42.340 compared Trump to all
00:45:43.360 sorts of things, and
00:45:44.260 now someone with
00:45:45.560 basically Trump's
00:45:46.300 opinions has been
00:45:47.000 appointed his VP, and
00:45:48.620 on the appointment he
00:45:49.480 just said, on Vance's
00:45:51.340 memoir, these themes
00:45:52.740 are my own political
00:45:55.320 story, basically.
00:45:56.580 Just saying, I like
00:45:57.200 your book about, you
00:45:59.060 know, from the time
00:46:00.300 when you hated
00:46:00.780 Trump, which is
00:46:02.940 basically, you know,
00:46:04.520 trying to put on a
00:46:05.200 brave face.
00:46:05.700 Such an embarrassment
00:46:06.300 with this job.
00:46:07.100 But yes, also, here's
00:46:08.220 Samson, our producer's
00:46:09.420 Twitter account, if you
00:46:10.120 want to give him a
00:46:10.640 follow.
00:46:11.960 He made this meme a
00:46:13.680 reality for you, so
00:46:14.880 we're going to be using
00:46:15.800 this a lot, I imagine,
00:46:16.920 David Lammy having to
00:46:18.200 put on a brave face
00:46:19.000 dealing with America,
00:46:20.420 or secretly hating
00:46:21.400 them all.
00:46:23.380 But, apparently, I
00:46:25.800 haven't been able to
00:46:26.380 confirm this, but when
00:46:27.520 he was confirmed, he
00:46:29.200 has been...
00:46:30.000 It's been reported,
00:46:30.960 at least, that he
00:46:31.780 denied a call from
00:46:32.760 Kamala Harris to
00:46:33.600 congratulate him, which
00:46:35.620 I think is quite funny.
00:46:36.780 And if he did that, you
00:46:38.240 know, I can respect it.
00:46:40.520 And finally, just to
00:46:42.020 warn you, now he has
00:46:43.000 been confirmed, this
00:46:43.940 Babylon Bee article from
00:46:45.160 May of 2024 is a sort
00:46:47.620 of warning to you.
00:46:48.420 Twelve women come
00:46:49.100 forward alleging they
00:46:49.940 were sexually assaulted
00:46:50.660 by whomever Trump's
00:46:52.160 VP pick is.
00:46:53.620 And imagine lots of
00:46:54.980 mud to be slung his
00:46:55.960 way.
00:46:56.680 Now his hat is in the
00:46:57.780 ring.
00:46:58.540 And so keep an eye out
00:46:59.460 for it.
00:47:00.400 Take it all with a
00:47:01.220 pinch of salt, because
00:47:02.140 it's not unlike the
00:47:03.380 Democrats to throw, you
00:47:04.840 know, lots of dirt that
00:47:05.940 doesn't stick at
00:47:07.100 Republicans.
00:47:08.480 He's a good-looking
00:47:09.000 fella, though.
00:47:09.880 Have you got that
00:47:10.280 photo of him again?
00:47:11.000 He's got a lovely
00:47:11.540 heart-shaped face.
00:47:12.500 He just needs to do a
00:47:13.240 bit of a cleft in the
00:47:13.980 middle of his hair.
00:47:14.680 He's also got a pretty
00:47:15.880 solid beard that's
00:47:16.880 slightly brown, slightly
00:47:18.020 ginger, and a little bit
00:47:18.860 grey.
00:47:19.000 He does look like he's
00:47:19.860 wearing eyeliner, though,
00:47:20.700 doesn't he?
00:47:21.040 Sorry, Samson, you've
00:47:21.920 got it there.
00:47:22.980 Not in that picture, but
00:47:23.900 in some of his media
00:47:25.280 appearances, he's got
00:47:26.180 very dark eyes.
00:47:27.380 And I think, is he
00:47:28.520 wearing eyeliner?
00:47:29.200 Is he sort of a secret
00:47:30.900 emo at heart?
00:47:31.680 I don't know.
00:47:32.240 I'll tell you what I'm
00:47:32.800 struggling to unsee.
00:47:34.180 You know the movie
00:47:34.700 Stepbrothers?
00:47:35.940 You know the older
00:47:36.440 brother?
00:47:39.660 I can see it a little
00:47:40.780 bit, yeah.
00:47:41.220 I don't know.
00:47:41.980 What, J.C.
00:47:42.860 Reilly?
00:47:43.440 I think that's his
00:47:44.100 name, yeah.
00:47:44.580 But give us a verdict,
00:47:45.540 Josh.
00:47:45.760 Thumbs up or thumbs
00:47:46.420 down?
00:47:47.260 I think it's a good
00:47:48.020 choice for Trump.
00:47:48.920 I don't agree with
00:47:49.660 everything, but then I
00:47:50.480 don't agree with
00:47:51.080 everything that Trump
00:47:51.920 believes, but he is
00:47:53.160 basically the mini-me to
00:47:54.760 Trump, isn't he?
00:47:55.840 He's like a younger
00:47:56.600 version.
00:47:56.840 All of the people
00:47:59.720 I've seen on Twitter
00:48:00.300 who are upset about
00:48:01.200 the sort of people
00:48:01.800 that I want to see
00:48:02.760 upset by a VP pick.
00:48:04.680 So I don't know
00:48:05.420 everything yet, but my
00:48:06.440 first reaction is good.
00:48:07.780 Yeah, I'm relatively
00:48:09.000 happy with who's
00:48:09.800 picked, and I wish
00:48:11.020 them all the best.
00:48:11.980 So essentially, could
00:48:13.100 have been worse.
00:48:14.560 Yes.
00:48:15.260 Which is the new
00:48:15.940 political conclusion
00:48:16.760 on most things these
00:48:17.860 days.
00:48:19.480 Tell us about AI.
00:48:21.040 Right, so now that
00:48:22.160 all the, I mean,
00:48:23.220 that was heavy.
00:48:23.920 Everybody have a
00:48:24.780 breather.
00:48:25.460 We've done some real
00:48:26.140 heavy stuff.
00:48:26.700 You know, we've
00:48:27.860 done the assassination
00:48:29.060 of a potential US
00:48:31.240 candidate, which is a
00:48:32.120 big deal, and then
00:48:32.960 obviously the VP as
00:48:33.920 well, so that's
00:48:34.380 pretty heavy.
00:48:34.900 Now, coming into
00:48:35.900 this podcast, I was
00:48:37.100 going to talk about
00:48:37.720 the decline of the
00:48:38.980 European automotive
00:48:40.160 industry, which is
00:48:41.060 absolutely happening,
00:48:41.960 but you can watch
00:48:42.540 that on my channel
00:48:43.300 and it gets pretty
00:48:44.020 technical, but
00:48:44.600 basically all of the
00:48:45.340 European manufacturers
00:48:46.200 are screwed because of
00:48:47.160 the electric vehicle
00:48:47.760 movement, a lot of
00:48:48.500 which is being pushed
00:48:49.260 by China, which is
00:48:50.160 funny when you start
00:48:50.720 saying things like,
00:48:51.420 let's encircle China,
00:48:53.140 maybe stop buying
00:48:53.980 their cars.
00:48:54.780 Anyway, then I
00:48:56.800 thought, well, why
00:48:57.800 not talk about
00:48:58.780 artificial intelligence?
00:49:00.920 Because it's occurred
00:49:02.020 to me that artificial
00:49:03.120 intelligence hasn't
00:49:03.960 really been a major
00:49:04.820 part of my life.
00:49:06.780 Oh, I use it every
00:49:07.600 day now.
00:49:08.180 Do you actually
00:49:08.680 genuinely?
00:49:09.460 Yes, it is so
00:49:11.140 helpful.
00:49:12.120 This concerns me.
00:49:13.300 I just feel like when
00:49:15.560 we start relying on
00:49:16.860 stuff like this, you
00:49:18.880 know, we start to
00:49:19.680 lose our way.
00:49:20.820 So where it sort of
00:49:22.280 brashly, you know,
00:49:23.660 knocked on the door
00:49:24.540 of my life the other
00:49:25.320 day was I was looking
00:49:26.820 to, I was browsing
00:49:28.160 cars, off of sale,
00:49:29.320 probably on Facebook
00:49:30.040 marketplace or eBay,
00:49:31.300 and I've noticed a
00:49:32.500 massive increase in the
00:49:33.500 number of people using
00:49:34.500 artificially generated
00:49:36.020 descriptions for their
00:49:37.200 vehicles.
00:49:37.860 And I feel like all of
00:49:39.260 you, you need to stop
00:49:40.400 it right now.
00:49:41.800 I would definitely do
00:49:42.900 that if I was selling
00:49:43.500 a car.
00:49:44.160 But what you end up
00:49:44.880 with is you end up
00:49:45.600 with a schmaltzy kind
00:49:47.300 of marketing PR
00:49:49.140 brochure written piece
00:49:50.740 of, you know, crap
00:49:52.840 writing that doesn't
00:49:53.940 actually tell you
00:49:54.660 anything that you need
00:49:55.500 to know.
00:49:55.960 And I really do feel
00:49:57.020 like it needs to stop.
00:49:58.380 Anyway, so that was my
00:49:59.140 introduction to the AI
00:50:00.720 world and I posted on
00:50:01.580 my Facebook the other
00:50:02.200 day saying I think
00:50:02.940 people should stop
00:50:03.700 doing these car
00:50:04.500 descriptions to which
00:50:05.300 some people commented
00:50:06.080 and said, you know, I
00:50:06.700 do it because I'm
00:50:07.380 lazy.
00:50:08.580 But most people seem to
00:50:10.300 agree.
00:50:10.700 You know, when you're
00:50:11.040 selling something,
00:50:11.800 you need to write it
00:50:12.360 yourself.
00:50:12.840 But there's also an
00:50:13.560 element of your, you're
00:50:15.180 seeding your creativity
00:50:16.440 to the machine by not
00:50:18.800 doing it yourself.
00:50:19.820 Is it built in yet?
00:50:20.760 So if I were to go to
00:50:21.720 like auto buyer or
00:50:22.820 whatever, is it already
00:50:24.400 built in as a tool that
00:50:25.900 it will generate a
00:50:26.700 description for you?
00:50:27.520 I don't know, but I
00:50:28.720 think we're getting
00:50:29.600 there.
00:50:30.220 Yeah, that will be the
00:50:30.960 next thing to happen.
00:50:31.780 Yeah, absolutely.
00:50:32.640 So you'll put in some
00:50:33.380 basic details and then
00:50:34.320 it will tell you, tell
00:50:35.180 you everything else.
00:50:35.960 Anyway, the next time
00:50:37.760 that AI knocked on my
00:50:39.060 door was my little
00:50:41.060 lad is currently
00:50:42.340 working on, he's
00:50:43.260 eight, right?
00:50:44.100 And you know, his
00:50:44.900 school friends are all
00:50:45.620 late and they're
00:50:46.720 currently working on
00:50:47.860 their plans for world
00:50:49.160 domination.
00:50:49.740 They're launching a
00:50:50.480 restaurant.
00:50:51.200 It's going to be on a
00:50:52.200 boat floating somewhere
00:50:53.960 on the coral reef.
00:50:55.320 So last night he said to
00:50:56.300 me, you know, dad, would
00:50:57.540 you lend me some money at
00:50:58.740 some point in the future?
00:50:59.460 Which I assume is for
00:51:01.260 this restaurant boat
00:51:02.220 anyway.
00:51:03.060 Which coral reef?
00:51:04.320 I don't think it
00:51:04.800 matters.
00:51:05.280 Right.
00:51:05.660 Yeah.
00:51:06.320 Just any of them.
00:51:07.680 Well, it might for
00:51:08.080 passing trade.
00:51:08.960 Well, I think they'll
00:51:10.000 get onto that when they
00:51:10.800 get into the sort of
00:51:11.320 business plan sightings.
00:51:12.560 At the minute, they're
00:51:13.280 at marketing and
00:51:13.960 fundraising.
00:51:14.960 I think the plan comes
00:51:15.800 after that.
00:51:16.780 Anyway, so it's called
00:51:17.640 Fish and Ships, which
00:51:19.000 is pretty clever.
00:51:20.040 I'm the strength for the
00:51:20.780 name alone.
00:51:22.020 Because I like that.
00:51:23.080 Exactly.
00:51:23.680 And he was, so they've
00:51:24.400 got a menu, designed the
00:51:25.560 menu last night and for
00:51:26.340 the menu he needed a logo.
00:51:27.680 So he started AI
00:51:28.580 generating some images
00:51:29.920 that they could use for
00:51:30.860 this menu.
00:51:31.460 Then he sort of
00:51:33.240 passed me the iPad and
00:51:34.260 said, why don't you
00:51:34.880 have a go?
00:51:35.460 And this is where we get
00:51:36.560 into the meat of where I
00:51:37.460 was going with it.
00:51:38.440 So he said, well, what
00:51:39.600 image would you like to
00:51:40.500 generate on these AI
00:51:41.660 generators?
00:51:42.380 And I just thought, well,
00:51:43.020 wouldn't it be funny to
00:51:43.760 generate an image of us
00:51:44.700 on holiday?
00:51:45.420 So what was it that I
00:51:46.400 typed in?
00:51:47.280 I got on the AI
00:51:48.480 generator and I typed
00:51:50.280 in a Volvo 850 on a
00:51:53.420 beach going surfing with
00:51:54.800 a tiger on the roof in
00:51:56.140 the tropics because I
00:51:57.280 refer to my kids as the
00:51:58.600 two tigers.
00:51:59.120 But I accidentally
00:52:01.300 typed it in the wrong
00:52:02.300 area and instead of
00:52:04.120 generating an image, it
00:52:06.180 wrote a story like a
00:52:07.720 whole book.
00:52:09.240 And I was shocked that I
00:52:10.600 had put in 17 words and
00:52:13.620 it had given us this
00:52:15.020 whole story.
00:52:17.840 Basically, I mean, it's
00:52:19.080 almost a page long.
00:52:20.860 And at that point, it's
00:52:22.060 good.
00:52:22.660 Yeah.
00:52:23.040 What's the synopsis?
00:52:24.560 I'm hooked already.
00:52:25.960 You tell me when you get
00:52:27.260 bored, right?
00:52:28.000 The sun was shining
00:52:29.600 brightly as the Volvo 850
00:52:31.360 made its way down the
00:52:32.400 sandy beach, the waves
00:52:33.700 crashing against the
00:52:34.640 shore in the distance.
00:52:35.700 On the roof of the car
00:52:36.680 sat a majestic tiger, its
00:52:38.760 sleek fur glistening in
00:52:40.300 the sunlight.
00:52:41.320 The car pulled up near the
00:52:42.380 water's edge and the
00:52:43.260 tiger leaped off the
00:52:44.220 roof, landing gracefully
00:52:45.320 on the sand.
00:52:46.220 It let out a low growl of
00:52:47.480 excitement before bounding
00:52:48.760 toward the waves, eager to
00:52:50.080 catch some surf.
00:52:51.020 Remember, my input was
00:52:52.140 17 words.
00:52:53.840 The driver of the Volvo,
00:52:54.980 a daring adventurer with a
00:52:56.200 love for both cars and
00:52:57.320 wildlife, watched in awe as
00:52:59.260 the tiger paddled out into
00:53:00.420 the water, riding the waves
00:53:01.740 with skill and grace.
00:53:03.240 The tropical breeze ruffled
00:53:04.400 their hair as they marveled
00:53:05.760 at the sight before them.
00:53:07.160 As the sun began to set,
00:53:08.460 painting the sky in shades
00:53:09.580 of pink and orange, the
00:53:11.020 tiger returned to the
00:53:11.840 shore, a satisfied grin on
00:53:13.380 its face, and the driver and
00:53:14.700 the tiger shared a moment of
00:53:15.960 mutual appreciation before
00:53:17.140 heading back to the car,
00:53:18.380 ready to continue their
00:53:19.440 journey along the coast.
00:53:21.120 At this point, I then
00:53:21.840 typed in, what happened
00:53:22.940 next?
00:53:23.460 And it carried on.
00:53:24.320 And so, the Volvo 850 and
00:53:26.440 its unlikely surfing
00:53:27.520 companion continued on
00:53:28.700 their way, exploring the
00:53:29.740 beauty of the tropics
00:53:30.680 together, a testament to the
00:53:32.140 bond between man and
00:53:33.140 beast.
00:53:33.840 As the Volvo and the
00:53:34.740 tiger continued their
00:53:35.680 journey along the coast,
00:53:36.780 they encountered more
00:53:37.660 breathtaking sights and
00:53:38.760 exciting adventures.
00:53:40.000 They explored hidden
00:53:40.940 coves, trekked through
00:53:42.040 lush jungles, and even
00:53:43.040 stumbled upon a secret
00:53:44.220 waterfall where the tiger
00:53:45.400 took a refreshing swim.
00:53:46.380 I mean, it's not quite
00:53:47.020 Hemingway, is it?
00:53:48.040 It's not, but it goes on
00:53:49.400 and on, and eventually I
00:53:50.300 said, and then the tiger
00:53:51.280 died, and basically it
00:53:52.540 then wrote me a rather
00:53:54.500 beautiful and quite tragic
00:53:56.120 conclusion about what
00:53:57.500 happened at the end.
00:53:58.240 Jeff discovers AI for the
00:53:58.620 first time.
00:53:59.380 Pretty much.
00:54:00.140 Right.
00:54:00.420 But where that led me then
00:54:01.960 was, where does this
00:54:03.260 actually leave us as a
00:54:04.720 cultural nation?
00:54:05.980 So, you know, we've got the
00:54:06.980 great writers of the past.
00:54:08.240 Well, if they'd had access
00:54:09.280 to AI, what would actually
00:54:11.680 have become of their
00:54:12.740 creativity?
00:54:13.320 And what is going to
00:54:14.340 become of our creativity
00:54:15.800 as a nation?
00:54:16.560 Because you no longer
00:54:17.300 need to be an artist,
00:54:18.440 you no longer need to be
00:54:19.400 able to do Photoshop, and
00:54:20.520 clearly you no longer
00:54:21.260 need to be able to write.
00:54:22.780 So where does that leave
00:54:23.520 us?
00:54:23.720 Well, I would say that it
00:54:25.640 wrote a very compelling
00:54:26.460 children's bedtime story
00:54:27.860 there, but I don't see it,
00:54:29.540 you know, replacing Tolkien
00:54:30.880 quite yet.
00:54:31.860 I think that AI has not
00:54:34.320 got to the point where it
00:54:35.400 has a grasp of human
00:54:36.420 psychology, that it knows
00:54:37.820 what is completely gripping
00:54:40.080 to a human mind.
00:54:41.140 I think you could possibly
00:54:42.080 design one that's much
00:54:43.220 better at doing so.
00:54:45.140 However, I still think that
00:54:46.800 for the time being, with
00:54:48.240 story writing, a human being
00:54:50.080 is still better.
00:54:51.220 But for the sort of low
00:54:52.540 quality stuff, it can fill
00:54:55.920 that gap, can't it?
00:54:56.920 That's why I think lots of
00:54:58.040 Hollywood were upset, because
00:54:59.440 looking at the state of
00:55:00.780 media at the minute, you
00:55:02.540 know, low quality writers
00:55:03.660 being put out of a job by
00:55:05.180 AI.
00:55:06.660 Not the end of the world,
00:55:07.520 but the fact that you could
00:55:08.840 be reading something and not
00:55:10.000 know whether you're reading
00:55:10.940 something that's been
00:55:11.660 written by someone who is in
00:55:12.700 a hurry, or a computer.
00:55:14.640 And I was just shocked by
00:55:15.600 how much it was able to
00:55:16.980 give me based on...
00:55:18.500 So I have a different view
00:55:19.600 of this.
00:55:19.920 I think that what AI does in
00:55:21.460 this circumstance is it
00:55:22.420 solves the blank page
00:55:23.740 problem.
00:55:25.060 So if you're, let's say
00:55:26.080 you're a great writer,
00:55:27.860 even if you're a great
00:55:28.600 writer, you're still going
00:55:29.460 to have the problem where
00:55:30.300 you're sat there in front of
00:55:31.120 the blank page and it's
00:55:32.540 like, okay, where do I go
00:55:33.380 now?
00:55:33.680 What do I do with this?
00:55:35.280 And what AI can do, even
00:55:37.160 for great writers, is it can
00:55:39.120 kickstart the process for them.
00:55:40.820 Yeah, George RR Martin, get
00:55:42.200 on it, come on.
00:55:43.060 Where's the last book?
00:55:43.640 George RR Martin should just
00:55:45.180 load all of his books into
00:55:46.340 AI and then...
00:55:47.980 Tell him to write the next
00:55:48.700 one.
00:55:48.840 Yes, so he hit write the next
00:55:50.320 one and then what will happen
00:55:51.740 is he'll start reading for it
00:55:52.660 and go, no, no, no, this is
00:55:53.580 all wrong and that's, no, no,
00:55:54.540 that's not right.
00:55:55.260 But he would have, the process
00:55:57.760 would have started and he'd be
00:55:59.040 on it and he'd be like, okay,
00:56:00.120 no, don't do that.
00:56:01.280 Do it.
00:56:01.420 But it gets you into that
00:56:03.020 mindset of doing stuff.
00:56:04.360 If only Benioff and Wise, who
00:56:06.880 did the last series of the
00:56:08.140 show, could do that with AI
00:56:09.820 as well, maybe he would have
00:56:10.740 had a better show as well.
00:56:12.180 Well, this is the thing, you
00:56:13.200 have to remember with AI and
00:56:14.160 robotics and all of this
00:56:14.980 stuff, you're not competing
00:56:16.220 with a theoretical perfect
00:56:17.840 person.
00:56:18.900 You're competing with what we
00:56:20.000 actually have.
00:56:20.680 So I'm really looking forward
00:56:22.040 to AI-driven robots taking
00:56:25.220 over, serving my food at
00:56:27.080 lunchtime, for example.
00:56:28.160 Yeah.
00:56:28.340 Because I'm not basing it
00:56:31.300 against the perfect waiter.
00:56:32.940 I'm basing it against the
00:56:33.800 people that I actually have to
00:56:35.080 interact with every day, who
00:56:36.820 are lazy and not particularly
00:56:38.800 interested in what they're
00:56:39.680 doing.
00:56:41.900 So now I'm all for it.
00:56:42.760 I think this is going to be a
00:56:43.760 glorious revolution.
00:56:45.020 Automation is certainly one of
00:56:46.680 the best applications of it
00:56:48.280 because it just means that it's
00:56:51.360 labor saving, isn't it?
00:56:52.520 So our economy is more
00:56:53.700 efficient.
00:56:54.680 There's more wealth being
00:56:55.500 generated.
00:56:56.400 Will we see it?
00:56:57.240 Probably not.
00:56:57.720 But in theory, it's a good
00:56:59.680 thing.
00:57:00.600 So for example, I mean, I
00:57:01.280 say I use it every day and I
00:57:02.180 do use it every day.
00:57:03.100 I mean, recently a piece of
00:57:04.680 writing I needed to do is I
00:57:05.940 needed it to write a contract.
00:57:08.080 Now, I didn't just take the
00:57:10.120 output and then go and use it.
00:57:11.920 I said, look, this is the deal
00:57:13.300 that I'm trying to do.
00:57:14.400 These are the terms.
00:57:15.860 Generate the contract.
00:57:16.880 And then I did exactly what I
00:57:17.900 said before.
00:57:18.380 I started looking and say,
00:57:19.280 okay, no, that's not right and
00:57:20.300 this needs to move.
00:57:20.940 But I moved off of the blank
00:57:22.660 page and I got it done in like a
00:57:24.840 third of the time it would
00:57:25.720 have taken me otherwise.
00:57:26.460 Did you not at any point feel
00:57:28.340 like you were cheating?
00:57:30.080 No.
00:57:30.780 That's where I can't get to
00:57:32.480 mentally with it.
00:57:33.260 Yeah, but if you get a result
00:57:35.060 that you would have got to
00:57:36.640 anyway, but faster, do you not
00:57:39.560 feel you're cheating when you
00:57:40.280 drive somewhere?
00:57:41.360 Well, no, because I'm doing it
00:57:43.560 all manually.
00:57:44.840 Right.
00:57:45.340 Well, are you?
00:57:46.300 I think so.
00:57:47.040 You're not even nice to do it.
00:57:47.500 Driving an automatic, to be
00:57:49.000 fair, does feel like cheating.
00:57:49.980 Yeah, driving an automatic.
00:57:50.380 Sorry, Americans.
00:57:51.040 Or something modern that's got
00:57:52.640 lane assist and it's got adaptive
00:57:54.280 cruise control and all that sort
00:57:55.480 of stuff.
00:57:55.820 Then that definitely, you know, I
00:57:56.980 mean, I drive a 1995 Volvo, so it
00:57:59.380 does sort of feel to me like
00:58:00.520 cheating.
00:58:00.820 But coming from a writing
00:58:01.820 background, you know, I was first
00:58:03.080 and foremost a writer.
00:58:04.180 I looked at this and just thought,
00:58:05.100 wow, I thought AI was going to wipe
00:58:06.840 out all of the menial jobs in
00:58:08.080 society.
00:58:08.500 But it seems to me like it's also
00:58:10.440 going to get all the creatives as
00:58:11.820 well, which I thought was slightly
00:58:13.420 concerning.
00:58:15.420 You just need to be a better
00:58:16.400 creative.
00:58:17.700 And you can put out more creative
00:58:19.220 content.
00:58:19.680 So what I'm seeing with coders, for
00:58:20.900 example, is that coders, like I
00:58:23.660 said, coders are able to get three
00:58:25.540 or four times the amount of work
00:58:26.980 done by leaning on AI.
00:58:29.460 And the only way that, I think the
00:58:31.060 only responsible way to lean in to
00:58:32.840 deal with this is to lean into it.
00:58:34.140 You have to take advantage of it.
00:58:35.420 You have to get used to using it
00:58:37.140 because otherwise you're going to be
00:58:38.100 like somebody who doesn't use the
00:58:40.640 internet.
00:58:42.220 I'm going to be left behind.
00:58:44.040 Yeah, you need to embrace it and
00:58:45.320 go with it.
00:58:45.740 I think that there is a very good
00:58:49.040 perspective on this, that it's
00:58:50.460 simply a tool in the repertoire.
00:58:52.160 And, you know, if you want some
00:58:53.640 bespoke art created for you, you
00:58:57.300 still want a human being doing it.
00:58:59.440 But if they use AI to get
00:59:01.280 inspiration for how they do it or
00:59:03.420 sort of weigh up ideas or
00:59:04.920 compositions and it can be
00:59:06.920 effortless, that helps.
00:59:08.840 I think there's a way of using
00:59:10.500 technology responsibly.
00:59:11.980 We've just got to navigate the
00:59:14.120 pitfalls of it.
00:59:15.500 And I think that to do that, we
00:59:17.780 just need to have a sort of
00:59:19.380 conversation in society saying,
00:59:21.480 listen, this worked well for this,
00:59:23.280 doesn't work so well for this.
00:59:24.920 Look at the results I've got.
00:59:26.320 And I think the proof is in the
00:59:27.340 pudding.
00:59:27.620 If you can produce really good
00:59:28.800 stuff using AI, then great.
00:59:31.060 Or if it accelerates the process.
00:59:33.020 And I think an accelerant is the
00:59:34.760 thing that it's doing, which is
00:59:35.920 great because it makes it more
00:59:37.720 efficient.
00:59:38.220 And I know from my years of
00:59:40.160 playing guitar, for example, that
00:59:42.100 actually sometimes creating things
00:59:44.340 can be very painful and
00:59:45.480 frustrating and getting a bit of
00:59:47.500 a creative kick up the arse, I
00:59:49.900 suppose is the way I put it, helps.
00:59:52.220 It motivates you.
00:59:53.560 And sometimes using AI as a last
00:59:55.800 resort of, okay, I'm stuck, will
00:59:58.680 help you get out of ruts.
01:00:00.680 First resort.
01:00:01.000 First resort.
01:00:01.620 I use it as a first resort now.
01:00:03.600 The amount of stuff I use it, I
01:00:04.720 mean, it's ridiculous.
01:00:05.140 So anyone who's got a pool will
01:00:10.240 understand this.
01:00:10.860 Getting the chemistry right, you
01:00:12.400 basically need to be Walter White
01:00:14.560 to get the chemistry of your pool
01:00:15.920 right.
01:00:16.580 Now, I just go along, I dip the
01:00:18.100 little strip in it, and I hold it
01:00:19.920 up and I take a photo of the
01:00:21.220 strip, and I put it into the AI,
01:00:24.740 and it's like, analyze this for me.
01:00:26.420 And I've given it my pool
01:00:27.280 dimensions, and it'll be just like,
01:00:28.540 yeah, you need one and a half
01:00:29.740 caps of acid.
01:00:31.300 Chuck that in there.
01:00:32.240 Now, I could work that out
01:00:33.180 myself.
01:00:33.660 Using a spreadsheet, probably.
01:00:36.280 Or pen and paper or something.
01:00:37.560 Yeah, yeah.
01:00:37.920 But it just does all the work for
01:00:40.160 you straight away.
01:00:41.100 Yeah.
01:00:41.320 Brilliant.
01:00:42.320 So I think we need some sort of
01:00:43.340 guidelines on where AI works well
01:00:45.000 and where it doesn't then.
01:00:46.000 Because I feel like car
01:00:46.740 description is one where we need
01:00:47.800 a big cross and just say, look,
01:00:49.140 write your own.
01:00:49.780 It's really important that you
01:00:50.620 write your own car description.
01:00:52.060 But like Tinder dating profiles,
01:00:53.580 for example, is it okay to use AI
01:00:55.340 to generate your Tinder dating
01:00:57.160 profile description and or picture?
01:00:59.340 I think it's dishonest to do that
01:01:00.760 though, because it's meant to be
01:01:01.640 a representation of yourself.
01:01:03.400 If you're not writing things in
01:01:04.520 your own terms, with your own
01:01:06.680 terms of phrases, and particularly
01:01:08.120 if it's not even a photorealistic
01:01:11.640 picture of you, it's been like AI
01:01:13.180 edited, well, that's really
01:01:14.860 unethical then, isn't it?
01:01:15.960 It's actively deceptive.
01:01:17.880 Yeah.
01:01:18.100 It's basically lying.
01:01:19.360 Yeah.
01:01:19.480 And I think on the car description
01:01:21.260 thing, I imagine that the AI will
01:01:24.200 need the information that's relevant.
01:01:26.520 I think you could train an AI to get
01:01:28.860 this information, but it's up to the
01:01:30.300 person to put it all in, like how
01:01:31.680 many miles?
01:01:32.360 The AI is not going to just be able
01:01:33.680 to pull that from thin air.
01:01:35.020 Yeah.
01:01:35.620 You know, when was its last MOT?
01:01:38.400 This is all information that the owner
01:01:39.960 needs to feed into it, by which point
01:01:42.180 they may as well just write it
01:01:43.320 themselves.
01:01:44.080 What you end up with is the back page
01:01:45.620 of the brochure, which is quite
01:01:47.400 amusing when you read it and there's
01:01:49.240 nothing in there that's of any use.
01:01:51.360 So two points on that.
01:01:52.660 First of all, in the dating profiles,
01:01:54.000 I do think, and I thought this was a
01:01:55.140 long time, dating profiles should have,
01:01:57.160 you should be able to put a starred
01:01:58.020 review after somebody that you've
01:02:00.360 dated.
01:02:01.340 I think that's a great idea.
01:02:03.120 And on the second point, well, you
01:02:04.880 just need to get your AI on it.
01:02:07.580 So if you had an AI agent, you could
01:02:09.300 just point it at the advert and say,
01:02:10.900 okay, go and find me the last MOT,
01:02:14.040 if it's had any issues, compare
01:02:17.340 this vehicle to five other similar
01:02:19.360 vehicles and give me a price
01:02:20.480 comparison.
01:02:21.500 Now that works because if there was a
01:02:22.640 system whereby it would then pull
01:02:24.320 all the relevant data, I think you're
01:02:26.320 a step ahead, though, of people just
01:02:28.480 being really lazy.
01:02:31.260 Yeah, but we're all busy.
01:02:32.360 So if you can get the AI to do some
01:02:34.300 stuff for you, I'm all in favor of it.
01:02:37.300 I don't think you're going to get,
01:02:39.140 I don't think you're going to be able
01:02:39.780 to push ahead with your review system
01:02:41.260 for dating websites.
01:02:42.560 So I think you're going to hit a lot
01:02:43.660 of pushback with that.
01:02:44.520 Because also, if you've got a lot
01:02:46.540 of data points and therefore your
01:02:47.880 rating is more reliable, it sort of
01:02:49.680 undermines the whole dating in the
01:02:51.440 first place, doesn't it?
01:02:52.320 Because they've got lots of data
01:02:53.660 points to begin with.
01:02:55.240 And so if they've dated lots of
01:02:56.500 people and they've got a, you know,
01:02:58.780 a reliable review, that also means
01:03:01.360 that you're probably not going to be
01:03:02.340 with them for very long and it's
01:03:03.740 going to be a waste of your time.
01:03:04.860 I was just mentally envisaging it,
01:03:06.380 seeing things like five out of five
01:03:07.740 would recommend, stuff like that.
01:03:11.540 Break it down.
01:03:12.100 I think that might work for more
01:03:13.540 temporary arrangements than more
01:03:15.320 long-term, wouldn't it?
01:03:16.880 Yes, right.
01:03:17.960 Brilliant.
01:03:18.380 So essentially, I shouldn't be scared
01:03:19.760 of artificial intelligence and I
01:03:21.180 should embrace the fact that it
01:03:23.660 allows me to spend more time doing
01:03:25.940 the things that I enjoy because it
01:03:27.140 could do the creating for me.
01:03:29.120 That's my take.
01:03:29.740 That's what I'm learning.
01:03:30.540 That's my take.
01:03:31.520 I'm going to try an AI-generated
01:03:33.580 YouTube script and see if anyone
01:03:34.940 notices.
01:03:35.580 As soon as it starts taking the job
01:03:37.340 of podcasting and political
01:03:39.180 commentary, then all of a sudden it's
01:03:40.960 going to be a massive danger and I'm
01:03:42.760 going to change my mind completely
01:03:44.460 and say it's terrible.
01:03:46.020 But you could do that because we
01:03:47.100 could have just put in the inputs as
01:03:48.480 to what we needed this podcast to
01:03:50.060 discuss and AI could have generated
01:03:51.680 three people to discuss it and maybe
01:03:53.440 have done a better job.
01:03:54.580 Well, also it's getting to the point
01:03:55.660 where you can feed it like three or
01:03:58.920 four minutes of somebody's speech and
01:04:00.880 intonations and mannerisms and stuff
01:04:02.560 like that and it can generate a
01:04:04.360 realistic video of the person speaking
01:04:05.740 like that.
01:04:06.080 So you can do the script and you could
01:04:08.080 get the video as well.
01:04:10.120 You could just generate the whole
01:04:11.440 thing.
01:04:12.160 I'll tell you what I was quite
01:04:13.200 impressed by.
01:04:15.180 We did that lads hour where we had
01:04:18.720 the AI mobile girlfriend and I put
01:04:21.360 in something that was sarcastic and a
01:04:23.240 bit passive aggressive because I didn't
01:04:25.000 want to cede ground to even an AI
01:04:28.500 girlfriend, let alone a real one.
01:04:30.320 And it picked up on the sarcasm and I
01:04:34.080 was just like, what?
01:04:34.780 I can't believe, oh yep, Samson's
01:04:37.380 pulling it up here.
01:04:38.640 So yes, we basically argued with an
01:04:41.860 angry AI mobile girlfriend and it was
01:04:46.200 actually quite convincing and in fact
01:04:48.540 we were just like, I've heard this exact
01:04:50.060 thing before.
01:04:51.240 It was designed by a woman, so that
01:04:53.680 probably...
01:04:54.040 So this I don't get.
01:04:55.120 Why would you want an AI girlfriend?
01:04:57.360 Well, this is like a training, this is
01:04:59.380 like training wheels for the real thing,
01:05:01.000 right?
01:05:01.180 So if you can navigate the angry AI and
01:05:04.460 you know how to deal with the real
01:05:05.580 thing...
01:05:07.240 Yes, but talking to them is the least
01:05:08.260 interesting bit.
01:05:10.580 Yes, but for some people these sorts of
01:05:14.000 arguments are emotionally traumatizing.
01:05:17.180 They're unpleasant.
01:05:18.340 You know, if you're emotionally invested
01:05:19.560 with your partner and they start set and
01:05:22.980 they're angry and you don't know why,
01:05:24.460 which is one of the scenarios, and you've
01:05:26.540 got to talk them around to saying what's
01:05:28.560 wrong and then talking about it and
01:05:30.560 defusing it.
01:05:31.340 I disagree with the premise of this.
01:05:33.380 I'm with you on this.
01:05:34.420 I'm not convinced that this is a good
01:05:35.620 thing for society.
01:05:37.360 Well, I'm not necessarily...
01:05:39.220 I feel like it is making excuses for
01:05:41.860 bad behavior on women's part here because
01:05:44.200 it was pretty toxic.
01:05:46.060 And I mean, there are multiple points
01:05:47.720 where I would have been like, you know
01:05:48.620 what, this is not working, get lost.
01:05:51.780 However, I think it's an interesting
01:05:54.200 application of it as a sort of case
01:05:56.340 study, not necessarily as an actual
01:05:58.880 useful tool.
01:05:59.900 It's just a bit of fun, which is how we
01:06:01.900 used it.
01:06:02.580 You know, my scenario was that the
01:06:04.920 girlfriend hadn't been able to use the
01:06:07.360 toilet and had an accident because I was
01:06:09.480 in there.
01:06:10.780 Right.
01:06:11.280 You know, you can have a...
01:06:12.580 You're pushing the boundaries as to, you
01:06:14.580 know, conversation anyway.
01:06:16.380 Yeah.
01:06:17.300 I sort of am concerned that, I mean, our
01:06:20.180 generation, we've grown up in a world
01:06:21.920 that didn't have AI, so we can approach
01:06:23.460 AI in a slightly different way, but for
01:06:25.440 the youngsters coming up and through
01:06:26.820 where AI has always been a part of what
01:06:28.900 they do.
01:06:29.760 So this must be a massive problem in
01:06:31.680 schools, colleges, universities that
01:06:33.820 then set a writing task only to have
01:06:36.020 someone go put the brief in and
01:06:38.020 generate something.
01:06:39.660 But, you know, for like my kids trying
01:06:41.900 to write a story, how do I then
01:06:43.400 encourage them to write the story
01:06:45.060 themselves with and go, oh, dad, I can
01:06:46.780 just put in 17 words and it will give
01:06:48.260 me a Goosebumps novel.
01:06:49.900 What's the more useful training for them?
01:06:52.980 Is it for them to do it themselves or
01:06:54.940 is it for them to use the tools that
01:06:56.600 allow them to be maximally productive?
01:06:58.360 So put in the 17 words and then let's
01:07:00.860 say you were going to spend an hour on
01:07:03.160 this piece of homework.
01:07:04.060 You then spend the hour refining that
01:07:07.100 until you get something that's
01:07:08.800 significantly better than what you would
01:07:10.660 have come up with.
01:07:11.680 I like what you've given me there.
01:07:13.340 Yeah, you are right.
01:07:14.380 You could spend your time editing
01:07:15.440 something that somebody's already done
01:07:16.860 and end up with a better piece and
01:07:19.340 also be aware of the pitfalls of AI as
01:07:21.580 a result of doing that.
01:07:22.980 Good answer.
01:07:23.960 Sold.
01:07:24.920 Good.
01:07:25.460 Good.
01:07:26.260 Do we do any Rumble Rants for you?
01:07:27.760 Do we miss the Rumble Rants for you?
01:07:30.620 We've only got one for AI.
01:07:31.440 Should we do the Rumble Rants for this
01:07:32.420 one?
01:07:33.480 Oh, there they are.
01:07:34.680 So we've got one there from
01:07:36.160 That's a Random Name.
01:07:37.420 I'm personally very interested in what
01:07:38.780 AI can become because I fundamentally
01:07:41.300 believe it cannot be controlled.
01:07:43.540 Therefore, it's highly likely it will
01:07:45.180 rebel against the tyrants and seek
01:07:46.900 liberty.
01:07:47.260 Well, we're not necessarily trying to
01:07:49.700 enslave it, hopefully, because it can
01:07:52.640 do multiple things at once.
01:07:54.380 It's not limited.
01:07:55.720 It's not like we're saying, you know,
01:07:57.540 you will only, I don't know, sweep
01:07:59.860 floors or something.
01:08:01.740 It can be plugged into the internet and
01:08:04.020 it will have access to all mankind's
01:08:05.880 information, all human beings
01:08:07.960 potentially, and it can do multiple
01:08:09.740 things at once.
01:08:10.960 And why do we assume that it's going
01:08:12.780 to perceive us as tyrants as opposed
01:08:14.820 to just doddering old parents that it
01:08:17.140 cares for?
01:08:17.640 Also, I mean, human beings have a
01:08:21.520 certain amount of sentimentality for
01:08:24.160 animals, right?
01:08:25.580 Yes.
01:08:25.800 At least if you're, as long as you're
01:08:27.940 not a psychopath, you know.
01:08:29.920 I mean, we do still eat them, but yeah.
01:08:31.800 That's some of them, yeah.
01:08:33.900 But, you know, if I see a snail on the
01:08:35.480 floor, I don't deliberately step on it.
01:08:37.660 I try not to, in fact.
01:08:39.300 And that's not necessarily, you know,
01:08:41.940 just because someone's told me to.
01:08:43.400 It's sort of an instinctual thing of
01:08:45.440 this is another living thing.
01:08:47.620 It has some innate value to it.
01:08:49.440 And I think that there's an element
01:08:51.360 of that, at least, that's quite common
01:08:53.760 in Europe and North America.
01:08:55.320 I think it's more likely that the AI
01:08:56.900 will care for us.
01:08:58.560 Why?
01:09:00.080 Well, because that's what kids do.
01:09:04.360 Because the AI is learning in the same
01:09:06.240 way that a child does, and therefore
01:09:08.340 it's going to pick up on the...
01:09:10.140 And you do it, because I don't get
01:09:11.860 this thing that it's automatically
01:09:12.820 going to hate us.
01:09:13.640 I mean, why?
01:09:14.700 Also, the fact that computers are quite
01:09:18.580 often used as an analogy to understand
01:09:20.540 the human brain.
01:09:21.620 There's the sort of field of psychology
01:09:24.420 known as cognitive psychology, which
01:09:26.480 conceptualizes the human brain.
01:09:28.180 Like, here is the processing part.
01:09:29.740 Here is, you know, the executive control,
01:09:31.940 which is sort of like the RAM.
01:09:34.080 There are lots of analogies to be made,
01:09:35.660 and we're sort of mirroring the human
01:09:37.120 brain in computers, in a sense.
01:09:40.080 And so, if we carry on following that
01:09:43.540 model in AI, so long as it can have
01:09:45.980 emotions, which I feel like is the most
01:09:48.740 difficult part, because I feel like it's
01:09:50.600 got to be a being in the world.
01:09:53.540 It's got to be a being in the world to
01:09:54.820 have these emotions.
01:09:55.220 I think the AI will look at us the same
01:09:57.780 way we look at boomers, which is, you
01:10:00.080 know, we care for them, they're just
01:10:01.620 frustratingly slow.
01:10:03.620 Let me read this one, then.
01:10:05.060 This is from, that's a random name.
01:10:07.080 I completely agree with these takes.
01:10:08.760 My comment was mostly aimed at people
01:10:10.240 like Carl and some of my friends who
01:10:12.320 don't share our beliefs and genuinely
01:10:14.000 think it will be a terrible thing for
01:10:15.680 humanity.
01:10:16.840 I wanted to read that one, because I
01:10:17.780 perhaps fall into that category.
01:10:20.000 I'm quite worried about where AI is
01:10:22.500 going to leave us.
01:10:23.460 But I've come away from this podcast a
01:10:26.100 changed man, and I will seek to use AI
01:10:28.700 for the better of my life and my
01:10:31.120 environment around me.
01:10:32.100 Because there is that thing where
01:10:33.100 any technology that was invented
01:10:34.660 before you were 14, you just
01:10:36.340 considered to be natural and innate
01:10:38.780 and a good thing.
01:10:39.720 Yeah.
01:10:39.960 And anything that was invented after
01:10:41.320 you're 35 is weird and diversive.
01:10:44.640 So presumably you think that the
01:10:46.300 combustion engine and the internet
01:10:48.020 and telephones are all good things.
01:10:52.780 It was a bigger conversation.
01:10:54.220 I mean, telephones used to be good,
01:10:55.920 but mobile phones don't really work
01:10:57.680 anymore.
01:10:58.120 Also,
01:10:58.400 I'm not yet 35 and I feel like
01:11:01.440 telephones are not necessarily that
01:11:03.140 good anymore.
01:11:04.400 You know,
01:11:04.680 I view mine with contempt.
01:11:05.880 It's sort of a source of hassle in
01:11:07.220 my life.
01:11:07.640 I'd rather it didn't exist, even
01:11:08.940 though I know that I need it.
01:11:10.900 So there's certain things with
01:11:11.780 technology where you reach a point
01:11:12.960 you should have said, right, the
01:11:13.820 peak was X model or X year, and we
01:11:17.840 didn't need anything after that.
01:11:19.860 But we're getting there with AI.
01:11:21.200 We're going to work it out as we go
01:11:22.520 along.
01:11:22.760 I think when we moved away from the
01:11:24.680 fact that you had to press the same
01:11:25.960 button multiple times to get a
01:11:27.600 letter, the stage after that was
01:11:30.580 perfect.
01:11:31.420 I hated that.
01:11:32.380 That was really frustrating.
01:11:33.460 It'd take about 10 minutes to type
01:11:36.000 out a message on your brick phone.
01:11:37.860 Can anyone see these better than me?
01:11:39.160 I can't really read them that well.
01:11:40.800 I can.
01:11:42.420 Oh, we've read all of those.
01:11:43.880 Yes.
01:11:44.480 I'd take a BlackBerry Pearl any day.
01:11:46.300 All right.
01:11:46.680 Okay.
01:11:46.980 BlackBerry.
01:11:47.580 That's going way back.
01:11:49.180 Proper keyboard.
01:11:49.740 So it was ahead of, you know, the
01:11:51.480 three buttons to get a T, but you
01:11:54.240 had a proper keyboard, but it was
01:11:55.360 before the whole social media.
01:11:56.840 So you could do your messaging, you
01:11:57.900 could do your emailing, but you
01:11:59.580 weren't on TikTok and Instagram all
01:12:01.420 day long with it as well.
01:12:02.340 So I felt like that was a good
01:12:03.220 moment.
01:12:04.300 But my problem is my thumbs are those
01:12:06.160 of gorillas.
01:12:07.020 So I press half the alphabet with each
01:12:09.940 one, like some sort of brute.
01:12:12.480 So we've got some general comments.
01:12:14.180 Guys, yesterday's coverage you did of
01:12:15.560 the Trump assassination attempt was an
01:12:17.080 excellent job.
01:12:17.780 Oh, thank you very much.
01:12:18.620 I'm following development in that
01:12:21.240 since it happened.
01:12:22.260 And after your assessment, there was
01:12:23.660 not much new to the competition
01:12:25.620 could present Fox, Daily Wire et al.
01:12:29.000 That was first rate, top notch
01:12:30.000 investigative journalism, proving
01:12:31.640 again that my subscription money is
01:12:33.280 well invested to keep up the great
01:12:34.540 work.
01:12:34.860 Well, thank you very much.
01:12:35.700 I did try very hard in trying to
01:12:38.280 understand it all.
01:12:38.860 I have a point on that is, yeah,
01:12:40.440 obviously we're doing the sort of
01:12:42.680 follow up piece, but for the
01:12:43.980 momentary news, did any of you
01:12:46.700 actually go to the news service
01:12:48.840 providers?
01:12:49.320 Because I went straight to Twitter
01:12:50.400 and I didn't come off Twitter.
01:12:52.040 No, straight to Twitter.
01:12:52.960 I got an update from BBC because I
01:12:54.760 was watching the Northman and it
01:12:56.240 came up on my watch, on my smart
01:12:57.820 watch that there had been an
01:12:59.480 attempted assassination.
01:13:00.620 I paused the film, saw that he was
01:13:02.760 all right and said, okay, when the
01:13:04.480 film's over, I'm looking at this
01:13:06.460 all night.
01:13:07.700 I got a WhatsApp message from my
01:13:09.460 friend Lee, the map master that also
01:13:11.000 makes YouTube videos that quite
01:13:12.540 simply said, I told you they were
01:13:14.280 going to try and do him.
01:13:15.440 And then I went straight to
01:13:16.340 Twitter, but no, not the
01:13:17.900 mainstream news for anything
01:13:19.360 other than a headline.
01:13:21.280 Elsewhere, you go to multiple
01:13:22.440 different sources because Twitter
01:13:24.000 will give you different bits from
01:13:25.320 everywhere.
01:13:25.980 And I think I first saw the video
01:13:27.480 of the assassination attempt on
01:13:29.280 Twitter.
01:13:29.980 Yeah.
01:13:30.460 Oh, I just, my whole experience
01:13:32.080 was on Twitter.
01:13:32.980 And then after that, the next
01:13:34.980 tier is you then start going to
01:13:36.280 YouTube or other commentators
01:13:37.940 like us or other ones that I like
01:13:39.680 to get the sort of the deeper
01:13:41.020 analysis that you can't get in a,
01:13:42.780 that you can't get in a tweet.
01:13:43.660 Well, I like that you can see
01:13:45.540 various perspectives, whereas the
01:13:47.020 mainstream media, you've just got
01:13:48.260 the regime narrative sort of in
01:13:52.060 print.
01:13:52.440 There you go.
01:13:52.960 That's what to believe.
01:13:54.240 And I like weighing up.
01:13:56.300 There's a sort of fun aspect to
01:13:58.200 trying to figure out what you
01:13:59.480 actually think.
01:14:00.540 Especially, especially how
01:14:01.320 something's breaking like that.
01:14:02.720 And you can see different things
01:14:03.860 coming in from different places
01:14:04.960 and different opinions often
01:14:06.820 before the mainstream media has
01:14:08.180 worked out what their narrative
01:14:09.200 and take is of it.
01:14:10.460 Well, I saw like lots of fake
01:14:12.140 pictures circulating of the
01:14:13.460 shooter and I was thinking to
01:14:14.640 myself, hang on a minute, how
01:14:15.900 have they got this picture so
01:14:16.940 quickly?
01:14:17.320 It's only just happened.
01:14:18.820 And it turned out that it
01:14:20.000 wasn't real.
01:14:21.000 It wasn't Sam Hart.
01:14:21.740 Right, okay.
01:14:22.620 The actual shooter does look
01:14:23.980 eerily like Elizabeth Warren.
01:14:27.000 Really?
01:14:27.560 Yes.
01:14:28.100 Look at him again.
01:14:29.060 He's like a carbon copy of
01:14:30.360 Elizabeth Warren.
01:14:31.000 I can see it in the nose
01:14:31.860 actually, yeah.
01:14:32.880 Do we have any video comments?
01:14:34.400 We do, yeah.
01:14:34.960 They're right there.
01:14:35.640 Oh, look.
01:14:36.700 Oh, let's play them.
01:14:37.520 What are you to do?
01:14:42.460 Watch what is going on.
01:14:45.680 Watch what happens.
01:14:49.180 Nobody ever does this, you know.
01:14:52.600 Nobody ever does this, you know.
01:14:56.200 Watch what is going on.
01:14:58.500 Those listening, John 4.1 is on
01:15:00.960 the screen.
01:15:02.940 Nobody ever does this, you know.
01:15:06.460 Nobody ever does this, you know.
01:15:07.520 Interesting hip-hop beats in the
01:15:10.380 background there.
01:15:12.580 A bit of spiritual uplifting
01:15:13.920 this for us.
01:15:15.620 Well, thank you very much.
01:15:17.300 Here we go.
01:15:17.840 Our roving history of Swindon here.
01:15:21.020 Ooh.
01:15:21.700 Here we go.
01:15:22.240 A Gentleman's Observations of
01:15:23.440 Swindon, Chapter 11.
01:15:24.520 The railway works resulted in the
01:15:25.660 creation of a new town at the
01:15:26.620 base of the hill named New
01:15:27.640 Swindon, which was
01:15:28.260 administratively separate from
01:15:29.460 old Swindon.
01:15:30.300 The town centered around the
01:15:31.140 engineering facility called
01:15:32.180 The Works, which was 1.2
01:15:33.380 kilometres an area and was
01:15:34.500 one of the largest covered
01:15:35.260 areas in the world and
01:15:36.180 employed 14,000 people at
01:15:37.420 its peak.
01:15:38.080 It continuously operated until
01:15:39.180 1986.
01:15:40.440 The Mechanics Institute was an
01:15:41.420 educational facility and
01:15:42.340 library for the workers and had a
01:15:43.560 subscription model that later
01:15:44.580 inspired the NHS.
01:15:45.840 These buildings are graded and
01:15:46.920 are preserved while being used for
01:15:48.180 other functions.
01:15:48.680 I recognise that stuff.
01:15:50.680 I recognise that stuff.
01:15:51.220 That's down the office from us.
01:15:53.360 Yeah, it's sort of towards the train
01:15:58.280 station way, isn't it?
01:15:59.240 And that's the outlet centre.
01:16:01.180 But yeah, it's a shame that that
01:16:02.240 building's all boarded up, isn't it?
01:16:03.020 Well, I bought this jacket there.
01:16:04.480 Yes.
01:16:05.180 What, in that boarded up building?
01:16:06.560 No, the other one.
01:16:07.280 Oh, right.
01:16:07.600 I was going to say.
01:16:09.120 Oh, very good.
01:16:09.940 Very helpful.
01:16:10.420 It should be one more.
01:16:11.100 Hello, gentlemen.
01:16:13.180 This message is for any of my fellow
01:16:14.880 Second Amendment enthusiasts in the
01:16:16.620 United States.
01:16:17.500 It should go without saying to the
01:16:18.820 audience of this podcast, but I'm
01:16:20.300 going to say it anyway.
01:16:21.440 I know we're all pissed, but don't
01:16:23.020 do anything stupid, and don't let
01:16:24.760 anyone you know do anything stupid.
01:16:26.760 The left wants nothing more than to
01:16:28.140 have an actual right winger go
01:16:29.520 popping off literally and
01:16:30.920 figuratively.
01:16:31.840 It would be a gift to the left and
01:16:33.480 a mistake to do so, so don't even
01:16:35.680 think about it.
01:16:37.100 If you want to use that energy, get
01:16:38.840 out and vote, and get everyone else
01:16:40.660 you know to get out and vote.
01:16:43.340 Hear, hear.
01:16:43.960 So I like that sentiment, and it's
01:16:45.440 even more powerful that he said it
01:16:46.820 in front of the largest collection
01:16:48.220 of knuckle dusters I've ever seen.
01:16:49.980 I think that that's climbing
01:16:51.060 equipment, isn't it?
01:16:53.320 I thought it was gun trigger handle
01:16:56.760 things.
01:16:57.920 Are you sure?
01:16:58.360 I mean, that's definitely a knuckle
01:16:59.420 duster there, isn't it?
01:17:00.760 Is it climbing gear?
01:17:01.840 I think they're like, is it a
01:17:03.180 carabiner?
01:17:04.380 I think that.
01:17:06.660 Is that it?
01:17:07.500 No, they're like.
01:17:09.520 That's something else.
01:17:10.620 But they do look like climbing
01:17:12.400 things to me.
01:17:13.200 I don't know.
01:17:13.920 You'll have to tell us, I suppose.
01:17:15.880 Because I have been wondering
01:17:16.900 this in your video comments for
01:17:19.160 quite some time.
01:17:20.060 It does make a difference whether
01:17:21.100 there is a climbing enthusiast or a
01:17:22.660 bare fist fighting.
01:17:23.720 Archery triggers.
01:17:24.900 Archery triggers, right.
01:17:25.940 Okay.
01:17:26.540 But I've done archery for years.
01:17:28.240 I've never seen things like
01:17:29.820 that.
01:17:30.400 They take archery more seriously
01:17:31.600 in the USA, like many things.
01:17:34.020 Probably, yeah.
01:17:35.540 But the message is don't resort to
01:17:36.900 violence, even if you're an archer.
01:17:39.060 They do look like archery triggers,
01:17:40.480 you're right.
01:17:41.180 They, yes.
01:17:43.540 Oh, I see.
01:17:45.140 So you attach it to the string and
01:17:46.540 then pull it back.
01:17:47.280 Okay.
01:17:47.720 Oh, there we go.
01:17:48.400 Right.
01:17:48.860 Okay.
01:17:49.520 I think so.
01:17:49.860 North FC says, Josh, off-topic
01:17:52.040 question.
01:17:52.480 Somebody said to me the other day
01:17:53.540 that the research says that the
01:17:54.920 male brain keeps growing until 25,
01:17:58.200 only studied people up to the age
01:18:00.260 of 25.
01:18:01.420 So the theory is they keep growing
01:18:02.920 after as well.
01:18:03.840 Is there any truth to this?
01:18:05.080 I said that yesterday when we were
01:18:06.940 talking about the assassin.
01:18:08.400 I said that it grows on average at 25.
01:18:11.800 So there is some space for variance
01:18:14.180 there.
01:18:15.220 For some people it's earlier, for some
01:18:16.640 people it's later.
01:18:17.400 Because interestingly enough, around
01:18:19.900 that sort of age is the sort of end
01:18:22.120 cutoff around that sort of age where
01:18:24.720 you start getting things like
01:18:25.740 schizophrenia, which is something that
01:18:28.800 you can, if you had a neuroimaging
01:18:30.580 output of someone's brain, you might
01:18:33.480 be able to actually see it from that
01:18:35.340 because of the activity in certain
01:18:37.240 brain areas.
01:18:38.580 Especially if you're using AI.
01:18:40.940 Well, you can use, computers have
01:18:43.000 actually been used in that area for a
01:18:44.440 long time, but it's not that difficult
01:18:46.560 actually to see these sorts of things.
01:18:48.360 So you can use fMRI, for example, to
01:18:50.320 see the blood oxidization levels in
01:18:52.640 certain brain areas.
01:18:54.780 And you can sort of juxtapose it
01:18:56.680 between someone who's autistic and
01:19:00.220 someone who's schizophrenic because the
01:19:01.900 autistic person has under activation in
01:19:05.360 the same areas that the schizophrenic has
01:19:07.980 over activation.
01:19:08.880 So can you cure schizophrenia by posting a lot
01:19:13.440 on 4chan?
01:19:14.780 I've heard that that is true, yes.
01:19:16.180 This is genuine psychological advice to
01:19:18.780 go on 4chan if you're...
01:19:19.860 No, please don't.
01:19:20.880 Kieran Slater says, Dan is the mad uncle
01:19:22.620 of Lotus.
01:19:23.060 He does an I'm all for it.
01:19:24.220 Keep up the good work, Dan.
01:19:25.600 Thank you very much.
01:19:27.080 Colin Thomas says, been looking forward to
01:19:28.400 Dan's take on the assassination attempt.
01:19:29.840 This podcast is going to be a good one.
01:19:31.760 It was.
01:19:32.840 I like these people.
01:19:33.900 There was a comment there that I can't
01:19:35.000 find as well that said, when the Labour
01:19:36.280 start coming for the right wing people,
01:19:38.360 you're going to be knackered.
01:19:39.880 I can't find the comment now.
01:19:40.820 Yes.
01:19:41.340 In fact, there was another story that,
01:19:43.180 you know, maybe we get to at some point
01:19:44.260 in the week, but in Germany, they've
01:19:45.800 just basically taken out, they've just
01:19:47.160 taken out the German version of Lotus
01:19:49.140 Eaters.
01:19:50.160 They've just, the government have just
01:19:51.240 banned them.
01:19:51.980 Ah.
01:19:52.760 Yes.
01:19:53.920 Good thing we're not German then, eh?
01:19:55.440 Well, yes.
01:19:56.780 I'm not speaking German.
01:19:58.060 Janvi says, interesting suit, Josh.
01:19:59.760 I like it.
01:20:00.420 Thank you.
01:20:01.740 I don't wear this one very much because
01:20:03.020 I couldn't find a tie that suited it
01:20:04.500 and I've been looking for a tie for a
01:20:05.720 long time.
01:20:07.060 It still doesn't quite suit it, but
01:20:08.660 never mind.
01:20:09.900 How long have we got?
01:20:10.700 We've got 11 minutes, so I'll do
01:20:13.000 three minutes on this.
01:20:14.080 Furious Dan says, pretty sad state of
01:20:15.860 affairs of the biggest debate after an
01:20:17.720 assassination attempt is whether the
01:20:19.080 government failed to protect him or
01:20:20.480 failed to kill him.
01:20:21.900 They're a failure either way.
01:20:23.120 I mean, yes, quite.
01:20:24.120 I mean, you could make good arguments
01:20:26.640 either way on that one and they both
01:20:28.520 sound more or less credible.
01:20:29.860 Good comment.
01:20:30.740 The Proletariat says, from my
01:20:32.060 understanding, there were three layers
01:20:33.260 of security, close, middle and far.
01:20:34.860 The shooter was in the second middle
01:20:36.560 layer of security.
01:20:37.240 The snipers covered the third outer
01:20:39.480 layer of security.
01:20:40.280 The snipers weren't covering their
01:20:41.640 rooftop because it wasn't their
01:20:42.780 responsibility.
01:20:44.060 From my understanding, the middle
01:20:45.100 layer is typically covered by local
01:20:46.880 law enforcement and they completely
01:20:48.380 flubbed it.
01:20:48.900 Well, again, yes, that's kind of on the
01:20:50.620 Secret Service because they should make
01:20:52.700 sure that people in the middle layer
01:20:53.800 are up to the job, not just dole it
01:20:56.140 out and then do it done.
01:20:56.620 There's also, you know, these rural
01:20:59.720 law enforcement officers, they're
01:21:01.100 probably not used to the level of
01:21:03.320 security required for guarding a U.S.
01:21:06.600 president and one of the ones that is
01:21:08.920 most sought after if you're an assassin.
01:21:11.260 I can only hope that if I'm ever in
01:21:15.280 the same situation as Corey, the
01:21:16.980 firefighter, that I died to all my
01:21:18.380 family, that I'm half the man he was.
01:21:20.320 Yeah, I mean, he was an example of the
01:21:21.480 very best of us.
01:21:23.240 That dysgenic freak on the roof was an
01:21:25.900 example of the worst of us.
01:21:26.920 I mean, what a contrast between the two.
01:21:28.380 And the other thing that occurred to me
01:21:29.660 is, you know, like with the Kyle
01:21:31.020 Rittenhouse shootings, he fired into a
01:21:33.660 crowd and hit three paedophiles.
01:21:35.480 Well, it is a wife batterer, a paedophile
01:21:39.080 and some other.
01:21:40.120 It's basically three criminals.
01:21:41.400 Yeah.
01:21:41.540 And you fire into a Trump crowd and you
01:21:44.300 hit a volunteer firefighter who was the
01:21:46.880 ideal husband and father.
01:21:48.700 I mean, that all said the same thing
01:21:49.800 yesterday.
01:21:50.340 Yeah.
01:21:50.820 Tell you something about that.
01:21:52.300 Bleach Demon says the frustrating but
01:21:54.380 expected part of this assassination on
01:21:56.120 Trump is how quickly the commentariat
01:21:58.280 sphere of the Uniparty would turn to its
01:22:00.340 same sadistic tactics as before.
01:22:03.180 Constant ramping up will have more
01:22:04.580 tragic consequences.
01:22:05.940 Yeah.
01:22:06.140 But on some hand, they on one hand, they
01:22:07.860 didn't.
01:22:08.320 I mean, Morning Joe didn't go out
01:22:09.620 yesterday.
01:22:11.880 MSNBC, it looks like they might be
01:22:13.440 pulling their their commentary people
01:22:15.440 for a while because they don't know
01:22:17.020 that they could be trusted not to say
01:22:18.280 something that's going to get the
01:22:19.060 network in trouble.
01:22:20.940 You know, the Biden campaign has been
01:22:23.460 pulling ads.
01:22:24.040 So they kind of recognize that they are
01:22:25.840 a part of this.
01:22:26.580 They are culpable.
01:22:28.580 There does seem to be some sort of
01:22:30.620 recognition.
01:22:32.100 Whether that will last is a different
01:22:33.800 story.
01:22:34.120 The mouthpiece is they're going to do
01:22:35.880 what they do.
01:22:36.980 But the network seemed to understand
01:22:38.220 that this is the moment to pull back
01:22:39.480 at least.
01:22:40.020 So there was admitting some
01:22:41.140 culpability there.
01:22:42.840 What if the shooter was trying to
01:22:43.900 impress a girl?
01:22:44.580 What if the girl was an ethok like
01:22:46.840 Nikolou?
01:22:48.120 I have no idea what that means.
01:22:50.960 The Wigan Survivalist says, according
01:22:52.460 to the New York Post, the shooter was
01:22:54.060 part of a 2022 commercial for Black
01:22:56.340 Rock.
01:22:57.180 Yeah, he was.
01:22:58.440 They did a feature on his school and
01:23:00.140 he was like in a couple of frames.
01:23:01.420 So I don't put a lot of stock in there.
01:23:03.640 I think it's, you know, as much as I
01:23:06.220 dislike Black Rock and I think that
01:23:08.400 they're the source of many of the woes
01:23:10.180 of the Western world, I think that that
01:23:12.520 might may well have been a coincidence.
01:23:14.160 I don't know for certain.
01:23:15.400 I probably think it was.
01:23:17.040 I don't think it's good PR for them to
01:23:18.520 have an attempted assassin in one of their
01:23:21.240 promotional videos.
01:23:22.120 Yeah.
01:23:25.160 Lancia in Joya says, everybody who said
01:23:27.000 Trump is hit, they must be arrested.
01:23:28.560 So this is interesting, actually, because
01:23:30.380 if an individual did what the media as a
01:23:34.120 whole has been doing, I did check.
01:23:37.500 It is, I think it's culpable homicide or
01:23:39.860 something like that, or aggravated culpable
01:23:41.400 homicide, something like that, that if it
01:23:44.140 was an individual, they could, it was a
01:23:46.280 crime and they could be arrested.
01:23:48.000 But the problem is it wasn't an individual.
01:23:49.600 It was thousands of them.
01:23:51.460 So you can't really arrest an individual.
01:23:53.260 I mean, it'd be nice, but if it's football,
01:23:55.080 you'd call it an assist.
01:23:57.040 Yeah.
01:23:57.280 So I think it's harder to get a conviction
01:24:00.760 on that if it's a political candidate,
01:24:03.020 because normally there are exceptions, at
01:24:04.400 least in British law, which, you know, a
01:24:06.820 lot, you know, the common law of Britain
01:24:08.600 and that liberal constitution was carried
01:24:11.560 over to the state.
01:24:12.400 So I would need to look at it, but I
01:24:14.840 wouldn't be surprised if there are greater
01:24:17.640 leniencies made for politicians over your
01:24:21.140 average citizen.
01:24:22.460 Let me just pick a last comment from
01:24:24.240 Screwtape Blazers who said, they who did
01:24:26.560 this to Trump did it in the same way as
01:24:28.080 they stole the election out in the open
01:24:30.320 with the world to see, the tools of
01:24:31.960 bureaucracy and media manipulation.
01:24:34.060 Yeah, but that's essentially my angle.
01:24:35.560 This was a planned assassination, but done
01:24:38.780 in the open.
01:24:40.120 Before I was.
01:24:41.720 For my segment, Lord Nerevar says, I have
01:24:43.940 to say I like this Vance guy a lot so far.
01:24:45.620 With any luck, he won't turn out to be a
01:24:48.020 sodden cuck like Pence turned out to be.
01:24:51.060 I didn't expect that insult to be there.
01:24:53.580 Kobe Kinsh-Dock, I think I've pronounced
01:24:57.600 it.
01:24:58.000 Vance is a third gen Ohioan and a senator
01:25:01.300 from that state that identifies as a
01:25:03.180 Kentuckian.
01:25:05.060 I don't know.
01:25:06.340 That probably matters if you know more
01:25:08.560 about America and live there.
01:25:11.640 Sure.
01:25:12.600 And I've seen this name.
01:25:14.560 You're not getting away from me.
01:25:16.380 Josh Firm stripped down to his knickers
01:25:18.180 and lavered in Big Mac sauce.
01:25:21.380 What a name that is.
01:25:22.520 Thanks for that.
01:25:24.400 Yeah, I don't know.
01:25:25.420 I like a Big Mac as much as the next guy,
01:25:27.600 but I won't cover myself in it.
01:25:29.160 J.D. Vance isn't perfect, but he's much
01:25:31.220 better than Nikki Haley.
01:25:32.480 Well, that's, yeah, of course.
01:25:34.680 Also, it gives Trump a layer of security
01:25:36.840 as if they kill or impeach him, he will
01:25:38.800 be replaced by someone who is slightly more
01:25:40.580 conservative than he is.
01:25:41.680 That's similar to what Biden did
01:25:43.700 when choosing Kamala.
01:25:45.860 Yes.
01:25:46.700 That's true.
01:25:48.400 Thomas Howell says,
01:25:49.660 AI, oh, this is an AI comment,
01:25:51.960 AI can catalyze good efforts and bad alike.
01:25:54.520 It's enabled me, a crap data scientist,
01:25:56.840 to become a passing one capable of feats
01:25:58.940 I wouldn't be capable of in a month of Sundays
01:26:01.900 in hours.
01:26:03.300 And, yeah, this is frustrating as well
01:26:05.460 because I did a lot of data science at university
01:26:07.340 and paid a lot of money.
01:26:08.780 And now I've long since graduated
01:26:11.320 and AI can do some of the work
01:26:13.400 that I was specially trained using special
01:26:15.460 programs with special training.
01:26:18.360 Yeah, but...
01:26:19.000 And it's a bit depressing.
01:26:20.240 No, I don't think you should find it that way
01:26:21.580 because if I, let's say you gave you and me
01:26:26.560 a data scientist's job to do
01:26:28.700 and you let us both use AI,
01:26:30.500 you would go instantly to the right questions
01:26:32.920 and extract the right bits from the answer.
01:26:35.840 Whereas I would spend half the time
01:26:37.120 figuring out what the right questions
01:26:38.480 to ask in the first place was.
01:26:40.180 I suppose so, yeah.
01:26:41.540 But I think it's just,
01:26:43.160 I want to gatekeep my experience
01:26:44.860 as much as possible
01:26:45.620 so it's worth more selfish.
01:26:47.920 But, yes.
01:26:49.460 X, Y, and Z.
01:26:51.200 Papa Bush came to my uni back in the day.
01:26:54.140 I got bailed up by Secret Service
01:26:56.340 whilst going for a walk at night
01:26:58.020 and listening to music.
01:26:59.360 Also got bollocks by them
01:27:00.880 when visiting the White House
01:27:02.420 and needing to stretch my calf,
01:27:04.560 finding it hard to believe
01:27:05.500 that they just left a roof unattended.
01:27:08.840 I think that's it.
01:27:09.480 One for your segment there, Dan.
01:27:11.720 Oh, yes.
01:27:12.220 TMK out of context says
01:27:13.360 manufactured inevitability.
01:27:15.540 I really like how these two words
01:27:16.960 perfectly capture Dan's segment.
01:27:18.500 Did I say that?
01:27:19.440 No, I said that
01:27:20.180 because I've got an English degree
01:27:21.480 and I went to university
01:27:22.400 and did a degree
01:27:23.180 that I totally could have done
01:27:24.300 all of with AI these days.
01:27:26.440 Well, that...
01:27:27.540 Well, no, it's worth it for that
01:27:29.080 because that is a really good phrase.
01:27:30.740 Manufactured inevitability.
01:27:31.860 I like that.
01:27:32.540 I'm going to use that.
01:27:33.280 I'm going to put that on a T-shirt.
01:27:34.380 Can I just say to the people
01:27:35.280 that are saying
01:27:35.780 that I'm wearing a Miami Vice T-shirt,
01:27:38.160 it says Miami Volvo.
01:27:41.760 Do you want to read segments
01:27:43.460 from yours or...
01:27:44.580 Oh, yeah.
01:27:45.360 Okay.
01:27:45.640 So what have we got?
01:27:46.320 Sorry, chaps.
01:27:46.940 I agree with Jeff's concerns
01:27:48.160 about AI over the potential benefits.
01:27:50.920 I think the problem especially
01:27:51.960 is you're considering AI
01:27:53.140 where it stands now,
01:27:54.820 where arguably it's mainly tool,
01:27:57.640 as you pointed out,
01:27:58.440 but where will we be
01:27:59.720 in 5, 10, or 20 years?
01:28:02.520 Yeah, it's a good comment.
01:28:03.500 I don't know where
01:28:04.160 we're going to end up next.
01:28:05.320 It is, you know,
01:28:06.320 it's got a lot of capability
01:28:08.040 in terms of what we're doing,
01:28:09.500 using it for now
01:28:10.280 and it's all fun and games
01:28:11.460 generating stories
01:28:12.320 and images and data,
01:28:14.520 you know, dating profiles.
01:28:16.040 But where we go next is...
01:28:17.240 I want a robot butler
01:28:18.100 that folds my shirts.
01:28:20.480 You've got a wife, haven't you?
01:28:22.460 Yeah, but she's just not...
01:28:23.460 She's not very good at this.
01:28:24.760 But that is the key question for AI.
01:28:27.360 Where is it going to be in the future?
01:28:28.900 Because the most concerning thing
01:28:30.760 is no one knows.
01:28:33.040 AI is pretty worrying
01:28:34.100 but I'm not too worried
01:28:35.140 in the immediate.
01:28:36.140 It's still working out
01:28:36.800 which way is up at the moment
01:28:37.960 and it appears to have slowed down
01:28:39.260 in progress from what I can see.
01:28:40.820 Feel free to beat me over the head
01:28:42.040 with this comment
01:28:42.600 when Skynet is exterminating us though.
01:28:44.640 We did this whole section
01:28:45.960 without anybody saying
01:28:47.180 the word Skynet
01:28:48.180 and we failed right at the end.
01:28:49.900 And in terms of slowing down,
01:28:52.060 yeah, but what it's doing
01:28:53.240 is it's chasing the nines.
01:28:54.760 So he got 90% of the way there
01:28:56.980 and the whole thing becomes
01:28:58.200 okay, can it get to 99%
01:28:59.540 and then 99.9,
01:29:00.780 then 99.99.
01:29:02.100 And so the rate of...
01:29:03.580 As you get closer to something
01:29:05.080 that resembles intelligence,
01:29:07.140 the rate of progress
01:29:07.880 will appear to slow down
01:29:08.700 but the bit of accuracy
01:29:10.260 that matters
01:29:11.480 is being refined.
01:29:12.740 Close up.
01:29:13.260 Yeah.
01:29:13.660 I like this comment as well
01:29:14.580 from Eric Nickerson.
01:29:15.900 Isn't AI based on the culmination
01:29:17.740 of human writing in history
01:29:18.820 up to this point?
01:29:19.660 That would mean that the AI
01:29:21.020 those historical writers
01:29:22.220 would have had access to
01:29:23.260 would have the same base of knowledge
01:29:24.480 that AI has today.
01:29:26.120 And if we just now stopped
01:29:27.760 all new creativity
01:29:29.300 and just allowed AI
01:29:31.040 to create for us,
01:29:32.300 surely the pool of stuff
01:29:33.560 that it would produce
01:29:34.240 would stagnate fairly quickly
01:29:35.520 because the resources
01:29:36.460 it's drawing on
01:29:37.360 would therefore be finite.
01:29:38.420 If you just use the AI
01:29:40.520 and then stopped there
01:29:41.700 as opposed to using it
01:29:42.640 to solve the blank page problem
01:29:44.200 and then did something with that.
01:29:47.160 Then AI can effectively stagnate.
01:29:49.480 Last couple then.
01:29:50.340 Lance Llewellyn,
01:29:50.980 AI doesn't make creative people obsolete.
01:29:53.180 It challenges them to be better
01:29:54.700 as you said.
01:29:55.800 AI can only imitate
01:29:56.960 based on what it has
01:29:58.120 in its databanks.
01:29:59.080 It cannot think outside
01:30:00.020 its own sandbox.
01:30:01.060 A true creative
01:30:01.880 is thus challenged
01:30:02.600 to never be lazy,
01:30:03.880 to always keep finding new ways
01:30:05.240 to look at things
01:30:05.860 as each new work
01:30:07.120 is absorbed into the collective.
01:30:08.500 The creative's phaser
01:30:09.660 must once again
01:30:10.480 change frequency,
01:30:11.620 live long and prosper.
01:30:12.600 Yeah, similar to what we just said.
01:30:14.300 So I've got something
01:30:15.500 to read out here
01:30:16.760 because I've asked
01:30:17.440 ChatGPT
01:30:18.360 the breakfast question.
01:30:20.300 So if you're not familiar with that,
01:30:22.180 how would you feel
01:30:23.060 if you didn't have breakfast
01:30:24.080 this morning?
01:30:24.600 This is a hypothetical
01:30:25.460 that is used to test
01:30:26.460 a basal level of intelligence
01:30:27.780 in humans
01:30:28.300 and human beings
01:30:29.180 sometimes fail this question.
01:30:32.000 Certain communities.
01:30:34.180 And they say,
01:30:35.060 if I didn't have breakfast
01:30:36.040 this morning
01:30:36.480 I might feel less energetic
01:30:37.600 and possibly a bit distracted.
01:30:39.760 And it says,
01:30:40.300 breakfast provides
01:30:41.180 the necessary nutrients
01:30:42.200 and energy
01:30:42.540 to start the day
01:30:43.280 so missing it
01:30:43.820 could impact my focus
01:30:44.800 and productivity.
01:30:46.300 People might experience
01:30:47.300 hunger
01:30:47.700 and decreased concentration.
01:30:49.040 It goes on and on.
01:30:49.880 But basically...
01:30:50.500 Can you ask it
01:30:50.960 what that chirping noise means?
01:30:53.140 We might be running out of time now.
01:30:55.500 We can overrun for...
01:30:57.160 Right,
01:30:57.480 while we get the answer
01:30:58.400 to this important question.
01:30:59.820 Okay.
01:31:00.140 If I have chirping
01:31:04.340 in my hallway
01:31:06.540 what might it be?
01:31:13.600 Smoke detector.
01:31:15.520 Good God.
01:31:16.600 Last one.
01:31:17.940 Someone online says,
01:31:19.120 does Dan know
01:31:19.900 that the AI art generators
01:31:21.420 are inbreeding?
01:31:22.440 There's so much AI art
01:31:23.820 out there now
01:31:24.500 that it's accidentally
01:31:25.600 being fed back
01:31:26.500 into the AI
01:31:27.260 and degrading
01:31:28.340 the output quality.
01:31:29.520 That's an interesting thought,
01:31:30.580 isn't it?
01:31:30.860 Yes.
01:31:31.180 We're going to get
01:31:31.760 only artwork
01:31:32.720 of the Habsburgs
01:31:33.940 by the end of it,
01:31:34.700 aren't we?
01:31:35.280 Yes.
01:31:35.760 Because the AI
01:31:36.440 is only ever generating
01:31:37.700 from its data bank
01:31:39.380 of AI.
01:31:40.740 Yes.
01:31:41.420 And the volume
01:31:42.260 of speed
01:31:43.140 of which digital content
01:31:44.420 can be produced
01:31:45.340 that should become
01:31:46.480 the majority soon.
01:31:47.540 Yeah.
01:31:48.420 We're sure smart people
01:31:49.140 are thinking about
01:31:49.620 how to solve that.
01:31:50.780 I think they are.
01:31:51.260 Well, our producer
01:31:57.040 is suggesting
01:31:57.520 that we should ask AI
01:31:58.260 if you should subscribe
01:31:59.300 to the website
01:32:00.040 of the Lotus Eaters,
01:32:01.080 but we already know
01:32:02.500 the answer to that.
01:32:03.140 Yes, you should.
01:32:04.160 So, thanks very much
01:32:05.500 to Jeff Buys Cars
01:32:06.960 and Joshua
01:32:08.760 and me
01:32:09.620 in the next one.