Western Standard - April 14, 2023


Twitter CEO Elon Musk slams BBC reporter in surprise interview


Episode Stats

Length

2 minutes

Words per Minute

209.12221

Word Count

486

Sentence Count

52


Summary

In this episode, I speak to a man who has been on the job for 6 months and has seen a lot of hateful content on his feed, but he can't name a single example of it. So how does he know it's hateful content?


Transcript

00:00:00.000 Content you don't like or hateful? What do you mean to describe a hateful thing?
00:00:04.680 Yeah, I mean, you know, just content that will solicit a reaction, something that may include something that is slightly racist or slightly sexist, those kinds of things.
00:00:15.800 So you think if something is slightly sexist, it should be banned?
00:00:19.420 No, I'm not saying anything.
00:00:21.600 I'm just curious. I'm trying to say what you mean by hateful content.
00:00:24.840 And I'm asking for specific examples. And you just said that if something is slightly sexist, that's hateful content.
00:00:34.900 Does that mean that it should be banned?
00:00:36.300 Well, you've asked me whether my feed, whether it's got less or more, I'd say it's got slightly more.
00:00:42.580 That's why I'm asking for examples. Can you name one example?
00:00:45.760 I honestly don't. Honestly, I can't name a single example.
00:00:49.260 I'll tell you why, because I don't actually use that for your feed anymore, because I just don't particularly like it.
00:00:53.640 And actually, a lot of people are quite similar. I only look at my followers.
00:00:57.040 Well, hang on a second. You said you've seen more hateful content, but you can't name a single example, not even one.
00:01:02.600 I'm not sure I've used that feed for the last three or four weeks.
00:01:06.040 Well, then how did you see the hateful content?
00:01:08.300 Because I've been using Twitter since you've taken it over for the last six months.
00:01:11.900 Okay, so then you must have at some point seen for you hateful content. I'm asking for one example.
00:01:16.400 Right.
00:01:16.880 And you can't give a single one.
00:01:17.820 And I'm saying...
00:01:19.540 Then I say so that you don't know what you're talking about.
00:01:21.900 Really?
00:01:22.180 Yes, because you can't give a single example of hateful content, not even one tweet.
00:01:27.680 And yet you claimed that the hateful content was high.
00:01:30.300 Well...
00:01:31.140 That's a false.
00:01:32.300 No, what I claim...
00:01:33.140 You just lied.
00:01:33.800 No, what I claim was there are many organizations that say that that kind of information is on the rise.
00:01:41.100 Now, whether it has on my feed or not...
00:01:43.180 Give me one example.
00:01:43.960 I mean, right.
00:01:44.780 Literally counting in one.
00:01:45.600 I mean, like the Strategic Dialogue Institute in the UK, they will say that.
00:01:51.100 So...
00:01:51.420 Look, people will say all sorts of nonsense.
00:01:53.320 I'm literally asking for a single example, and you can't name one.
00:01:56.360 Right.
00:01:56.540 And as I've already said, I don't use that feed.
00:01:58.840 But let's...
00:01:59.120 Then how would you know?
00:01:59.820 I don't think this is getting anywhere.
00:02:00.820 You literally said you experienced more hateful content and then couldn't name a single example.
00:02:05.640 Right.
00:02:05.760 And as I said, I haven't...
00:02:06.560 That's absurd.
00:02:07.200 I haven't actually looked at that feed.
00:02:09.760 Then how would you know this is hateful content?
00:02:11.340 Because I'm saying that's what I saw a few weeks ago.
00:02:14.220 I can't give you an exact example.
00:02:15.640 Let's move on.
00:02:16.700 We only have a certain amount of time.
00:02:19.260 Wow.