TRIGGERnometry - March 17, 2024


Inside the Bubble of Public Broadcasting - Josh Szeps


Episode Stats

Length

1 hour and 22 minutes

Words per Minute

186.59126

Word Count

15,387

Sentence Count

256

Misogynist Sentences

5

Hate Speech Sentences

54


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, we're joined by Josh Zepes from the BBC in Australia to talk about diversity and culture in the country, and why it's important to keep kicking the hornet's nest. We also talk about the Australian political landscape, and how it compares to the UK and US.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.440 AI and risk-related compliance isn't tomorrow's idea, but rather today's edge.
00:00:05.760 Moody's combines advanced AI with one of the most comprehensive data estates,
00:00:09.640 and automated workflows provide faster insight, reduce bottlenecks, and drive more strategic action.
00:00:15.620 Moody's helps banks, corporates, and financial institutions navigate today's challenges and seize tomorrow's opportunities.
00:00:21.980 Stay ahead in an era of exponential risk.
00:00:24.220 Visit moody's.com slash kyc slash ai dash study and get in touch today.
00:00:30.860 Some producers who I've worked with have been required by management in various media organisations
00:00:35.740 to literally keep a spreadsheet, literally guessing whether or not a guest is gay,
00:00:40.260 because you're not going to go up to them and go, hey mate, are you a bit queer?
00:00:44.460 But there's no column for what's the economic background of this person?
00:00:49.200 What's their class? What's their ideology?
00:00:51.860 So you'll end up turning on the TV, and you'll have a panel,
00:00:55.760 and there'll be a Sikh woman, a transgender woman, a black woman, and an indigenous woman,
00:01:02.440 all saying the same fucking thing.
00:01:04.260 Well, how's that diversity?
00:01:06.560 Josh Zeps, all the way over from Australia, you haven't been there for a while.
00:01:10.520 I've just come back from there, so that will be an interesting chat for us to get into.
00:01:15.300 This is hilarious as well, because we've been going back and forth and back and forth for like 18 months,
00:01:18.860 and every single time I'm in the UK, you're in the US or somewhere, and then I was in Australia.
00:01:24.380 That's what happens when you're a star, mate.
00:01:26.740 And then finally, and now then I was here while you were in Australia,
00:01:30.140 and now this is like the one day, I'm going back to Australia tomorrow,
00:01:33.380 and then the overlap, the stars have finally aligned, and we're here for 24 hours in the same city.
00:01:38.720 That's right.
00:01:39.080 Well, so you were just in America, and Francis was there.
00:01:41.560 Yes.
00:01:41.840 I was just in Australia, I came back yesterday morning, and you're flying there tomorrow.
00:01:45.020 Exactly.
00:01:45.800 It was meant to be.
00:01:46.860 It was meant to be.
00:01:47.660 So it's good to have you on, man, and so much to talk about.
00:01:50.240 You had some very interesting things happen in your career with the equivalent of the BBC in Australia.
00:01:56.480 Yes.
00:01:56.740 We'll talk about that as well.
00:01:58.120 But just, I was curious to get your perspective on where you think Australia is in terms of comparing to the UK and the US,
00:02:05.480 in terms of the cultural stuff that we often talk about,
00:02:08.120 because my takeaway from having spent a couple of weeks there was like traveling 10 years into the past,
00:02:14.040 and in the past, that would have sounded like a hack joke about Australia being a backward place.
00:02:20.540 Today, I think it's kind of a compliment.
00:02:22.580 Like, things aren't as bad.
00:02:25.680 Do you know what I mean?
00:02:26.700 Well, I'm hoping that we don't have to follow exactly the same trajectory.
00:02:29.500 Me too.
00:02:29.840 I'm hoping there are some icebergs that we can avoid.
00:02:31.820 I mean, this is one of the benefits of being a medium-sized country that nobody pays attention to.
00:02:36.180 You get to learn from the mistakes of other countries,
00:02:38.200 and you get to see what the US and the UK are doing, and hopefully not do exactly that.
00:02:42.520 But, I mean, Australians are quite, we're pretty relaxed.
00:02:47.280 I mean, we're mostly pretty chilled out.
00:02:48.920 I feel like the volume on everything in America is turned up to 11.
00:02:52.500 I'm less familiar with where things are at at the moment right now in the UK.
00:02:56.460 I'm pretty familiar with, I mean, I've lived here when I was in my teens,
00:03:00.160 and it struck me that there's a kind of a,
00:03:02.380 there's a stiff upper lip sense of English compliance that is reminiscent of the Australian attitude.
00:03:08.300 And the United States just seems to have gone to a degree that is,
00:03:14.160 I don't know how you wind it back, especially in an era of social media
00:03:18.360 and coming artificial intelligence and, like, political polarisation
00:03:22.440 and the two fantastic candidates who we have for president,
00:03:26.740 both remarkable gentlemen in their own right.
00:03:29.480 But I think Australia has managed to kind of, you know,
00:03:34.280 tack as something of a middle course.
00:03:36.160 There is a wild social justice fringe, but it's not nearly as powerful or as vocal.
00:03:43.120 There is a right-wing, quasi-alt-right, semi-libertarian strain of moderate craziness,
00:03:51.000 but it's not as significant.
00:03:52.280 But what there is in Australia that I think goes unremarked upon there
00:03:57.080 is a kind of a soft, fluffy, kind consensus that is hard to shatter.
00:04:03.920 Like, why would you have conversations like the ones that you guys do?
00:04:07.880 Like, why not just let things be?
00:04:10.200 Why do you want to keep ruffling feathers?
00:04:11.920 Why do you want to keep making trouble?
00:04:13.340 Why do you want to keep kicking a hornet's nest?
00:04:15.280 You know, like, we know what the proper way of thinking is,
00:04:18.540 so let's just think the proper things.
00:04:20.120 And that's quite pervasive, I think.
00:04:22.720 And is this why you're no longer the ABC?
00:04:25.200 It may have something to do with it.
00:04:27.040 It may have something to do with it.
00:04:28.280 To tell everybody the story of what happened with you.
00:04:30.000 Well, look, there is no precipitating incident,
00:04:32.640 and I love the public broadcaster.
00:04:34.300 I think it's indispensable, I think.
00:04:35.960 We used to talk like this about five years ago.
00:04:37.960 Yeah, yeah, yeah.
00:04:39.260 You think I'm going to come around at some point?
00:04:41.780 Oh, I think you will.
00:04:42.480 Oh, I think you will.
00:04:42.500 I mean, I think.
00:04:43.100 It's not about you coming around.
00:04:44.460 It's about them changing.
00:04:45.820 That's what will happen.
00:04:46.580 Well, that is interesting.
00:04:47.720 I mean, you know, so how do you get that change,
00:04:50.200 and how do you restore trust in mainstream media institutions,
00:04:53.220 and how do those institutions and organisations earn the trust?
00:04:57.120 I mean, there has come to be a way of thinking about big subjects
00:05:02.720 where the people who are involved in reporting
00:05:05.160 don't necessarily even know that they're inside a bubble, right?
00:05:08.920 So it's not that there's, like, I think sometimes people
00:05:12.500 who are on the outside of media might suspect that there's, you know,
00:05:16.140 some kind of nefarious conspiracy that people know
00:05:18.660 that they're avoiding certain stories,
00:05:20.160 and we might be talking about, you know, transgender issues,
00:05:22.880 or we might be talking about, I don't know, the gender pay gap,
00:05:25.660 or we might be talking about race or Indigenous rights
00:05:27.900 or something like that.
00:05:29.100 And there can be a perception, oh, well, everyone's just saying
00:05:31.000 the same thing because they're all in on it,
00:05:32.680 or they're all, you know, they're not touching things
00:05:34.540 because they're co-opted in some way.
00:05:37.840 And it's less that and more, like, you know the old joke
00:05:42.520 about the two young fish who are swimming through the ocean
00:05:44.960 and an older, bigger fish swims past them and says,
00:05:48.160 how's the water today, guys?
00:05:50.220 And they swim on, and one of the young fish looks at the other one
00:05:53.060 and says, what's water?
00:05:55.460 You don't know the water unless you're outside of the water.
00:05:59.320 Like, you have to actually jump out of the pond
00:06:01.460 in order to see what you're swimming in.
00:06:03.200 And for me, there was just an increasing conflict
00:06:06.760 between the kinds of conversations that I want to have
00:06:09.020 on Uncomfortable Conversations, which is the name of my podcast
00:06:12.320 and which, by definition, are Uncomfortable Conversations,
00:06:15.640 which doesn't necessarily mean that I'm making the guest uncomfortable
00:06:18.320 or that we're going to have an argument
00:06:19.680 or that I'm going to confect some kind of controversy,
00:06:22.180 but it does mean that we're going to touch subjects
00:06:23.940 that would make people uncomfortable if people were to raise them
00:06:26.440 at a party or, you know, at the pub or at a barbie,
00:06:29.980 as the case would be in Australia.
00:06:31.180 Things where there's a certain kind of orthodoxy on both sides,
00:06:35.760 and you know that once you start talking about them,
00:06:38.120 there are eggshells that you have to tread on,
00:06:39.980 there are tripwires that you have to be careful not to trigger.
00:06:42.300 Trigger, no more tree, thank you very much.
00:06:44.600 And so my desire to sort of wrestle with those things on my podcast,
00:06:48.880 occasionally write newspaper articles about those sorts
00:06:51.380 of subjects, but came into conflict with a, you know,
00:06:57.620 justifiably, I would say, risk-averse corporation
00:07:01.000 and a cautious corporation that is struggling at the moment
00:07:04.980 in a battle between objectivity and diversity.
00:07:09.860 So, like, there's a whole diversity mantra that's taking place
00:07:13.240 at the moment in newsrooms where you want more diverse people
00:07:16.760 to come in and bring their whole selves to the story
00:07:19.920 and use their own lived experience and so on.
00:07:22.820 Well, how does that mesh with the quest
00:07:24.980 for an objective reporting, right?
00:07:28.280 I've always felt that in a position like the one that I had,
00:07:31.480 I hosted a three-hour-a-day talkback radio show
00:07:34.860 where we took calls, we interviewed politicians,
00:07:37.240 we interviewed cultural figures and so on,
00:07:39.980 I felt there should be some space for a kind of certain amount
00:07:43.160 of rambunctiousness, a certain amount of pushing the edges,
00:07:46.500 a certain amount of kind of playful interrogation
00:07:49.260 that sometimes rubbed people the wrong way.
00:07:53.460 I would want to see a public square that's as capacious as possible,
00:07:56.920 that's as vigorous as possible.
00:07:58.580 I think it's the only way we're going to survive the 21st century
00:08:00.860 if we manage to talk to each other in ways that are sometimes provocative
00:08:04.360 and sort of wrestle our way towards the truth.
00:08:06.500 But that does sometimes run counter to the mandates of objectivity,
00:08:12.420 which can sometimes get confused with mandates
00:08:15.580 for intellectual or ideological orthodoxy.
00:08:18.980 So when you say, how did it happen?
00:08:20.340 That's all a very long-winded way of saying,
00:08:22.340 basically, I wanted to write some newspaper articles
00:08:25.480 about various subjects that would consistently get knocked back
00:08:28.540 by management.
00:08:29.560 The canary in the coal mine for me was during gay pride last year.
00:08:37.220 Now I'm married to a guy.
00:08:39.020 I think I have my bona fides there.
00:08:41.360 That makes me pretty gay.
00:08:43.120 I'm a card-carrying member.
00:08:46.280 And yet I've always been a bit off about pride.
00:08:51.080 I mean, you know, it's 2024.
00:08:54.000 Will there ever come a point in the future where we can just say,
00:08:56.560 stick a fork in it, the turkey is done.
00:08:58.460 Like, it's done.
00:08:59.680 We won.
00:09:00.740 We won everything we asked for.
00:09:02.700 God bless all the people in the 60s and 70s,
00:09:05.240 the gay rights pioneers who endured police beatings
00:09:08.720 and campaigned for everything.
00:09:10.220 And now we have effectively total equality.
00:09:13.420 My life is miraculously boring, right?
00:09:16.340 You know, I have a mortgage.
00:09:17.420 I have kids.
00:09:18.080 I have a husband.
00:09:19.080 That's what they wanted.
00:09:20.900 And so I was invited by one of the broadsheet newspapers
00:09:24.100 in Sydney to write a piece about this
00:09:26.080 because the editor knew how I felt about it.
00:09:28.900 And unfortunately, when you are a host or presenter
00:09:32.500 on the public broadcaster and you're the public face,
00:09:35.500 everything has to go through management to get approved.
00:09:38.280 And this piece that was somewhat critical,
00:09:42.140 mildly critical of the idea of pride,
00:09:44.260 went up the chain and it was refused to be published.
00:09:47.740 Not even in the public broadcaster, right?
00:09:50.180 But published in a different publication.
00:09:52.700 So it wouldn't even have their imprimatur on it.
00:09:56.280 And the explanation was that, you know,
00:09:59.380 hosts on the network are not allowed to hold
00:10:02.100 or express opinions about controversial cultural issues.
00:10:07.180 This was at a time when the broadcaster was the official sponsor
00:10:13.940 of World Pride, which was the gay pride that was taking place.
00:10:18.480 There were huge rainbow flags hanging in the lobby.
00:10:21.460 Every other host on the station, all of whom are straight,
00:10:24.960 apart from me, are going rah, rah, rah, rah, rah, gay pride.
00:10:28.140 Every second promo in the ad breaks is for gay pride.
00:10:32.060 Pride. So I was like, you are allowed to have an opinion about pride.
00:10:37.520 It just has to be management's opinion about pride.
00:10:40.640 It can't be a dissenting opinion about pride.
00:10:43.380 And that's when I thought, okay,
00:10:45.520 we're going to run into some trouble here at some point.
00:10:48.360 And that trouble came to a head in various ways
00:10:51.100 towards the end of last year,
00:10:52.340 where I think the risk aversion of management was just like,
00:10:56.400 you know, pick a team, either you're with us or against us.
00:10:59.280 And I was like, okay, I guess I have to be against you, grudgingly so.
00:11:03.000 Josh, do you think that would have happened 10 years ago?
00:11:06.500 Do you think they would have been as strong on,
00:11:09.760 you need to adhere to the public message?
00:11:12.920 I don't know, because I had the good fortune
00:11:15.720 of not being in an organisation that robust.
00:11:19.820 It's hard to say.
00:11:20.880 I mean, look, echo chambers have always existed, right?
00:11:24.380 Like, you know, groupthink has always existed.
00:11:26.400 It's not like we've had a particularly courageous set
00:11:30.180 of contrarian institutions or organisations
00:11:33.360 going back into the past.
00:11:34.800 And there's certainly groupthink on the right as well.
00:11:38.060 I mean, I could be making the same criticisms
00:11:39.640 of other media outlets.
00:11:42.040 And I think on the whole, it's critically important
00:11:44.180 for us to still have the newsgathering institutions
00:11:46.360 of some of these places.
00:11:47.700 But would it have happened 10 years ago?
00:11:49.720 I mean, I think there'd probably be different blind spots.
00:11:53.200 I think the blind spots, I think it wouldn't be specific to that one
00:11:56.660 because the orthodoxy of what the elite establishment
00:11:59.380 regard as being taboo has shifted.
00:12:02.560 But there would have been other things
00:12:03.860 and there would have been other, you know,
00:12:05.720 how easy was it in the 1970s to, you know,
00:12:09.920 come out in favour of, I don't know,
00:12:12.300 name your social justice issue of the 1970s.
00:12:15.580 You know, you would have had, when the Life of Brian was released,
00:12:19.340 you know, the Christians were going nuts
00:12:20.800 and saying that it should be banned.
00:12:21.980 There's always people who believe
00:12:23.960 that they're on the right side of history
00:12:25.220 who want to silence other people.
00:12:27.940 And I'm not saying that that's what happened here.
00:12:30.260 I'm just saying that there's an effect
00:12:33.200 where when people feel like they're pretty certain
00:12:36.360 what the right solution, what the right answer is,
00:12:38.860 they don't want the questions to be asked.
00:12:40.640 And I think I believe, and you guys clearly believe,
00:12:43.760 that those questions still deserve to be asked
00:12:45.780 because we haven't settled on a final answer
00:12:48.360 to what's right and what's not.
00:12:50.080 I think as well, and push back on this if you disagree,
00:12:53.780 and the more I've been thinking about this,
00:12:55.620 mainly to do with the BBC,
00:12:57.400 the more I've come to this conclusion,
00:12:59.860 which is if the state-funded broadcaster
00:13:02.300 is not willing to entertain other points of view,
00:13:07.320 let's just call them a heterodox points of view,
00:13:09.240 then why should they be funded by the state?
00:13:13.220 Because there are people in that state
00:13:16.240 who have those opinions,
00:13:17.640 whether it's being pro-Brexit a few years ago,
00:13:20.840 or thinking that actually what we need
00:13:23.740 is some kind of more populist government.
00:13:26.320 We need stricter controls on immigration.
00:13:29.300 But if the mainstream broadcaster
00:13:31.860 is not prepared to entertain those points of view,
00:13:35.100 then is it really serving its public?
00:13:37.700 It's a tricky question.
00:13:38.820 I mean, the simple answer to that
00:13:40.760 is in news gathering, it should be.
00:13:43.420 I mean, in news gathering,
00:13:44.900 so you can divide kind of all media organizations
00:13:47.540 into the news department
00:13:48.800 and then what is sometimes called content,
00:13:51.220 which would be entertainment documentaries,
00:13:53.180 factual, and stuff like that.
00:13:55.140 I think it is absolutely indispensable
00:13:57.920 for us all to establish uniform facts about the world
00:14:01.700 and for us to be getting our news
00:14:03.440 from news organizations.
00:14:05.160 I think people who get their news
00:14:06.980 solely from podcasts and blogs
00:14:10.060 are at huge risk of being misled.
00:14:14.440 Subscribe now.
00:14:15.020 Subscribe now.
00:14:15.920 And subscribe to Uncomfortable Conversations,
00:14:17.520 which is launched on Substack
00:14:18.620 and a new YouTube page.
00:14:20.840 But, you know, I do think...
00:14:21.940 I so agree with you, by the way.
00:14:23.220 Yeah, it's like people who haven't worked in journalism
00:14:26.160 may not understand the sheer rigor of a real newsroom.
00:14:29.960 If you're at the New York Times
00:14:31.180 or if you're at the New Yorker
00:14:32.900 or if you're at the BBC or the ABC,
00:14:35.340 you will have an editor above you
00:14:38.540 if you're in a newsroom
00:14:39.400 who, when you come with a story,
00:14:41.280 will say,
00:14:41.940 great story,
00:14:43.020 you don't quite have it yet.
00:14:44.680 You need to get another two sources.
00:14:46.920 We need to fact check it.
00:14:48.000 We need to go through line by line
00:14:49.320 and make sure that it's true.
00:14:51.160 And does that always work?
00:14:53.120 No.
00:14:53.500 Do they sometimes make mistakes?
00:14:54.780 Of course they do.
00:14:55.620 And then they will correct those mistakes.
00:14:57.080 And in an ideal world,
00:14:58.760 you're aspiring towards a scenario
00:15:00.440 where you've set up an institution
00:15:02.820 that functions as a self-correcting mechanism
00:15:05.300 to try to arrive at the best possible representation
00:15:08.700 of the truth,
00:15:09.420 however hard that may be.
00:15:11.080 Then when you're talking about the content side of things,
00:15:13.420 so you might be talking about a BBC panel show,
00:15:15.440 for example,
00:15:15.900 you're talking about a chat show,
00:15:17.240 you're talking about a morning show
00:15:18.480 or something like that,
00:15:19.460 where it's not where you're going for your news
00:15:21.320 and it doesn't have editorial oversight in the same way,
00:15:24.200 it's more just like,
00:15:25.060 let's talk about the issues of the day,
00:15:26.560 then I completely agree with you.
00:15:28.360 Then you need the largest number of possible voices.
00:15:31.860 And what's happened is that you've got at the moment
00:15:33.960 this diversity mantra,
00:15:35.460 which focuses incredibly intensely
00:15:37.700 on the ethnicity and the religion
00:15:40.560 and the race and the sex and the sexuality
00:15:43.700 and the gender orientation of guests
00:15:45.960 to try to make sure that the net is cast
00:15:48.340 as widely as possible
00:15:49.600 with no focus on ideological diversity
00:15:52.160 or political diversity.
00:15:54.020 I mean, there would be panel shows.
00:15:55.300 So some producers who I've worked with
00:15:57.700 have been required by management
00:15:59.180 in various media organisations
00:16:00.840 to literally keep a spreadsheet
00:16:03.080 of all of the guests
00:16:04.780 with columns for their sex,
00:16:08.520 you know, sexual orientation.
00:16:10.220 I mean, they're literally guessing
00:16:11.640 whether or not a guest is gay
00:16:12.760 because you're not going to go up to them
00:16:13.800 and go, hey, mate,
00:16:14.780 are you a bit queer?
00:16:16.400 So, you know, they're guessing sexual orientation.
00:16:18.680 They're writing down that.
00:16:19.760 But there's no column for
00:16:21.480 what's the economic background of this person?
00:16:24.340 What was their class?
00:16:25.840 What's their ideology?
00:16:27.380 What's their political affiliation, if any?
00:16:29.940 So you'll end up turning on the TV
00:16:31.820 and you'll have a panel
00:16:33.500 and there'll be a Sikh woman,
00:16:36.200 a transgender woman,
00:16:37.820 a black woman and an indigenous woman
00:16:40.000 all saying the same fucking thing.
00:16:42.600 Well, how's that diversity?
00:16:44.360 So, yes, you know,
00:16:45.740 expand that as much as possible.
00:16:48.380 But I would never go so far
00:16:50.320 as to say that you shouldn't be funding,
00:16:51.740 you know, state funded media.
00:16:54.040 I think that's indispensable.
00:16:54.980 I think that the original idea,
00:16:56.600 if you want to get all philosophical,
00:16:57.840 like Habermas was the first guy
00:16:59.760 who kind of came up with this notion
00:17:01.980 that anytime you have corporate media,
00:17:04.640 it's going to be influenced by advertisers
00:17:06.380 and by big corporate money.
00:17:07.820 And anytime you have state media,
00:17:09.500 it's going to be influenced by politicians.
00:17:11.440 So you create a public sphere,
00:17:12.840 which is in the middle.
00:17:13.640 Which is not funded by corporations.
00:17:15.860 And although it's funded by the state,
00:17:17.300 it's completely free from interference by the state.
00:17:19.840 And the government has no ability
00:17:21.180 to instruct journalists or editors
00:17:22.700 about what they should or shouldn't be saying.
00:17:24.780 That's the ideal.
00:17:25.780 And I still think that that has merit.
00:17:27.680 I think that's a really good distinction,
00:17:29.120 actually, about the news side
00:17:30.460 and everything else,
00:17:31.620 because I think that's where
00:17:32.520 the BBC's got into trouble.
00:17:34.520 And I said that you sound like us five years ago.
00:17:37.820 And I think you misunderstood what I meant,
00:17:39.720 because the change that's happening,
00:17:42.060 certainly for us,
00:17:43.440 it's not that we have changed our opinion particularly.
00:17:45.680 It's just as you've seen the content side of the BBC
00:17:50.340 drive it consistently in the direction
00:17:53.100 that you're talking about.
00:17:54.200 At this point, it's kind of hard for us to argue
00:17:56.740 that that is something that should be funded by the public.
00:17:59.500 And I think from my conversations with people
00:18:02.400 who do still work at the ABC
00:18:04.300 and your experience too,
00:18:05.660 that is a direction that it seems to be heading in as well,
00:18:09.900 whereby the content side
00:18:11.540 is undermining people's trust in the news side.
00:18:14.940 I mean, the New York Times is a good example of this,
00:18:17.960 where, I mean, the New York Times
00:18:19.900 has been caught lying endlessly at this point.
00:18:23.400 And I agree with you that a podcast-
00:18:25.200 I'm not sure about that.
00:18:25.900 I'm sure I would sign on to lying endlessly.
00:18:29.360 Okay.
00:18:29.800 I suppose it depends on what your definition of lying is.
00:18:32.440 My view would be that the New York Times
00:18:34.920 is consistently representing a slanted view of the world
00:18:39.640 by pushing stories very hard that fulfill that view
00:18:43.280 and suppressing and not publishing stories that don't.
00:18:46.100 The end result of which is people are presented
00:18:48.220 with a false vision of what the truth is overall.
00:18:51.120 And that is intentional because that is what the people
00:18:54.120 in the organization want to push to the world, right?
00:18:57.900 So, I mean, well, there are two things going on.
00:18:59.420 One is the editorial direction
00:19:01.040 that they want a newspaper to have.
00:19:02.640 So the Wall Street Journal
00:19:03.520 has a self-consciously conservative op-ed page.
00:19:07.380 All of their columnists are, you know,
00:19:09.140 deeply, deeply conservative.
00:19:10.600 The New York Times has traditionally been on the left
00:19:12.280 and it's, you know, editorials will be slanted
00:19:15.500 in that direction.
00:19:16.660 Then there's the news side of things,
00:19:17.840 just to divide news and, you know,
00:19:19.300 opinion or content again.
00:19:21.120 And I'll take you, I'll grant to you
00:19:23.240 that certainly during the racial reckoning
00:19:25.880 of 2020 to 2021, there was an enormous amount
00:19:29.440 of trying to do the right thing
00:19:32.060 throughout through their news reporting
00:19:33.900 at the New York Times.
00:19:35.080 So, you know, oh, they weren't riots.
00:19:36.860 They were, you know, mostly peaceful protests
00:19:38.700 or, you know, in the early days
00:19:40.780 of the transgender issue,
00:19:42.240 it was all just about, you know,
00:19:43.600 respecting people's identity and so on.
00:19:45.340 But I think in the past 12 to 18 months,
00:19:47.420 the New York Times has done a good job
00:19:49.220 of bringing on board a bunch of dissenting voices
00:19:51.640 in the editorial side of things
00:19:53.320 with, you know, columnists
00:19:54.520 who are much more heterodox.
00:19:56.020 And even on the news side of things,
00:19:57.520 you've seen three big feature pieces
00:19:59.680 in the New York Times
00:20:00.500 in the past eight months or so
00:20:01.940 about the controversy
00:20:03.300 over transgender pediatric care
00:20:05.020 and things like that
00:20:05.740 you would never have seen
00:20:06.840 three or four years ago.
00:20:08.100 So I do think there's an ability
00:20:10.040 to right the ship.
00:20:11.360 I don't think you can call it lying.
00:20:12.900 Hold on, but this is precisely my point.
00:20:15.080 Look at transgender stuff, right?
00:20:18.100 The reason they are now doing this
00:20:19.840 is that people on podcasts
00:20:21.780 and everywhere else
00:20:22.680 have been basically saying,
00:20:24.420 guys, this is like,
00:20:25.400 this is a real issue
00:20:26.180 and it's serious
00:20:26.800 and people are being hurt
00:20:27.800 and blah, blah, blah.
00:20:28.880 And the New York Times
00:20:29.800 would have been writing pieces
00:20:31.020 at the time saying,
00:20:32.660 oh, these people are evil, right?
00:20:34.760 Yes.
00:20:35.040 And now three years later,
00:20:36.460 they're like,
00:20:36.960 oh, actually, this is a big problem.
00:20:38.960 So if I'm one of the people
00:20:41.180 who's like Abigail Schreier
00:20:42.720 or Barry Weiss
00:20:43.600 or someone like that
00:20:44.320 who's been raising this issue
00:20:45.600 and being dismissed,
00:20:47.360 I think it's quite reasonable
00:20:48.300 in that situation to go,
00:20:49.860 well, it's great
00:20:50.400 that they've woken up to it,
00:20:51.880 but if every time an issue
00:20:53.480 doesn't fit their narrow vision,
00:20:56.140 the people who raise that issue
00:20:57.520 get destroyed
00:20:58.200 and then three years later,
00:20:59.160 they're like,
00:20:59.420 oh, actually, this is a real issue.
00:21:00.900 That's not how it should be.
00:21:02.240 Well, absolutely,
00:21:03.000 it's not how it should be.
00:21:03.940 And clearly,
00:21:04.700 I have faith in the ability
00:21:06.040 of podcasts
00:21:06.900 to bring to the mainstream media's attention
00:21:08.740 things that they ought
00:21:09.560 to be covering.
00:21:10.240 Otherwise, I wouldn't have gone out
00:21:11.320 on my own
00:21:11.760 and I wouldn't now be doing a podcast.
00:21:13.040 So you're right.
00:21:14.600 I think we have the freedom
00:21:15.640 in the independent media space
00:21:18.160 to notice things
00:21:19.300 that perhaps
00:21:20.000 the structures
00:21:21.000 of an editorial newsroom
00:21:22.180 aren't going to notice
00:21:23.040 and to push on them
00:21:24.360 in a way
00:21:25.100 that they wouldn't feel
00:21:26.060 comfortable pushing on them.
00:21:27.760 And I would just say
00:21:28.840 if they come around late,
00:21:30.580 then that's better
00:21:31.100 than never coming around at all.
00:21:33.020 I don't think it means
00:21:33.980 that they were lying
00:21:34.660 in the first place.
00:21:35.380 I think it means
00:21:35.820 that they had
00:21:36.380 a misallocation of focus.
00:21:38.060 It's also worth
00:21:39.900 bearing in mind as well
00:21:40.940 that I think
00:21:42.220 this is something
00:21:43.020 that happens
00:21:43.600 to every institution
00:21:45.040 where every institution
00:21:46.300 right-leaning media
00:21:47.300 will have blind spots
00:21:48.460 with issues
00:21:50.020 that the left
00:21:50.700 think are more important.
00:21:53.120 And also as well,
00:21:54.660 the things that can,
00:21:55.520 I would love to have
00:21:56.180 your opinion on this,
00:21:57.180 which is
00:21:57.680 the podcast space.
00:21:59.880 It's beautiful,
00:22:01.240 it's brilliant,
00:22:01.940 it's exciting,
00:22:02.780 but it's not also
00:22:04.500 without its pitfalls,
00:22:05.760 particularly when it comes
00:22:07.240 to things like
00:22:07.780 audience capture.
00:22:08.860 Yeah, absolutely.
00:22:09.860 I mean,
00:22:10.420 look at what's happened
00:22:11.600 over the course of,
00:22:12.440 since prior to COVID
00:22:13.800 to many people
00:22:15.200 who I regarded
00:22:15.780 as friends and colleagues
00:22:16.840 prior to that.
00:22:18.340 There is,
00:22:19.180 I mean,
00:22:19.440 you guys are at risk
00:22:20.220 of this as well, right?
00:22:21.100 I mean, you know.
00:22:21.600 No, we're not.
00:22:23.400 Do you know what?
00:22:24.260 We are genuinely
00:22:25.120 far too,
00:22:25.760 far too confrarian assholes
00:22:27.840 to be captured.
00:22:29.300 We like pissing
00:22:30.980 our audience off regularly.
00:22:32.460 Become part of our group.
00:22:33.520 No, fuck off.
00:22:34.180 I'm not part of any group.
00:22:35.080 Yeah, exactly.
00:22:35.900 It's the tribe.
00:22:36.380 I call my people
00:22:37.040 the tribe of the tribalists,
00:22:38.600 right?
00:22:38.720 You know,
00:22:39.080 we are proudly tribalists,
00:22:40.320 a tribalist tribe.
00:22:41.660 But, you know,
00:22:43.160 prior to,
00:22:44.140 back in,
00:22:45.100 what was it,
00:22:45.440 maybe 2018,
00:22:46.200 2019,
00:22:47.060 I moderated a live event
00:22:49.440 in Australia,
00:22:50.900 which was Sam Harris,
00:22:52.540 Douglas Murray,
00:22:53.760 Majid Nawaz,
00:22:55.080 Eric Weinstein,
00:22:56.020 and Brett Weinstein,
00:22:57.060 right?
00:22:57.980 Now,
00:22:58.260 of that group,
00:22:59.200 and I shan't mention
00:23:00.220 any names,
00:23:01.920 several are still
00:23:03.560 completely sane,
00:23:04.480 and several have gone
00:23:06.200 batshit crazy.
00:23:07.220 Do you know what's
00:23:07.680 interesting about this?
00:23:08.460 Sorry to interrupt,
00:23:09.120 Josh,
00:23:09.240 is Joe Rogan
00:23:11.100 has a great routine
00:23:12.500 about what happened
00:23:13.800 during COVID.
00:23:14.420 He says,
00:23:15.420 we lost so many people
00:23:16.360 during COVID
00:23:17.040 and most of them
00:23:17.620 are still alive.
00:23:18.940 Yeah.
00:23:19.000 And what's interesting
00:23:20.180 is which of those
00:23:23.400 five people
00:23:24.180 you would describe
00:23:25.320 as having been lost
00:23:26.540 and which you wouldn't
00:23:27.600 really depends on
00:23:28.820 where you've ended up
00:23:29.640 as a result.
00:23:30.200 There are some people
00:23:30.920 who would have
00:23:31.640 the exact opposite view
00:23:32.760 to you.
00:23:33.060 Those people would be wrong.
00:23:34.840 Only I am right.
00:23:37.000 I probably will find
00:23:38.660 myself agreeing
00:23:39.340 with your list
00:23:40.640 of who has lost
00:23:41.880 and who hasn't,
00:23:42.780 but I think
00:23:43.580 it's an interesting point
00:23:44.660 and this speaks actually
00:23:45.620 to the podcast capture thing.
00:23:46.980 Yeah, I mean,
00:23:47.720 there is something
00:23:48.480 that happens
00:23:49.220 in independent media
00:23:50.640 where if you're prone
00:23:52.880 to a certain type
00:23:54.080 of conspiratorial thinking
00:23:55.260 and you're prone
00:23:56.740 to appreciating
00:23:58.180 the feedback,
00:23:59.140 what you regard
00:23:59.720 as the feedback
00:24:00.220 of your audience,
00:24:01.560 which actually,
00:24:02.320 of course,
00:24:02.620 is the feedback
00:24:03.160 of the minority
00:24:04.100 of your audience
00:24:04.720 who can be bothered
00:24:05.460 emailing you
00:24:06.560 because they're sufficiently
00:24:07.300 outraged or passionate
00:24:08.260 and have enough time
00:24:09.040 on their hands to do so,
00:24:10.700 then you can find yourself
00:24:11.900 veering off
00:24:12.760 into cloud cuckoo land
00:24:15.100 and you can sometimes
00:24:16.180 also just find yourself
00:24:17.100 through guest selection
00:24:18.220 getting a bit misled.
00:24:20.840 Like, you know,
00:24:21.520 there's a challenge
00:24:22.240 about how much
00:24:22.920 do we use our platforms
00:24:24.720 to interrogate people
00:24:26.380 and how much
00:24:26.900 do we just hear
00:24:27.340 for a comfy chat?
00:24:28.700 Joe Rogan ran into this
00:24:29.920 in 2022 during COVID
00:24:31.920 when he was,
00:24:33.360 was that 2022
00:24:34.160 or 2021 when,
00:24:35.680 2022,
00:24:36.540 when he joined Spotify
00:24:37.680 and there was that backlash
00:24:38.740 and some musicians
00:24:39.640 were threatened to,
00:24:40.420 you know,
00:24:41.000 pull their music
00:24:41.720 from Spotify
00:24:42.340 because of vaccine
00:24:43.300 misinformation,
00:24:44.460 supposedly,
00:24:45.040 on Joe's show.
00:24:46.180 And part of the problem
00:24:47.080 there I think
00:24:47.580 was that Joe's style
00:24:49.180 and I'm familiar
00:24:50.200 with this
00:24:50.520 because I was on his show,
00:24:51.840 I've been on his show
00:24:52.560 like seven times
00:24:53.540 but the last time
00:24:54.320 was exactly during this
00:24:55.980 and he and I
00:24:56.500 had a bit of a,
00:24:57.240 a bit of biffo
00:24:58.340 about COVID
00:24:59.140 and about vaccines
00:24:59.940 at the time
00:25:00.480 that momentarily
00:25:01.020 went viral
00:25:01.640 but, you know,
00:25:03.020 his style
00:25:03.780 is just to have
00:25:04.620 a convivial conversation
00:25:05.660 with somebody.
00:25:06.120 He's not an interrogative journalist,
00:25:07.480 he's not an investigative journalist,
00:25:08.940 he just wants
00:25:09.360 to have a chat
00:25:10.080 and sometimes
00:25:11.860 if you have a chat
00:25:12.880 and give a platform,
00:25:14.860 although I kind of hate
00:25:15.520 the idea of platforming people,
00:25:18.340 you know,
00:25:18.800 if you just have a chat
00:25:20.100 with somebody
00:25:20.740 who actually does need
00:25:22.820 pushing back on,
00:25:24.280 then you can find yourself
00:25:25.480 increasingly listening
00:25:26.520 to people
00:25:27.080 who aren't necessarily
00:25:28.260 representing the best science,
00:25:29.620 who aren't necessarily
00:25:30.220 representing the best policy
00:25:31.460 and, yeah,
00:25:33.420 then you can find yourself
00:25:34.700 in both audience capture
00:25:36.080 and guest capture
00:25:37.520 and just I'm-in-the-prison-of-my-own-brain
00:25:40.560 capture,
00:25:41.320 which we all have
00:25:42.320 now that we get
00:25:42.880 so much of our information
00:25:43.940 from, you know,
00:25:45.840 supercomputers in our pocket
00:25:47.080 that are programmed
00:25:48.040 by 22-year-olds
00:25:49.380 with skateboards
00:25:50.080 in Silicon Valley
00:25:50.960 who are focused only
00:25:52.640 on how much time
00:25:53.380 you're spending on the app
00:25:54.720 and whether or not
00:25:55.220 you like it,
00:25:55.880 share it
00:25:56.320 and comment on it.
00:25:57.900 So that inevitably guides you
00:25:59.780 towards things
00:26:00.420 that are going to reinforce
00:26:01.500 what you already believe
00:26:02.760 and demonize
00:26:04.080 the kinds of things
00:26:04.980 and the people
00:26:05.420 who you don't already believe
00:26:06.880 because nuance
00:26:08.380 doesn't really inflame you.
00:26:10.800 It doesn't, you know,
00:26:11.420 it doesn't cause you
00:26:12.020 to engage with things.
00:26:13.260 So I think all of those things
00:26:14.360 are coming together.
00:26:15.240 Social media,
00:26:16.240 soon artificial intelligence,
00:26:18.080 audience capture,
00:26:19.260 the fracturing
00:26:19.860 of the media ecosystem,
00:26:21.480 the polarization
00:26:22.100 of our politics,
00:26:23.680 the polarization
00:26:24.180 of our demographies
00:26:25.300 as people move
00:26:26.220 into just areas
00:26:27.380 where there are
00:26:28.400 more like-minded people
00:26:29.560 and we're in this situation
00:26:30.940 where we're-
00:26:31.620 everything is sort of
00:26:32.460 conspiring
00:26:33.220 to narrow our focus
00:26:36.080 and to push us
00:26:36.940 into tighter
00:26:37.460 and tighter echo chambers
00:26:38.680 and I think the job
00:26:39.940 of Uncomfortable Conversations
00:26:41.260 and the job
00:26:41.780 of many podcasts
00:26:42.460 like yours
00:26:43.080 is to just try
00:26:44.600 to edge open
00:26:45.620 the-
00:26:46.200 just by 10%
00:26:47.160 the worldview
00:26:48.300 of the audience
00:26:49.660 from wherever they are.
00:26:50.760 I mean,
00:26:50.900 I don't expect my audience
00:26:51.700 to agree with me
00:26:52.420 about things
00:26:53.240 but I want them
00:26:54.080 to listen to me
00:26:54.680 and go,
00:26:55.740 well,
00:26:56.000 Josh is being fair.
00:26:57.320 He's not caricaturing
00:26:58.620 the other side.
00:26:59.480 He's not fighting straw men.
00:27:01.320 You know,
00:27:01.560 he-
00:27:01.800 I-
00:27:02.140 he disagrees with me
00:27:04.440 but at least he's-
00:27:05.360 he understands
00:27:06.020 the point of view
00:27:06.560 that I'm coming from
00:27:07.380 and, you know,
00:27:08.540 now I sort of understand
00:27:10.000 10% a bit better
00:27:11.180 what the other side thinks.
00:27:12.660 And it's also as well
00:27:13.820 about that idea
00:27:15.880 of embracing discomfort
00:27:17.320 which is why
00:27:18.060 the title of your podcast
00:27:19.060 is so good
00:27:19.840 because some conversations
00:27:21.640 by their nature
00:27:22.600 are going to be uncomfortable
00:27:23.960 and we need to embrace that
00:27:26.320 because it's only
00:27:27.260 by embracing that
00:27:28.140 that we hope
00:27:28.680 to actually understand
00:27:29.920 the subject or issue
00:27:30.940 much better.
00:27:32.060 You can have someone
00:27:33.080 brilliant on
00:27:33.840 from the right
00:27:34.520 who will explain an issue
00:27:36.280 from a conservative
00:27:37.300 point of view
00:27:38.120 and everything they're saying
00:27:39.400 may be perfectly valid
00:27:40.480 but unless you get
00:27:41.340 somebody else on
00:27:42.620 from the left
00:27:43.400 to actually go,
00:27:44.840 well, look,
00:27:45.600 that may be right
00:27:46.420 but here's my opinion,
00:27:48.380 that's when you come
00:27:49.280 to a real understanding
00:27:51.000 of the issue.
00:27:51.780 Yeah.
00:27:52.160 Yeah.
00:27:52.440 I mean,
00:27:52.940 as I say,
00:27:54.180 that interaction
00:27:55.260 doesn't necessarily
00:27:55.820 have to be uncomfortable
00:27:56.800 but finding a way
00:27:58.880 to have those conversations
00:27:59.960 in a conciliatory fashion
00:28:02.000 and, you know,
00:28:03.320 without just allowing people
00:28:04.300 to sort of filibuster
00:28:05.080 on their own points
00:28:06.060 is kind of the goal.
00:28:07.680 Yeah.
00:28:07.900 Well, you know,
00:28:08.380 it's interesting.
00:28:09.740 Obviously,
00:28:10.540 everyone is subject
00:28:11.400 to the pressures of this.
00:28:13.380 Except for me.
00:28:14.060 I'm completely impartial.
00:28:15.460 I'm something of a god.
00:28:16.540 Exactly.
00:28:17.060 Well, me too.
00:28:17.600 That's where I was going.
00:28:18.940 But, you know,
00:28:19.760 I think Francis and I,
00:28:21.180 first of all,
00:28:21.480 the fact that there's two of us
00:28:22.580 really helps
00:28:23.440 and we have very different
00:28:24.440 perspectives on things
00:28:25.420 and also we're both,
00:28:26.860 I'm certainly
00:28:27.740 very highly disagreeable
00:28:29.460 and so we've had
00:28:30.540 several situations
00:28:32.040 where we spend
00:28:33.440 the entire summer
00:28:34.340 of BLM
00:28:35.020 as the racial reckoning
00:28:36.020 you call
00:28:36.420 are basically saying
00:28:37.140 this is completely wrong,
00:28:38.140 this is outrageous,
00:28:38.980 et cetera
00:28:39.280 and in the process
00:28:42.260 attracting a lot of
00:28:43.340 Trump fans to the channel
00:28:44.380 and then January the 6th
00:28:45.800 happened
00:28:46.140 and we were like
00:28:47.320 this is completely wrong
00:28:48.400 in exactly the same way
00:28:49.560 and outrageous
00:28:50.120 and people got pissed off
00:28:51.560 with us
00:28:51.860 and we had absolutely
00:28:53.540 no problem with that
00:28:54.400 because it's what
00:28:55.480 we believe to be wrong
00:28:56.380 or right
00:28:57.120 depending on
00:28:57.860 what the situation is
00:28:59.080 and I think
00:28:59.520 I always
00:29:02.020 I know so many people
00:29:03.780 now in the podcast space
00:29:04.880 who absolutely resent
00:29:06.020 and hate their own audience
00:29:07.060 because they have built one
00:29:08.920 and that doesn't really
00:29:10.360 reflect who they are
00:29:11.540 it reflects where
00:29:12.600 they think the clicks are
00:29:13.940 and you know
00:29:14.760 how do you avoid
00:29:15.520 getting negative comments
00:29:16.640 on your YouTube channel
00:29:17.540 yeah that's right
00:29:18.320 don't read them
00:29:18.940 the comments
00:29:19.340 is the answer to that
00:29:20.280 yeah don't worry about
00:29:21.120 how many negative comments
00:29:21.940 there are
00:29:22.280 and it's not really
00:29:23.160 reflective of
00:29:23.920 like if you look
00:29:24.760 at a video
00:29:25.240 that people will
00:29:26.720 destroy with comments
00:29:29.440 so to speak
00:29:29.940 you'll still have
00:29:30.940 85-90%
00:29:32.440 like to dislike
00:29:33.320 row show
00:29:33.820 right
00:29:34.140 so the overwhelming
00:29:35.240 majority of your audience
00:29:36.420 actually still enjoyed
00:29:37.500 the conversation
00:29:38.180 that you had
00:29:38.880 it's just a minority
00:29:40.580 of very angry people
00:29:41.760 who as you say
00:29:42.480 had a lot of time
00:29:43.460 on their hands
00:29:43.980 who may be creating
00:29:45.400 that false impression
00:29:46.260 it's really important
00:29:47.000 not to fall for that
00:29:48.000 I mean one of the good
00:29:48.780 things in a sense
00:29:49.520 about being a public figure
00:29:50.900 is you don't have
00:29:52.200 the luxury
00:29:53.020 of falling into
00:29:53.760 the trap
00:29:54.400 of caring about
00:29:55.480 what people
00:29:55.960 on social media think
00:29:57.080 because it would drive
00:29:57.980 you absolutely
00:29:58.820 bloody insane
00:30:00.020 I had Jimmy Carr
00:30:01.860 on the show
00:30:02.340 the comedian
00:30:02.880 yesterday
00:30:04.160 which will come out
00:30:05.020 in a few weeks
00:30:05.960 when his Netflix
00:30:06.540 special drops
00:30:07.280 he's coming up
00:30:07.800 on your show
00:30:08.220 as well I know
00:30:08.740 and you know
00:30:10.400 I was talking to him
00:30:10.960 about how much
00:30:11.720 how much crap
00:30:12.860 he gets on
00:30:13.600 on social media
00:30:14.940 he was like
00:30:15.440 I'm still selling out shows
00:30:17.040 who cares
00:30:18.140 why would I be caring
00:30:19.160 about what
00:30:19.800 buttface 77
00:30:20.780 has to say
00:30:21.560 you know
00:30:22.260 on Twitter
00:30:22.800 when I'm selling
00:30:23.840 tickets to my show
00:30:24.760 and I was like
00:30:25.480 yes absolutely
00:30:26.200 you do get to a point
00:30:27.640 at which
00:30:28.120 don't worry about it
00:30:29.300 it's you know
00:30:29.840 water off a duck's back
00:30:30.880 and if you're uncancellable
00:30:32.300 which is the good thing
00:30:33.020 about being independent
00:30:33.880 then it doesn't matter
00:30:35.820 the problem is
00:30:37.140 some of us
00:30:38.460 in the independent
00:30:39.400 media landscape
00:30:40.100 don't have that
00:30:41.600 fortitude I suppose
00:30:43.100 and are still
00:30:44.080 kind of focused
00:30:45.100 even in a subconscious
00:30:46.120 way maybe
00:30:46.880 about like
00:30:47.600 I'm wondering
00:30:48.900 what happens
00:30:49.520 to some of the people
00:30:50.560 who I think
00:30:50.940 have gone off
00:30:51.380 the rails
00:30:51.940 and whether
00:30:52.520 it's actually
00:30:53.100 that they're
00:30:54.100 reading tweets
00:30:55.240 and YouTube comments
00:30:56.300 or whether it's
00:30:56.960 just that
00:30:57.560 they notice
00:30:58.860 the numbers growing
00:31:00.260 when they do
00:31:01.080 certain types of things
00:31:02.240 when they have
00:31:02.940 certain types of
00:31:03.540 conversations
00:31:04.080 when they interview
00:31:05.340 certain types of people
00:31:06.440 and there's just
00:31:07.820 almost a subconscious
00:31:08.780 thing
00:31:09.720 of course
00:31:10.500 you kind of veer
00:31:11.420 towards the sunrise
00:31:13.140 you know
00:31:13.640 of course
00:31:14.060 you veer
00:31:14.540 you go towards
00:31:16.080 the flower bed
00:31:16.920 and if that flower bed
00:31:18.520 is full of
00:31:19.280 toxic bullshit
00:31:20.320 then you're going
00:31:21.580 to find yourself
00:31:22.300 off the rails
00:31:22.980 pretty quickly
00:31:23.700 to mix metaphors
00:31:25.240 but I think
00:31:25.840 there's a lesson
00:31:26.600 in this
00:31:26.920 even for people
00:31:27.860 who aren't
00:31:28.200 in the public eye
00:31:28.740 and who don't
00:31:29.080 have podcasts
00:31:30.400 that I guess
00:31:32.540 hitching your own
00:31:33.560 sense of validation
00:31:34.540 and what you should
00:31:35.500 be thinking
00:31:35.980 and what you should
00:31:36.620 be saying
00:31:37.140 to the opinions
00:31:38.720 of others
00:31:39.340 is extremely perilous
00:31:42.200 and is going to
00:31:43.240 lead you absolutely
00:31:43.820 nowhere
00:31:44.220 and I think
00:31:45.280 we can all feel
00:31:46.080 over the past
00:31:46.620 few years
00:31:47.100 that there's been
00:31:47.740 more and more
00:31:49.260 of that
00:31:49.600 because the opinions
00:31:50.360 of others
00:31:50.800 have become
00:31:51.340 so potentially
00:31:52.780 inflammatory
00:31:53.320 like it used
00:31:54.160 to be the case
00:31:54.680 that if you
00:31:55.160 said the wrong
00:31:56.260 thing at a party
00:31:57.920 then maybe
00:31:59.680 the person
00:32:00.100 you were talking
00:32:00.600 to just wouldn't
00:32:01.280 talk to you anymore
00:32:02.040 now there's a sense
00:32:03.680 that if you say
00:32:04.420 the wrong thing
00:32:05.280 they are going
00:32:06.040 to really believe
00:32:06.920 that you are evil
00:32:07.660 and they may
00:32:08.760 you know
00:32:09.160 maybe the mob
00:32:09.920 will come for you
00:32:10.800 at least in an
00:32:11.900 online fashion
00:32:12.560 or you might
00:32:13.100 lose your job
00:32:13.760 or something
00:32:14.040 like that
00:32:14.540 like there's
00:32:15.060 become this
00:32:15.560 increasing
00:32:16.020 censorious
00:32:16.660 and increasing
00:32:17.320 hysteria
00:32:18.240 whereby
00:32:19.200 you might
00:32:20.740 not even have
00:32:21.440 offended them
00:32:22.220 on precisely
00:32:22.840 the thing
00:32:23.200 that they care
00:32:23.740 about
00:32:24.060 but you said
00:32:24.640 something that
00:32:25.280 puts you
00:32:25.640 in a column
00:32:26.240 that they think
00:32:27.020 oh that means
00:32:27.680 that you're
00:32:28.100 in that particular
00:32:28.740 tribe
00:32:29.340 you know
00:32:29.900 it's sort of
00:32:30.260 crazy that
00:32:31.080 if you tell me
00:32:32.400 what you think
00:32:33.200 about climate
00:32:34.300 change
00:32:34.980 I can probably
00:32:36.100 predict what
00:32:36.720 you think
00:32:37.100 about corporate
00:32:37.660 taxation
00:32:38.280 well those two
00:32:40.500 things don't have
00:32:40.920 anything to do
00:32:41.360 with each other
00:32:41.940 right
00:32:42.460 so you know
00:32:43.360 you say one
00:32:44.120 thing the person
00:32:44.700 goes oh well
00:32:45.200 that person's
00:32:45.720 either you know
00:32:46.740 a Trumper
00:32:47.940 or a Brexiteer
00:32:49.000 or whatever
00:32:49.460 they might be
00:32:50.180 and all of a
00:32:51.200 sudden everything
00:32:51.500 else that you
00:32:51.900 have to say
00:32:52.500 just hits up
00:32:53.380 against a brick
00:32:54.460 wall
00:32:54.760 we have to find
00:32:55.560 a way to talk
00:32:56.020 to each other
00:32:56.540 in in
00:32:57.760 it's it's both
00:32:59.200 nuanced and also
00:33:00.020 heterodox where
00:33:00.800 you're like well
00:33:01.500 I'm just going to
00:33:01.860 take the most
00:33:02.280 reasonable position
00:33:03.000 on each issue
00:33:03.720 regardless of
00:33:04.600 where that leaves
00:33:05.220 me and if that
00:33:05.700 leaves me in a
00:33:06.340 flaming hodgepodge
00:33:07.740 then so be it
00:33:09.240 I just have to
00:33:09.700 sort of cop the
00:33:10.280 flack
00:33:10.640 not pay too
00:33:12.160 much attention
00:33:12.620 to feedback
00:33:13.180 and have you
00:33:14.600 know integrity
00:33:15.340 authenticity and
00:33:16.700 credibility as my
00:33:17.740 load stars and
00:33:18.740 pursue that
00:33:19.260 I've been thinking
00:33:20.020 about this a lot
00:33:20.900 do you think part
00:33:22.000 of the problem is
00:33:22.700 social media but
00:33:23.460 not social media in
00:33:24.300 the way where they're
00:33:24.780 discussing it but
00:33:25.500 social media in that
00:33:26.460 it's turned all of
00:33:27.780 us into celebrities
00:33:29.380 with our own little
00:33:30.880 followings and
00:33:31.640 audiences you know
00:33:32.660 however you know
00:33:33.640 hundreds of followers
00:33:34.460 or whoever they may
00:33:35.200 be and as a result
00:33:36.420 of that we now need
00:33:37.480 to have public
00:33:38.700 opinions or
00:33:39.480 stances on every
00:33:41.180 single issue when
00:33:42.660 the reality is we
00:33:44.500 don't have time to
00:33:46.660 research into every
00:33:47.980 issue dive into the
00:33:49.760 nuance of it read
00:33:50.960 about it and then as
00:33:52.280 a result of that come
00:33:53.280 out with a balanced
00:33:54.140 opinion so instead of
00:33:55.840 that we just go oh
00:33:56.920 let's take this opinion
00:33:57.820 off the shelf because I
00:33:59.100 need to have a public
00:34:00.180 stance on Israel
00:34:01.060 Palestine
00:34:01.540 yes absolutely it's
00:34:03.080 like everybody has to
00:34:04.240 give a decree about
00:34:05.720 what their opinion is
00:34:06.660 about something it's
00:34:07.540 like we're all little
00:34:08.240 Roman emperors like
00:34:09.360 you know announcing
00:34:10.100 what our position is
00:34:11.100 on everything nobody
00:34:11.960 gives a shit like
00:34:13.140 nobody gives a shit
00:34:13.900 apart from the
00:34:14.540 facile people who
00:34:15.560 gobble up that sort
00:34:16.500 of nonsense yeah I
00:34:18.500 completely agree the
00:34:19.520 amount of people who
00:34:20.960 have wildly passionate
00:34:22.440 opinions about Israel
00:34:24.240 and Palestine I saw a
00:34:25.340 screenshot of something
00:34:26.580 that someone had
00:34:27.140 posted of themselves in
00:34:28.320 Australia recently on
00:34:30.180 Instagram where they'd
00:34:31.620 taken this proud
00:34:32.440 stance they were at a
00:34:33.160 panel event in a
00:34:34.180 public forum and they
00:34:35.160 held up a card that
00:34:36.280 said ceasefire now
00:34:37.980 and they were doing
00:34:38.620 this as like a really
00:34:39.600 courageous thing and I
00:34:41.300 thought oh that's
00:34:42.860 great no one's thought
00:34:43.720 of that we never
00:34:45.060 thought of there being
00:34:45.840 a ceasefire thanks for
00:34:47.380 your enlightening
00:34:48.040 fucking insights you
00:34:49.140 know nobody ever
00:34:49.800 thought about might be a
00:34:50.800 good idea if you know
00:34:52.100 Hamas released the
00:34:53.260 hostages and Israel
00:34:54.340 stopped the shelling
00:34:55.280 that no no one's been
00:34:56.280 talking about that for
00:34:57.560 months and months and
00:34:58.500 months and months and
00:34:59.280 months behind the
00:35:00.000 scenes in negotiations in
00:35:01.460 Qatar nobody's been
00:35:02.740 trying to pursue exactly
00:35:04.100 that you know as
00:35:05.440 foreign dignitaries and
00:35:06.620 foreign ministers and
00:35:07.480 secretaries of state have
00:35:08.640 been flying around the
00:35:09.380 world trying to sort out
00:35:10.200 this mess like chill out
00:35:12.480 you're not an expert on
00:35:13.760 this you know what you're
00:35:14.620 talking about like it's
00:35:16.820 similar to like I heard
00:35:17.980 you talking about the
00:35:19.380 the Tucker Carlson
00:35:20.540 interview with Putin right
00:35:21.860 and you made a very good
00:35:23.440 point which was like if
00:35:24.940 you haven't watched the
00:35:25.720 entire two and a half
00:35:27.080 hours or whatever it is
00:35:28.120 don't have an opinion
00:35:29.440 about it and I was I was
00:35:31.460 like a breath of fresh air
00:35:32.440 for me to hear because I
00:35:33.580 didn't watch it I don't
00:35:34.600 want to watch it I don't
00:35:35.480 care about it I know what
00:35:36.720 I think of Tucker I know
00:35:37.900 what I think of Putin I'm
00:35:39.120 not that interested like go
00:35:40.280 go go at it but you will
00:35:41.760 never hear me comment about
00:35:43.560 why Tucker was right or
00:35:45.040 why Tucker was wrong you'll
00:35:46.360 never hear me tweet about
00:35:47.900 it you're not gonna hear me
00:35:48.800 comment on it because I
00:35:49.880 don't know like we need a
00:35:51.840 bit more of I don't know I
00:35:53.140 don't have an opinion about
00:35:54.140 this I have opinions about
00:35:55.460 the things that I know
00:35:56.200 about and if someone in
00:35:57.480 front of me has has a lot
00:35:58.520 of knowledge then I'll
00:35:59.700 interrogate them I'll tease
00:36:00.760 it out and I'll try to use
00:36:01.600 my bullshit detector to
00:36:02.700 figure out what's true
00:36:03.460 and what's false hopefully
00:36:04.780 my audience gets a lot of
00:36:05.960 you know insight out of
00:36:07.560 that but you're right why
00:36:09.460 do we all have to be
00:36:10.380 constantly broadcasting our
00:36:11.620 opinions to the world about
00:36:12.800 things that we're ignorant
00:36:13.540 on and not only is it kind
00:36:16.000 of corrosive to the
00:36:17.040 conversation because we're
00:36:18.920 being fed so many different
00:36:20.200 opinions hard to sort
00:36:21.220 through what is actually an
00:36:22.320 informed opinion and what
00:36:23.680 is just someone trying to
00:36:24.660 follow their tribe but I
00:36:26.740 think it's also corrosive to
00:36:28.220 the people who are
00:36:29.680 producing the opinions
00:36:30.960 aka all of us to
00:36:33.760 constantly be feeling like
00:36:35.340 every thought we have is
00:36:38.100 potentially auditioning as
00:36:40.340 being a piece of content
00:36:41.400 you know I mean this is sort
00:36:43.160 of a more spiritual or woo
00:36:44.480 woo thing if you want to go
00:36:45.520 there but I do feel like
00:36:46.680 there's a there's a way of
00:36:49.620 being in the world that we're
00:36:51.500 losing which is just being
00:36:53.400 present listening appreciating
00:36:58.200 contemplating musing like at
00:37:04.860 the risk of sounding like the
00:37:06.100 middle-aged guy when I was
00:37:08.600 young you made a plan to go
00:37:11.100 and meet a friend outside the
00:37:12.440 movie theater and you didn't
00:37:14.740 have a phone so you just stood
00:37:17.880 there if they were running
00:37:18.860 late and you watched people go
00:37:21.280 by and you looked at the
00:37:23.020 clouds you didn't pull a
00:37:24.940 supercomputer out of your
00:37:25.900 pocket to see what people
00:37:26.880 are arguing about on the
00:37:27.960 other side of the world and
00:37:29.940 there's something that
00:37:31.220 happens to your head when
00:37:33.260 you're constantly punctuating
00:37:34.700 time every time you sit down
00:37:36.440 on the toilet you're looking
00:37:37.620 at this stuff and you're not
00:37:39.500 just consuming it you're a
00:37:41.740 content creator all of us are
00:37:43.360 now so it's like you're
00:37:44.960 walking around with this
00:37:46.740 little version of you on
00:37:47.800 your shoulder who's saying
00:37:49.620 that's a nice sunset I
00:37:51.920 wonder if you could take a
00:37:52.920 photo of it and if it would
00:37:54.000 get some likes on Instagram
00:37:55.140 oh that's an interesting
00:37:56.000 thought I wonder if you
00:37:57.180 should broadcast that to the
00:37:58.680 world on Twitter oh that's
00:37:59.860 an interesting argument that
00:38:00.800 you could have you could
00:38:01.580 probably dunk on someone
00:38:02.620 really well by parachuting
00:38:04.300 back into that Facebook
00:38:05.180 comment thread and arguing
00:38:07.200 about it it's like instead
00:38:09.080 of living our lives we're
00:38:11.460 curators of an avatar of
00:38:14.060 ourselves that is an
00:38:15.680 artificial version of
00:38:16.740 ourselves that we're
00:38:18.040 producing for other
00:38:20.180 people like what does
00:38:21.960 that do to your head
00:38:22.800 it's not healthy
00:38:24.260 no it's not healthy
00:38:26.060 and there are some times
00:38:27.900 where you have a thought
00:38:29.240 and you go I'll tweet
00:38:30.340 that and then you go let
00:38:32.480 me just think about it for
00:38:33.860 a second and then an hour
00:38:35.440 later you go I'm really
00:38:38.640 glad I didn't say that
00:38:39.960 thing out loud because we
00:38:41.460 all think stupid things it
00:38:43.040 doesn't matter how smart you
00:38:44.200 are or how principled you are
00:38:46.260 you are human you are
00:38:47.340 fallible you are flawed and
00:38:49.440 you're gonna think dumb
00:38:50.440 stuff we all do and I
00:38:52.460 think one of the things a
00:38:53.480 lot of people haven't
00:38:54.340 considered and the bigger my
00:38:56.320 platform has got the more
00:38:59.480 the less I post and the more
00:39:01.660 careful I am about saying
00:39:03.040 what I'm saying because the
00:39:05.240 thing that I've really got
00:39:06.680 over the last couple of
00:39:07.940 years is everything you're
00:39:09.940 saying now I mean you don't
00:39:13.260 know where the world is going
00:39:14.720 five years from now no and
00:39:16.760 it's all in public and the
00:39:18.760 positions you take best to be
00:39:20.400 very carefully thought out
00:39:21.680 because three or four years
00:39:23.200 from now what you're saying
00:39:25.000 maybe you know maybe in
00:39:26.580 conflict with things that you
00:39:27.660 then believe and then what
00:39:28.640 are you going to do you know
00:39:29.860 a comedian once said to me
00:39:31.900 like he's always thinking
00:39:33.340 about what what clip could be
00:39:35.280 offensive in the future he was
00:39:36.580 like you know 10 years ago I
00:39:38.800 didn't know that it would be
00:39:39.480 offensive to do jokes about
00:39:40.920 like men dressing up in
00:39:42.580 women's clothes or something
00:39:43.640 like that but now of course
00:39:44.860 that would be considered
00:39:45.460 transphobic he's like you
00:39:46.580 know you fast forward 10
00:39:47.380 years into the future and
00:39:48.480 maybe everyone will regard
00:39:49.720 will treat it as being beyond
00:39:51.040 the pale to make fun of
00:39:52.000 clowns and he was like he
00:39:53.420 was doing jokes about
00:39:54.220 clowns you clown phobe and
00:39:56.240 he's like I didn't know we
00:39:57.060 didn't know in 2024 that it
00:39:58.540 was bad to make fun of
00:39:59.420 clowns it was just a thing
00:40:00.400 that we did yeah yeah and I
00:40:02.180 don't mean it from a being
00:40:03.520 offensive perspective I've got
00:40:05.220 no problem you know owning
00:40:06.620 the fact that this was like
00:40:08.120 you know friends is now
00:40:10.120 considered offensive and if I
00:40:11.360 had written friends I'd be
00:40:12.240 like fuck you I don't give a
00:40:13.140 shit you know look how look
00:40:14.600 how many people watched it at
00:40:15.840 the time it was probably you
00:40:17.620 know a good thing in its own
00:40:18.700 time what I mean is you know
00:40:20.840 I saw this with the war in
00:40:22.400 Ukraine which was a subject
00:40:24.380 that I knew more about than
00:40:25.620 most people in the West I just
00:40:27.060 happened to know more and I
00:40:28.640 had people call me up and ask
00:40:30.280 me my opinion and they didn't
00:40:31.540 know what their opinion was and
00:40:33.180 then I'd see the same person
00:40:34.220 like three days later having the
00:40:36.320 strongest possible opinion
00:40:37.620 about it in public and I was
00:40:38.800 going well if you agreed with
00:40:40.820 me you only agreed with me
00:40:42.420 because you heard me and you
00:40:44.120 heard my opinion but a lot of
00:40:45.660 these people didn't agree with
00:40:46.660 me because obviously they'd
00:40:47.500 gone to someone else and they
00:40:48.420 got a different opinion and I'm
00:40:49.920 going you three days ago you
00:40:51.760 were not qualified and now
00:40:53.760 you're qualified and three years
00:40:55.880 from now this is going to be a
00:40:57.700 whole different situation
00:40:58.520 there'll be another conflict
00:40:59.360 somewhere else and the stuff
00:41:00.280 you're saying is going to be in
00:41:01.240 complete opposite to what you're
00:41:03.200 now saying and it's just that
00:41:05.360 awareness of the everything we
00:41:08.360 say in public will can and will
00:41:10.820 be used against you rightly so
00:41:12.900 by the way like you're going to
00:41:14.440 be held accountable so because
00:41:16.200 people should think before
00:41:17.860 speaking in public should they
00:41:20.220 yeah they should absolutely yeah
00:41:22.340 isn't this what your argument was
00:41:24.040 about Joe I mean having people on
00:41:26.200 who say things that is part of a
00:41:29.440 conversation may end up hurting
00:41:31.300 people people should think about
00:41:32.840 how they speak in public I'm not
00:41:34.980 sure people should think about what
00:41:36.500 they say I think they should be
00:41:37.940 very careful that sounds a bit
00:41:40.200 Maoist to me no no no it sounds a
00:41:43.120 bit like I didn't no no I think
00:41:45.040 the Mao the Maoist part is you
00:41:47.020 should think about what you say in
00:41:48.140 public because you'll be punished
00:41:49.360 that's Maoist what I'm saying is you
00:41:51.700 should think about what you're
00:41:52.520 saying in public because the
00:41:53.380 chances are you're wrong right but
00:41:55.760 earlier you were saying you should
00:41:57.080 think about what you say in public
00:41:58.000 because in three years time people
00:41:59.480 might reevaluate no no you will
00:42:01.880 realize how wrong you were right
00:42:03.560 and then you're going to be in a
00:42:04.840 position where you're like oh shit I
00:42:06.280 said all this crap about this
00:42:07.800 conflict and now I'm I'm yeah yeah I
00:42:11.220 mean and then people will remind you
00:42:12.820 I think it's important for us to not
00:42:14.900 care so much about consistency as well
00:42:17.320 like I'm a big fan of not being
00:42:18.860 consistent over time it's regarded as
00:42:21.280 being the biggest slam dunk against
00:42:23.100 people especially politicians if they're
00:42:25.340 like if someone's like well hang on
00:42:26.980 you're for this policy now but five
00:42:28.660 years ago you weren't well you know as
00:42:31.420 a great man once said when the facts
00:42:33.060 change I changed my mind what do you
00:42:35.040 do but that's a different it's like I
00:42:36.580 understand what you're saying that
00:42:37.960 like you know be mindful of the fact
00:42:40.200 that the things that you're saying now
00:42:41.700 are provisional right and that they
00:42:43.680 that facts may change and that you may
00:42:45.500 have you may turn out to have been
00:42:46.840 wrong I agree that you need to sort of
00:42:50.440 pick your battles don't go out you know
00:42:53.640 fighting for some huge social justice
00:42:56.380 cause that you don't know anything
00:42:57.940 about or to save democracy if you don't
00:43:00.380 even understand the contours of the of
00:43:02.080 the battle especially if it's something
00:43:03.700 complicated and foreign like Gaza or
00:43:06.480 Ukraine on the other hand I do think we
00:43:09.560 need to have greater forgiveness both of
00:43:11.800 other people and ourselves about things
00:43:13.560 that we got wrong in the past and you
00:43:16.560 know we can't be held to account for
00:43:18.740 things that we tweeted eight years ago
00:43:21.220 that was a different climate it was a
00:43:23.140 different place the milieu was different
00:43:25.500 the taboos were different you know we've
00:43:28.340 reached a point where we have this kind
00:43:29.800 of outrage archaeology which is I know
00:43:31.800 what you weren't defending there but it
00:43:33.460 it's a difficult and nuanced thing to
00:43:35.460 tease out like how how careful we to be
00:43:38.280 about the things that we're saying now
00:43:39.980 versus how much justification or people
00:43:42.420 to have in the future for punishing us
00:43:44.200 for things that we're saying now because
00:43:45.880 those things no longer comport with the
00:43:48.300 norms of 2030 well I don't know the norms
00:43:51.460 of 2030 so I'm going to say what I'll say
00:43:53.320 now and so be it no no I agree with you
00:43:55.680 what I'm talking about is something
00:43:56.820 completely different and then you see
00:43:58.500 it now with the war in Ukraine and I
00:44:02.300 focus on it because it's interesting to
00:44:03.820 me and the war in Iraq I was against the
00:44:06.360 war in Iraq I was against our invasion
00:44:08.520 and there were so many people who were
00:44:10.340 for it who because they were for it and
00:44:13.400 they were so disappointed in their
00:44:15.960 cheerleading for that war and now like
00:44:18.360 I'm against all war well anyone should
00:44:20.780 be against all war but there are times
00:44:22.280 when you're going to find yourself in a
00:44:23.700 position where you have to defend
00:44:24.940 yourself you have to help someone else
00:44:26.420 defend themselves but a lot of people
00:44:28.380 Tucker being one one example of this
00:44:31.080 who cheer led that war because of the
00:44:34.440 you might call it you know if you want
00:44:36.060 to go into woo-woo conversation the
00:44:37.960 psychic damage they did to themselves by
00:44:40.260 cheerleading a conflict that they didn't
00:44:42.220 really fully understand because that's
00:44:44.020 what everyone else was doing they're
00:44:46.320 now in a completely different position
00:44:47.700 where they're overreacting in the
00:44:49.020 opposite direction that's interesting so
00:44:50.100 don't in other words don't learn the law
00:44:51.920 the wrong lesson from your mistakes in a
00:44:54.480 way like I remember at the time of the
00:44:58.060 Arab Spring when Syria was going up in
00:45:02.280 flames remember Obama was seriously
00:45:05.500 considering intervening there he had said
00:45:07.340 that chemical weapons would be a red line
00:45:09.040 and that the United States would do
00:45:10.780 something if Assad the dictator in Syria
00:45:13.340 used chemical weapons on his own people he
00:45:15.880 did Obama tried to get the UK involved and
00:45:19.900 the UK didn't do so so that was basically a
00:45:22.780 you know US didn't want to go it alone
00:45:24.180 thing I was interviewing Phil Donahue the
00:45:26.440 old American talk show host on HuffPost
00:45:28.460 Live when I was living in New York and he
00:45:30.220 was coming out strongly against intervention
00:45:33.160 in Syria I was pretty in favor of
00:45:36.040 intervening I thought the humanitarian
00:45:37.680 catastrophe justified intervention on you
00:45:41.160 know just moral grounds and I was saying to
00:45:43.900 him he was saying like you know Iraq we
00:45:46.780 haven't we learned the lessons of Iraq I was
00:45:48.540 like yeah but aren't we smart enough to
00:45:50.520 make distinctions between different kinds
00:45:52.240 of war no no like that and you know it's a
00:45:58.360 similar thing with COVID and with like you
00:46:00.300 know coming back looping that point that
00:46:02.400 you're making Constantine about the you
00:46:04.720 know the derangement of certain people who
00:46:06.820 are in Podcaster Stan as Sam calls it and
00:46:10.460 the lessons learned from COVID you know there
00:46:12.760 there are certain people who have
00:46:15.760 experiences of being traumatized by the
00:46:18.820 overreach of governments by government's
00:46:20.900 use you know certainly in the developing
00:46:22.940 world and in the Arab world of emergency
00:46:25.580 decrees in order to introduce you know
00:46:28.580 fundamentally to enslave the population and
00:46:30.900 introduce authoritarian regimes who when
00:46:33.420 COVID happened they were like oh this shit is
00:46:36.760 that I have to oppose every you know we can get
00:46:39.940 into all our arguments that we want to about
00:46:41.560 lockdowns and when they went too far and
00:46:43.340 whether Australia went too far and whatever
00:46:44.760 but as public health officials tried to
00:46:46.760 scramble to figure out what to do about a
00:46:48.540 global pandemic there were people who
00:46:50.700 regarded any inhibition on individual
00:46:53.160 liberty as being evidence of a police
00:46:55.740 state that was just around the corner they
00:46:57.660 were making they were learning the wrong
00:46:58.760 lessons from from the past we didn't end up
00:47:02.300 with a police state we are not currently
00:47:03.700 living in a police state Australia is not
00:47:05.240 currently a police state and it unwound all of
00:47:07.260 the all of the restrictions and similarly if
00:47:10.460 you've got an evolutionary biologist who you
00:47:13.640 know starts to sort of talk about how well
00:47:15.820 maybe ivermectin this and maybe the vaccines
00:47:18.400 are doing that or whatever it might be and
00:47:21.040 realizes that he was right about a few of those
00:47:23.840 early things not in the case of ivermectin but
00:47:25.940 in the case of I don't know side effects the
00:47:28.180 vaccines might have or whether or not it's
00:47:29.960 really necessary for young people to get
00:47:31.460 vaccinated then you can go oh because I was
00:47:34.740 right and the establishment was wrong about
00:47:37.180 this little thing that means that establishments
00:47:39.720 are always wrong and that the conspiratorial
00:47:42.120 contrarian point of view is always right
00:47:44.080 it's the wrong conclusion to draw so I mean
00:47:46.460 yes treat every individual you know example
00:47:50.120 that you're faced with in as rational a way
00:47:51.900 as possible learn from your mistakes but
00:47:55.260 don't create a kind of a paradigm or a filter
00:47:58.240 or a prism through which you henceforth filter
00:48:01.060 all of your information thinking well I made a
00:48:03.780 mistake last time therefore I'm never going to
00:48:05.080 make that same mistake again and it also ties
00:48:07.340 into the idea of identity because these
00:48:09.660 people then their identity is I'm the
00:48:12.000 heterodox guy yeah or I'm the guy who is you
00:48:15.440 know the covid guy who will come in and
00:48:17.180 explode the troops and show you what's really
00:48:19.140 going on and once you've got that identity
00:48:21.920 if if a fact comes to light which actually
00:48:26.220 shows a lot of your positions to be incorrect or
00:48:31.100 false or lacking in nuance then you're not
00:48:34.020 really going to accept those facts because
00:48:37.320 what it does is damage your identity your
00:48:40.280 brand and the way that you make money and
00:48:43.780 the way that you get invited onto other
00:48:46.200 podcasts or platforms yeah it's funny isn't
00:48:49.220 it we're all becoming sort of it's a little
00:48:52.040 bit like the social media avatar phenomenon
00:48:54.220 that I was talking about a moment ago we're
00:48:56.360 starting to inhabit more and more tribal
00:48:59.180 roles and maybe it was always thus like maybe
00:49:02.420 in prehistory it was more like that you know
00:49:05.220 things were more stratified things were more
00:49:07.440 predictable things were more feudal basically
00:49:10.120 like you knew your place and then post war
00:49:13.520 especially post 1960s and the civil rights
00:49:15.740 movement there was this great kind of flowering
00:49:17.760 of into individualism where everybody could you
00:49:20.460 know find their own jam however they wanted to
00:49:23.320 but it seems like social media is creating an
00:49:26.580 environment in which we're able to revert back
00:49:28.820 into more secure more stable more comforting
00:49:32.420 tribal identities so yeah I'm the covid contrarian
00:49:35.840 or I'm this type of person and we even do it in our
00:49:39.500 individual lives I mean the reason why I originally
00:49:42.380 went on Joe Rogan's show we were talking about this
00:49:45.020 Francis before before we were rolling was because
00:49:47.940 I was in I was a host of HuffPost Live which was this
00:49:52.220 you know streaming talk network in in the US and
00:49:57.200 there was a campaign a campaign came up to cancel
00:50:01.260 the Colbert rapport remember Stephen Colbert's
00:50:04.160 original show because he'd made a joke that was
00:50:07.260 allegedly racist it was remember the joke I do I do
00:50:12.540 you want me to tell it yeah yeah so it is a little
00:50:14.920 bit of backstory the Washington Redskins is a
00:50:18.020 sports team obviously Redskins is regarded as being
00:50:21.040 offensive so there was a furor about changing the name
00:50:23.340 of that team the owner didn't want to change it but
00:50:25.800 he wanted to prove that he wasn't racist so instead he
00:50:28.240 set up a charitable foundation for educating young
00:50:32.300 Native American people to say see I'm not racist I've got
00:50:36.480 my charitable foundation so Colbert comes on one night and
00:50:40.420 he goes you know I've been accused in the past of racism
00:50:43.940 towards Asians and he runs I believe a fake clip of old
00:50:48.320 pretend versions of him being racist towards Asians like
00:50:52.380 pretending to be an Asian wearing a Chinese pointy hat like all this sort
00:50:56.060 of stuff and he says so as a result I'm going to prove that I'm not racist
00:50:59.720 by creating the Ching Chong Ding Dong Foundation for Orientals or whatever
00:51:04.560 that's very funny yeah
00:51:07.040 so people were trying to cancel him for that so outcomes an Asian American
00:51:12.140 activist she's young she's enthusiastic she creates a petition she
00:51:16.700 creates a campaign Comedy Central should get rid of the Colbert rapport and I she's lined
00:51:23.200 up to be interviewed on my show on HuffPost live and I try to explain
00:51:28.480 that what he's doing is satirical right he's satirizing this other guy who was doing a
00:51:37.380 thing that might be racist and he's trying to make the point that it's a facile and cynical
00:51:43.360 thing to do in other words he's not actually racist himself and she says something along
00:51:48.960 the lines of well it doesn't surprise me that a white man would have that opinion
00:51:53.780 and and so I interrupt her and I say hang on sorry this has nothing to do with the fact
00:51:57.860 that I'm a white man like I did you know I didn't give up my right to have an opinion
00:52:02.460 about comedy and satire because I was born with balls and white skin
00:52:07.140 and uh she says well you know I would expect a white man to enjoy talking over a woman of color
00:52:13.020 that's you know that's uh that's something that you like to do it went up in flames and like
00:52:18.000 ultimately I was like well I mean if they you know if we're not going to be able to talk to
00:52:21.760 each other then we're not going to be able to talk to each other um and we ended the interview early
00:52:25.620 I was going to say that I feel like it's incredibly patronizing for you to paint these
00:52:29.260 questions this way especially as a white man I don't expect you to be able to understand what
00:52:33.200 people of color are actually saying with regards to cancel Colbert he has a history of making jokes
00:52:38.320 Suri being a white man doesn't give me doesn't prevent me from being able to think and doesn't
00:52:42.620 prevent me from being able to have uh have thought reasoned perspectives on things I don't I didn't
00:52:47.040 give up my right to be able to have an intellectual conversation when I was born I know but oh well
00:52:51.940 white men definitely feel like they're entitled to talk over me they definitely feel like they're
00:52:55.320 entitled to kind of minimalize my experiences and they definitely feel like they are somehow exempt
00:53:00.140 and so logical compared to women who are painted as emotional right no no one's minimalizing your
00:53:05.280 your experiences no one's minimalizing your right to have an opinion it's just a stupid opinion I
00:53:09.380 mean it's it's a it's a misunderstanding of what you just called my opinion stupid you just called my
00:53:16.380 opinion stupid that's incredibly unproductive and I don't think I'm going to enact the labor of
00:53:21.060 having to explain to you why that's incredibly offensive and patronizing explain I just told you I
00:53:27.580 wouldn't enact that labor okay thanks for being with us Sui so Joe Rogan saw that he played it on
00:53:34.860 his podcast and that was how my friendship with Joe began but what reminded me why he reminded me of
00:53:40.280 that Francis was the experience of talking to this person was an experience of talking not to another
00:53:47.180 human being not to another rational mind but to a cardboard cutout of an identity who is treating me
00:53:57.220 as a cardboard cutout of an identity she is woman of color I am white man right you try to talk about
00:54:06.360 the actual thing you're talking about you try to talk about the joke you try to talk about satire you
00:54:09.840 try to talk about where is the boundary where is too far where is not far enough you know do Asians
00:54:13.740 get picked on in particular or if you did it about a black person maybe it wouldn't be acceptable all
00:54:17.760 kinds of interesting things that could be talked about none of which are being talked about because
00:54:21.580 we've got our roles we've got our fucking you know sort of avatars that we have to inhabit like that's
00:54:27.700 no way to live your life I am operating from the place of being a spokesperson for my identity group
00:54:33.740 and the more we do that the more likely it is that the 21st century is going to devolve into some kind of
00:54:40.320 low-grade cultural civil war or not so low-grade cultural civil war and it worries the hell out of me
00:54:46.400 like we have to be able to talk to each other as human beings regardless of where we come from
00:54:49.800 not as a bunch of check boxes on some diversity tick list the reason people do that and it is it does
00:54:55.940 when you are on the other end of it as you have been it feels like you're arguing with a tape recorder
00:54:59.860 because it's just playing specific lines and response that you know are coming anyway but the
00:55:05.120 reason people do it is it's it's very powerful tool it's a weapon they forge this identitarian weapon
00:55:10.080 that they use which takes us back to Australia how do you feel that uh you guys are doing on that front
00:55:18.240 because you just had the voice referendum which was quite comprehensively rejected by the Australian
00:55:23.260 public and the idea of it was you'll correct me if I miss misrepresent it but it was essentially about
00:55:28.300 um embedding what I would say is identity politics at the constitutional level right saying essentially
00:55:35.160 aboriginal people should have an extra way of being heard and it was quite unspecified as part of the
00:55:41.820 various legal uh conversations that are being had and the Australians rejected it quite overwhelmingly
00:55:47.100 uh and I've just just been there it's a very multi-ethnic society uh and uh people talk very
00:55:53.460 proudly in Australia about being a a the world's most successful multicultural nation do you think
00:55:59.480 what do you think of all yes I think I think that's true I mean I think uh it's a source of enormous
00:56:03.840 pride for Australians that um that we are one of the most successful multi-ethnic and one of the most
00:56:09.020 multi-ethnic countries in the world were one of the highest rates of immigration per capita
00:56:12.420 either the first or second number one or number two country in the world for refugee resettlement
00:56:17.520 um the voice um and remind me to come back to the point about multiculturalism because there's an
00:56:23.780 interesting point to be made about immigration and Brexit and Trump and Australia's multiculturalism
00:56:28.260 um but on on the voice so yeah let me give the most generous articulation of the voice just to
00:56:34.780 steel man for a moment so you had in the 1700s the world's most powerful empire in the British empire
00:56:42.380 crash into the greatest traditional civilizations in the world the Australian Aborigines have been
00:56:48.900 around for the longest period of time they're the long they're the oldest continuous civilization in
00:56:53.940 the world because even in Africa there have been a number of changes as far as anthropologists are
00:56:58.820 concerned there's something very unique and very special about a bunch of civilizations and they really
00:57:03.920 were a bunch of civilizations speaking different languages with different practices all over a
00:57:08.260 continent that is the same size as the contiguous United States living in incredibly harsh environments with
00:57:13.740 enough wisdom to last for many many many tens of thousands of years but obviously that was going
00:57:20.220 to be an irreconcilable clash between the gunboats of the British empire and those civilizations and as
00:57:27.840 recently as the 19 as the middle of the 20th century let's say you had policies that were incredibly brutal I
00:57:35.980 mean you had indigenous children being ripped away from their families in order to be raised the proper way
00:57:42.000 by real proper white people by you know often harsh nuns in convents and things like that I mean imagine the
00:57:48.700 experience you know you're a father of having your child ripped away from you because of your race
00:57:52.600 so it's been difficult to find progress on a lot of money's been thrown at the problem uh you know
00:57:59.480 a lot of attempts at affirmative action and and equality have been thrown at it the consensus that
00:58:06.540 came out of uh a forum that was held a little over a decade ago was that it would be useful to have
00:58:13.260 a single cohesive body that could articulate the first nation's point of view on legislation that
00:58:22.040 parliament was considering that would affect indigenous people and at the moment it was a
00:58:26.060 bit too haphazard it was a bit too random or the voices were all a bit too scattered uh you needed to
00:58:30.560 coordinate them somehow so the idea that was come up with was you'd create a body called the voice and
00:58:36.160 it would give voice to first nations people those its advice wouldn't be binding parliament would be free
00:58:41.720 to ignore it um but it would be a place of collecting the those those voices um perhaps foolishly
00:58:50.700 the government decided instead of just creating this thing and these things have been created
00:58:54.700 at a state level in australia just through legislation australia is a federation like the
00:58:58.560 united states so you know our hospitals and our um uh you know police and schooling are done on a
00:59:04.460 state level not a national level so some state governments have actually tried this and you could
00:59:08.900 have done this at a federal level just by creating it just by parliament passing it but the government
00:59:14.940 decided instead to try to embed it in the constitution which requires a referendum to literally
00:59:20.560 change the nation's founding document um that's a big ask that requires a majority of states as
00:59:27.220 well as just a majority of voters in other words a majority of voters in a majority of states it didn't
00:59:32.080 even get anywhere close to that because there were legitimate worries i mean why are you embedding
00:59:37.620 something that is supposedly trying to remedy a temporary inequality one hopes that it's a temporary
00:59:43.860 inequality one hopes that in a thousand years time if we're all still here there won't be a
00:59:48.500 disparity between first nations health outcomes and education outcomes and the rest of the population
00:59:53.300 in which case why do you still have this thing that's going to be in the constitution forever
00:59:58.280 and then as you say constantine there's that sort of egalitarian thing of like well hang on
01:00:04.040 more than half the australian population has arrived since the second world war
01:00:08.440 we have this huge multi-ethnic society why does a working-class chinese australian shopkeeper
01:00:15.560 not get a say but a first nations person does just because they're a descendant of people who were
01:00:22.300 wronged so it became a real culture war clash there was a lot of misinformation about about it um
01:00:29.220 and it went down in flames it was an interesting time because it was interesting how blinkered and
01:00:36.860 blinded people were on both sides about well especially on the pro you know on the left progressive side
01:00:43.100 about the reasons why people might have reservations for it i mean so many of my colleagues would just
01:00:48.100 say something like oh it's just bloody obvious don't be a dick you know just vote for it you know
01:00:54.720 throw them a bone you know they've got a hard life uh you know why wouldn't you well maybe people
01:01:02.040 have reservations about changing the founding document if they don't know what the ultimate legal
01:01:06.940 consequences are going to be maybe people have reservations about how much it's going to cost maybe
01:01:10.920 people have reservations about whether or not it's going to be truly representative maybe they don't
01:01:14.500 know where these people are going to be chosen from and whether they're going to come from you know
01:01:18.180 an elite kind of social justice oriented university class or whether they'll actually represent the
01:01:23.280 interests of first nations people on the ground it was amazing in the wake of the referendum when it
01:01:27.920 went down i was i still had my radio show at the time i would be interviewing very learned
01:01:32.640 academics and learned journalists about it and you know one of them said to me uh i think the reason why
01:01:37.940 you know it failed was because a lot of australians who live in big cities uh they don't know a lot of
01:01:43.700 first nations people now i pointed out to her actually the places in australia where wealthy elites who
01:01:53.040 don't know a lot of indigenous people live are the ones that voted most in favor of the the voice body
01:02:00.140 the places that made it fail were largely rural and regional electorates where in fact i think the
01:02:07.080 two electorates with the largest number of first nations people were the ones that went most strongly
01:02:12.060 against it now you could say oh well that's because you know white racists live amongst the
01:02:17.340 first nations people whatever it it's clearly not true that it was people who don't know
01:02:23.240 indigenous people who were voting against it so i explained that to her and she said oh yeah but in
01:02:28.320 the big cities where they voted yes they go to art galleries and they appreciate indigenous art
01:02:32.100 so that's why that's probably why they were voting yes i was like bubble much echo chamber much like
01:02:41.520 group think much you don't think that there's it's possible that there are people who are just
01:02:46.000 you know the idea that you had to be a racist in order to have reservations about this is what drove
01:02:53.520 more people against it if you'd been less elitist and sort of dogmatic and condescending towards
01:03:00.420 people who had questions about whether or not this was the right way to address inequality racial
01:03:04.240 inequality in australia then maybe there would have been a potential to cut across the aisle and
01:03:08.480 convince some people um and on the question of multiculturalism i mean it's a really interesting
01:03:13.480 one again in the same way the question of creating a sort of a quorum of support for a particular policy
01:03:21.780 right how do you get the largest buy-in i mean this is something that i'm interested in
01:03:25.160 on uncomfortable conversations i want to speak to the winnable middle i want to speak to people who
01:03:30.480 still regard themselves as being rational thoughtful uh you know i'm never going to win over the far
01:03:37.080 left i'm never going to win over the far right i'm hoping that there are people on the fringes who
01:03:41.000 will join us in a kind of a radical centrism so to speak so what australia's multiculturalism has to
01:03:47.860 teach i think the uk and the us is that you can get enormous public buy-in for very high rates of
01:03:56.500 immigration even into a very white country which australia was in the 1950s if people feel that they
01:04:03.820 have control over the borders that was the deal after the second world war the first ever immigration
01:04:10.400 minister arthur caldwell said you know we need to make sure that the borders are secure in order to
01:04:16.820 reassure australians that we know exactly who's coming here and that we're making we've got a
01:04:20.980 good selection criterion for it and that has basically persisted the entire way through i mean
01:04:26.140 i know you had tony abbott the former australian prime minister on this show um i'm not a fan not a
01:04:31.060 huge fan of tony's i think his border policies were unnecessarily harsh nonetheless it remains the
01:04:37.340 case in australia that if you embark on now of course we have the good fortune to be an island so you
01:04:42.000 know it's not exactly the same as the united states you can't walk we have that fortune here too
01:04:45.680 we have tens of thousands of people coming in legally yeah yeah and so you know what australia
01:04:50.360 why do you say it was unnecessarily harsh because the year before uh abbott's government implemented
01:04:56.120 operations of sovereign borders that you had about i can't remember it was either 12 or 17 000 people
01:05:02.180 come illegally yeah today it's 74 people so it solved the problem uh yes so there are there are ways and
01:05:09.980 there are ways right like i think that i say it's unnecessarily harsh because so just for people
01:05:14.780 who aren't across the entire thing what australia basically did um and this was a solution that was
01:05:19.960 actually devised in the early 2000s pre-toni abbott by john howard the basic contours of this
01:05:25.040 were was that if you try to come to australia illegally in other words if you get on a boat
01:05:30.000 there are very sophisticated people smuggling rings or there were through southeast asia that would funnel
01:05:34.460 people from south asia through indonesia you get on a boat in indonesia come down into australia
01:05:39.000 as soon as you're on australian soil of course then you can declare refugee status and australia
01:05:44.720 has to process you the idea was find the boats before they get there before they have the right
01:05:49.320 to claim asylum in australia and make the promise that if you try to come to australia illegally you
01:05:54.700 will never ever set foot in australia guaranteed signed sealed and delivered we'll ship you off to a
01:06:00.360 south pacific island nation that we're bribing to build concentration camps to house you in
01:06:04.600 in the hot desert until someone else will take you and then we just sort of find other partner
01:06:11.960 countries that we can disperse those people to it's harsh because do you need to be keeping them
01:06:19.060 in the conditions that they're in which are actually quite opaque and it's very difficult to
01:06:22.560 find out what's going on in those places but they're run by private prison companies and by all
01:06:27.780 accounts they're absolutely awful there have been cases of people on starvation diets there there
01:06:32.280 have been cases of people sewing their lips closed there there have been cases there was one award
01:06:37.020 winning australian podcast that was recorded by an inmate there who was able to smuggle have a
01:06:42.500 recording device smuggled in it's all very cloak and dagger i think that once someone is in your care
01:06:49.280 you have a duty of care like yes you can always say oh but what about the these thousands of other
01:06:55.880 hypothetical people who might have drowned at sea you know trying to get there who were stopping
01:07:00.440 because of the deterrent effect of what we're doing to this small number of people on this pacific
01:07:05.000 island nation well great in some global moral calculus when you're finally at the pearly gates
01:07:10.720 maybe they'll tally up all the lives you saved and you know you'll get to go to heaven but in the
01:07:15.020 meantime you're brutalizing people and you're brutalizing them to make an example of individual
01:07:19.760 human beings including women and children in order to deter other people there's got to be a way of
01:07:24.220 doing it that's somewhat less barbaric but also provides the deterrent effect i think but just to
01:07:29.500 the general gist i do think is that in immigration when you have rich wealthy prosperous countries
01:07:36.660 where it's great to live and you have a lot of countries where it's not so great to live
01:07:41.380 there's gonna be some kind of brutality along the way and filtering out who can come and who can't
01:07:48.120 even if that's just saying you have to stay back in your shithole in bangladesh and australia has
01:07:54.260 chosen the path of we're going to be particularly uh brutal and particularly kind of firm about the
01:08:02.220 border and as a result you have massive public support for immigration now of course there's
01:08:09.040 still a bit of worry about immigration because as you get pressure on infrastructure pressure on
01:08:13.300 public schools pressure on the health system you know and so on and so forth people say do we need to
01:08:17.300 be letting in like half a million people every year could we make it half that or whatever
01:08:20.980 but you don't see brexit and you don't see donald trump and i do think a part of that not to not to
01:08:26.800 blame both of those things entirely on immigration but i think it's a significant i don't think you
01:08:31.780 could get those without a sense from the public of immigration being out of control and they're just
01:08:37.140 being chaos fundamentally definitely the reason it's interesting we've got to wrap up but i'll ask
01:08:43.220 you this last question before we do the usual one is about the multi-ethnic versus multicultural
01:08:47.540 because i asked you about multiculturalism and you immediately went to multi-ethnic which i think
01:08:51.880 is the right way of talking about it and the reason is that in europe during periods of mass legal not
01:08:58.960 illegal but legal immigration uh several people who none nobody would describe as culture warriors or far
01:09:05.340 right or anything like that including people like angela merkel david cameron were forced to concede
01:09:11.620 that multiculturalism has failed in europe um and that's kind of was a little bit of my worry with
01:09:18.280 australia in the sense that i think that there may be we talked earlier about how it's like going back
01:09:24.800 10 years in the past it i just got that little bit of sense the illegal immigration issue is different
01:09:30.380 that there's a little bit of complacency about that because when you have large waves of immigration
01:09:36.620 come in from different cultures from different religious backgrounds and you don't encourage
01:09:41.280 assimilation you are gonna create problems that then will result in a brexit trump style response
01:09:47.700 maybe yeah i don't think it's i don't think it's a non-issue i do you're right that i specifically
01:09:54.660 chose the word multi-ethnic because i'm in europe uh uk man i think it's in europe but you know in my
01:09:59.840 brain it is and i do think the word has different valence here largely because probably of that angela
01:10:05.680 merkel speech where she was talking about multiculturalism um i think i don't believe
01:10:12.100 that we should be aggressively trying to get people to abandon their home cultures i believe in
01:10:17.740 multiculturalism in the sense that i love living in a melting pot i love being close to a neighborhood
01:10:24.600 where i can go go to and you know the signs are in mandarin and then the translation underneath the
01:10:30.880 mandarin is in korean and there's no english translation at all like i i like that i like
01:10:36.340 living in in a place where the food is incredibly authentic and it feels like you're bumping into the
01:10:41.540 chaos of humanity where i think you have to draw the line is that there are fundamental principles that
01:10:48.360 we all agree on in this country right men and women are treated equally gay poop it's fine to be
01:10:54.380 gay you know basically sort of universal small l liberal principles and if you come from a culture
01:11:00.940 where that's not the case you don't get to continue to live that culture in australia and continue to
01:11:07.400 insist that your women wear burqas against their will or you know that you don't think those two
01:11:12.920 things are incredibly connected and unavoidably so if people live in a community in which they speak
01:11:18.100 the language they spoke in the country from which their grandparents came if people live in those
01:11:22.560 societies where they're not fully integrated then the cultural heritage including social values
01:11:28.820 will be passed down inevitably and in the uk that's what we see we see second and third generation
01:11:34.740 people who whose parents and grandparents came here who are more socially conservative including
01:11:39.560 on those issues are they yeah that that's not the way that it works in australia i mean or america
01:11:44.060 actually it gets less it gets diluted with each generation maybe because of the different types of
01:11:48.080 people that you allow in it could be i mean i i think i mean if we're talking about
01:11:52.000 conservative muslims which i think is probably the subtext then uh you know the first generation
01:11:56.700 comes over and they have their ways of doing things and in general the the next generation
01:12:01.060 just through by necessity bumps into more people from more different cultures and more local
01:12:07.040 australians they consume the public news they like it there is a there is a first generation
01:12:12.340 there is a filtering effect and gas the jews outside the sydney opera house um i mean uh there there is
01:12:20.580 no way of avoiding the fact that there is going to be a radical uh you know subset a tiny radical
01:12:26.060 subset of people um i don't think that was the overwhelming uh sentiment like is there a so i suppose the
01:12:33.300 question really is is there a is there a maximum kind of cap on the number of people who you would bring
01:12:39.380 in from particular cultures because the pace of dilution of their conservatism is too slow
01:12:45.940 for you know a liberal democracy to be able to sustain assuming that dilution is going to continue
01:12:51.760 over long periods of time as their percentage of the population grows rapidly yes yes uh that's not
01:12:58.760 necessarily the case i mean people are talking in this country about an islamic party now right
01:13:04.260 an islamic party is not going to be socially liberal i don't imagine do you see what i'm saying
01:13:08.940 uh well i don't know i mean if it's an islamist party it's obviously not going to be socially
01:13:12.360 liberal uh you know if it was to represent british muslims broadly then who knows um
01:13:17.400 this isn't just about islam like my my in-laws are socially conservative orthodox christians yeah
01:13:29.920 there was an orthodox christianity party of britain they would be socially conservative
01:13:34.380 definitely but you didn't say an orthodox muslim party you just said a muslim party an islam an
01:13:40.520 islamic party islamic party yeah i mean i i'm yeah look maybe i know too many moderate muslims in
01:13:46.120 australia and i have rose-tinted glasses about muslims it's it's a conversation about uh whether
01:13:51.140 you are able to integrate people when they're large waves of immigration while encouraging them to
01:13:57.820 retain entirely their language and their culture without really working hard to integrate yeah i
01:14:02.560 mean like look it may be the case that australia just has more covert assimilationist policies than
01:14:07.140 the uk and the us do is that we talk a big game about multiculturalism but actually if you call
01:14:12.400 through to the government helpline you're going to have to have rudimentary english in order to
01:14:15.760 understand it that's very different it's possible that here you know you can press number 16 for arabic
01:14:20.320 and number 17 for uh you know pashtun or urdu or something and you don't have that in australia so i
01:14:26.340 think yeah i think there's a probably a that's interesting there's a there's a balance to be
01:14:29.800 struck between an overt kind of you know philosophy of welcoming multiculturalism but also a practical
01:14:36.500 recognition that in order to get by and in order to be socially tolerated really you're gonna have to
01:14:41.960 join the mainstream will everybody no but the force is strong enough the centrifugal force to pull you
01:14:47.560 out of the the the bunker and into mainstream australia i have sufficient faith in that that i think that
01:14:53.400 we can yeah australia can do it all right we're all moving to australia then excellent come on down
01:14:58.260 come on down so and the question the the question that we always end our interviews with is what's
01:15:04.820 the one thing we're not talking about that we really should be before josh answers make sure to head on
01:15:10.340 over to locals after the interview is over to see this i don't see how you can be so comfortable with
01:15:16.580 australia's response to the pandemic avoiding section 92 of the constitution police brutality detention of
01:15:22.720 citizens who committed no crimes and removal of basic natural rights none of which was based on
01:15:28.000 any evidence some people are talking about it but artificial intelligence is on my mind uh more and
01:15:34.740 more um i think we are about to enter a different world well i know we're about to enter a different
01:15:42.980 world where we're talking all the time and hearing from all the time creatures that are in our pockets
01:15:54.540 that seem to be i'm not implying that they are sentient but they'll they'll land for us as if they
01:16:01.380 are and that's going to be a change at least as big as the change of the smartphone i mean just to put
01:16:07.440 the smartphone in context for this thought experiment when 9 11 happened which doesn't feel like totally
01:16:14.140 ancient history to certain people of a certain age old people like me uh when 9 11 happened the ipod
01:16:23.940 didn't exist yet remember the original white ipod that held like 15 songs and weighed four bricks
01:16:31.500 uh and with the scroll button that was released in october of 2001 so when the 2007 election happened
01:16:39.840 in australia i don't know what the equivalent would be here but that's a big marker in australia
01:16:43.420 because it was a big landslide elections but let's say the erection of the erection let's say the
01:16:48.260 erection of barack obama in the united states right and it was a big one well yeah it was a big one
01:16:53.760 when that happened you didn't have uh like facebook and you know uh you didn't you didn't have
01:16:59.940 mobile uh device in fact you didn't have the iphone so the entire history if you'd told us
01:17:06.820 just 15 years ago there were blackberries right but there weren't there weren't smartphones if you'd
01:17:11.960 told us 15 years ago that it would be completely normalized for people to be walking around with
01:17:18.180 supercomputers in their pockets that they used in every spare moment of the day to parachute into
01:17:23.260 conversations news that was being tailored for them and there were there were as many versions of those
01:17:29.360 news feeds as there were people in the world because there were computer programs that were
01:17:34.720 learning exactly what you liked hovering over and the number of milliseconds that you spent looking at
01:17:39.680 a particular video before moving on and that the chinese communist party had the most popular version
01:17:46.460 of these things and the most young people were getting their news and information from a from the
01:17:50.900 chinese communist party's computer programs that were trying to determine exactly what each individual
01:17:57.140 liked and didn't like that would seem like a weird and dystopian future and if you fast forward the same
01:18:04.240 amount into the future from now i think we'll look back on this moment where we're all sitting here today
01:18:11.180 and go i can't believe that was a time when we weren't just constantly talking to things all around us
01:18:20.440 cracking jokes to them having them laugh having them crack jokes back to us and having basically
01:18:27.200 creatures all around us in artificial form that were helping us do everything we will all have a personal
01:18:33.960 assistant we will all have a lawyer we will all have an accountant and they will all be virtual
01:18:39.840 sooner than we realize and the impact that that's going to have like we talk a lot about job loss or
01:18:45.400 something from ai or misinformation all of that's very important but i think just the psychology
01:18:50.500 of what we're about to embark on social media was basically a gigantic experiment in which none of us
01:18:58.180 enrolled but we will all find ourselves in well that will look like a walk in the park in comparison to
01:19:04.660 the kind of global psychological experiment that's about to happen josh zeps check out uncomfortable
01:19:10.500 conversations and head over to locals for the bonus questions having been brave enough to make his
01:19:16.880 position as clear as possible to those who only see one side of the israel hamas war has josh had
01:19:22.420 feedback from his audience that shows there has been a change in understanding or attitude
01:19:40.500 of expressing our behavior so and with these kinds of things these people are a strength in쉬 litter video
01:19:43.000 because of those who don't care about how leak it is today so it's oliver it and change his
01:19:44.420 stuff because they started to be of configuration also from this company i but i think you know
01:19:46.420 that you're going to have the gente to see two different properties but i met in terms of
01:19:47.520 you know if you here use one side i work we're gonna act as well but i need to adopt welcome
01:19:49.600 this kind of thing and if the black lineage and my wife you know what and i'm letting you
01:19:51.100 give the right on the herd it she sign in and it doesn't say a lot of chance in
01:19:53.040 tu amtes reading the emails and and chat.
01:19:55.040 And the identities of oppression I have been able to forgive him so i can think of
01:19:55.840 but she gets下olog from the top living room and I have things that I want to deny you
01:19:57.840 Thank you.
01:20:27.840 Thank you.
01:20:57.840 Thank you.
01:21:27.840 Thank you.
01:21:57.840 Thank you.