Stay Free - Russel Brand - June 30, 2023


EXPOSING THE CENSORSHIP INDUSTRIAL COMPLEX | Part 1 - #158 - Stay Free With Russell Brand


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

155.88036

Word Count

10,683

Sentence Count

562

Misogynist Sentences

3

Hate Speech Sentences

6


Summary

Matt Taibbi and Michael Schellenberger join me to talk about the censorship industrial complex, and why it's time to wake up to the fact that we live in a world where the only thing we can speak freely on is the Internet, and that is still governed in part by the World Health Organization (WHO), and that's still governed by commercial and corporate interests that literally prevent free speech. We also talk about Elon Musk and Mark Zuckerberg's upcoming cage fight, and ask: Is their real fight going to be against the censorship Industrial Complex? with the EU threatening to kick, you won't believe this, and with Zuckerberg admitting that he regrets taking Fauci's lead, allegedly, and censoring debatable and sometimes true information, it's heating up to be an incredible fight. If our age is to have a journalism that cares about truth and narratives, and representing the truth to the people that matter most, willing to talk truth, you have to be willing to take the truth and tell the truth. And this is a humanity crisis, and I just hope we're not too late to fix it. In this special Friday show, we have a warm round of applause for two fine journalists, and a chance to ask the question: Who hates the speech? Who are the elites who hate the speech, and what are they really trying to stop us from talking about it? and why is it so important that we need to speak about it, and how can we stop them? in the first place? What are they trying to do to make a difference in the world? And what we can we do to stop the censorship and stop it ? in a democracy that s not just to stop censorship, but to make sure we re-writing history, but also to make it better, not just in order to make the world better, but so we can all be more free and more fair and more open? This is a question that needs to be answered, and it s a question we all of us can all have a say in the process, and we can have a voice in the conversation, not less? We can t be silent, can we be more than we can be more honest, more open, more transparent, and more transparent? - we can change the world, more fair, more informed, more accessible, more truthful, more accountable, more woke, more diverse, more thoughtful, more free?


Transcript

00:00:00.000 I'm a black man and I could never be a better man I'm a straight up, you watch me move I'm a straight up, you
00:00:10.000 watch me move I bought a Rolls-Royce, so I'm looking for the CEO
00:00:14.000 Looking for the CEO I'm a black man and I could never be a better man
00:00:19.000 I'm a straight up, you watch me move I'm a straight up, you watch me move
00:00:22.000 I bought a Rolls-Royce, so I'm looking for the CEO Looking for the CEO
00:00:27.000 In this video, I'm going to be using a new product. I'm going to be using a new product.
00:00:31.000 In this video, you're going to see the future.
00:00:42.000 Hello there, you Awakening Wonders!
00:00:44.000 Thanks for joining me for an extra special Friday show.
00:00:48.000 You know that we participated in the Censorship Industrial Complex event with Matt Taibbi and Michael Schellenberger.
00:00:55.000 It was a fantastic conversation with incredible participants, surprise guests, and more importantly than any of that, vital information around how a new, unelected series of institutions are coordinating an attempt to shut down free speech
00:01:11.000 worldwide. If you're watching us on YouTube, we're only going to be available for the first 15 minutes,
00:01:16.000 then we'll be exclusively on Rumble. Why?
00:01:18.000 Because Rumble is the home of free speech and we can't speak freely
00:01:22.000 on a platform which I love. I love you 6.4 million Awakening Wonders.
00:01:26.000 By God, I love you, that is still governed in part by the WHO and that has commercial and corporate interests that literally prevent free speech.
00:01:35.000 I know increasingly that Rumble's work is going to be vital.
00:01:39.000 Later on in the show, in our presentation, here's the news.
00:01:41.000 We have an in-depth look at Elon Musk and Mark Zuckerberg's upcoming cage fight and ask is their real fight going to be
00:01:49.000 against the censorship industrial complex with the EU threatening to kick, you won't believe
00:01:54.000 this, Twitter right out of Europe and with Zuckerberg admitting that he regrets taking Fauci's
00:02:01.000 lead allegedly and censoring debatable and sometimes true information, it's heating up
00:02:06.000 to be an incredible fight.
00:02:08.000 First of all though, let's have a look at this live event in London.
00:02:12.000 Me, Matt Tybee, Michael Schellenberger, those so-called journalists, those that were so vital in breaking the Twitterphile story, here speaking openly in an incredible environment about the censorship industrial complex.
00:02:24.000 If you're watching this on Rumble, click the red button and join us on Locals and let us know where you think we're going to end up.
00:02:31.000 Hello.
00:02:32.000 Please, let's have a warm round of applause for Michael Schellenberger and Matt Taibbi.
00:02:39.000 Who hates the speech?
00:02:41.000 It's the elites.
00:02:42.000 There's this relentless effort to sort people into categories.
00:02:46.000 Around the world, we see censorship.
00:02:50.000 We're meeting for the first time because we believe that free speech isn't just an enabling condition for civilization, for democracy, that it's a fundamental human right.
00:02:59.000 And this is, I think, more than a speech crisis.
00:03:02.000 It's a humanity crisis, and I just hope we're not too late to fix it.
00:03:07.000 What is the nature of these organizations?
00:03:10.000 How have they been granted a power that to any rational, ordinary person would require consensus and democracy to achieve it?
00:03:18.000 How have those safeguards been so expertly bypassed?
00:03:23.000 That's a great question, and actually, this is, well, of course, yeah.
00:03:28.000 Plus the shirt.
00:03:30.000 APPLAUSE CHEERING
00:03:34.000 Hello.
00:03:36.000 Thank you.
00:03:37.000 I'm so excited that you've made the effort to come here to see Michael Schellenberger and Matt Taibbi.
00:03:43.000 If, yeah...
00:03:44.000 If our age is to have a Woodward and Bernstein, ethical journalists that care about truth
00:03:59.000 and narratives and representing the truth to the people that matter most, willing to talk truth,
00:04:06.000 you Yeah, Barber to Bar, who's on this?
00:04:09.000 Barber to Bar.
00:04:10.000 I mean, this is the censorship issue we're discussing.
00:04:14.000 You can't even do a vaguely flattering intro to two fine journalists without the censorship industrial complex.
00:04:23.000 Stepping in, in this hallowed territory established by the Methodists, who had to revivify Christianity after that orthodoxy became draconian and oppressive, to re-evoke once more the divine, to ensure that we can speak freely and openly, for it's our only tool against corruption and hypocrisy.
00:04:43.000 When Michael invited me to participate in this as a facilitator, because watch a minute, it's going to blow your mind how quiet I'm going to be in a second.
00:04:49.000 This is it.
00:04:50.000 I'm going to say this, then I'm going to self-censor like you wouldn't believe.
00:04:54.000 Matt Taibbi is shattered.
00:04:56.000 He's just arrived here.
00:04:58.000 These people care about what they do.
00:05:00.000 Michael Schellenberger and Matt Taibbi are precisely the journalistic voices that we require because, like Wesley's Methodist movement, our movement already has martyrs.
00:05:10.000 I believe we have supporters of Julian Assange in the room right now.
00:05:22.000 We already have people that are willing to sacrifice themselves for a higher good in
00:05:28.000 Edward Snowden is a supporter of course of this event and recognizes the significance of the work that Michael and Matt in particular are doing.
00:05:36.000 This is a conversation that we are facilitating for two men that I believe in, who I believe are working very hard to do a necessary job.
00:05:43.000 Some of the topics that we're covering are What is the nature of this new centralizing authoritarian system?
00:05:49.000 How long can we allow convenience, safety and security to enable centralized authoritarian systems to shut down communication and free speech?
00:05:59.000 What is the misanthropy that lies at the heart of a discourse that believes our speech needs to be controlled?
00:06:07.000 Where is the moral authority that is entitled to make those decisions on our behalf?
00:06:12.000 Thankfully, there are people in this room that can answer those questions tonight.
00:06:16.000 Please, let's have a warm round of applause for Michael Schellenberger and Matt Taibbi!
00:06:21.000 Thank you, brother.
00:06:28.000 Thank you.
00:06:30.000 Thank you.
00:06:30.000 What a pleasure.
00:06:31.000 Thank you, guys.
00:06:37.000 Wow, thank you so much.
00:06:41.000 People say there is no censorship industrial complex.
00:06:47.000 People say the idea that there's a censorship industrial complex is a conspiracy theory.
00:06:53.000 It's disinformation.
00:06:56.000 And yet we know that Facebook censored what its own executives called often true stories of COVID vaccine side effects.
00:07:08.000 We know that Facebook censored the New York Post in February 2020 when it published an opinion piece that said maybe COVID came from a laboratory.
00:07:23.000 And we know that Twitter and Facebook censored an entirely 100% true accurate story about Hunter Biden's laptop just two weeks before the US elections.
00:07:37.000 Now people say it's not really censorship because censorship is when the government censors you and the government was just flagging misinformation.
00:07:51.000 They were just being helpful, helping Twitter executives and Facebook to correct the misinformation out there.
00:08:01.000 And yet Matt Taibbi and his colleagues at Racket have identified 50 large, powerful organizations around the world that take government funding that are staffed by former government employees that work hand in hand with the U.S.
00:08:21.000 government, the U.K.
00:08:22.000 government, The Brazilian government, the Canadian government, Australia, New Zealand, Ireland, around the world we see censorship.
00:08:33.000 And censorship at the behest of governments is a violation of the First Amendment and a violation of your laws too in the great nation of Britain.
00:08:45.000 So we're here tonight to get into it.
00:08:47.000 What is the censorship industrial complex?
00:08:50.000 How did we allow this monstrosity to take hold in our societies?
00:08:55.000 It's every time we think we get to the bottom of this story, the floor drops out from under us.
00:09:01.000 Most recently, just a few weeks ago, we started getting emails from people around the world.
00:09:07.000 From Australia, from New Zealand, from Canada, the United States, saying, did you know that there's a piece of legislation going through our Parliament, our Congress?
00:09:19.000 It's not being covered by the news media.
00:09:22.000 What is it?
00:09:24.000 In the UK, there's legislation that would allow the government to read your private direct messages on WhatsApp, Signal, Telegraph.
00:09:34.000 In Brazil, a single judge on the Supreme Court is demanding the right for the government to read and censor private text messages.
00:09:45.000 In Canada, they're seeking legislation that would promote official government media sources over independent news media sources on social media platforms.
00:09:56.000 In the United States, They've been trying to sneak through legislation that would criminalize the use of VPNs, or virtual private networks, to gain access to forbidden websites.
00:10:10.000 In Ireland, this is the most shocking thing of all.
00:10:15.000 They're trying to get the right to go into your homes, into the homes of people in Ireland, including the staffs of social media companies, search their phones, search their computers without authorization, and presume people guilty until proven innocent of spreading hateful material.
00:10:37.000 Hateful speech should be condemned.
00:10:41.000 We should use our freedom of speech to condemn it.
00:10:44.000 But our societies are more tolerant of racial, religious, and sexual minorities than they have ever been.
00:10:53.000 Think about the attitudes of your grandparents and great-grandparents.
00:10:57.000 In 1958, four percent of Americans approved of the rights of whites and black people to be married.
00:11:04.000 Today, over 95 percent do.
00:11:08.000 Who is driving the hate speech?
00:11:12.000 Who is demanding the censorship of hate speech?
00:11:16.000 I would say it's some of the most hateful people in our society.
00:11:20.000 Who hates the speech?
00:11:23.000 It's the elites.
00:11:24.000 And they want to censor the people.
00:11:26.000 They want to censor the authentic voice of the people.
00:11:28.000 And I tell you tonight, they will not succeed.
00:11:31.000 And I know they will not succeed because all of you are here as lovers of freedom to demand your rights to freedom of speech.
00:11:38.000 We have already won.
00:11:41.000 We have put a stake in the ground in this hallowed, sacred space.
00:11:47.000 There's always been a debate about whether or not you need free speech to make democracy work.
00:11:53.000 We know we do.
00:11:54.000 You can't choose your elected representatives if you cannot freely debate who they are, what they stand for.
00:12:00.000 People make the case that you need freedom of speech in order to have free markets work.
00:12:06.000 You can't know what to buy or sell if you aren't allowed to discuss those products freely and openly.
00:12:12.000 We have very few restrictions on our freedom of speech.
00:12:15.000 You can't lie to people to steal from them through fraud, and you can't incite violence against people in the immediate term.
00:12:23.000 But beyond that, our rights are very strong.
00:12:26.000 They're the strongest in the United States of anywhere in the world, but they're very strong in Britain, and they should be stronger.
00:12:33.000 We are here to launch a campaign, a new free speech alliance.
00:12:38.000 We've brought people from around the world.
00:12:42.000 I just met many of them, and these are people who I only see online.
00:12:47.000 These are people fighting for their freedom in Brazil, in New Zealand, Australia, Canada.
00:12:53.000 We're meeting for the first time because we believe that free speech isn't just an enabling condition for civilization, for democracy, that it's a fundamental human right.
00:13:03.000 Free speech is what makes us human.
00:13:05.000 It's tantamount to our ability to breathe and to eat and to love who we want to love.
00:13:12.000 And so I say to you tonight, this is the moment where we start to fight back against the censorship industrial complex.
00:13:19.000 We intend to defund it, dismantle it, and demand a new standard for freedom of speech worldwide as strong as the one that we enjoy in the United States of America.
00:13:29.000 Thank you all for coming.
00:13:31.000 Thank you. Thank you all.
00:13:33.000 The most painful thing, and there's a lot of painful things that one goes through, is
00:13:47.000 losing almost all of your friends as a consequence of using your speech.
00:13:54.000 Thank you.
00:13:56.000 The only positive thing that's come out of that has been to make new friends.
00:14:01.000 And it's not the most obvious thing in the world to lose all your friends in your late 40s.
00:14:08.000 The ones you keep are so dear, and the ones that you make, dearer so.
00:14:13.000 And there's few people in the world that I admire more than Matt Taibbi.
00:14:16.000 I've admired Matt Taibbi for almost 20 years, I think.
00:14:27.000 I think.
00:14:29.000 When I was invited in to work on the Twitter files, meeting Matt Taibbi was one of the most special moments in that adventure.
00:14:39.000 And when we testified in front of Congress, at that moment that we were testifying, the Internal Revenue Service, which is our tax police, visited Matt Taibbi's home and attached a note to his door, which is completely, completely against the standard practice of the Internal Revenue Service.
00:15:00.000 This is a person who has sacrificed significantly and he's seen what life is like for journalists in totalitarian societies.
00:15:07.000 He knows and has been friends with people that have died for the cause.
00:15:10.000 I have few greater pleasures than the opportunity tonight to introduce you to the great Matt Taibbi.
00:15:16.000 I'm going to sit.
00:15:24.000 So let me say one more thing.
00:15:29.000 It is an equal pleasure to be up here with Russell Brand.
00:15:33.000 And truly, truly one of my favorite, favorite comedians and someone that has demonstrated great courage
00:15:49.000 in his own personal recovery and courage in speaking out against the orthodoxy on so many issues.
00:15:57.000 Whether it's COVID or free speech, he's here tonight like we are, without asking for anything in return.
00:16:05.000 We're so blessed to have Russell Brand with us.
00:16:08.000 So please join me again in thanking both of them.
00:16:22.000 I'm going to remain seated because Michael tricked me, actually.
00:16:27.000 Before the event, he texted me and said, we're going to do prepared remarks to begin the event.
00:16:35.000 And I'm not an orator.
00:16:36.000 I'm a writer.
00:16:37.000 So what did I do?
00:16:38.000 I spent the last 48 hours meticulously writing an essay, which I'll publish tomorrow.
00:16:44.000 You can all read it.
00:16:46.000 It's very carefully argued.
00:16:47.000 I think it's pretty eloquent in places.
00:16:50.000 I'm not sure it entirely holds together.
00:16:54.000 But once we get here, Michael tells me, no, I'm just going to wing it.
00:17:00.000 I'm going to go up there and talk extemporaneously.
00:17:03.000 So out of spite.
00:17:05.000 I'm not going to read that entire speech that I had written, but I'll read excerpts of it because there are a couple of important points that I do think we want to make before we get to the larger discussion with Russell, which I know you're all anxious to get to.
00:17:20.000 I originally started by talking, saying something very pretentious about George Orwell.
00:17:28.000 I'll now read it.
00:17:32.000 And then from there it led into sort of an introduction to what the Twitter file story was, and it was full of sort of unforgettable asides about Elon Musk and all these other things.
00:17:44.000 We can skip that.
00:17:47.000 And then there was a quote, and basically the idea here is that I went into the Twitter file story, probably like Michael, bringing my old school, legalistic, kind of Enlightenment era notions of free speech with me.
00:18:04.000 And I was hoping to answer maybe one or two narrow questions about Twitter.
00:18:10.000 You know, for instance, did the FBI maybe once or twice intervene to, you know, get in the middle of a speech question?
00:18:18.000 Quickly, we all realized that it was something sort of bigger, scarier, and weirder than that.
00:18:25.000 And here's what I wrote about that.
00:18:27.000 The quote is, a sweeping system of digital surveillance combined with thousands or even millions of subtle rewards and punishments designed to condition people to censor themselves.
00:18:39.000 So we're going to get into The concrete examples of how they did use government and government did work with these companies to actually censor people.
00:18:49.000 But the larger, scarier issue is the construction, I think, of this gigantic Internet age system that is designed to get people to preempt dangerous thoughts by getting people to avoid having them in the first place.
00:19:07.000 And then there was another pretentious thing about George Orwell.
00:19:12.000 And the idea here was that one of the things that Orwell focused on in 1984 was this notion of binaries, that in the world that he described in 1984, There were no shades of grey.
00:19:29.000 All ambiguities and shades of meaning had been purged.
00:19:32.000 And it wasn't necessary to have words for everything.
00:19:35.000 You didn't need to have words for warm and cold.
00:19:37.000 You could just have warm and unwarm, for instance, right?
00:19:42.000 And this is what we saw a lot of in the Twitter files.
00:19:45.000 We saw a lot of taking very complex issues where there are lots and lots of shades of meaning and finding ways to whittle it down to basically two things.
00:19:55.000 All right, and a great example of this was the virality project that was led by Stanford University.
00:20:03.000 This was basically a catch-all program where Stanford took in information from all the biggest internet platforms, Facebook, Google, Twitter, some others, and they aggregated all the things that they were hearing about COVID and their experiences about what content moderation decisions that they made, and they made recommendations to each of the platforms about how they should deal with these things.
00:20:33.000 And the really fascinating thing about this, well first let's start with the headline sort of scary moment in these emails.
00:20:45.000 There was one email in which Stanford suggested to Twitter that you should consider, as standard misinformation on your platform, Stories of true vaccine side effects or true posts which could fuel hesitancy as well as worrisome jokes or posts about things like natural immunity or vaccinated individuals contracting COVID-19 anyway.
00:21:14.000 And basically what they were doing here is they were trying to get into the minds of millions of people through algorithms.
00:21:23.000 If a person was telling a true story about somebody who got the vaccine and got myocarditis, they didn't have to say that they got it because of the vaccine.
00:21:34.000 Even if they just told the story, even if in the next post they said, I'm all for the vaccine, the way the Virality Project interpreted that original post was that this could promote hesitancy.
00:21:48.000 Therefore, even if it's true, it's untrue, right?
00:21:51.000 So you have, in reality, you have shades of meaning there.
00:21:56.000 There's a true story that, you know, suggests that maybe you should be cautious about the virus.
00:22:01.000 The person might be pro-vaccine, but they see it as anti-vax material.
00:22:06.000 So it's vax, anti-vax, right?
00:22:09.000 And this is Constantly throughout.
00:22:12.000 They just took things that were really somewhere in the middle, and they moved them in one direction or another.
00:22:19.000 Another amazing moment was when there was a company called Grafica, which described the dangers of undermining what they called authoritative health sources, like Anthony Fauci.
00:22:34.000 They were very against even the use of puns like FAUXI, F-A-U-X-I.
00:22:41.000 And their quote was, this continual process of seeding doubt and uncertainty and inauthoritative voices leads to a society that finds it too challenging to identify what's true or false.
00:22:56.000 Basically what they're saying is questioning authority.
00:22:59.000 Who here is old enough to remember the 70s and the VW bugs that had the questioning authority stickers?
00:23:07.000 Questioning authority, which was of liberal value then, is now disinformation.
00:23:13.000 So if you apply these techniques 50, 100 million times, a billion times, a billion billion times, eventually what happens is that people see that they are either going to be defined as approved, having approved thoughts, or unapproved thoughts.
00:23:31.000 There's no middle that they can occupy.
00:23:34.000 They will just naturally self-sort and self-homogenize.
00:23:38.000 And we're doing this all throughout society with politics, entertainment, and everything.
00:23:43.000 That's how you can get, you know, somebody like Russell, who is clearly not a right-winger, but they define him as a right-winger anyway because there are only two categories of people in the current media environment.
00:23:57.000 There are people who Believe in everything true and decent and democracy and puppies and all that, and then there's right-wingers who are wrong about everything, right, basically.
00:24:06.000 And so that's what they've been doing.
00:24:08.000 They've been creating binaries over and above the direct censorship that we saw.
00:24:15.000 There's this relentless effort to sort people into categories.
00:24:19.000 And the other thing that I think is really important to point out and is another Orwell concept is double think.
00:24:30.000 And this is the idea.
00:24:31.000 How did Orwell define this?
00:24:35.000 Basically, it's the idea of holding two ideas at the same time.
00:24:41.000 He defined it as the act of holding simultaneously two opposite individually exclusive ideas or opinions and believing in both simultaneously and absolutely.
00:24:51.000 Now, We do that.
00:24:54.000 We do that constantly now.
00:24:55.000 With news stories, things that were true yesterday turn out to be completely the opposite tomorrow, and people are totally fine with that.
00:25:03.000 We just completely skip the fixing process.
00:25:06.000 There's no stopping to say, oh, sorry, we got that wrong.
00:25:10.000 We just move to freaking out about the next thing seamlessly.
00:25:14.000 So just to take an example, It wasn't that long ago that we were told in no uncertain terms that the only suspect in the Nord Stream pipeline bombing was Russia itself.
00:25:27.000 And just a couple of weeks ago, we were told by the same U.S.
00:25:31.000 government that they were actually aware since last June that this was planned by Ukrainians with the assent of the highest officials in the Ukrainian military.
00:25:43.000 Now, I don't know what the true story is, but those two stories are completely different.
00:25:47.000 And they don't stop and say, oh, well, we're sorry.
00:25:50.000 Let's resolve that.
00:25:51.000 Let's square this discrepancy.
00:25:54.000 They just want you to forget.
00:25:56.000 And there's no way for people to live normally with these contradictions and stay sane.
00:26:03.000 The only thing they can do is live continuously in the moment, because that way you don't have to think about the past, you don't have to think about the future.
00:26:12.000 You are sort of charged affirmatively to forget everything that you were told before, because it might turn out to contradict something they want to tell you tomorrow.
00:26:21.000 So we live in the present, continually, and in the present, there are only two choices.
00:26:26.000 So we're living in this very, very narrow intellectual world, and this is over and above the problem of authority that would come in later.
00:26:36.000 If you somehow manage to get past all these obstacles and actually be an independent, free thinker, like I think most of the people in this room are, then they're going to have censorship and other obstacles to try to stop you.
00:26:49.000 But their aim is to prevent that from ever happening.
00:26:53.000 And we saw that over and over in the Twitter follows.
00:26:55.000 I think that's the lesson that I ended up taking away from it.
00:26:58.000 And this is, I think, more than a speech crisis.
00:27:01.000 It's a humanity crisis.
00:27:04.000 And I just hope we're not too late to fix it.
00:27:06.000 So thank you very much.
00:27:07.000 Thank you very much, and I hope you have a great day.
00:27:09.000 Thank you.
00:27:22.000 I think in the 20 minutes that you've both been speaking, it's already become plain that we are dealing with an issue
00:27:28.000 of considerable, perhaps even unprecedented, scale.
00:27:35.000 And thanks mostly to some of the territory that Matt outlined, there is an entirely subjective experience also that will individually affect all of us in ways that seem to be more rooted in behaviouralism than politics, even in the most dystopian technocratic version of that, that attention itself, the experience of being you, is being curated and directed in ways that could only have been theoretical at the time of B.F.
00:28:09.000 Skinner, for example.
00:28:11.000 To return to the broader framing of our conversation, Michael, seeing as how you seem to be in charge.
00:28:22.000 Making poor Matt write a whole essay.
00:28:26.000 And then publicly redact his affiliation with George Orwell.
00:28:31.000 Plainly a device used to curry favour with the British.
00:28:39.000 Can you tell us how something as vast as the censorship industrial complex can possibly exist and come into being when it necessarily requires the participation of numerous agencies that one would assume would not be explicitly connected?
00:28:54.000 I know Matt has done work in revealing 50 NGOs that participate in this censorship industrial complex, which I believe is a phrase that you have coined.
00:29:03.000 Can you tell us how both state And private authorities, be they media or governmental, are participating in the creation of and execution of this new idiom that you have coined.
00:29:19.000 Sure, I mean, I think we have these various moments on the Twitter files where you would just get really creeped out, like you would discover something and you would just get chills up your spine.
00:29:29.000 And for me, it was when we discovered that the Aspen Institute had organized a workshop they called a tabletop exercise that had New York Times, Washington Post, CNN, Facebook, Twitter, you know, 12 to 20 people there.
00:29:46.000 All talking in the summer of 2020 about how to debunk a story about Hunter Biden and Burisma.
00:29:53.000 This is three months before the Hunter Biden laptop had come out.
00:29:58.000 And it was just like, what is going on here?
00:29:59.000 We sometimes ask, you know, is this a conspiracy or is it just a culture?
00:30:06.000 Because there is a way in which you know that looked like a like conspiracy that looked like a secret coordinated effort that was obviously not public to pre bunk a story that would come out several months later.
00:30:22.000 On the other hand, it's also a culture.
00:30:24.000 You know, these are people that are in Washington, D.C.
00:30:26.000 together.
00:30:26.000 They go to the same parties.
00:30:27.000 They went to the same prep schools.
00:30:28.000 They went to the same Ivy League schools.
00:30:30.000 They got jobs at the top newspapers.
00:30:33.000 When they come to London, they hang out with the same particular, you know, group of people.
00:30:38.000 But I think we're constantly asking ourselves, to what extent is this, you know, an inorganic phenomenon of a censorship industrial complex?
00:30:46.000 And to what extent is it an organic part of cancel culture?
00:30:50.000 I mean, I thought Russell, what you said is really to the point, which is that there's, and also to Matt's comments about Orwell, is that there's a psychological, there's something that's unhealthy psychologically about this.
00:31:04.000 And we've all become obsessed, my colleagues and I, with this totally obscure book by a Polish psychologist who lived Under both Nazi, under the Nazis, then also under the other communists.
00:31:15.000 And the book is called Political Ponerology, which is this crazy word I think he invented, which is the study of evil or the study of totalitarianism.
00:31:24.000 And what he says is he says people that are in totalitarian societies, the people in charge, he says the way that you get to totalitarian societies is that psychopathological people Or what psychologists call cluster B personality disorder type people.
00:31:42.000 These are antisocial personality disorder, which is the new name for psychopaths.
00:31:50.000 Narcissist, borderline personality, and histrionic disorder.
00:31:53.000 These are all cluster B personality.
00:31:55.000 These are the folks who, like when you're around them in your life, you always feel a little bit like you're walking on eggshells, because anything you say might set them off or might offend them.
00:32:05.000 And I'm also struck by that Orwell, where Orwell is saying, in a totalitarian society, you know, it's either, you know, black or white.
00:32:14.000 And that's actually one of the characteristics of cluster B personality disorder people is that I mean, these are people that are marked by grandiosity, self-centeredness, and then this concept that we just learned, which is called splitting.
00:32:30.000 So these are people for whom you're either with me or you're against me.
00:32:34.000 And it's just, that's the way their world is.
00:32:37.000 And so, and when we start to, you know, without naming names, you start to learn some of the characters in the censorship industrial complex, and you look at, you sort of watch them, and you realize you're dealing with people that there's something Pathological about the way that they're, you know, the way they look at the world, the way in which it's like you're either with the program or you're dead to me, you know, you're and there's no sense of play and of like humor is impossible in that situation.
00:33:06.000 There's a reason why you feel scared to make a joke around to those kinds of people.
00:33:12.000 And so it's dark and I think there's like I think it's you know, it's it's I mean these two gentlemen are so lovely to work with and there's something so... I went to Russell's office or his studio not far from here and so well treated this like all of his people are really fun and sweet and healthy and there was no like
00:33:34.000 Weirdness to it, and yet everybody's enthusiastic about it.
00:33:38.000 That's a really different vibe than when you go to some of these more pathological institutions.
00:33:43.000 And so that's why I felt like, you know, getting us together and being with each other in person and being like, wow, there's some other people that we can have fun with and play with was an important part of our fight against totalitarianism.
00:33:55.000 Of course, any architecture of this nature, difficult though it is to envisage and track, must have its origins in the human psyche.
00:34:05.000 Where else could it come from?
00:34:06.000 Therefore, I suppose it's natural that it would have traits recognizable at the level of the individual.
00:34:15.000 I'm interested, Matt, to learn a little more about these 50 NGOs, and in particular, I'm interested in the way that they are frequently framed as philanthropic, and indeed, the entire Telos of the censorship argument is predicated on the idea of there being a moral authority with the integrity to execute that kind of censorship in addition to the misanthropy that I mentioned before that would require it.
00:34:49.000 What is the nature of these organizations?
00:34:52.000 How have they been granted a power that to any rational, ordinary person would require consensus and democracy to achieve it?
00:34:59.000 How have those safeguards been so expertly bypassed?
00:35:05.000 That's a great question.
00:35:06.000 And actually, this is well, of course, yeah.
00:35:11.000 Plus the shirt.
00:35:12.000 Plus the shirt.
00:35:13.000 This also gives me an opportunity to thank Michael and give him credit for something
00:35:23.000 enormous, which is coming up with the term censorship industrial complex, which I think
00:35:28.000 was crucial to naming this whole phenomenon and giving it an identity that people could
00:35:37.000 grasp and rally around.
00:35:40.000 Before that, I think it's similar to what the Occupy Wall Street movement did when they came up with the idea of the 1% and the 99%.
00:35:49.000 Just the nomenclature, I think, is really, really important.
00:35:53.000 And he came up with that name, and the reason it struck a chord with me, and I'll just go through the chronology quickly of What happened?
00:36:02.000 In February, early February, I was looking through the Twitter files, and we started to run into emails about an organization called the Global Engagement Center.
00:36:12.000 How many people here have heard of the Global Engagement Center?
00:36:16.000 Almost nobody, right?
00:36:18.000 Which is so fascinating.
00:36:19.000 So the Global Engagement Center was created in the last year of Barack Obama's presidency.
00:36:27.000 Technically, it's what they call housed in the State Department.
00:36:30.000 It's actually a multi-agency task force whose official remit is combating foreign disinformation.
00:36:40.000 And we found an inspector general's report that said basically that the Global Engagement Center,
00:36:47.000 in its first year, I guess it was FY 2017, had funded roughly $100 million worth of projects,
00:36:56.000 and it listed 36 different organizations, or 39.
00:37:00.000 And of course, 36 of them were redacted.
00:37:04.000 And I got the idea that it might be a good thing try to figure out what those organizations were.
00:37:11.000 We brought in some more people to start looking.
00:37:13.000 And the instant we started trying to figure out how many of these organizations were, the project Completely spiraled out of control.
00:37:22.000 You know, Michael mentioned 50.
00:37:25.000 The real number that we're looking at now is somewhere in the 400s.
00:37:28.000 It's like 400, 450.
00:37:29.000 You know, we've begun keeping sort of an Excel type spreadsheet with all these organizations.
00:37:37.000 And we think even that is only scratching the surface of how many of these quote-unquote anti-disinformation organizations that are out there.
00:37:46.000 A lot of them are receiving public money.
00:37:49.000 What are they doing?
00:37:50.000 What's the genesis of these groups?
00:37:53.000 Well, in the case of the Global Engagement Center, the real origin of this is, as was described to me by somebody who worked there from the very beginning, This started with the counter-proliferation movement in the U.S.
00:38:11.000 military.
00:38:13.000 Essentially, they were having trouble countering the messaging of ISIS, which they were finding difficult to understand.
00:38:22.000 ISIS was somehow reaching Basically, white suburban kids in Britain and in America.
00:38:28.000 And of course, when something happens to white suburban kids anywhere, then it becomes a crisis internationally.
00:38:34.000 They started pouring money into it.
00:38:36.000 But that was the original remit of the Global Engagement Center.
00:38:41.000 And so they went from counterterrorism to basically what they do now is counterpopulism.
00:38:48.000 It's the same people, they're using the same technologies, and they're using the same techniques to try to identify people that they consider problematic and try to find ways to diffuse that messaging, either using counter-messaging or de-amplifying the messaging or removing it from platforms or whatever it is.
00:39:09.000 To them, it's the same thing, and that's what's so frightening.
00:39:12.000 I think that people have to understand is that this all starts in a wing of the government that was looking at what they considered a terrorist threat that, you know, you could use basically any technique against, and it would be legitimate, up to and including droning them.
00:39:30.000 And they turned that entire mechanism inward, and that's really what the censorship industrial complex is.
00:39:36.000 It's just Taking the techniques that we were using to try to reduce the impact of sort of foreign terrorist communication and turning it inward on domestic unrest, people complaining about things.
00:39:54.000 People complaining about the electoral process.
00:39:56.000 People not getting vaccinated when they're told.
00:39:59.000 Whatever it is, there's always some emergency.
00:40:02.000 And they've learned that they can continually apply that over and over again.
00:40:06.000 And that, I find that terrifying.
00:40:09.000 I don't know about you, Russell, but I think it's very scary.
00:40:12.000 Yes, the idea of perpetual and never-ending crisis being a precondition for authoritarianism is by its nature terrifying, as is the shift that a mechanic designed in order to deal with an apparently external threat being inverted to deal with an internal threat, particularly using some of the Psychological critiques that have been touched upon, that indicates a kind of implosion that's difficult not to equate with pre-apocalyptic thinking.
00:40:44.000 Furthermore, to see the figures, and I'm referring directly to both of you, of course, that would once have been cherished by identifiable legacy media outlets, even if they're glib, somewhat cultural artifacts like Rolling Stone or legacy media outlets like the New York Times that it's unthinkable that Matt Taibbi or Michael Schellenberg could reliably and consistently write for those kind of outlets.
00:41:12.000 This too is indication that things are changing and they're just two examples of course Chris Hedges has had a comparable trajectory and most obviously perhaps Glenn Greenwald who's gone from breaking of the seismic stories around WikiLeaks and Snowden and Chelsea Manning to being a literal exile And Pariah, these are all, I would say, significantly terrifying alterations and changes.
00:41:37.000 Might I ask, if there is an underlying ideology here, because sometimes to hear the way that Matt has described it, then it just feels like a kind of utilisation of a creation just because it's there, it has to be used.
00:41:49.000 But I wonder, Michael, if the imperatives that undergird this new fast-moving, observably fast-moving trajectory towards authoritarianism, censorship and centralization is primarily motivated by dominion, power, money, or is there something ideological and political taking place, or is it both, and if so, how do they intersect, please?
00:42:19.000 Yeah, I mean, This thing where I think that you guys have both just tapped into this central issue, which is that there is an authoritarian mentality among the leaders of the censorship industrial complex.
00:42:34.000 I always think of this, do you guys know the Bourne movie, not the one with Matt Damon, but the other guy?
00:42:40.000 And they go in to kill Rachel Weisz?
00:42:42.000 You guys know what I'm talking about?
00:42:45.000 Okay, everybody remembers that scene with Jeremy Renner, right?
00:42:47.000 Do you remember the scene where, like, they go into her- I'm gonna spoil the movie for you guys if you haven't seen it.
00:42:52.000 But there's, like, I guess it's, like, CIA or FBI, and they go into the house, and they're, like, trying to be, like, empathic with her, because she's been through this trauma.
00:43:01.000 And then they sit her down and then they just look at each other like now and then like try to kill her and that's kind of the scene that has kept coming up in my mind as I like as I'm like reading these documents and listen and like reading the things that people are saying it's like I feel like There's a lot of like they're running an operation.
00:43:22.000 You know, you feel like you look at these people.
00:43:23.000 I wrote a piece about one person who's a former CIA fellow.
00:43:27.000 I'm very sensitive about not wanting to personalize this.
00:43:30.000 I'm even hesitant to use this person's name, but I did write a whole piece about her.
00:43:35.000 And, you know, it was like her story was she was just this hobbyist, you know, and I was just concerned about anti-vaxxers.
00:43:42.000 And then, you know, and then the next thing you know, I was advising Obama on fighting ISIS.
00:43:46.000 And I just remember being like, I don't think it works like that.
00:43:48.000 You know, like, I think it's like, these are like really hierarchical military intelligence organizations.
00:43:54.000 And that this is a person who came out of, you know, and I don't know if she, you know, was recruited out of the The NSA, which does all the spy satellites and whatnot, but, you know, this particular kind of a peculiar career.
00:44:09.000 And then she's been, I think, one of the most important intellectual architects of the censorship industrial complex.
00:44:16.000 And, you know, you would listen to her and it'd be like talking about like reducing harm in the real world and using very progressive language, like that's like language of compassion.
00:44:26.000 And we have to reduce harm.
00:44:27.000 I mean, that's been a big part of it.
00:44:28.000 We have to reduce hatred in the society.
00:44:30.000 And then it just feels like that moment from, you know, the Bourne movie where it's like, and then we have to, you know, fight the disinformation.
00:44:37.000 And so, you know, I have to say, I think like that seems to be, I mean, that seems to be like the undercurrent is that I think that there's that what brings us all together is a kind of suspicion of authority.
00:44:48.000 I mean, my dad had not only did he have a beetle, VW bug, we had the question authority sticker on
00:44:56.000 the car. I mean that was who we were and so for me that was a part of it and I
00:45:00.000 think one of the delightful parts of it is that there's there's people in
00:45:04.000 this anti-censorship movement, this free speech movement, who you know like
00:45:08.000 there was a guy that had blocked me because he had we had been in this huge fight
00:45:12.000 about nuclear power which is something I support.
00:45:15.000 A lot of the folks that are very anti-authoritarian are anti-nuclear, and I actually asked him recently to stop blocking me on Twitter, and he did, because we're on the same side of this.
00:45:26.000 And so but I do feel like that seems like that's a big part of it is that if there's an ideology, it's just some of it's like, just questioning authority and, you know, being able to have a conversation and ask hard questions of people and not kind of not wanting to be in a situation of just following orders.
00:45:45.000 My concern is that the end point of this is an inability to openly communicate in good faith, in particular with people that you disagree with.
00:45:53.000 It's interesting that Matt Taibbi uses as his framework George Orwell, and you use a lesser known Bourne film.
00:46:01.000 Not even the main ones.
00:46:06.000 Do you think Matt, that this oddly mercurial shape-shifting, probably shouldn't use conspiracy theory type language, should I, in this context.
00:46:18.000 Do you think that this odd pathology might afford the possibility that many of the agencies and key figures that profess concern around misinformation and its linguistic acolytes Mal and Dis, I think, are the other ones you can have, aren't there?
00:46:37.000 Miss Mal, Dis.
00:46:38.000 Miss Mal, yeah.
00:46:39.000 Mal, Miss and Dis, yeah.
00:46:41.000 Like Scrooge's nephews in the DuckTales films, which I reckon Michael will probably use as a paradigm in a minute.
00:46:53.000 Do you reckon that some of them agencies that are like, oh no, we've got to watch all this misinformation, are actually culpable for themselves spreading misinformation?
00:47:01.000 Oh, absolutely.
00:47:03.000 Yeah, but I don't think they...
00:47:06.000 Russell, I don't think they see it as disinformation.
00:47:09.000 I think they see things as politically true, even if they're factually proven untrue later.
00:47:18.000 Michael and I, just in the last week or so, we've gotten in the middle of this story involving the origins of COVID-19.
00:47:28.000 This is really a fascinating episode in world history, because this thing happened, And immediately before we had any answer as to, you know, the cause of the pandemic, a whole
00:47:46.000 The universe of possibilities was ruled out.
00:47:48.000 We were basically just told, it can't be this, so let's not look over there.
00:47:53.000 And that is something that maybe you might see somebody in the military think, but a journalist should never think like that.
00:48:01.000 First of all, we shouldn't really care.
00:48:04.000 All the old journalists that I know would be in They would start with indifference, you know, I'm happy to report this if this is true, I'm happy to report that if that's true.
00:48:18.000 But we were told, you know, basically, no, this is the new version of how we do information in the world, is that things are right and righteous, and that you have to get behind them emotionally as you're reporting them.
00:48:34.000 Therefore, there's this extraordinary incentive to become a believer.
00:48:41.000 It's much more like religion than journalism, I think.
00:48:44.000 And that, you know, I think that's kind of the same precondition you have to have to fight a war.
00:48:49.000 You know, you have to believe in something.
00:48:51.000 You have to have it.
00:48:52.000 You have to believe it in your gut in order to press a button to drone somebody.
00:48:57.000 I don't know.
00:48:57.000 I mean, I think it's very strange.
00:48:59.000 It's so different from how I was raised to view journalism, but what they've succeeded in doing, and I was going to ask you about this because you're in this very strange position with YouTube and Rumble and everything, where you have to be constantly thinking What are they going to consider real?
00:49:19.000 And what are they going to going to consider off limits?
00:49:22.000 And how do you do humor in a situation like that?
00:49:24.000 How do you do reporting in a situation like that?
00:49:28.000 If I mean, look at the New York, the New York Times had to basically write around, you know, stories about Nord Stream, the Wall Street Journal is now having to write around Stories about, you know, the COVID leak.
00:49:42.000 There was a story last week about, you know, neo-Nazi emblems in Ukraine, and they had to frame it as this might hurt the war effort by making Russian propaganda look good, as opposed to just reporting it.
00:50:00.000 So they've turned us into believers instead of just sort of passive consumers.
00:50:06.000 Judgers who are interested, and I just wonder, what do you do when you're doing your show?
00:50:12.000 Do you have to give in to that urge to consider it all the time?
00:50:19.000 To return to the broader framing of our conversation, Michael, seeing as how you seem to be in charge, making poor Matt write a whole essay, To the star of everyone in the world's favorite movie, the Shawshank Redemption, Mr. Tim Robbins with a question.
00:50:42.000 Round of applause for the great Tim Robbins!
00:50:46.000 Please ladies and gentlemen, how about a round of applause for Stella Assange?
00:50:49.000 Elon Musk and Mark Zuckerberg are having a cage fight.
00:51:08.000 But is their real battle going to be against the EU, who are introducing new censorship laws that threaten to bring down free speech altogether?
00:51:19.000 Thanks for joining me on this voyage to truth and freedom at a time where we oppose serious forces which in my opinion have observed the trend that independent media can provide voices to people that oppose centralist authoritarian narratives and are beginning to shut it down.
00:51:36.000 Even Elon Musk has capitulated to EU laws to censor Twitter.
00:51:41.000 Let's have a look at Elon Musk and Mark Zuckerberg's fight being spoken about by the mainstream news.
00:51:46.000 As if it's a real fight between actual fighters, which is weird.
00:51:49.000 What kind of circus world is this?
00:51:50.000 What kind of hysterical clown spectacle are we living in when this is seriously discussed?
00:51:55.000 I'll tell you what kind of world we're living in.
00:51:56.000 One where they want to distract you from the fact that vast, giant, unelected, bureaucratic bodies are imposing power like racketeers and gangsters in a way that is completely unprecedented.
00:52:09.000 I believe because they recognize that independent media and independent politics is on the rise and could capsize the establishment.
00:52:16.000 A tech billionaire cage match could become reality.
00:52:20.000 Can you imagine that?
00:52:21.000 Elon Musk, Mark Zuckerberg in the cage and they say they're absolutely serious about trading blows in the octagon.
00:52:28.000 It's funny that they're absolutely serious about it because when they recognize what the EU and the Five Eyes countries are going to do with these new censorship laws, they're going to have different concerns.
00:52:37.000 It's interesting that during the pandemic, Fauci was exchanging letters With Zuckerberg.
00:52:42.000 Zuckerberg was saying I'm available to help you.
00:52:43.000 But now Zuckerberg has on Lex Friedman's podcast said they went too far.
00:52:47.000 They censored stuff that they shouldn't have done.
00:52:49.000 Asked for a bunch of things to be censored that in retrospect ended up being more debatable or true.
00:52:54.000 It's amazing that Elon Musk has acquired Twitter and seems committed to free speech.
00:52:59.000 But similarly astonishing is the fact that the EU have been able to get him to agree to this legislation.
00:53:06.000 If the The world's richest man cannot oppose these forces, who can?
00:53:11.000 The answer to that rhetorical question is only all of us.
00:53:14.000 Only all of us.
00:53:16.000 It all started with a tweet.
00:53:18.000 We all thought Musk and Zuck were joking about it on social media, but this week Zuckerberg posted on his Instagram story, send me location.
00:53:28.000 Send me location.
00:53:29.000 So there you go, an extraordinary thing to ponder.
00:53:32.000 At this age where we're fascinated again by the Titanic, billionaires threaten to brawl.
00:53:38.000 Surely this is a ridiculous spectacle, and surely this has to be considered in the context where giant bureaucratic bodies are looking to impose never-before-seen measures to regulate and control speech.
00:53:50.000 Now the simple question we have to ask, and let me know in the comments if you agree with this, while you simultaneously subscribe if you don't mind, Let me know this.
00:53:57.000 Do you think that there are any giant legislative bodies that have the moral authority to regulate your life?
00:54:05.000 To regulate your free speech?
00:54:07.000 Who is it that you trust?
00:54:09.000 What government?
00:54:10.000 What NGO?
00:54:11.000 Who do you trust?
00:54:12.000 Do you trust the IMF, the WHO, the World Bank, the WEF, the American government, the Republican Party, the Democrat Party, the British government, the EU, Macron, Trudeau?
00:54:24.000 Who do you trust?
00:54:24.000 Who?
00:54:26.000 Who would you vote for?
00:54:27.000 Who do you want censoring you?
00:54:29.000 Twitter CEO Linda Iaccarino, along with owner Elon Musk, met with the EU on Wednesday.
00:54:34.000 At the top of the agenda for the EU was ensuring that Twitter is going to comply with its upcoming censorship law, which is scheduled to be enforced on August the 25th, 2023.
00:54:45.000 Not far off.
00:54:47.000 Coming up soon.
00:54:48.000 Enforced.
00:54:49.000 The censorship law is entitled the Digital Services Act, It's just the Digital Services Act.
00:54:55.000 The EU has been pushing for Twitter to commit to complying with it.
00:54:58.000 Ever since he took over Twitter, Elon Musk has been under fire from various political and media corners.
00:55:03.000 That's right, because, you know, they care about you so much that they're attacking anyone that's talking about free speech.
00:55:09.000 Remember, I don't have unmitigated, unregulated, unbounded faith in any individual, actually, because I'm an Adult.
00:55:16.000 But broadly speaking, I think that Elon Musk has been a positive influence in the world of free speech, discourse and debate.
00:55:22.000 He's not censoring either side of any conversation.
00:55:25.000 And God, in the end, I think that's where we have to get to.
00:55:27.000 Let me know in the comments if you agree.
00:55:30.000 He's teasing the Brussels bureaucrats by promising that his social media company will respect EU laws designed to fight disinformation and hate speech, saying if laws are passed, Twitter will obey the law.
00:55:42.000 And I suppose in a sense that's bloody obvious, isn't it?
00:55:44.000 Because what's the alternative?
00:55:45.000 Twitter will break the law.
00:55:47.000 But the problem is, of course, that hate speech and disinformation are terms that are used to censor.
00:55:53.000 Who in the world, in their right mind, other than a few, Lunatics think that hate speech is a good thing.
00:56:00.000 What?
00:56:00.000 Hate speech?
00:56:01.000 I like speaking hate to people.
00:56:03.000 I want to malign vulnerable people.
00:56:05.000 I want to speak hatefully about people that are different from me in some way.
00:56:09.000 The number of people that are doing that, you know, I know that there are people doing that.
00:56:12.000 But they are not as significant, I would contest, as the ability of astonishingly powerful bureaucratic agencies who want to censor and control counter-narratives that, in my view, are designed to democratize increasingly authoritarian spaces.
00:56:30.000 Are you beginning to understand that your alliances with people who are culturally different than you, that the cultural differences are, whilst they are important to you, and this is my point in fact, I welcome how important they are to you, if you are not willing to form alliances with and stop criticising people that live differently from you, these giant bureaucratic bodies are just going to steamroll through your freedom, while you're going, I agree with this part of it, they're legislating in my favour, woohoo!
00:56:59.000 And disinformation, of course, is an arbitrary term.
00:57:01.000 It just means information that we don't like.
00:57:04.000 Just look at the last few years.
00:57:05.000 Information that was regarded as disinformation a few years ago proved to be information.
00:57:10.000 Where's the apology?
00:57:11.000 Where's the reversal?
00:57:12.000 Where's the compensation?
00:57:14.000 Instead of any of those things, we're gonna pass massive disinformation laws!
00:57:17.000 This comes after France's digital minister, Jean-Noël Barraud, threatened the social media platforms' access to the bloc.
00:57:25.000 Disinformation is one of the gravest threats weighing on our democracies, said Barraud.
00:57:29.000 Our democracies?
00:57:30.000 It's not democratic, is it?
00:57:31.000 It's not democratic if you can't, using the ballot and the electoral mechanisms, alter the trajectory and machinations of power.
00:57:40.000 Twitter, if it repeatedly doesn't follow our rules, will be banned from the EU, the French minister added.
00:57:45.000 So Twitter could face being cut out of, like, that's probably, what, a fifth of the planet's population?
00:57:50.000 Significant and advanced economies, just due to the way that society and civilization rolled out.
00:57:54.000 So obviously, Twitter can't afford to do that.
00:57:56.000 So the point I'm making here is not, oh, Elon Musk, he's a coward.
00:57:59.000 It's no one can oppose force of that magnitude.
00:58:03.000 It's irresistible bureaucratic power.
00:58:06.000 In May, the European Union's Internal Markets Commissioner Thierry Breton warned that the platform cannot hide from obligations to censor content.
00:58:14.000 You can run but you can't hide, Breton threatened in a tweet.
00:58:18.000 When did we get to the point where like bureaucrats from the EU started saying you can run but you can't hide?
00:58:23.000 There's just someone in the office at the EU.
00:58:25.000 You pull a knife, we'll pull a gun.
00:58:27.000 Brings a knife to a gunfight.
00:58:29.000 You'll send one of ours to the hospital, we'll send one of yours to the morgue.
00:58:32.000 Hashtag, that's the EU way.
00:58:35.000 They're not gangsters, but they actually are though, aren't they?
00:58:37.000 They actually are.
00:58:38.000 Their rhetoric is starting to be laced with the true violence that is always just out of view when power is asserted.
00:58:47.000 Beyond voluntary commitments, fighting disinformation will be a legal obligation under DSA as of August the 25th, Breton continued.
00:58:56.000 Our teams will be ready for enforcement.
00:58:58.000 What a terrifying statement!
00:59:00.000 Our teams will be ready for enforcement.
00:59:02.000 You will comply.
00:59:03.000 You have 20 seconds to comply.
00:59:06.000 I don't know.
00:59:06.000 I don't want to be hysterical, as we say in my line.
00:59:09.000 If it's hysterical, it's historical, i.e.
00:59:12.000 if something makes you go crazy, it's probably triggered something from your past.
00:59:15.000 But our teams will be ready for enforcement, and words like, beyond voluntary commitments, fighting disinformation will be a legal obligation.
00:59:23.000 That's saying you are going to do what you're told.
00:59:25.000 No wonder the cage find stuff's coming into the public conversation.
00:59:29.000 Because we're getting to basic animal freedom.
00:59:32.000 The animal freedom.
00:59:33.000 The freedom of your body.
00:59:34.000 How are you going to make me?
00:59:35.000 We're going to make you with force.
00:59:37.000 Is that how people talk when they're trying to protect you?
00:59:40.000 We are going to protect the public from threatening head speech.
00:59:43.000 Oh yeah?
00:59:43.000 And how are you going to do that?
00:59:44.000 By threatening head speech?
00:59:45.000 Let's have a look at Breton, a man who talks a good fight.
00:59:48.000 Let's see what he looks like.
00:59:50.000 My mission is just to make sure that as of August 25th they will don't like the old desk slam much they will they will respect the law or they will not be able to Where's this power coming from?
01:00:12.000 They're barely visible, unremarkable bureaucrats saying stuff like, you can run but you can't hide.
01:00:19.000 That's where power is.
01:00:20.000 That's power, to say you won't be able to operate in Europe.
01:00:23.000 Let me know in the comments what you think.
01:00:24.000 Bretton was referring to the censorship law, the controversial Digital Services Act, a new set of rules for social media platforms operating in Europe which require them to actively police content or risk fines of up to 6% of global turnover.
01:00:37.000 No one's going to be prepared to risk that.
01:00:40.000 No one.
01:00:40.000 Because when you think of like a Twitter or Alphabet or Meta or Facebook or those companies, 6% of a turnover is meaningful.
01:00:48.000 They won't be able to do that because they're shareholders.
01:00:50.000 It's a rigged game.
01:00:51.000 People have looked at the way things are going.
01:00:53.000 Wait a minute.
01:00:54.000 Donald Trump, RFK, five years, ten years, carry the one.
01:00:57.000 Oh no, there's going to be independent media voices, independent political voices.
01:01:01.000 We have to shut this stuff down.
01:01:02.000 They won't be able to do it if we control the social media platforms.
01:01:05.000 We already control mainstream media.
01:01:06.000 Let's exert that kind of control over social media.
01:01:09.000 Otherwise, there's a genuine threat that you'll get political movements that are opposed to this kind of bureaucratic economic and state power and where those three things intermingle and coalesce.
01:01:20.000 They'll be able to attack that, but they won't be able to do that if If we hit them where it hurts.
01:01:24.000 This is what these measures are about.
01:01:26.000 Do you seriously believe it's about protecting you?
01:01:29.000 It is a mad, crazy idea.
01:01:30.000 Oh, we're going to propose this thing.
01:01:32.000 What do you think?
01:01:33.000 Would you like to vote for it?
01:01:34.000 Vote yes.
01:01:35.000 If you would like it, vote no.
01:01:36.000 No, don't ask them to vote for stuff.
01:01:38.000 They vote for people we don't like.
01:01:40.000 They vote for stuff we don't want.
01:01:43.000 The current code of practice, which is voluntary, includes obligations for social media to stop the monetization of disinformation, monitor political advertising, and allow third parties to access their algorithms.
01:01:58.000 Access their algorithms means manage which information is promoted and manage which information disappears.
01:02:04.000 So, shadow banning and the ability to control the flow of free information.
01:02:08.000 End of freedom of speech.
01:02:09.000 Wow.
01:02:09.000 So you have to make free speech sound like a bad thing.
01:02:12.000 When I'm talking about free speech, I'm not going, hey, free speech, I want to be able to abuse people that are different colours or genders or whatever.
01:02:19.000 I'm like, no, no, no.
01:02:20.000 If we can't speak freely, we're not gonna be able to attack power.
01:02:22.000 What do you think power cares about?
01:02:24.000 Vulnerable people or powerful people?
01:02:25.000 Tell me in the comments.
01:02:26.000 In February, Twitter did not submit a report on its implementation of the code.
01:02:31.000 It was the only major platform to fail to do so.
01:02:33.000 Props.
01:02:34.000 Unlike the code of practice, the DSA is legally binding and large platforms, Twitter, Facebook, Instagram, YouTube, TikTok, Pinterest, Snapchat and LinkedIn will have to comply if they want to operate in Europe.
01:02:44.000 In April, the Digital Services Act just got enriched by a new law that will allow the bloc to declare a state of emergency on the internet.
01:02:52.000 Oh, like those truckers.
01:02:53.000 Small fringe minority holding unacceptable So if you don't agree with the state, you're like, oh, we'll protest.
01:03:00.000 Oh, I'm afraid it's an emergency.
01:03:03.000 Comply.
01:03:04.000 And this is bureaucratic power at the level of the state that's bypassing democracy, now interconnecting with big tech power, which, as you know, is moving towards surveillance and social credit scoring.
01:03:14.000 It's just a matter of time until they could just fade down your economic in a bit.
01:03:17.000 You are a non-person.
01:03:19.000 You've got no money.
01:03:20.000 You can't vote.
01:03:21.000 You can't travel.
01:03:22.000 Do you not see this is where this is heading?
01:03:25.000 Do you genuinely think they care about rainbows and stuff?
01:03:28.000 Hey, come on, help the key... Well, hold on a minute.
01:03:31.000 Remember we cared about nurses last week?
01:03:32.000 Did we care about the nurses this week?
01:03:34.000 No, we're firing them all.
01:03:35.000 Remember you cared about key workers and lorry drivers and... No, we don't care about truck drivers anymore.
01:03:39.000 Oh, wow, it's almost like you've got no principles other than the principle to maintain power and to control information.
01:03:45.000 Oh, I'm sorry it looks like that.
01:03:46.000 You can't run and you can't hide and you must comply.
01:03:49.000 A state of emergency normally gives governments extraordinary powers and suspends normal laws and regulation in order to preserve lives and property, something that has thus far been used in case of war or natural disaster, i.e.
01:04:00.000 those events affecting a country's physical security, economy, etc.
01:04:04.000 But now the 27 EU countries will be able to do the same in imposing extraordinary control on all key public-facing elements of the web, social platforms, search engines and e-commerce sites.
01:04:14.000 What they're doing are taking measures that they trialed during the pandemic and applying it to the internet.
01:04:21.000 They're mapping the concept of emergency onto management of information.
01:04:26.000 Many people said, hey what if this pandemic and the subsequent measures are a kind of trial run?
01:04:32.000 Yes, it looks like it was, and in ways that are sort of difficult to preempt actually, because they're applying it to a sort of cyber realm now.
01:04:41.000 But it's plainly about control.
01:04:43.000 I'm a person that's always in the past believed that the function of the state was to protect the people potentially
01:04:49.000 from commercial and corporate interests becoming too gargantuan
01:04:52.000 and out of control as well as obviously protecting us, the people, from foreign
01:04:57.000 invasions, war and stuff.
01:05:00.000 But now I think that the state does not do that and cannot do that and will not do that and I don't think of
01:05:06.000 myself as like, well, it should be all private economies and free market.
01:05:09.000 I don't think that because I think that's out of control and mental and greedy.
01:05:12.000 But this stuff, this is the beginning of the end right here in legislative form.
01:05:17.000 Daphne Keller of Stanford's Cyber Policy Center has been quoted as telling Wired,
01:05:22.000 It looks like the war in Ukraine created a political opportunity for advocates of tighter restrictions to push
01:05:27.000 their agenda.
01:05:28.000 That's pretty normal politics, if bad law. People have always said, what?
01:05:32.000 What happens is, is there's a crisis, then the crisis is used to present a solution, then the solution permits legislational regulation, which then enhances the power of centralised authority.
01:05:42.000 They're right!
01:05:43.000 That's happening whether it's a pandemic or this war.
01:05:45.000 They're being used to usher in new legislation and regulation that normalises control.
01:05:50.000 It's of interest to note that big tech has been playing along pretty well so far, making this latest legislative push somewhat unclear.
01:05:56.000 Both during the Covid and Ukraine war events, these large corporations have been heeding political messages and catering to political needs, basically to a fault.
01:06:04.000 Reports suggest now that the bureaucrats in Brussels may just want to make their job simpler.
01:06:09.000 Instead of having to go to the sanctions regime and relying on big tech to obey, like they did when they blocked Russian media outlets like RT and Sputnik, they will now have a whole new law that enforces all this in one fell swoop.
01:06:21.000 So instead of politely asking that RT be shut down, which Rumble refused to do and they are now banned in France as a consequence, you know Rumble's where most of my content is now because of free speech, is precisely because what was once a request is now becoming an order.
01:06:35.000 All of this politeness, all of this kindness, all of this compassion, all this apparent care and concern for vulnerable people appears actually to be a way of crowbarring new legislation in which will serve the powerful.
01:06:46.000 This is happening as big tech, both from the West, like Google, Facebook and Amazon, and from the East, like TikTok, are yet to make any comment.
01:06:52.000 And now it's up to EU member countries to approve the law and allow the crisis mechanism to kick into gear.
01:06:57.000 It gives us an interesting perspective, doesn't it, on the recent TikTok congressional hearings in the United States.
01:07:03.000 Perhaps the real problem with TikTok is it's a company that's outside of the anglophonic axis or the European tradition and therefore potentially beyond the legislative power of bureaucracies like the EU and the United States and northern American countries like Canada.
01:07:17.000 Perhaps the fact that TikTok exists outside of that remit that makes it a problem.
01:07:22.000 If even Elon Musk, with his buccaneer attitude to rules and regulations, with his appetite for free speech, recognises that what the EU is proposing is serious, then obviously what we're dealing with is a potentially existential threat.
01:07:35.000 I would regard this legislation as the beginning of the kind of globalist, centralising, authoritarian horror that we've been discussing mostly in terms of a somewhat Distant threat.
01:07:47.000 This is the legislation that brings a globalist mechanic even closer into fruition.
01:07:53.000 It makes free speech harder.
01:07:55.000 It makes opposing power harder.
01:07:57.000 It makes new alliances more difficult.
01:07:58.000 I don't see the opportunity to oppose it electorally.
01:08:01.000 Anywhere!
01:08:02.000 So, Zuckerberg and Elon Musk, instead of fighting one another, should probably unite.
01:08:07.000 But if neither of those men have the power to oppose these kind of institutions, then it's down to us, individually and collectively, to demand the free speech of those we disagree with.
01:08:19.000 Only then, do we have a chance of confronting this new, racketeering, bureaucratic mob.
01:08:25.000 But that's just what I think!
01:08:26.000 Until next time, stay free!
01:08:28.000 Many switches, switch on, switch off.