Rebel News Podcast - September 03, 2020


Why aren't Commonwealth countries defending civil liberties?


Episode Stats

Length

40 minutes

Words per Minute

160.9319

Word Count

6,484

Sentence Count

472

Misogynist Sentences

19

Hate Speech Sentences

8


Summary

Why are the Commonwealth countries, Canada, the UK, Australia and New Zealand, doing so poorly at standing up for our civil liberties in the face of the pandemic lockdown? I ll give you some crazy examples too, including a new video out of Melbourne, Australia that will make you sick.


Transcript

00:00:00.000 Hello, my friends. In today's podcast, I ask a question. Why are the Commonwealth countries,
00:00:05.240 Canada, the UK, Australia, New Zealand, doing so poorly at standing up for our civil liberties in
00:00:12.520 the face of the pandemic lockdown? I don't get it. I'll give you some crazy examples too,
00:00:17.240 including a new video out of Melbourne, Australia that will make you sick. I wish you could see the
00:00:22.840 video so you'll hear it and you'll understand it. But seeing is believing with these home invasions.
00:00:28.780 I mean, I won't tell you what you're in for, but basically cops busting into a private house
00:00:33.620 with a search warrant because someone didn't like the lockdown. I'd like it if you could see the
00:00:39.700 videos here, and you can by becoming a premium subscriber. It's what we call Rebel News Plus.
00:00:45.300 It's the podcast, but in video form with all the video clips and me standing in the studio,
00:00:50.580 waving my arms around. It's eight bucks a month to become a subscriber to Rebel News Plus.
00:00:55.720 And in addition to my daily show, you get a show from Sheila Gunn-Reed and David Menzies every week,
00:01:01.880 plus the satisfaction of knowing that your eight dollars a month
00:01:04.940 helps us pay for our independent journalism. Okay, here's today's podcast.
00:01:09.280 Tonight, why are the Commonwealth countries doing so poorly defending civil liberties?
00:01:28.800 It's September 2nd, and this is the Ezra LeVant Show.
00:01:31.400 Why should others go to jail when you're a biggest carbon consumer I know?
00:01:37.060 There's 8,500 customers here, and you won't give them an answer.
00:01:41.100 The only thing I have to say to the government about why I publish it is because it's my bloody right to do so.
00:01:46.980 We're not as free as the Americans, a country born in revolution where freedom was the central issue.
00:01:57.360 You really should read the Declaration of Independence.
00:02:00.360 I mean, this stuff.
00:02:02.500 We hold these truths to be self-evident, that all men are created equal,
00:02:06.360 that they are endowed by their Creator with certain unalienable rights,
00:02:10.700 that among these are life, liberty, and the pursuit of happiness,
00:02:15.520 that to secure these rights, governments are instituted among men,
00:02:19.460 deriving their just powers from the consent of the governed,
00:02:22.960 that whenever any form of government becomes destructive of these ends,
00:02:27.100 it is the right of the people to alter or to abolish it.
00:02:31.460 I mean, if that's the recipe for dissolving an old country and building a new country,
00:02:37.460 you know it's going to be good.
00:02:38.880 If you think that's good, well, you've got to read their Bill of Rights, too.
00:02:42.140 First Amendment for free speech, Second Amendment for the right to bear arms.
00:02:45.320 It's a pretty serious place when it comes to freedom.
00:02:47.880 But, you know, here in Canada, we're supposed to be pretty free, too.
00:02:52.280 We're more passive, to be sure.
00:02:53.900 Canada got our independence through negotiation and legislation, not revolution.
00:03:00.160 Same with Australia, New Zealand, pretty much the whole Commonwealth.
00:03:04.160 But you'd think we'd still have kept the best of British freedom, the Magna Carta,
00:03:09.660 the evolution of British liberty as enshrined in both laws passed by Parliament,
00:03:15.020 but also centuries of common law, court rulings.
00:03:18.520 And then there's just the culture, the literature, the arguments.
00:03:22.680 This book, published in 1644 by John Milton, with the Greek name Areopagitica,
00:03:29.220 For the Liberty of Unlicensed Printing.
00:03:32.980 What does that mean?
00:03:34.940 350 years ago, the British Parliament wanted to pass an order
00:03:37.800 that you had to have a license before you could print a book.
00:03:42.480 So it wasn't even just censorship.
00:03:44.640 It was censorship before the fact.
00:03:47.580 You couldn't even publish and then be censored.
00:03:50.120 You had to go to the government first
00:03:51.920 to ask for the right to publish in advance.
00:03:54.820 That's how un-British.
00:03:57.920 That's what I mean by the culture of freedom.
00:03:59.920 There's laws, then there's the history, the customs, like John Milton.
00:04:04.020 For centuries, freedom was the thing in the United Kingdom, in the British Empire.
00:04:08.600 And they spread it around the world.
00:04:10.640 Surely we inherited enough of that.
00:04:13.660 But I point out that while there have been some absurd violations of civil liberties in the United States,
00:04:19.300 such as California's satanic rules that ban churches from opening,
00:04:25.140 in general, America has been pretty free.
00:04:28.200 I suppose partly it's because there are 50 different United States,
00:04:31.540 and governors can only be a tyrant in their territory.
00:04:35.500 And while California is unfree in part,
00:04:38.700 because it's so very Democrat-dominated,
00:04:41.560 both the legislatures and the courts,
00:04:44.240 well, most of America is saner and has more checks and balances
00:04:47.580 because it's not so lopsided to one party.
00:04:50.520 The Constitution of the United States applies across the country.
00:04:55.260 America's done pretty well, all things considered.
00:04:57.260 But what about us?
00:04:59.960 In Canada and Australia and New Zealand and the UK itself,
00:05:03.520 how have we done?
00:05:05.540 We've done the worst.
00:05:07.400 Among the worst, not as bad as China.
00:05:09.840 But terrible on the civil liberties side, I mean.
00:05:12.680 New Zealand, they were so proud of themselves
00:05:15.580 for not having any cases of the virus.
00:05:17.900 They celebrated 100 days virus-free
00:05:20.200 because they're a small island,
00:05:22.300 and they're fairly isolated,
00:05:24.120 and then they basically stopped all air travel.
00:05:27.220 And of course, because their prime minister is a woman,
00:05:30.200 a socialist extremist named Jacinda Ardern,
00:05:33.040 well, you had countless stories about how it was because she was a woman
00:05:36.380 that New Zealand was doing so well.
00:05:38.220 I'm serious.
00:05:38.920 New York Times said that.
00:05:40.100 You know, that's obviously a scientific explanation.
00:05:42.180 Yeah, well, New Zealand cut itself off, but not 100%.
00:05:47.240 So someone did come in with the virus,
00:05:50.880 and it spread like wildfire to four people.
00:05:56.340 Four.
00:05:57.720 And they went into full panic,
00:06:00.680 full authoritarian mode.
00:06:02.320 Oh, Jacinda Ardern was just waiting for that.
00:06:04.160 They went full crazy.
00:06:05.500 They actually postponed their parliamentary elections for a month.
00:06:09.020 They put 1,200 New Zealand soldiers on the street
00:06:12.640 to enforce their mask laws.
00:06:15.200 Mandatory tracking of people, too.
00:06:17.280 And listen to Ardern cackle about how she locks people up
00:06:21.800 and keeps them locked up until they take some medical tests.
00:06:25.720 You said I wanted to...
00:06:28.240 I've got a number of questions about people...
00:06:31.080 You know, what do we do if someone refuses to be tested?
00:06:34.940 Well, they can't now.
00:06:36.160 If someone refuses in our facilities to be tested,
00:06:41.100 they have to keep staying.
00:06:42.760 So they won't be able to leave after 14 days.
00:06:45.420 They have to stay on for another 14 days.
00:06:47.660 So it's a pretty good incentive.
00:06:49.460 You either get your tests done and make sure you're cleared,
00:06:52.640 or we will keep you in a facility longer.
00:06:55.140 So I think most people will look at that and say,
00:06:57.220 I'll take the test.
00:06:58.940 Yeah, you can tell she was the youth president
00:07:01.120 of the Socialist International.
00:07:02.420 But she's just one person in a commonwealth democracy.
00:07:06.940 She's not a tyrant by nature.
00:07:09.640 I mean, her position isn't.
00:07:11.280 She is.
00:07:11.880 But her position...
00:07:13.280 She's the prime minister in a parliamentary system
00:07:15.620 that has checks and balances,
00:07:17.960 that has the rule of law,
00:07:18.920 that has courts,
00:07:19.980 that has opposition MPs,
00:07:21.480 that has the press.
00:07:22.680 How did it happen?
00:07:24.580 You can't put it all on her.
00:07:26.840 It was the whole country that succumbed.
00:07:29.880 How?
00:07:31.700 Australia, or at least the state of Victoria in the Southeast,
00:07:34.480 has gone just as nuts.
00:07:38.720 This is from a few weeks ago.
00:07:41.620 Remember this?
00:07:43.060 You're f***ing choking me, dude!
00:07:45.180 What the f***?
00:07:48.140 You're f***ed in the head!
00:07:49.940 You're f***ed in the head!
00:07:50.880 What on earth?
00:07:52.300 And that's in the name of public health?
00:07:53.720 Or did you see this one?
00:07:54.700 I think I showed this to you in my daytime show the other day.
00:08:04.240 Getting distance, please?
00:08:05.580 I'm distanced.
00:08:06.660 Not from me.
00:08:07.840 See you later.
00:08:08.540 Excuse me.
00:08:09.180 Don't touch me.
00:08:09.760 You're quite in my way.
00:08:10.480 Excuse me.
00:08:10.900 Don't touch me.
00:08:12.860 Excuse me.
00:08:14.260 I've got all of that on video.
00:08:24.700 No, no, no.
00:08:25.760 You're not free to go.
00:08:26.620 Please don't touch her.
00:08:27.800 Don't touch me.
00:08:28.800 You're not free to go.
00:08:29.680 Take your hand up.
00:08:30.560 I'm not free to go.
00:08:31.980 I'm recording this.
00:08:33.040 She's asking you to remove her hand.
00:08:34.820 You're now under arrest.
00:08:36.660 You're under arrest.
00:08:37.840 Anyone else?
00:08:40.860 Can you please stop moving closer towards me?
00:08:43.160 What's that?
00:08:43.540 Can you stop moving closer towards me?
00:08:45.360 Mate, I'm not watching the ground.
00:08:59.220 Excuse me.
00:09:00.220 She has her son with her and she's done nothing wrong.
00:09:11.080 What a joke.
00:09:12.260 You're embarrassing.
00:09:13.900 You're embarrassing.
00:09:14.740 You're trying to make a point.
00:09:19.820 She has done nothing illegal.
00:09:21.580 Leave her alone.
00:09:22.320 Stop it.
00:09:23.000 She has her son with her.
00:09:25.000 You're embarrassing.
00:09:27.100 Leave her alone.
00:09:28.000 Can you please not get so close to me?
00:09:29.720 I'm social distancing.
00:09:31.520 You have come within a metre and a half of me.
00:09:34.120 I've been social distancing.
00:09:36.320 Keep away from me.
00:09:38.080 You just don't want me recording her.
00:09:39.480 Leave her alive.
00:09:40.240 Leave her alive.
00:09:41.440 Let her hold her.
00:09:42.380 Let her hold her.
00:09:43.000 She has her child with her.
00:09:45.340 You are disgusting.
00:09:47.740 You are disgusting human beings.
00:09:50.140 She has her child with her.
00:09:53.880 This is a poorly.
00:09:55.420 Are you a mother?
00:09:57.160 Are you a mother?
00:09:59.520 What were they doing wrong?
00:10:01.940 That was so important to stop that you traumatised that kid for life.
00:10:05.200 And look at this new video from yesterday in Melbourne.
00:10:09.480 You can show me your search warrant before you go through my house.
00:10:13.900 You're the...
00:10:15.900 You're the...
00:10:16.200 You're the...
00:10:16.240 I own his house.
00:10:19.180 There he is.
00:10:19.560 It's a search warrant.
00:10:21.300 Search warrant for what?
00:10:22.660 Now what I want to explain to you is if you want to listen, you've got your phone going.
00:10:27.680 Yeah, I do.
00:10:28.200 Yeah.
00:10:28.300 Right.
00:10:28.880 Now you're under arrest in relation to incitement.
00:10:31.920 Incitement?
00:10:32.400 You're not obliged to say I do anything, but anything you say I do may be given in evidence.
00:10:37.380 Excuse me, incitement for what?
00:10:39.060 What the...
00:10:39.620 What on earth?
00:10:40.720 Excuse me, what on earth?
00:10:42.680 Just put your phone down.
00:10:43.900 Can you, like, report this?
00:10:45.440 I'm in my pyjamas.
00:10:46.180 What's this?
00:10:46.620 I had an ultrasound in an hour.
00:10:48.060 Yeah, she's pregnant, so...
00:10:50.100 I'll take it easy.
00:10:51.380 What's this about?
00:10:52.260 If I had an ultrasound in an hour.
00:10:54.660 Let me finish and I'll explain.
00:10:56.040 It's in relation to a Facebook post, in relation to a lockdown protest you put on for Saturday.
00:11:02.180 Yeah, and I wasn't breaking any laws by doing that.
00:11:04.720 You are, actually.
00:11:05.600 You are breaking laws.
00:11:06.400 That's why I'm arresting you, in relation to incitement.
00:11:08.420 How can you arrest her?
00:11:09.900 In front of my two children.
00:11:11.360 Can't you just say to her, take the post down?
00:11:13.160 Like, come on.
00:11:13.580 I'm happy to delete the post.
00:11:14.980 This is ridiculous.
00:11:15.940 Yeah.
00:11:16.260 I have to give you least caution and rights.
00:11:18.000 You understand?
00:11:18.760 Yeah, that's fine.
00:11:19.540 Like, I'm happy to delete the post.
00:11:21.420 This is ridiculous.
00:11:22.460 Like, I just...
00:11:23.180 That's fine.
00:11:24.060 I'm getting the evidence.
00:11:24.820 Do you understand that?
00:11:25.620 Yeah, that's fine.
00:11:26.440 But my two kids are here.
00:11:27.720 I have an ultrasound in an hour.
00:11:29.580 Like, I'm happy to delete the post.
00:11:31.820 You also have the right to communicate with or to communicate with a legal practitioner.
00:11:36.260 Do you understand those rights?
00:11:38.040 Yeah, this is ridiculous.
00:11:39.300 Yeah, this is a bit unfair.
00:11:40.300 Come on, mate.
00:11:41.260 What about she just doesn't do the event?
00:11:43.020 Like, it's not like she's done it.
00:11:44.620 She made a post.
00:11:45.460 I haven't committed the offence.
00:11:46.680 So I've nothing to argue with it.
00:11:47.200 So that's an offence.
00:11:48.240 A woman in her own house, pregnant, on her way to an ultrasound exam,
00:11:53.660 police barge in, search warrant, arresting her for a Facebook post for incitement.
00:12:00.880 Incitement to what?
00:12:01.840 To riot?
00:12:03.120 That's what incitement usually means.
00:12:04.780 No, in this case, it means disagreeing with the government's politics.
00:12:10.240 Here's the actual offending Facebook post.
00:12:12.560 She was promoting a lawful protest, a protest that actually complies with mask laws,
00:12:18.320 busting into her house, arresting her in front of her family.
00:12:23.160 She's pregnant.
00:12:25.780 How does that happen?
00:12:26.760 How do police go along with this?
00:12:28.320 How do judges go along with this?
00:12:29.760 Who signed the search warrant?
00:12:31.160 Who approved this?
00:12:32.020 How do the jailers take in such a woman on these charges?
00:12:34.860 How does anyone in the entire system, from the justice minister to the judges,
00:12:38.980 all the way on down, from the prime minister on down,
00:12:41.080 where are the civil libertarians?
00:12:42.820 What is happening to Australia?
00:12:44.360 How could it happen?
00:12:45.200 Without a peep, no one objected.
00:12:49.040 And in the United Kingdom, too, Pierce Corbyn, as the name suggests,
00:12:52.540 the brother of former labor leader Jeremy Corbyn,
00:12:55.380 an activist in his own right, more conservative in some ways than his brother.
00:12:58.960 He was a leading global warming skeptic,
00:13:01.360 and it won't surprise you now to learn he's a leading pandemic lockdown skeptic.
00:13:05.920 It's the same sort of mindset.
00:13:07.560 He organized a peaceful demonstration, not a Black Lives Matter riot.
00:13:11.580 And look at this.
00:13:13.200 He was fined £10,000.
00:13:15.700 And when he tried to crowdfund it,
00:13:17.740 the crowdfunding company deleted his campaign immediately.
00:13:20.140 Why?
00:13:20.740 What's happening here?
00:13:22.800 And Australia and the UK don't have the excuse of being run by a socialist politician.
00:13:27.720 They have conservative prime ministers.
00:13:29.400 What's happening?
00:13:29.960 Yesterday, I interviewed Rocco Galati about his lawsuit.
00:13:35.540 By his own self-description, he's a rebel, a dissident, a gadfly.
00:13:38.580 Good.
00:13:39.180 I'm glad he's suing.
00:13:39.960 But really, where's everyone else?
00:13:43.220 Where are the fancy people, the establishment people, even the vultures?
00:13:46.920 You know, the class action lawyers motivated by money alone.
00:13:50.540 Good.
00:13:51.060 We can use them now.
00:13:52.520 Where are the media hounds?
00:13:54.480 Lawyers motivated by publicity alone.
00:13:56.660 I mean, they are part of our legal ecosystem.
00:13:58.920 Why has everyone, the media, the courts, the culture, the professors, everyone being so silent?
00:14:04.280 Where are all the law professors?
00:14:06.580 Well, they're not just silent.
00:14:07.580 They're complicit.
00:14:08.280 They're obedient.
00:14:09.280 Yesterday, Nancy Pelosi was caught getting a private hair salon visit in California.
00:14:15.600 Someone leaked the closed-circuit security footage of it.
00:14:19.440 No mask on, of course.
00:14:21.840 But the bigger point is, this is all while hair salons across California are locked down.
00:14:28.100 For the little people, not for the Speaker of the House, this was a huge scoop.
00:14:32.200 It proves the elite doesn't believe their own spin about the pandemic.
00:14:38.440 It's just a power move.
00:14:40.720 But look at this reaction.
00:14:42.360 Here's what a reporter at Politico wrote in reply.
00:14:45.340 Have to ask, upon seeing this, is it legal in California, a two-party consent state,
00:14:50.340 to videotape someone in a private home or business without their consent?
00:14:53.380 Is that journalism?
00:14:56.760 You don't think it's interesting that the top Democrat in Congress is breaking her own rules?
00:15:01.960 Rather, you'd like to cook up some kooky theory that even reporting on this huge news is illegal?
00:15:06.500 It's not, by the way.
00:15:08.500 But this is a reporter, not just acting as an enforcer for lockdowns and mask laws,
00:15:13.500 but acting as an enforcer for a Democrat caught cheating?
00:15:17.660 That's the state of the media?
00:15:20.880 That's California, one out of 50 U.S. states.
00:15:23.700 It's bad there in California, but most Canadian provinces are worse than your average American state.
00:15:31.140 You can't even travel from Canada to the United States, where you can travel in,
00:15:35.720 but you can't get back without a 14-day quarantine.
00:15:38.440 It's part of Trudeau's attempt to help the Democrats hurt Trump and the American economy in the next four months,
00:15:44.080 to denormalize things, to destabilize Trump, I'm sure of it.
00:15:47.940 There's no medical basis for it.
00:15:49.860 I see that Canada is opening up direct flights with China again,
00:15:52.500 but not with the United States in the same manner.
00:15:56.300 And as bad as Canada is, we're nothing compared to our other five-eyes allies in New Zealand and Australia.
00:16:02.220 How is it that our freedoms in the British Commonwealth and in the U.K. itself were stood for centuries?
00:16:09.720 World wars, the Great Depression, a hundred other menaces,
00:16:13.080 including genuinely huge pandemics like the Spanish flu a hundred years ago.
00:16:17.680 But that now, all of a sudden, we're lost it all so quickly and so stupidly.
00:16:24.280 Stay with us for more.
00:16:38.080 I know this is probably not the most joyous TGIF we have had.
00:16:43.160 You know, it's been an extraordinarily stressful time, I'm sure, for many of you.
00:16:51.920 You know, the outcome, you know, in a two-party system with a lot of polarization in the country,
00:17:00.560 it's a deeply divided country, and you have a binary outcome, right?
00:17:05.200 There is no easy way through this.
00:17:07.440 It was a shock to all of us, the results of the election.
00:17:12.500 It was a fair and democratic process, and we honor that.
00:17:16.620 That was the first moment I really felt like we were going to lose.
00:17:20.800 And it was this massive, like, kick in the gut that we were going to lose.
00:17:24.660 And it was really painful.
00:17:25.900 That shocking, emotional, personal display of corporate solidarity with the Democratic Party,
00:17:35.720 matched with a juvenile, personal, emotional rejection of the voters' will,
00:17:41.600 that was a Google leadership town hall meeting the week that Donald Trump won his election in 2016.
00:17:51.540 Did you catch the language there?
00:17:53.440 They said, we, us, first person plural.
00:17:56.880 There was no separating line between Google and the Democrat campaign and Hillary Clinton.
00:18:03.540 Their loss was the same loss as Hillary Clinton's.
00:18:07.720 And ever since, for four years, Google, Facebook, Amazon, Netflix, Twitter,
00:18:15.080 all of them have vowed they would never again allow social media or the Internet to be used by their enemies,
00:18:25.280 which is their mindset towards anyone conservative.
00:18:28.160 It's one of the reasons why we here at Rebel News in Canada were demonetized.
00:18:34.280 That was one of the first moves by YouTube to demonetize conservative media.
00:18:40.080 Well, we learned about that town hall video from Alam Bokhari, the senior tech correspondent at Breitbart.com.
00:18:49.360 And over the past four years, you've gotten to know Alam well.
00:18:52.540 I've said it many times.
00:18:54.140 He's the most important journalist covering tech in America these days because he focuses on tech's political goals.
00:19:02.040 And I am so delighted to say that just in the nick of time, Alam has published a new book on the subject called
00:19:10.200 Deleted, Big Tech's Battle to Erase the Trump Movement and Steal the Election.
00:19:18.080 And Alam joins us now.
00:19:20.440 Alam, great to see you again.
00:19:21.740 Thank God this book is being published.
00:19:24.440 Will it, in fact, be shipped and published by Amazon?
00:19:28.000 My own book was censored by Amazon for two months.
00:19:30.300 Do you know if this book will be censored or will it even be allowed to be published on Amazon?
00:19:35.900 Well, this is the question that all political authors have to ask themselves with big tech dominating publishing,
00:19:42.600 whether it's news publishing or book publishing in so many ways.
00:19:46.200 Your book was censored.
00:19:47.680 They also censored Alex Berenson, who wrote a book critiquing the coronavirus panic.
00:19:52.160 So it's something that Amazon is doing more and more often.
00:19:56.860 So I suppose we'll see that we'll see what happens there.
00:19:59.560 That's also available, I'm happy to say, on Barnes & Noble and various other more traditional book retailers.
00:20:04.640 So if it does go down on Amazon, there will be some alternatives.
00:20:08.820 I guess we'll see what happens if they ding me for hate speech or something like that.
00:20:12.860 Maybe criticizing the tech companies is going to be redefined as hate speech soon.
00:20:16.300 Who knows?
00:20:17.060 Yeah, isn't that the truth?
00:20:18.240 And they don't even engage you in a conversation.
00:20:20.980 When they ban my own book, and I don't want to talk about my own book,
00:20:23.220 but I just know that when Amazon banned my book called China Virus,
00:20:26.660 they refused to give any explanation other than it contradicted, quote, official sources.
00:20:32.040 I can only imagine what they would do with your book, because at least I was criticizing China.
00:20:36.860 You are actually criticizing the tech companies themselves.
00:20:40.860 I'm worried that even your book will be censored on Facebook, on YouTube, and other places.
00:20:48.860 I mean, that certainly wouldn't be the first time that, you know, I have to be fair to them,
00:20:53.180 I haven't seen any signs of this so far,
00:20:55.180 but it wouldn't be the first time that big tech has tried to censor a Breitbart reporter.
00:21:00.360 You know, they've censored Breitbart stories before on Facebook.
00:21:03.260 It wouldn't be the first time that big tech has censored reporting that's critical of them.
00:21:09.000 I remember them censoring videos from James O'Keefe at Project Veritas exposing the tech giants.
00:21:15.560 The thing is, these tech companies, there's no law regulating them.
00:21:19.240 There's no oversight preventing them from interfering with journalism, from interfering with politics.
00:21:24.760 And this is something that's going to be a huge, huge problem in the next election, I think.
00:21:28.860 That's the whole theme of the book,
00:21:30.020 The Four-Year Plot to Undermine the Trump Movement and Steal the Election to Avoid a Repeat of 2016.
00:21:36.700 That's been the absolute imperative for so many people inside Silicon Valley over the past four years.
00:21:43.880 And we're going to see how that's going to impact the next election for the first time, actually,
00:21:47.940 because in 2016, they were complacent.
00:21:50.560 They're not going to be complacent this time.
00:21:52.000 They're going to be using every tool in their arsenal to avoid a similar outcome.
00:21:57.540 And by the way, Ezra, this doesn't just come from me.
00:21:59.540 This comes from all of my sources inside these companies that have informed the book and informed my journalism.
00:22:05.700 People inside Google, inside Facebook, inside Twitter, who have just been watching this hysteria erupt across the industry.
00:22:14.840 And, of course, they're unable to say anything because if you challenge the political narratives,
00:22:18.180 you'll get set upon and fired and hounded out of the company by your colleagues,
00:22:22.320 much like James Damore was when he critiqued the narratives.
00:22:26.260 But all of these sources, they say the same thing, that there was a complete panic inside every single major big tech company
00:22:32.700 after Trump won in 2016.
00:22:35.200 And afterwards, what happened was this outgrowth of all these new words they invented,
00:22:41.860 misinformation, hate speech, fake news, conspiracy theories.
00:22:46.540 All of these words that give them plausible deniability but have basically served as a way to ban Trump supporters from their platforms over the past four years.
00:22:59.100 Not just ban them overtly, which you can see, but also suppress them covertly in the algorithms.
00:23:04.340 Yeah. You know, something that I think has changed since watching that tearful town hall that Google had with their senior managers and staff felt a little bit naive.
00:23:17.820 They were genuinely stunned.
00:23:19.800 Their reaction was childish, but they were stunned.
00:23:23.280 I don't think they're babes in the woods anymore.
00:23:26.500 I think they're much more aggressive and much more partisan.
00:23:31.300 For example, I note that one of Kamala Harris's senior staffers went directly from her presidential campaign to a senior censorship position at Twitter.
00:23:42.380 We see a revolving door between Justin Trudeau in Canada and tech companies.
00:23:47.780 The Obama administration and the overlap with Twitter, Facebook, YouTube, Netflix,
00:23:52.700 they're really, you don't know where one ends and the other begins.
00:23:55.420 I think that that kind of naivete in 2016 has been replaced with really a professional campaign political class like you would see in a super PAC.
00:24:06.340 These just aren't accidental politicians now.
00:24:09.300 They're professional politicians now.
00:24:11.400 What do you think of that?
00:24:13.700 I agree.
00:24:14.660 And even when Silicon Valley doesn't want to censor, on the rare occasions where it stands up to pressure from, say, the Democrats or from the mainstream media to censor political viewpoints,
00:24:27.340 they're immediately faced with, you know, advertiser boycotts, threats from Democrats to regulate them.
00:24:34.840 And all of this just adds to the sense that they have to do something.
00:24:41.020 They have to crack down on these opinions of the establishment hate.
00:24:44.620 I think recently, actually, Facebook introduced a new rule saying that if there's a regulatory—I can't remember the exact phrasing,
00:24:51.520 but the general gist is if there's a potential regulatory threat to Facebook from content on its platform,
00:24:57.660 it gives itself the right to remove that content.
00:25:00.460 So, essentially, this means that as Democrats and other politicians ramp up the threat of regulation of these companies,
00:25:07.940 they'll be forced to censor even more.
00:25:12.740 And you mentioned Kamala Harris's former employee working for Twitter.
00:25:18.060 We also did a story recently at Breitbart exposing a Facebook team member who works in their misinformation department
00:25:24.520 who, first of all, used to work for the CIA under John Brennan and also had a Black Lives Matter banner picture on her LinkedIn profile.
00:25:33.680 So, these companies are just full of left-wing Democrat partisans, especially Google, which was so deeply tied to the Obama administration.
00:25:43.600 I mean, there was essentially a revolving door between Google and the Obama White House employees going from one to the other
00:25:49.580 and back again during that whole administration.
00:25:51.920 We know Eric Schmidt worked pretty much directly for the Clinton campaign, was at Clinton's election night party,
00:25:59.600 was very open about campaigning for her and trying to help her win.
00:26:03.060 While he was still at Google, by the way, he was like one of the senior people there.
00:26:07.280 So, they're not even trying to conceal their sympathies at this point.
00:26:11.240 The only question is, how is the vast amount of power that they have over the control of, over the flow of information,
00:26:19.560 over everything we see online, how is that going to impact the 2020 election?
00:26:25.120 And can the Trump movement somehow win despite this vast mobilization of technological power?
00:26:33.080 We're talking with Alan Bokhari, the author of the new book, Deleted Big Tech's Battle to Erase the Trump Movement and Steal the Election.
00:26:42.400 Alan, we recently interviewed Ryan Hartwig, who for more than a year was a contractor working for a company called Cognizant,
00:26:51.280 based in Phoenix, Arizona.
00:26:53.440 I couldn't believe the statistics he told me.
00:26:55.940 He said there were 1,500 employees at the Phoenix office alone.
00:26:59.540 They were working three shifts a day, and they were censoring up to 200 posts per day each.
00:27:06.040 My math tells me that's 300,000 little acts of censorship per day just from that office.
00:27:12.200 I found it interesting that they had a specific Canadian election censorship manual,
00:27:18.760 and that's what I focused on with my discussions with him.
00:27:21.180 But if they were so brazen at censoring Canadian elections,
00:27:24.680 which I don't know if that's that important a deal for Facebook.
00:27:30.240 I can only imagine how many contractors have been employed and tasked with censoring the American election,
00:27:39.200 which is what Facebook really cares about.
00:27:43.020 This, you've hit on something truly terrifying,
00:27:45.820 which is that these tech companies aren't simply going to interfere in the election in America,
00:27:50.800 but they can now interfere in elections all around the world.
00:27:54.620 These are global companies.
00:27:56.400 They also interfere in the Brazilian election,
00:27:58.580 targeting WhatsApp groups that supported Brazil's President Bolsonaro.
00:28:04.760 Obviously, Facebook owns WhatsApp.
00:28:06.780 Facebook also banned thousands of Le Pen supporters before the French elections.
00:28:12.240 I believe they've targeted populace in Italy as well.
00:28:15.180 This is all around the world that this is happening.
00:28:17.360 And by the way, one of their senior people at Facebook is Nick Clegg,
00:28:22.340 who was pretty open about wanting to overturn the result of the Brexit referendum in Britain.
00:28:28.460 He was the leader of the most pro-Brexit party in the country.
00:28:33.100 Facebook's also advised by the Atlantic Council,
00:28:35.380 which is essentially a neocon globalist think tank
00:28:38.720 filled with people who spent their entire careers interfering in other countries.
00:28:42.840 And there's, again, there's no oversight stopping them from interfering in elections.
00:28:47.780 Yeah.
00:28:48.060 I think the scariest thing I heard from that censor contractor
00:28:51.920 was that these 300,000 censorship actions per day
00:28:55.580 were training the AI, the artificial intelligence.
00:29:00.460 So what they did for a few years,
00:29:03.360 those millions, maybe a billion acts of censorship,
00:29:08.020 they were teaching the computer how to automatically censor.
00:29:10.720 I find that the scariest.
00:29:12.780 Tell me a few things.
00:29:14.060 I mean, you and I have just been bantering about this general issue.
00:29:17.180 Tell me some of the themes you get into the book,
00:29:20.580 because I know you and I talk about current events in this subject,
00:29:24.020 but are there any things in the book that might surprise people,
00:29:27.720 that might come as news to people?
00:29:29.540 Even for me, you and I talk quite frequently,
00:29:31.760 but tell me why we should go out and get the book right now,
00:29:35.380 which I intend to do, by the way.
00:29:36.720 But give me some teasers about what's actually in the book.
00:29:41.380 Well, you've just hit on one of the most important things,
00:29:43.820 which is artificial intelligence.
00:29:45.720 That's the next stage of censorship, really,
00:29:48.220 because we're all aware of the prominent conservatives
00:29:50.600 who get kicked off these platforms.
00:29:52.600 Often there's a human decision behind that.
00:29:55.000 But the future of censorship is going to be these intelligent machines
00:29:58.400 that scan all of our posts
00:30:00.720 and essentially censor us in advance,
00:30:02.780 decide whether our posts will be seen in people's feeds
00:30:05.640 or just get completely buried in the algorithm.
00:30:08.880 And by the way, the point about algorithms,
00:30:10.420 people think they're complicated.
00:30:11.940 They're not complicated.
00:30:13.040 All algorithms are is something that essentially takes up,
00:30:17.240 takes in a set of inputs and analyzes them
00:30:20.300 and makes a decision based on those inputs
00:30:23.160 and how it's been programmed.
00:30:24.960 So to give you an example,
00:30:26.120 if an algorithm sees your post,
00:30:29.200 it will scan your post.
00:30:30.660 That's the input.
00:30:31.960 And it will then analyze it
00:30:33.420 based on its own criteria of hate speech,
00:30:35.900 which is, of course, determined by leftists in Silicon Valley.
00:30:39.220 And then it will decide
00:30:40.280 whether your post is going to be suppressed or not.
00:30:43.020 And again, this is going to happen
00:30:45.060 across these platforms to everything automatically.
00:30:48.600 And my sources have explained how this is going to happen
00:30:51.540 and how Silicon Valley has essentially created a system
00:30:55.420 where these algorithms are just going to get
00:30:58.240 more and more biased over time
00:31:00.300 because the people programming
00:31:02.140 are just moving further and further to the left
00:31:04.640 because of the echo chamber
00:31:05.780 and the people who disagree with them
00:31:07.780 can't voice their disagreement openly.
00:31:10.860 So we're essentially looking at
00:31:13.120 a kind of dystopian future internet
00:31:15.040 that's controlled and censored by the left.
00:31:18.760 Another key thing in the book
00:31:21.160 that I'd recommend looking into
00:31:23.840 is The Future of Anonymity.
00:31:26.820 Anonymity is one of the,
00:31:28.160 I think, one of the most important things
00:31:29.760 that free speech has been for,
00:31:32.160 even before the internet,
00:31:33.280 you know, the Federalists,
00:31:34.400 the Federalist papers in the US,
00:31:36.360 that was when the founders debated
00:31:38.300 the emerging constitution.
00:31:41.040 They did that anonymously
00:31:42.240 through anonymous letters.
00:31:44.240 Voltaire in France used a pen name
00:31:46.340 to write about controversial political issues.
00:31:48.100 So there's a long history of anonymity
00:31:49.680 being used to discuss taboo issues,
00:31:52.900 being used to challenge existing norms.
00:31:55.920 And that's especially important today
00:31:57.600 in the age of cancel culture.
00:31:59.440 But unfortunately,
00:32:00.560 some of my sources have told me that
00:32:02.080 even if we think we have anonymity,
00:32:05.720 there's technology being worked on right now
00:32:08.760 that will essentially render anonymity
00:32:11.020 completely useless.
00:32:12.360 So even if you think you're anonymous,
00:32:13.920 there will be programs
00:32:14.860 that will track your writing style,
00:32:16.820 even track your mouse movements
00:32:18.020 to determine your identity.
00:32:20.280 So I'd certainly recommend
00:32:21.560 reading the chapter on that
00:32:22.660 if you're concerned about the future
00:32:24.020 of free speech
00:32:25.200 and the ability to talk about
00:32:27.740 sort of forbidden ideas
00:32:28.900 and have forbidden debates
00:32:29.880 that you can't have in public.
00:32:32.040 I find that absolutely terrifying,
00:32:33.960 especially in the age of the pandemic
00:32:35.260 when we're forced online.
00:32:37.580 You can't have a big meeting.
00:32:39.960 You can't have a public gathering
00:32:41.160 in some jurisdictions,
00:32:42.440 but in many you can't.
00:32:43.600 So we're forced into these handful of,
00:32:49.140 you know, it's an oligopoly
00:32:50.680 of these huge companies.
00:32:52.940 It's no coincidence that the value of Amazon
00:32:56.380 has doubled in the stock market
00:32:58.780 during the pandemic
00:32:59.520 because they shut down mom-and-pop shops
00:33:02.220 and they shut down conferences and conventions.
00:33:05.080 So we're forced into Skype and Zoom.
00:33:07.160 I hate the fact that we are being turned
00:33:09.420 into little mice
00:33:10.420 on the little hamster wheel.
00:33:12.960 And let me ask you,
00:33:14.600 why has Donald Trump let four years go
00:33:16.800 without doing anything of substance here?
00:33:20.420 Well, this is another key theme of the book.
00:33:22.660 You know, as these companies
00:33:23.640 grow more wealthy and powerful,
00:33:25.800 their political influence increases as well,
00:33:28.300 much as it did for the big railroad
00:33:30.440 and oil monopolies
00:33:31.680 of the late 19th century.
00:33:32.920 These tech companies are, if anything,
00:33:35.400 even more powerful than that.
00:33:36.840 And their political influence matches it.
00:33:39.180 So one of my chapters focuses on the connections
00:33:42.420 that Silicon Valley has made to Washington, D.C.
00:33:45.680 The vast amounts of money
00:33:47.340 they pump into think tanks
00:33:50.040 and advocacy organizations
00:33:51.560 and lobbying groups in the capital.
00:33:54.080 And by the way,
00:33:54.480 that's both conservatives and liberals.
00:33:56.520 If you ever see a conservative think tank saying,
00:33:58.680 well, we have to leave the tech companies alone
00:34:00.180 because it's just the free market
00:34:01.420 and we can't interfere,
00:34:03.320 I will bet you any amount of money
00:34:06.260 that person is being funded by Google or Facebook.
00:34:10.680 And by the way,
00:34:11.660 that free market argument,
00:34:12.780 we've discussed it before,
00:34:13.820 but it's completely bogus
00:34:14.860 because these companies owe their position.
00:34:17.180 They owe their dominant position
00:34:18.360 to special government perks
00:34:20.580 they got in the 1990s
00:34:22.520 through, you know,
00:34:23.860 legal immunities that Congress passed for them.
00:34:26.080 Yeah.
00:34:27.000 Well, I find this subject very terrifying.
00:34:29.620 And it's hard to not come to the conclusion
00:34:35.020 that Trump's election is essential to stop this.
00:34:38.700 It's not necessarily sufficient.
00:34:41.880 Trump being elected won't necessarily stop it.
00:34:44.960 But I put it to you,
00:34:45.780 if Trump is not reelected,
00:34:48.040 there will be no chance to stop it.
00:34:50.540 So it's essential but not sufficient,
00:34:52.760 if that makes any sense.
00:34:53.860 Last word to you, Alan.
00:34:54.900 I guess I'm going to ask you the question,
00:34:58.500 do you think Trump can win?
00:35:04.700 Well, it'll be a big test
00:35:06.220 of how powerful these tech companies actually are.
00:35:09.720 He certainly has quite a bit of momentum at the moment,
00:35:12.060 still trading in the polls a little bit,
00:35:14.380 but the gap is narrowing.
00:35:16.220 And obviously there's always going to be
00:35:17.240 those shy Trump voters
00:35:18.500 who don't tell pollsters
00:35:20.320 who they're actually going to vote for.
00:35:22.060 So I think it's going to be a lot tougher this time.
00:35:27.840 I'm not going to make one prediction either way
00:35:31.320 because anything could happen.
00:35:32.800 It's an election.
00:35:33.900 But I think the thing working against Trump
00:35:36.360 and the reason why Americans need to be informed
00:35:38.720 about the power of the tech companies
00:35:40.060 is that in this election,
00:35:42.620 much more than the last election,
00:35:45.420 the elites,
00:35:46.580 whether that's the deep state,
00:35:48.700 the media,
00:35:49.320 or the tech companies,
00:35:51.280 or indeed the people
00:35:52.420 who are going to be cheating with mail-in ballots.
00:35:55.760 There was an article in the New York Post recently
00:35:57.460 about a whistleblower exposing that.
00:36:00.440 Cheating is now going to be virtuous.
00:36:02.420 They're not complacent.
00:36:03.840 And many of these people
00:36:04.820 will feel a moral imperative
00:36:06.240 to cheat in the election
00:36:07.820 because they feel that, you know,
00:36:09.620 they've been radicalized by the media
00:36:11.800 into thinking Trump is, you know,
00:36:13.580 the second coming of Hitler.
00:36:16.000 So the tech companies will be doing everything they can
00:36:18.960 to stop him.
00:36:19.540 And that's something he didn't face
00:36:21.020 in the last election.
00:36:22.820 The final thing I'll say,
00:36:24.140 and I think this is important
00:36:24.920 because this goes actually beyond Trump,
00:36:27.280 is just think of what these tech companies
00:36:29.700 have destroyed in their efforts
00:36:32.220 to kill the Trump movement.
00:36:33.620 They've destroyed internet freedom.
00:36:35.860 And, you know,
00:36:36.700 internet freedom was this amazing achievement
00:36:38.880 of human technology,
00:36:41.160 unprecedented in human history.
00:36:43.060 They, you know,
00:36:45.140 for a short period,
00:36:46.540 say 2005 to 2015,
00:36:49.860 you could go,
00:36:50.980 if you had a laptop,
00:36:52.100 you could go on the internet
00:36:52.940 and immediately access a global audience.
00:36:55.820 This was an unprecedented level of freedom
00:36:57.800 that was extremely disruptive
00:36:59.620 to the old controlled media gatekeepers.
00:37:03.980 And that was fantastic.
00:37:05.320 But these tech companies
00:37:06.200 have basically destroyed that
00:37:08.200 in an effort to defeat politicians
00:37:11.120 they don't like.
00:37:11.960 It's an outrage.
00:37:13.440 People should be very angry about it.
00:37:15.100 Yeah.
00:37:15.560 Angry and terrified.
00:37:17.180 Well, Alan,
00:37:17.580 it's a pleasure to talk with you.
00:37:19.000 I wish you good luck with this book.
00:37:21.980 I hope it is not censored
00:37:24.020 or shadow censored.
00:37:25.980 We're going to put this video on YouTube
00:37:28.400 and we'll email it
00:37:29.420 to our Canadian viewers.
00:37:31.340 Hopefully we can get them to read this
00:37:33.160 because every single thing you said
00:37:34.900 applies in Canada too.
00:37:37.320 We wish you good luck.
00:37:38.120 Great to see you again.
00:37:40.180 Thanks, Ezra.
00:37:40.840 Great to be on.
00:37:41.320 All right.
00:37:41.800 There you have it.
00:37:42.240 Alan Bocari,
00:37:43.380 senior tech editor at Breitbart.com
00:37:45.740 and the author of the new book,
00:37:47.140 Deleted Big Tech's Battle
00:37:49.520 to Erase the Trump Movement
00:37:50.760 and Steal the Election.
00:37:53.120 Stay with us.
00:37:53.840 More ahead.
00:37:54.180 Hey, welcome back on my show
00:38:06.780 last night with Rocco Galati.
00:38:08.420 Irene writes,
00:38:09.580 excellent program.
00:38:10.400 I pray,
00:38:11.200 wish for success
00:38:12.380 in this lawsuit.
00:38:14.120 Yeah,
00:38:14.560 I enjoyed going through it.
00:38:15.620 I mean,
00:38:16.480 I think the lawsuit
00:38:17.840 technically
00:38:18.800 risks having some parts
00:38:22.040 struck out.
00:38:23.000 I'm sure that
00:38:23.560 a lot of the defendants
00:38:24.820 will say,
00:38:25.300 well,
00:38:25.400 this is pleading evidence
00:38:27.020 and this is off point,
00:38:28.560 but there's a lot
00:38:29.560 of core arguments there
00:38:30.940 that no one is making.
00:38:32.920 And I ask again,
00:38:33.900 as I did in my monologue,
00:38:35.200 where's the class action lawyers?
00:38:36.660 Where's the favorite media
00:38:39.360 go-to publicity style lawyers?
00:38:41.480 Why has it fallen
00:38:42.580 to a self-described
00:38:44.020 gadfly to sue?
00:38:45.760 Where's everybody else?
00:38:47.460 Where's law professors
00:38:48.540 who love
00:38:49.380 to challenge a prime minister
00:38:50.940 when it's Stephen Harper?
00:38:52.360 Where's all the law professors
00:38:53.620 who were there
00:38:54.120 for Omar Cotter?
00:38:55.300 Where are they
00:38:56.040 for ordinary Canadians?
00:38:57.100 Where's everybody?
00:38:59.280 Susan writes,
00:39:00.000 love this guy.
00:39:00.740 You go Rocco.
00:39:02.140 I enjoyed talking with him.
00:39:03.840 I don't agree
00:39:04.980 with everything
00:39:05.400 he's done legally,
00:39:06.300 obviously.
00:39:06.700 He even represented
00:39:07.560 one of the Cotter brothers.
00:39:09.620 He sued Stephen Harper a lot.
00:39:11.380 I didn't agree
00:39:12.080 with all of his cases,
00:39:13.340 but you've got to
00:39:14.020 hand it to the guy.
00:39:15.220 He's non-partisan.
00:39:16.760 And he loves
00:39:17.500 to take on the state.
00:39:18.740 You know,
00:39:18.980 these days,
00:39:19.820 I'll call that guy an ally.
00:39:22.640 Michelle writes,
00:39:23.780 thanks, Mr. Gladdy
00:39:24.580 and the rebel.
00:39:25.080 Excellent interview.
00:39:25.860 Everyone in Canada
00:39:26.500 should see this.
00:39:27.540 Maybe this should be
00:39:28.080 made available for everyone.
00:39:29.400 Well, thanks for the idea.
00:39:30.420 In fact, today,
00:39:31.400 we put the whole interview
00:39:33.260 on YouTube for free
00:39:34.840 for the public.
00:39:36.060 I like to have special stuff
00:39:37.920 behind the paywall
00:39:39.200 just for you
00:39:39.960 are paying subscribers.
00:39:41.860 We've got to give you
00:39:42.520 something special
00:39:43.380 or people won't pay
00:39:45.480 the eight bucks a month
00:39:46.220 that we rely on here.
00:39:47.500 But every once in a while,
00:39:49.000 maybe once a month,
00:39:50.080 we take a special show
00:39:51.200 and put it out there
00:39:51.920 for the world to see.
00:39:52.680 And I got a lot of requests
00:39:54.000 for this one.
00:39:55.100 All right,
00:39:55.360 that's today's show.
00:39:56.240 Until tomorrow,
00:39:56.900 on behalf of all of us here
00:39:58.020 in Rebel World Headquarters,
00:39:59.420 to you at home,
00:40:00.000 good night.
00:40:00.820 Keep fighting for freedom.
00:40:01.740 you,
00:40:04.360 you,
00:40:04.820 you,
00:40:07.300 you,
00:40:07.360 you,
00:40:07.980 you.
00:40:12.220 You then,
00:40:16.880 you.