Rebel News Podcast - October 11, 2018


UK celebrity gets police visit for using transgender woman’s male name on Twitter


Episode Stats

Length

47 minutes

Words per Minute

163.18633

Word Count

7,786

Sentence Count

638

Misogynist Sentences

13

Hate Speech Sentences

22


Summary

A celebrity is visited by police for calling a transgender woman by her original male name. Why should others go to jail when you won t give them an answer? You come here once a year with a sign, and you feel morally superior? The only thing I have to say to the government about why I publish it is because it s my bloody right to do so.


Transcript

00:00:00.000 Tonight, a celebrity is visited by police for calling a transgender woman by his original male name.
00:00:07.500 It's October 11th, and this is The Ezra LeVant Show.
00:00:15.500 Why should others go to jail when you're a biggest carbon consumer I know?
00:00:19.300 There's 8,500 customers here, and you won't give them an answer.
00:00:23.020 You come here once a year with a sign, and you feel morally superior.
00:00:26.000 The only thing I have to say to the government about why I publish it is because it's my bloody right to do so.
00:00:36.580 I am not actually obsessed by the United Kingdom. I know sometimes it looks like it.
00:00:40.480 I love Canada. I admire the United States. I enjoy traveling to other countries, too.
00:00:44.640 It's just that since I've been dragged into the UK's dysfunctional politics and media and judiciary,
00:00:50.060 through my involvement with Tommy Robinson, I've come to realize that they are about five years further down the path
00:00:55.080 of political correctness as we are. Five years further, five years worse, five years more politically correct,
00:01:00.540 five years worse in terms of media bias, in terms of Islamification of the establishment, in terms of censorship.
00:01:06.600 Five years worse in terms of politicized policing and courts.
00:01:11.520 Five years further down the road of civilizational unraveling. That's my view.
00:01:16.360 I think they're a cautionary tale, a canary in a coal mine for us here in Canada,
00:01:21.620 the same way that Canada has served that role for the Americans.
00:01:25.620 Let me give you an example today. You might not know the celebrity I mentioned.
00:01:28.380 Maybe he's not even really a celebrity, but I think maybe the analogy here in Canada would be someone like Rick Mercer,
00:01:33.600 though Rick is much nicer than the British guy I'm going to talk about.
00:01:37.400 His name is Graham Linehan.
00:01:38.720 And he's a creative type, co-writer, co-creator of a show called Father Ted.
00:01:43.880 I actually hadn't heard of it, but I watched a bit of it online. It's a comedy.
00:01:48.060 Can I show you a two-minute clip? I know that's a long time, but it's a funny little clip, just so you understand.
00:01:52.880 It's sort of funny, and it's a tiny bit controversial, but not really.
00:01:57.300 I just want you to get a two-minute flavor of the show,
00:02:00.320 so you know who and what I'm talking about for the rest of this monologue.
00:02:03.560 Just watch this quick clip.
00:02:04.600 I am Chinese, if you'll please.
00:02:21.820 Come on, Dougal, lighten up.
00:02:23.760 Ha, ha, ha, ha, ha, ha.
00:02:34.600 Ha, ha, ha, ha, ha, ha, ha.
00:02:43.220 Dougal, there were Chinese people there.
00:02:45.660 All right, yeah.
00:02:49.640 I mean, what, what is, I mean...
00:02:51.760 That's the inn family. They're living over there in that old Chinatown area.
00:02:56.240 Chinatown there? There's a Chinatown on Craggy Island?
00:02:59.540 Dougal, I wouldn't have done a Chinaman impression
00:03:01.520 if I'd known there was going to be a Chinaman there to see me doing a Chinaman impression.
00:03:05.260 Why not, Ted?
00:03:08.480 Because, because it's racist.
00:03:11.160 They'll think I'm a racist.
00:03:12.880 I'm going to have to catch up with them and explain I'm not a racist.
00:03:20.680 Hello there, Father.
00:03:22.860 Hello, Colm.
00:03:24.700 Out and about?
00:03:26.060 I am.
00:03:26.980 Same as yourself.
00:03:28.820 Good, good.
00:03:29.400 I hear you're a racist now, Father.
00:03:34.300 What, what?
00:03:35.820 How did you get interested in that type of thing?
00:03:38.920 I said I'm a racist.
00:03:40.340 Everyone's saying it, Father.
00:03:42.220 Should we all be racist now?
00:03:44.500 What's the official line the church is taking on this?
00:03:47.560 Oh, no.
00:03:48.780 Only the farm takes up most of the day
00:03:51.220 and at night I just like a cup of tea.
00:03:54.080 I might not be able to devote myself full time to the old racism.
00:03:58.180 Good for you, Father.
00:03:59.560 What?
00:04:00.460 Oh, Mrs. Comrie.
00:04:02.180 Good for you, Father.
00:04:04.020 Well, someone has a guts to stand up to them at last.
00:04:07.380 Coming over here, taking our jobs and our women.
00:04:10.280 I thought it was sort of funny.
00:04:13.900 I mean, a little bit.
00:04:15.140 Reminds me a little bit of that show Curb Your Enthusiasm with Larry David,
00:04:18.740 you know, the Seinfeld spinoff.
00:04:20.440 Larry David's always the guy saying the wrong things and it just spirals.
00:04:23.800 It's pretty gentle.
00:04:25.200 It's still safe to make a joke about joking about Chinese people.
00:04:31.480 They'd never mock Muslims, of course.
00:04:33.060 I mean, in the U.K. that gets you fired within seconds.
00:04:35.400 But still, it was a teeny bit edgy.
00:04:37.620 This is sort of edgy, probably edgier than the boring anti-Trump pablum served up every week on Saturday Night Live out of New York.
00:04:44.760 Anyway, I say again, the co-creator of this show is named Graham Linehan.
00:04:49.960 Linehan, I don't know how to pronounce it.
00:04:52.320 And as with so many dramatic types, he gets into fights on Twitter.
00:04:56.080 He's a leftist, of course.
00:04:58.120 A social justice warrior, of course.
00:05:00.560 And Twitter can be dangerous for guys like that since it's so instant.
00:05:03.960 Even quicker than an email.
00:05:05.280 And unlike an email, the whole world sees it, not just the person you're emailing to.
00:05:09.180 So Graham Linehan started getting into a quarrel with a transgender activist, someone who was born a man but now says they're a woman.
00:05:20.000 He goes by the name Stephanie Hayden now, which is a girl's name, of course.
00:05:23.900 So they're both leftists, but one's transgender and they're fighting on Twitter, which isn't really fighting, it's words.
00:05:30.800 And Linehan calls the activist by the name he was given by his parents, a male name.
00:05:35.560 I'm not going to get into the quarrel too deeply.
00:05:37.240 And I don't even think I'm going to take sides between them because my point today isn't a gossipy quarrel between two minor celebrities in a faraway place.
00:05:44.860 It's about the police reaction.
00:05:47.140 So I'll skip over the details and I'll get to the end of the story.
00:05:50.580 So Linehan tweeted,
00:05:52.740 My run-in with Stephanie Hayden made the Times.
00:05:56.620 Favorite bit is where Stephanie, Tony, Stephen, explains how it's perfectly legal and normal to have multiple identities,
00:06:03.860 but if you don't call him the female one, you're doing a hate crime.
00:06:07.240 All right, I find that sympathetic.
00:06:09.320 Here's how Hayden responded.
00:06:11.600 What Graham Linehan did was to republish photographs of people associated with me,
00:06:16.280 details of my former male identity,
00:06:18.420 and then continue to defame deadname and misgender me.
00:06:24.000 Transphobia, transgender harassment.
00:06:25.600 Now, deadname, in case you're wondering, is a new and wonderful phrase that means calling a transgender activist by the name that they used to have but have now renounced.
00:06:37.840 It would be like calling Muhammad Ali, Cassius Clay.
00:06:41.760 I don't know.
00:06:42.160 Or calling the singer Bono by the name Paul Hewson.
00:06:45.760 But apparently it's a grave insult.
00:06:47.260 A dead name.
00:06:48.500 Don't you deadname me.
00:06:50.040 Even though it's the legal name carried by that person for decades, the name lovingly chosen for them as a baby by their loving parents.
00:06:59.360 Dad, don't you deadname me.
00:07:02.420 Okay, but still, so what, right?
00:07:04.500 As with so many online quarrels, the real loser is anyone who wastes their time on it.
00:07:08.640 So why am I wasting your time on it?
00:07:11.080 Well, Hayden complained that Linehan was doxing people, that is, revealing private information about them.
00:07:17.660 But he denies it, saying, oh, one thing on this.
00:07:20.720 Tony Stephen Stephanie is claiming I doxed him.
00:07:23.060 And, of course, I did no such thing.
00:07:24.620 Everything I retweeted was already available online.
00:07:27.280 Tony Stephen Stephanie retaliated by going after my wife.
00:07:30.780 So I know you're thinking, why are you boring us with this celebrity quarrel about celebrities who aren't even celebrated over here?
00:07:37.640 We've never heard of them.
00:07:38.340 It's a country far away.
00:07:39.660 Who cares?
00:07:42.580 Well, if that were it, that would be it.
00:07:45.020 I mean, we have quarrels like that in Canada, right?
00:07:47.660 Remember, it was when Professor Jordan Peterson first refused to call someone G or Zer.
00:07:54.240 That caused the big fuss when transgender activists tried to get him fired from U of T about it.
00:08:00.020 I don't think I had heard the word deadname back then, but that's sort of what this was about.
00:08:04.500 And Jordan Peterson was worried about federal legislation exposing him to prosecution for refusing to call someone by a made-up word, G and Zer, instead of their dead names, he and her.
00:08:16.100 So we have that same issue here in Canada.
00:08:18.800 But here's what's new or different.
00:08:22.300 In the UK, the police got involved.
00:08:26.220 Real, actual, gun-toting police.
00:08:30.140 Well, scratch that.
00:08:31.200 The police in the UK still are disarmed, so they're not gun-toting, are they?
00:08:35.020 But they have badges and cars, Priuses usually.
00:08:39.840 But still, Father Ted writer Graham Lineham was given a harassment warning by police.
00:08:47.400 This is a story in the BBC, the state broadcaster.
00:08:50.620 And look at how deadpan they are here.
00:08:52.800 The co-creator of Father Ted has been given a verbal warning by police for alleged harassment following a row on Twitter with a transgender woman.
00:09:03.880 Graham Linehan, I don't know if it's Linehan or Linehan, and frankly, I'm glad if I'm getting it wrong.
00:09:09.040 I don't care enough to get it right.
00:09:11.020 Graham Linehan was told by West Yorkshire police not to contact Stephanie Hayden.
00:09:15.940 She reported him for referring to her as a he.
00:09:21.020 Let me just read that one more time in case you didn't hear it.
00:09:24.960 She reported him for referring to her as a he.
00:09:30.320 Just stop right there.
00:09:32.900 911, fire, ambulance, or crime.
00:09:37.660 Who should we dispatch, ma'am?
00:09:39.420 Well, someone called me a he instead of a her.
00:09:47.060 Fire department, ambulance, or police, ma'am.
00:09:49.700 Which one?
00:09:50.440 This is 911, ma'am.
00:09:53.220 All right, put the quote back up.
00:09:54.240 Sorry for the tangent.
00:09:56.220 She reported him for referring to her as a he and for tweeting the names she used before transitioning.
00:10:01.080 Miss Hayden, 45, is now suing the writer.
00:10:07.160 Mr. Linehan, 50, told the BBC he is also considering taking action against her.
00:10:12.440 So the lawyers are at least doing well here.
00:10:14.820 Okay, again, I don't really care about people filing lawsuits against each other.
00:10:18.280 They're really just PR stunts.
00:10:19.680 But the police.
00:10:22.600 He called me a he instead of her officer.
00:10:24.780 Let me read some more.
00:10:25.460 The row comes amid a continuing debate about gender recognition in the UK.
00:10:30.140 Currently, if someone wishes to have their gender identity legally recognized,
00:10:35.600 they have to apply for a gender recognition certificate.
00:10:39.900 Oi, bruv, where's your gender recognition license, eh?
00:10:44.860 Which requires a medical diagnosis.
00:10:47.020 Some argue the medical element should be removed and say there should be a system of gender recognition based on self-identification.
00:10:54.900 But not all agree with some gender-critical people believing the system could be abused by predatory men.
00:11:01.940 Ha, so you're gender-critical, eh?
00:11:03.840 You're one of them, eh?
00:11:04.780 You bigot.
00:11:05.860 You gender-critical bigot.
00:11:07.960 I got my license here.
00:11:10.300 That's the UK, people.
00:11:11.620 You see how weird it is?
00:11:12.820 It ain't the land of Thatcher or Churchill or Shakespeare or the Queen anymore.
00:11:18.480 So, that last point there about being gender-critical, that's actually a real issue.
00:11:23.760 I mean, we've seen cases of sexual predators, males, simply declaring themselves to be women
00:11:29.500 and then being put into women's prison and obviously continuing to commit sexual assaults in there against the women prisoners.
00:11:36.160 It's crazy.
00:11:37.120 That's come to Canada, too, by the way.
00:11:38.460 We've shown you this before.
00:11:39.340 Remember this story earlier this year?
00:11:40.880 Again, it's from the Globe and Mail.
00:11:43.940 The federal prison system is changing the way it treats transgender inmates who will now be placed in a men's or women's facility
00:11:52.440 based on how they self-identify.
00:11:55.800 That's the Canadian Globe and Mail I just quoted from there.
00:11:58.600 So, self-identify means just saying so.
00:12:01.500 No surgery, no hormones, no makeup or wigs.
00:12:03.860 Don't even have to shave.
00:12:04.780 You've got a big bushy beard, no problem.
00:12:06.500 Just say you're a gal.
00:12:08.020 Poof, I'm a woman.
00:12:08.900 Is that any crazier or less crazier than this?
00:12:12.460 Poof, I'm the king of Spain today.
00:12:14.000 Call me your highness.
00:12:15.120 Poof, I'm a robot.
00:12:16.780 Because I self-identify as one.
00:12:18.400 But one of those three things actually gets you transferred from a male prison, which is not a very nice place,
00:12:24.340 to a female prison, which is a lot nicer.
00:12:27.580 Plus, you're surrounded by girls.
00:12:30.340 So there is a debate there, even for the gender critical.
00:12:33.340 That's for sure.
00:12:33.840 But again, that's not what we are talking about here today.
00:12:36.740 We are talking about the police.
00:12:39.280 Back to the BBC story.
00:12:41.180 Not about free speech.
00:12:42.580 Miss Hayden, who is pursuing civil proceedings accusing him of harassment, defamation, and misuse of private information,
00:12:48.980 told the BBC,
00:12:49.440 See, I spoke to the police for 45 minutes about how I wanted to go forward.
00:12:54.220 I didn't think he was a physical threat.
00:12:56.540 But thought if the police spoke to him and advised him with a warning,
00:13:00.600 he would possibly realize the hurt he had caused.
00:13:04.280 The point I want to get across is this isn't about free speech.
00:13:07.580 This is about harassment.
00:13:08.540 Is that how it works?
00:13:13.380 You're an exquisitely left-wing activist.
00:13:16.600 So you can get the police to give you attention for a Twitter spat.
00:13:22.880 Attention that they wouldn't give to, say, a stabbing victim or an acid attack victim can get.
00:13:29.520 And they'll sit there and listen to you blah, blah, blah for 45 minutes.
00:13:36.420 Because you're famous.
00:13:39.600 Sort of.
00:13:41.000 So the police come and they listen to you.
00:13:43.660 And instead of saying,
00:13:45.360 There, there now.
00:13:46.660 It's okay now.
00:13:49.320 They actually go to your Twitter sparring partner and tell him to shut up.
00:13:53.380 To warn him.
00:13:55.000 And all you have to say is,
00:13:56.380 This isn't about free speech.
00:13:58.000 And then, poof, it's not about free speech.
00:13:59.680 Just like you're the king of Spain.
00:14:01.760 Here's some more from the Daily Mirror that also covered it.
00:14:04.160 Here's some new facts.
00:14:04.980 He has been handed a verbal harassment warning, reports ITV.
00:14:10.900 Mr. Linehan, 50, said on Saturday,
00:14:13.640 The police asked me to stop contacting someone I had no intention of contacting.
00:14:18.660 It was a bit like asking me to never contact Charlie Sheen.
00:14:22.120 And this little detail.
00:14:24.760 Well, she said,
00:14:26.920 I don't take kindly to a public figure tweeting about me, referring to me as a man and putting my legal name in quotation marks to suggest it's not valid.
00:14:35.580 And the police warning issued to Mr. Linehan is not a conviction or caution, but a warning used to deter individuals from further behavior.
00:14:47.580 Oh, so that's how it works.
00:14:48.940 So Linehan hasn't done anything wrong.
00:14:51.420 Nothing to be charged with, let alone convicted.
00:14:53.900 He hasn't done anything wrong other than to call him a him instead of call him a her.
00:14:57.920 You'll note the Mirror and the BBC are not foolish enough to commit that gender critical thought crime.
00:15:03.300 They both absolutely uniformly call her a her.
00:15:07.380 But the police are happy to go warn him.
00:15:11.740 He hasn't done anything.
00:15:12.700 He says he won't do anything other than maybe say some mean things.
00:15:16.740 But the cops are going to send him a message, rattle his cage a bit.
00:15:23.840 Oh, and it does go on his police file.
00:15:25.320 I didn't mention that part.
00:15:27.580 They do, however, appear on enhanced criminal record checks.
00:15:32.180 Oh, so that really is a form of a criminal record.
00:15:38.880 It's a record, and it's on there.
00:15:43.360 Not as a crime committed, but as a pre-crime.
00:15:46.300 Then maybe you're thinking of doing something, man.
00:15:48.900 Where's your license to be gender critical, mate?
00:15:51.220 Bruv, where's your license?
00:15:54.820 That's the UK in 2018.
00:15:57.300 Makes sense, though, because look at what the UK was like in 2017.
00:16:00.940 Look at this tweet from the Metropolitan Police.
00:16:05.260 We have 900-plus specialist officers across London dedicated to investigating all hate crime.
00:16:15.560 That's London's police.
00:16:17.580 900-plus police.
00:16:21.240 Just on the hate crimes beat.
00:16:23.220 900.
00:16:25.680 That's bigger than some countries' armies, I think.
00:16:28.820 900 police officers.
00:16:34.760 And who knows how many support staff.
00:16:36.480 There must be thousands.
00:16:39.040 Not to attack the crime wave of knife stabbings or acid attacks.
00:16:43.020 That's a thing in London.
00:16:44.700 People drive around on mopeds and hurl acid in people's faces.
00:16:47.480 That's a thing in London.
00:16:49.360 But you've got 900 officers going after hurt feelings.
00:16:52.760 Here's a picture of women in London wearing a niqab, standing behind a cop, trying to hurt her feelings.
00:17:01.420 I bet those niqabed women have strong views on transgenderism, too.
00:17:06.980 Just a hunch I have.
00:17:08.820 But police don't have time to look into that.
00:17:11.440 But they spend time with Linehan, Linehan.
00:17:16.160 They have to keep busy, these hate officers.
00:17:19.500 They can't spend all day in front of a computer screen, creeping through other people's Facebook photos, looking for bad jokes.
00:17:27.540 Got to get out and stretch your legs.
00:17:29.320 Police visiting you to tell you not to be mean on Twitter.
00:17:34.620 And mean is calling a transgender man to woman a man.
00:17:38.160 And that goes on your record forever.
00:17:41.940 Warning you not to say things that hurt feelings.
00:17:44.780 And the national media, I've read to you from the Mirror and the BBC, it's the same with all the papers.
00:17:49.880 They're just fine with this.
00:17:51.120 They're just reporting it, straight face.
00:17:52.860 It's no big deal to them.
00:17:54.060 Totally fine.
00:17:56.780 That's the UK in 2018.
00:17:58.140 I said Canada's five years behind the UK.
00:18:00.440 That implies we'll be that bad in the year 2023.
00:18:03.860 Nah.
00:18:04.560 I think we're more probably going to be like that in six months.
00:18:10.760 What do you think?
00:18:12.820 Stay with us for more.
00:18:28.140 Welcome back.
00:18:31.020 Well, there have been incredible internal documents in recent months leaked from the Silicon Valley Titans.
00:18:39.600 For example, that Google staff meeting right after the 2016 presidential election, where it was basically a group therapy session.
00:18:49.180 Different executives practically crying about the results of Donald Trump's win and swearing to use all their power to stop it.
00:18:56.540 Well, now comes another leak, a huge scoop delivered to the leading media outlet that is concerned about the political bias and censorship of Silicon Valley.
00:19:09.280 I'm talking about our friend, Alan Bokhari at Breitbart.com.
00:19:13.340 And he has received a massive internal memo about what Google calls good censorship.
00:19:23.700 Here's the cover story for Breitbart, the good censor.
00:19:27.780 Google admits concerns about political neutrality are now mainstreaming.
00:19:32.400 And joining us now via Skype from Washington, D.C., is Alan Bokhari.
00:19:37.020 Alan, first of all, congratulations.
00:19:38.420 You keep on breaking these huge stories.
00:19:40.600 I think it's because you're really the journalist who most consistently reports on bias and censorship in Silicon Valley.
00:19:48.520 So congratulations to you.
00:19:50.540 Thank you, Ezra.
00:19:51.140 Yeah, we've been covering this topic for over a year now, and the news stories just keep on coming.
00:19:58.780 I mean, it's clear that Silicon Valley as a whole has just completely abandoned their early commitment to free speech.
00:20:07.680 And that's actually something that they admit quite clearly in this new document that was leaked to us.
00:20:12.520 They say that initially these companies promised free speech to their users, and then they gradually shifted towards censorship.
00:20:18.460 That's the first time we've seen them admitting it.
00:20:21.020 It's obviously something they've never admitted publicly, but they do admit it privately.
00:20:25.080 Yeah, I think they're self-aware, Alan.
00:20:27.700 They're aware that they're no longer what they once were.
00:20:31.100 I mean, I remember when Google was new, they had this trite motto, don't be evil, which people would say, well, of course, don't be evil.
00:20:38.120 But here we are in 2018, and this same Google is saying it will not work with the U.S. Pentagon for ethical reasons,
00:20:46.960 but it is deeply working with China on this kind of social censorship where your Internet account is tied to your phone,
00:20:56.520 is tied to your personal info.
00:20:58.280 And it's not just deep censorship.
00:21:00.460 It's really real-time tracking of every single human in China.
00:21:04.640 So that's cool by Google.
00:21:06.560 Working for the Pentagon is not.
00:21:08.720 And I think some of these Chinese tactics that they're learning are being imported to America and around the world.
00:21:13.480 What do you make of that?
00:21:15.400 Well, one of the striking parts of the document was where they talk about the censorship requests they receive from foreign governments.
00:21:22.160 And they show a massive, massive, according to their own internal research, they show a massive spike after 2016.
00:21:30.160 So certainly there's a lot more pressure on these tech companies now from state governments to censor their platforms.
00:21:36.980 But Google doesn't say it's a bad thing.
00:21:39.540 In fact, later on in the document, they say that if Google and other tech companies, if they want to expand globally, continue global expansion,
00:21:48.760 they do have to shift towards censorship.
00:21:51.420 And, you know, Google told us when we asked WeChat for comment, they told us that this document doesn't reflect company policy.
00:21:58.600 It's just research.
00:21:59.840 But in the case of China, they're developing this new censored search engine, Dragonfly, they call it,
00:22:04.600 that's going to have blacklisted search terms that the Chinese communists don't like.
00:22:09.220 It's going to tie users to their users search to their phone numbers.
00:22:13.900 So clearly they are following the recommendations of this briefing in that particular instance.
00:22:19.420 Yeah.
00:22:22.240 It's incredibly tempting for Google, for Facebook, for Twitter to comply with the Chinese government,
00:22:29.260 because it's not just getting in to the world's most populous market.
00:22:34.600 It's getting in ahead of or instead of your competitors.
00:22:39.460 I mean, Google and Facebook are doing enormously well in Europe, in America, around the world.
00:22:44.220 But it's not just about getting into China.
00:22:46.060 It's about getting into China and keeping out your rivals.
00:22:49.100 So it's almost a race to see who will be the most submissive.
00:22:53.340 And if you, you know, if the Chinese Communist Party comes with 20 demands and Facebook will meet 10 of them,
00:23:01.480 but you'll meet 15, well, it's not just that you're both getting in.
00:23:05.320 You'll be the only one in that green field.
00:23:08.940 It'll be like, it's like a gold rush.
00:23:11.540 And you'll be the first to be able to stake it out.
00:23:13.540 I think that's what the motivation is here.
00:23:16.400 It's money and power.
00:23:18.460 And anyone who thinks that these tech billionaires aren't motivated by those, I think, is deluded.
00:23:24.260 What do you think?
00:23:26.080 I agree.
00:23:27.400 And you've certainly got their incentives down there.
00:23:31.120 But it's quite terrifying to imagine the merging of state power with the power of these tech companies.
00:23:37.260 These companies know everything about you, everything you search for, everything you email.
00:23:41.580 They monitor your emails, even.
00:23:44.340 They have unprecedented control of the information we see.
00:23:48.640 And for them to work with authoritarian regimes like this, it should be very, very concerning to everyone.
00:23:54.260 Because this is a power that, you know, authoritarian regimes of the past, like the Soviet Union, couldn't even have imagined.
00:24:00.360 So it would be an entirely new kind of all-seeing, all-knowing totalitarianism if Google were to give in to too many requests by a government like China's.
00:24:12.740 You know, for the Chinese government itself to try to spy on all of its 1.3 billion people would require an enormous technological and resource effort that's probably outside their power.
00:24:25.620 And, I mean, they would certainly try, but to have full, real-time surveillance of every single Chinese person, knowing everything they say, everything they look at, everywhere they go.
00:24:35.820 I mean, the GPS in a phone, for example.
00:24:38.240 The locations.
00:24:39.160 On your own cell phone, there's something called system services or location services that tracks where you go.
00:24:45.680 And there's a useful benefit to that.
00:24:47.740 It can tell you the traffic on your way to work or the closest, you know, gas station or whatever.
00:24:52.760 But for the communists of China to try and plant that on you would be impossible.
00:25:00.480 But if they can just get Google, Facebook, Apple, whatever, to agree to let them have access to the GPS on your phone, to let them have access to what you write on Gmail,
00:25:13.660 China itself doesn't have to be the highest tech company in the world because it just gets Sergey Brin or Mark Zuckerberg to do that for them.
00:25:23.600 So it's, what was it, was it Solzhenitsyn who said that the capitalists would sell the Soviet Union the rope by which they would be hung?
00:25:32.720 I mean, is that, I'm trying to grok this.
00:25:35.780 I'm trying to grasp this.
00:25:37.360 I think it's Facebook and Google saying, yeah, we will be your secret police for you and you don't even have to pay for it.
00:25:46.480 Yeah, I'm not sure.
00:25:47.580 I think Facebook is still banned from China.
00:25:49.120 I'll have to check that.
00:25:50.040 But certainly Google is playing ball.
00:25:52.460 And, you know, it's not just China.
00:25:53.600 It's also Europe.
00:25:54.700 Europe, you know, they've had these hate speech.
00:25:56.860 And Canada, actually.
00:25:57.580 Europe and Canada have had these hate speech laws on the books for decades now.
00:26:02.400 And they're totally ridiculous.
00:26:03.900 And we've seen people arrested for doing journalism in the case of Tommy Robinson.
00:26:07.600 We've seen people arrested for jokes in the case of the YouTube account Dankula.
00:26:13.240 And at least in the past, you know, these governments wouldn't be able to see everything you say.
00:26:18.260 But as we move into the digital era, that is exactly what they can do.
00:26:22.200 And for companies like Facebook and Google to be working with them is very troubling to me.
00:26:26.240 Yeah.
00:26:26.900 Well, I mean, remember that IBM did computer work for the Nazis, of course.
00:26:32.700 And there's no shortage of companies willing to do business in the theocracy called Saudi Arabia or who want to get back into Iran or happy to sell things to Venezuela.
00:26:45.540 So there's never been a shortage of people willing to deal with dictatorships.
00:26:48.700 I want to go into this Google document.
00:26:51.580 We've been talking around it.
00:26:52.660 And thank you for letting me bounce some of my own thinking off you.
00:26:56.240 But I'd like to focus on three pages.
00:26:58.700 It's a 41-page document, isn't it?
00:27:01.540 Did I do I have that right?
00:27:04.140 Pardon me?
00:27:05.540 85, in fact.
00:27:06.440 85.
00:27:06.900 I'm sorry.
00:27:07.380 I did look through the whole thing on your website.
00:27:10.200 And I found it was very interesting.
00:27:11.800 I recommend everyone go to brightbart.com and look through the primary document itself.
00:27:16.860 But there were three slides that I thought were interesting from this internal Google document.
00:27:21.720 And maybe I can ask you to expand on them.
00:27:24.080 The first one is about a section of a U.S. law called Section 230.
00:27:29.580 I'm going to read a little bit of this slide.
00:27:31.260 And then, Alam, can you explain to our viewers what it means?
00:27:33.840 This slide from this internal Google document says,
00:27:36.400 An important U.S. federal statute from 1996 supports this position of neutrality.
00:27:42.720 Under Section 230 of the Communications Decency Act, tech firms have legal immunity from the
00:27:50.080 majority of the content posted on their platforms, unlike traditional media publication.
00:27:55.460 This protection has empowered YouTube, Facebook, Twitter, and Reddit to create spaces for free
00:28:00.920 speech without the fear of legal action or its financial consequences.
00:28:05.660 And then they quote a journalist saying,
00:28:08.860 It's hard to say what the global internet would look like if Section 230 had never become the law of the land.
00:28:14.860 Would YouTube have even been possible?
00:28:17.620 Can you speak to the importance about Section 230 of the Communications Decency Act
00:28:21.480 and how these companies might be violating it by becoming censors?
00:28:27.120 So Section 230 is, as the briefing admits, totally critical to the business model of Google and Facebook
00:28:34.720 and basically every social media platform.
00:28:37.220 Because it says that these platforms can't be held legally responsible for content posted by their users.
00:28:44.600 Because if they were held legally responsible, then they'd be liable, they'd be facing lawsuits for every piece of defamation
00:28:51.540 posted on Twitter, you know, millions and millions of posts every single day, that would be impossible to deal with.
00:28:57.260 You know, every time Google put a search result at the top of its search results that, you know,
00:29:05.180 that accuses you or me of peddling hate speech, they'd be liable for that comment.
00:29:09.140 They'd have to defend it in court.
00:29:10.660 So if these tech companies were ever to lose the protection of that law, they'd be in a lot of trouble.
00:29:17.900 Their business model would be in serious jeopardy.
00:29:20.780 And it's contingent on them behaving as platforms rather than publishers.
00:29:27.620 Because publishers, as we know, they are liable for defamation.
00:29:30.500 They are liable for libel.
00:29:32.020 So as these companies move towards becoming censors, making editorial decisions about what should go up in their algorithm,
00:29:42.800 what should go down, editorial decisions about who should be allowed on, who should not be allowed on, who gets priority,
00:29:48.980 then they become more like publishers.
00:29:50.420 And actually later on in the document, they admit that, that their new role as censors risks categorizing them as publishers,
00:29:57.080 which legally is a huge problem for them.
00:29:58.980 Yeah, I mean, an analogy, I mean, I've used the analogy before of someone who just owns a stage
00:30:04.580 and allows any actor to come and say anything on the stage without discrimination.
00:30:09.920 That's a platform.
00:30:11.400 A publisher would be the director of the play who chooses what's said.
00:30:14.800 Or think about it another way, a bulletin board on the sidewalk of a street
00:30:19.600 that anyone can pin a poster or a note to.
00:30:24.280 You're not going to sue the actual bulletin board itself for what something is tacked to it
00:30:31.920 unless they start saying, well, we will now decide what is or isn't on.
00:30:35.940 And that changes the...
00:30:36.940 Yeah, because then they become responsible.
00:30:38.240 Pardon me?
00:30:39.380 Because then they become responsible.
00:30:40.940 If the bulletin board owner starts saying, I'm making decisions about what goes on here and what doesn't go on here,
00:30:46.300 then suddenly he's responsible.
00:30:47.800 He's taking an active role in it.
00:30:50.120 It's my view that social media companies are breaking that rule.
00:30:53.680 Let's move on.
00:30:54.180 There's another graphic I want to show you, which is perhaps the most terrifying.
00:30:57.980 It's called, Why the Shift Towards Censorship?
00:31:02.540 And again, I encourage everyone to go to the actual Breitbart.com website to see this 80-plus page slide deck from Google.
00:31:09.900 They acknowledge that the Internet used to be freer, but the shift towards censorship.
00:31:15.900 And you can see on the right, they call it to create well-ordered spaces for safety and civility.
00:31:25.840 Well, that don't sound like the Internet I'm familiar with and love.
00:31:28.800 And here's some of the four values they talk about, or the demands.
00:31:33.700 Appease users, maintain platform loyalty, respond to regulatory demands, maintain global expansion, monetize content through its organization, increase revenues, protect advertisers from controversial content, increase revenues.
00:31:48.980 So you can see it's about money and global expansion.
00:31:53.640 This is their own document.
00:31:55.200 This is why they're shifting towards censorship.
00:31:58.840 And let me just read a few from the left.
00:32:00.180 But governments were unhappy to cede power to corporations.
00:32:05.380 It's impossible to neutrally promote content and info.
00:32:09.000 Those were some of the old ideas.
00:32:11.260 The new ideas are censor.
00:32:14.380 That's the way the arrows are pointing, aren't they, Alam?
00:32:17.000 Yeah, and it's very cynical and hard-headed how they describe it.
00:32:20.380 This is all about getting access to new markets.
00:32:22.520 It's all about increasing revenues by satisfying advertisers who are eager to avoid controversy.
00:32:27.940 So there's no clear principle behind this.
00:32:30.860 You know, don't be evil has gone out the window, clearly.
00:32:33.020 It's all about the dollar and maximizing profits and getting access to those new markets.
00:32:38.860 That doesn't mean a political bias isn't involved in this document as well.
00:32:41.900 You can see it, you know, elsewhere.
00:32:43.860 But this lays out why they're doing it, really.
00:32:46.740 Yeah, there's one more slide, and I think it's quite a compliment to you personally, which
00:32:51.820 I thought was interesting, and I'm sure you had a quiet glow of pride.
00:32:55.780 I refer to the slide about where free speech is now championed.
00:33:01.460 And let me just read this slide.
00:33:02.620 It says,
00:33:02.820 Being critical of big tech censorship powers was once a niche stance coming mostly from
00:33:08.580 those on the right, but now concern about big tech's abandonment of neutrality has gone
00:33:13.020 mainstream.
00:33:14.260 And this, and on another slide, they show that the number one critic and journalist looking
00:33:21.600 at these censorship issues is Breitbart.com.
00:33:26.600 And you, of course, are their senior technology correspondent who's been on this file.
00:33:30.460 In fact, on this page, you can see they have one of your headlines, Google doubles down on
00:33:34.260 purging conservative speech.
00:33:36.100 They also refer to the Wall Street Journal and the Spectator and frontpage.com.
00:33:41.800 But they have another chart on the next page, actually, that shows that Breitbart, by far,
00:33:46.580 is the champion here.
00:33:47.740 I think you, and I've said this before, that you're the leading journalist in the world that
00:33:51.940 I know of on the beat of Silicon Valley's bias and censorship.
00:33:55.560 We've shown some of the most worrisome things here.
00:34:01.180 I have to be candid and say there was some self-awareness in this document that they're
00:34:07.240 not the do-no-evil, you know, Boy Scout of the past.
00:34:13.880 There were some comments in this document that showed there was a little flicker of self-awareness
00:34:20.320 and concern.
00:34:22.360 But I don't think that's the direction of this whole thing, where it's going, is it?
00:34:27.760 No, and certainly not.
00:34:29.020 I do believe that whoever wrote it is being exceptionally honest in this document.
00:34:35.620 You almost feel for the author.
00:34:38.440 That's one of the reasons why I didn't include his name in it.
00:34:40.760 But this is a document that's intended to be read by Google internally, possibly by Google
00:34:46.860 executives.
00:34:47.980 So it's not going to pull in any punches or beat around the bush.
00:34:51.720 It's going to be very direct and forward about what's happening.
00:34:56.240 So, yeah, there's no spin in this document.
00:34:59.060 It just lays out the facts of where things are headed.
00:35:01.700 And it shows, really, that there's a massive gap now between what Google says publicly and
00:35:06.020 what they admit privately.
00:35:07.160 This document wasn't meant to go public.
00:35:09.920 It did.
00:35:10.940 And it shows us what the company is thinking internally.
00:35:14.440 And it's not good for consumers.
00:35:16.540 Yeah.
00:35:17.120 Well, it's very concerning.
00:35:19.040 And I say, now that I'm older, much older than you, Alam, I'm probably double your age.
00:35:24.880 When you get to be my age, Alam, you will learn that the phrase, I told you so.
00:35:29.720 When you're young, you love saying, I told you so, because it proves you were right.
00:35:33.100 When you're my age, Alam, you hate saying, I told you so, because it means you failed
00:35:38.220 in your warning.
00:35:39.640 You're a Cassandra from mythology.
00:35:42.060 You are shouting a warning and no one listened and your pessimistic predictions came true.
00:35:47.860 That's what I told you so means when you're my age, Alam.
00:35:50.360 And I see this document as a giant, I told you so.
00:35:54.220 I mean, we've been railing on about internet censorship for about two years here.
00:35:57.780 You've been on the beat dominantly.
00:36:00.660 And it is not fun to say we were right.
00:36:03.140 The censors are coming.
00:36:04.380 But I think it is incontrovertible that every foul prediction that we've made has unfortunately
00:36:11.040 come true.
00:36:12.300 Would you agree with that?
00:36:13.420 Or is there any reason for optimism?
00:36:15.400 I would agree with that.
00:36:18.000 I guess the optimistic thing about this document is that they do acknowledge that they at least
00:36:22.180 have to value free speech a little bit, even if they're moving away from it.
00:36:27.060 But any, you know, Democrats say social media censorship is a conspiracy theory.
00:36:32.360 They can't say that anymore.
00:36:34.220 You know, it's out in the open now.
00:36:35.560 You've got Silicon Valley itself admitting to it.
00:36:38.660 So it can't be denied any longer.
00:36:40.480 The question is what's going to happen and who's going to stand up for consumers here,
00:36:43.520 for ordinary users whose trust has been abused.
00:36:46.760 Yeah.
00:36:47.220 Well, very interesting.
00:36:48.540 And I thank you so much for the time you spent with us today.
00:36:51.340 And over the years, frankly, you have been my personal sherpa, my guide, as we get to
00:36:57.900 know these issues.
00:36:59.020 And I hope we continue to stay in touch as this story evolves.
00:37:02.940 Alan, thanks so much for your time.
00:37:04.840 Thanks, Ezra.
00:37:05.520 All right.
00:37:05.860 There you have it.
00:37:06.340 Alan Bocari, the senior tech writer at Breitbart.com, who is not mentioned by name in
00:37:11.460 the document, but his work is certainly cited.
00:37:14.860 So that in itself is hopeful, isn't it?
00:37:17.220 That Alan, he's not just writing, not just shouting into the wind, as sometimes we feel
00:37:22.080 like we are, but inside Google, they are aware of his criticisms.
00:37:26.700 And hopefully, hopefully, it will make a difference on the inside.
00:37:29.900 Stay with us.
00:37:30.740 More ahead on the way.
00:37:31.500 Hey, welcome back on my monologue yesterday about a British soldier being discharged from
00:37:46.700 the British Army for taking a picture with Tommy Robinson.
00:37:50.180 Tammy writes, this is definitely an example of the military submitting to Islamists.
00:37:55.020 Submission is demanded by Islam.
00:37:56.420 England should stand up to this nonsense.
00:37:58.060 I was watching one of Tommy's videos.
00:38:02.840 He said, is there a list of people you can't take a photo with?
00:38:06.840 I mean, maybe we should circulate the list.
00:38:08.840 Listen, I understand you don't want people in uniform to be partisan.
00:38:14.440 I think that's a good rule.
00:38:15.760 You don't want people to say, oh, that's a liberal police officer or that's a conservative
00:38:21.020 police officer.
00:38:21.980 Police officer has to be uniform.
00:38:23.720 That's what the word uniform means, right?
00:38:25.660 It's one form.
00:38:26.680 It's not a liberal cop and a Tory cop.
00:38:29.440 You're just cops.
00:38:30.640 Same thing with the military.
00:38:31.800 I agree with that.
00:38:33.000 You shouldn't be door knocking for a candidate in your kit.
00:38:37.600 But I'm sorry, you're at a roadside truck stop and you see an internet celebrity, which
00:38:42.000 is what these, the kid who's being discharged, he's 17.
00:38:45.720 He's 17.
00:38:46.420 He's not a deeply scholarly, historical, political guy.
00:38:50.040 He just, Tommy Robinson, that's a pro-military guy, he's on YouTube.
00:38:53.380 Everyone's excited.
00:38:54.100 I'll take a selfie with him.
00:38:54.920 That's it.
00:38:55.640 He's a 17-year-old kid.
00:38:57.280 He's being, his career is being destroyed.
00:38:58.820 He's being drummed out of the army because he took a selfie with Tommy.
00:39:01.120 I'm sorry, taking a selfie at a truck stop and posting it on your private Facebook page
00:39:05.700 is not corrupting the non-partisanship of the British army.
00:39:10.440 I'm sorry, it's not.
00:39:11.200 Linda from Britain writes, every mainstream media describes Tommy Robinson as being far
00:39:16.740 right and holding extremist views.
00:39:18.820 However, they can point to no actual evidence to back these claims up.
00:39:21.920 He holds the same views as millions of people around the world.
00:39:24.420 And we are sick of being called these idiotic names by race baiters simply because we have
00:39:27.960 common sense.
00:39:28.580 Well, that's the thing.
00:39:30.660 And I, I mean, he is critical of the religious and political doctrine called Islam.
00:39:38.440 I don't think that makes him far right.
00:39:40.300 In fact, I told you I've been to the UK too many times, mainly in relation to Tommy.
00:39:44.720 And I've been to a lot of labor towns, northern towns, spent a lot of time in a smallish city
00:39:51.920 called Sunderland, which is near Newcastle-on-Tyne.
00:39:55.620 I never in my life thought I'd be going to these places, by the way.
00:39:59.760 The one thing they have in common, Manchester, London, Luton, the one thing they have in common,
00:40:08.200 they're run by labor politicians.
00:40:11.400 Sunderland is completely labor.
00:40:13.880 So how is it that thousands of people in Sunderland labor down the line?
00:40:21.920 Are supporting Tommy Robinson?
00:40:25.120 Are they far right Jeremy Corbyn voters?
00:40:28.940 No, I don't think you can get away with just saying far right.
00:40:32.360 These are people worried about their lives.
00:40:37.060 The 1,400 girls who were systematically raped in Rotherham, hard left-wing city.
00:40:45.180 It's not a right-wing thing to be upset that your daughter, your sister was raped.
00:40:50.160 That's not right-wing.
00:40:51.960 That's not far right.
00:40:54.160 In fact, most of the victims are working class.
00:40:56.440 I think you would traditionally call them labor.
00:40:58.660 It's just a name.
00:41:00.500 The problem with throwing that word around is that when you actually want to label someone
00:41:06.280 far right, far right, the word will be meaningless.
00:41:09.300 It's like a sharp knife dulled by profane use.
00:41:13.220 And when you need it to be sharp, it ain't sharp anymore.
00:41:17.080 And that's why we try to keep certain words the opposite of profane.
00:41:21.820 We don't want to dull them with overuse.
00:41:24.320 To call someone a Nazi who is not a Nazi takes away the special meaning of that word.
00:41:30.160 I don't call people communists a lot.
00:41:33.740 Do you notice that?
00:41:35.520 I don't say he's a commie, she's a commie, because to me, communism has a special meaning.
00:41:39.900 It's a deep sort of evil.
00:41:41.800 And I don't say it unless someone self-identifies or they truly are communists, because I don't
00:41:46.100 want to profane the meaning of that term by overuse.
00:41:50.520 The promiscuity with which the left calls people far right and racist is taking away
00:41:57.500 the meaning of those words.
00:42:00.440 And I think that's a shame, because we actually do need to call out true racists.
00:42:05.020 And if you're calling Tommy Robinson, whose best friends are black and Sikh, you're calling
00:42:08.760 him racist, you're really destroying the meaning of the word.
00:42:11.960 Another letter, Steve writes, I wonder when a Muslim will be stationed in Westminster to
00:42:16.680 keep an eye on Queen Elizabeth.
00:42:18.120 Well, I was at the Sun News Network and I interviewed Anjum Chowdhury, who's now in prison for supporting
00:42:24.540 ISIS, and he wants the Queen to wear a hijab.
00:42:27.720 He wants her to submit, and he wants, of course, Westminster Palace to be turned into a mosque.
00:42:35.560 Keith writes, caution, Ezra, that petition could get you barred from the UK.
00:42:40.280 They barred Kurt Wilders and Robert Spencer for less.
00:42:43.080 You're talking about the petition in support of these British soldiers.
00:42:48.720 Yeah, you know, it could be.
00:42:50.340 Every time I go over to the UK, I cringe for a moment.
00:42:52.980 I have what's called a registered traveler or trusted traveler status.
00:42:57.380 You know, in Canada, it's called Nexus.
00:42:58.940 You get fingerprinted in Canada.
00:43:00.740 In the UK, it's a similar status.
00:43:02.420 So they whisk me through pretty quickly.
00:43:04.680 It's amazing.
00:43:05.820 But every time I sort of tense up, I think, oh, is this the time they're going to say,
00:43:11.320 oi, mate, where's your gender critical license?
00:43:13.880 Mate, where's your license to, oh, you like Tommy Robinson, where's your license, mate?
00:43:19.560 On my interview with Lee Humphrey on the ISIS terrorist who really wants to return to Canada
00:43:24.240 because he's tired, Lance writes, I've been saying for years these guys should simply be held prisoner.
00:43:30.420 There is a command structure, they carry their weapons openly, and they have a uniform,
00:43:33.640 a beard, close enough.
00:43:35.820 No, I think you might have misunderstood my point about the Geneva Conventions.
00:43:39.660 And let me take a minute on this.
00:43:42.700 Geneva Conventions was a great civilizational step forward.
00:43:47.960 In the past, cruel and barbaric armies, when they captured soldiers in war, would just kill them.
00:43:55.080 Just kill them all.
00:43:57.280 Other armies, through pragmatic reasons, as much as moral reasons,
00:44:01.700 soldiers sold had a ransom.
00:44:05.540 If you came and paid gold or whatever wealth, you could redeem your captive soldiers.
00:44:12.580 So there was actually an economic incentive not to be barbaric.
00:44:16.260 You could sell the prisoners free to the other side.
00:44:19.900 So the Geneva Conventions was a great moral step forward in the laws of war.
00:44:26.080 And you might say, well, laws of war don't mean anything.
00:44:28.600 War is the absence of law, isn't it?
00:44:31.060 It's the replacement of law by brute force, the failure of law.
00:44:35.840 And there's truth to that.
00:44:37.140 But civilized countries can make war in a lawful way.
00:44:40.640 And the Geneva Conventions were an attempt to do that.
00:44:42.640 Now, of course, in the Second World War, I mean, Japan was barbaric in its treatment of captured soldiers,
00:44:49.880 including British soldiers in Hong Kong and Burma and places like that.
00:44:53.980 But not us.
00:44:55.420 I mean, in Canada, even where I come from, Alberta, there were thousands of Wehrmacht soldiers
00:45:01.120 sent by ship from Europe to Canada to, I'm not going to say they were nice,
00:45:06.160 but they were not penitentiaries.
00:45:07.280 They were not punishments.
00:45:08.720 They were just held.
00:45:09.860 These thousands of German soldiers, conscripts, they were not put on trial for the crime of being a soldier
00:45:16.060 because it's not a crime to be a foreign soldier.
00:45:18.900 That's not a crime.
00:45:20.500 We just kept them off the battlefield until the Nazis surrendered.
00:45:23.820 And then they were let free.
00:45:24.820 And I think I might have told you the story one day about one young Hitlerjugend who was pressed into service.
00:45:31.120 I think he was 16.
00:45:33.060 In the final moments of the war, they were just literally sending kids into battle.
00:45:37.400 Well, he told me he surrendered the first moment he had contact with the Allies.
00:45:41.380 They shipped him to Alberta and he liked it so much he stayed.
00:45:44.620 I won't tell you his name, but he tells me the story.
00:45:47.320 So he told me the story about his denazification exam.
00:45:50.080 They gave these people exams before they were allowed to stay in Canada.
00:45:53.800 Isn't that interesting?
00:45:54.620 It's a tangent.
00:45:55.100 I don't think that terrorists deserve the treatment of the Geneva Convention.
00:46:04.740 And I know in law that they don't.
00:46:06.900 They don't meet the three tests, bear their weapons openly, be part of a chain of command,
00:46:10.980 wear a uniform, and generally follow the laws and customs of war that I just described that the West follows,
00:46:17.440 but barbaric cultures don't.
00:46:19.200 So terrorists are a species in law, hostess humanae, going from memory.
00:46:26.420 They're hostile to all humanity.
00:46:28.220 And you can actually kill them on sight with a drumhead trial.
00:46:32.600 They're called outlaws.
00:46:34.120 You can treat them like pirates.
00:46:36.380 That is their status in law.
00:46:38.120 They don't have the benefit of the Geneva Convention like my old buddy from the Wehrmacht had,
00:46:44.160 because they're not civilized themselves.
00:46:46.260 They're barbaric enemies of all mankind.
00:46:47.860 So in law, you can have a drumhead trial and execute these people.
00:46:55.140 But we're too gentle, aren't we?
00:46:57.420 We're too liberal, aren't we?
00:46:59.160 And Trudeau not only doesn't prosecute, let alone execute,
00:47:03.140 he gives them ten and a half million bucks each.
00:47:04.880 I'm sorry I'm going on too long about this,
00:47:06.720 but I want to let you know that these terrorists do not have the rights
00:47:11.240 that our courts ascribe to them.
00:47:13.740 For centuries, the law allows drumhead trials,
00:47:18.840 as in quick summary trials,
00:47:21.140 for terrorists, for pirates,
00:47:23.640 for spies in war.
00:47:25.640 They do not legally have the rights,
00:47:28.560 historically, in the West or the Geneva Conventions,
00:47:30.800 that we give them.
00:47:32.340 Maybe I'll do a show on that another day.
00:47:34.540 Until next time,
00:47:35.520 on behalf of all of us here at Rebel World Headquarters,
00:47:37.380 to you at home,
00:47:37.800 goodnight,
00:47:39.720 and keep fighting for freedom.
00:47:40.820 Good night,
00:47:41.360 and keep fighting for freedom.