Rebel News Podcast - January 03, 2019


SPECIAL! Allum Bokhari predicts “marches on Silicon Valley”


Episode Stats

Length

43 minutes

Words per Minute

165.61214

Word Count

7,233

Sentence Count

448

Misogynist Sentences

5

Hate Speech Sentences

6


Summary

Breitbart's Alana Bokhari joins Ezra to discuss the dangers of Big Tech Censorship, and why it's our greatest fear for 2019: Big Tech's censorship. Ezra and Alan discuss why Big Tech is a threat to our freedom and why we should be worried about it.


Transcript

00:00:00.000 Tonight, my greatest fear for 2019, big tech censorship.
00:00:05.760 We'll have a feature interview with Breitbart.com's Alam Bokhari.
00:00:09.740 It's January 2nd, and this is the Ezra Levant Show.
00:00:18.020 Why should others go to jail when you're a biggest carbon consumer I know?
00:00:21.880 There's 8,500 customers here, and you won't give them an answer.
00:00:25.600 You come here once a year with a sign, and you feel morally superior.
00:00:28.580 The only thing I have to say to the government about why I publish it is because it's my
00:00:33.600 bloody right to do so.
00:00:39.340 Welcome back, and Happy New Year.
00:00:41.480 This is the first show of 2019, and we're going to look forward to what will happen in the
00:00:47.520 world of tech.
00:00:48.860 Now, there was a time when tech was some obscure thing that only either the really cool or
00:00:53.740 really nerdy kids were interested in, but now tech, well, that's just a code for every
00:00:58.560 single thing we do.
00:00:59.900 There's not one of us who's not active on Facebook or Pinterest or, if you're in business,
00:01:05.900 on LinkedIn.
00:01:06.800 Our whole company, The Rebel, lives on the internet.
00:01:10.220 YouTube, Facebook, everything.
00:01:12.300 And the thing is, that was wonderful when it was this great, undiscovered, frontier-style
00:01:18.640 Wild West of freedom.
00:01:21.860 In fact, that's why we put The Rebel on the internet when the Sun News Network shut down
00:01:25.400 three and a half years ago, almost four years ago.
00:01:28.940 But now we're vulnerable to the colonization of big tech by social justice warriors and
00:01:36.480 grievance mongers who are bringing censorship to big tech in a far more serious way than
00:01:43.740 was ever done through the democratic institutions, even the Human Rights Commission.
00:01:48.800 So joining us now for the course of today's show is our friend Alan Bokhari, who's the senior
00:01:54.520 tech correspondent for Breitbart.com, who is the leading journalist in the world covering
00:02:00.120 the political colonization of big tech.
00:02:02.640 Alan, great to see you again.
00:02:04.600 Hello, Ezra.
00:02:05.360 Would you agree with me that, I mean, maybe, of course, you would agree because it's your
00:02:10.140 beat.
00:02:10.960 But I think that the most important political beat in the world right now is not the Trump
00:02:16.120 collusion narrative or the daily shenanigans in the White House or not even the EU fights,
00:02:22.800 Brexit.
00:02:24.180 I think it is who will control the politics of the internet because that has the power
00:02:30.640 to snuff out and deplatform political candidates of the highest order.
00:02:35.880 It has the ability to shape what we perceive the world to be.
00:02:40.000 I think that the battle for the political soul of the internet is the largest story of our
00:02:45.360 time.
00:02:45.800 That's my view.
00:02:46.380 What do you think?
00:02:46.980 I agree because, as you said in your intro, tech now refers to everything, every aspect
00:02:55.560 of our lives, everything we do, whether it's business, political speech, or just talking
00:03:01.140 to our friends.
00:03:01.720 We now do it online via some sort of online app.
00:03:05.340 So these companies have grown to such an extent and have such a level of importance in our
00:03:16.160 lives and also know so much about us that they've, in a sense, become more relevant, more
00:03:22.260 important to our lives than national governments.
00:03:25.040 But the flip side of that is they aren't subject to the rules and regulations of national governments.
00:03:30.560 They are not currently subject to the First Amendment or any protections on privacy or
00:03:36.980 any part of the Constitution, really, because they're a private company.
00:03:42.220 And until they're subject to some level of official oversight and some level of regulation,
00:03:47.700 they're just going to continue to have a totally arbitrary, tyrannical level of power
00:03:52.560 over our lives.
00:03:53.320 And, of course, when you think of the number of authoritarians we have coming out of Western
00:03:58.840 University at the moment, social justice warriors, as he pointed out, this is extremely tempting
00:04:03.180 to them.
00:04:03.560 And they've been putting pressure for the past year.
00:04:06.060 We've been seeing the results of their efforts in far-left Silicon Valley in the increasing
00:04:11.660 level of censorship we've seen from these companies.
00:04:14.220 And unless President Trump or Congress takes action, that's the future that we'll see.
00:04:24.100 We'll see more and more takeovers of these incredibly influential companies by the far-left
00:04:30.660 who want to control these companies, who recognize the amount of power they have and want to use
00:04:35.140 it to get their way, to enforce their will and their values.
00:04:38.640 And that's the power and the utility of the Internet is how everything is so seamlessly
00:04:45.920 connected.
00:04:47.140 But that's also its grave threat to us.
00:04:49.320 I mean, it wasn't long ago when many people were afraid to use their credit card on the
00:04:55.220 Internet.
00:04:56.160 They just didn't.
00:04:57.180 I mean, PayPal was exotic or obscure.
00:05:01.420 It just wasn't normal.
00:05:03.660 Well, now almost everyone uses Facebook, 2 billion users.
00:05:08.880 We all bank online.
00:05:10.360 Now you can deposit a check by taking a picture of it on your phone.
00:05:13.760 We forget how ubiquitous it is and things we don't even think of as the Internet.
00:05:20.880 But what's terrifying is that it's now connected and it's tagged, whether it's a cookie or a
00:05:28.080 database file, to everything else we do or say.
00:05:31.220 It's our whole life.
00:05:32.480 You just you can't unplug.
00:05:34.440 You can't unless you're going to go full Unabomber, live in the forest in a hut.
00:05:39.740 You cannot unplug, can you?
00:05:42.960 No, you can't.
00:05:43.960 And interesting that you mentioned the Unabomber and people living in forests.
00:05:46.840 He's actually enjoying a bit of a renaissance on on social media.
00:05:50.520 There's this whole movement, very sort of obscure.
00:05:54.280 I don't know much about it myself, called Pine Tree Twitter.
00:05:56.720 And it's this whole idea that you've got to unplug, you've got to go live off the grid.
00:06:02.060 You've got to go disconnect from technology.
00:06:06.540 I think that's movements like that, sort of fairly fringe movements like that.
00:06:12.060 That is something we're going to see in the future in response to the growing power of
00:06:16.400 these companies.
00:06:16.980 And honestly, I think, you know, we used to see marches in Washington because of Washington's
00:06:24.480 political power.
00:06:25.540 I think we're going to see marches in Silicon Valley because the nexus of power has shifted.
00:06:31.000 It's no longer politicians who seem to control our lives.
00:06:33.660 It's a bunch of unaccountable Silicon Valley CEOs.
00:06:36.180 Yeah, and what a tragic coincidence that the most left-wing city in America is in the San
00:06:45.500 Francisco area is the tech city.
00:06:47.900 It couldn't have been Dallas or Houston or, you know, Salt Lake City or Boise, Idaho.
00:06:54.560 It had to be the leftist city.
00:06:56.260 Well, listen, I promised that we would look at a few of the highlights of 2018 and then
00:07:01.520 look forward to 2019, and you helped us to select three video clips that you think are
00:07:08.720 emblematic of some of the conversations in tech and politics, and I'd like to play those
00:07:14.560 for our viewers and have your comments on each.
00:07:17.480 The first is the shortest and the simplest.
00:07:20.600 It's from a December congressional hearing where Sundar Pichai, the senior executive of Google,
00:07:27.060 was asked by a very sympathetic Democrat, Gerald Nadler, who actually receives tens of thousands
00:07:32.940 of dollars in donations from Facebook.
00:07:35.080 He was given a softball question.
00:07:36.900 I was really surprised by it because the answer, if he would have known the answer in advance,
00:07:40.640 I don't think he would have asked the question.
00:07:42.500 It's a Democrat asking Google, how much money did Russia spend on Google, which owns YouTube
00:07:51.440 and others, in the 2016 election?
00:07:54.860 Was it hundreds of millions?
00:07:56.620 Was it millions?
00:07:58.860 Well, here's the clip with the question and the answer.
00:08:02.320 Now, according to media reports, Google found evidence that Russian agents spent thousands
00:08:06.380 of dollars to purchase ads on its advertising platforms that span multiple Google products
00:08:11.880 as part of the agents, the Russian agents' campaign to interfere in the election two years
00:08:16.360 ago.
00:08:16.600 Additionally, Juniper Downs, head of global policy for YouTube, testified in July that
00:08:22.720 YouTube had identified and shut down multiple channels containing thousands of videos associated
00:08:29.500 with the Russian misinformation campaign.
00:08:32.220 Does Google now know the full extent to which its online platforms were exploited by Russian
00:08:36.920 actors in the election two years ago?
00:08:38.980 We have, you know, we undertook a very thorough investigation, and in 2016, we now know that
00:08:50.280 there were two main ad accounts linked to Russia, which, you know, advertised on Google for about
00:08:58.760 $4,700 in advertising.
00:09:01.340 We also found other limited...
00:09:02.980 A total of $4,700?
00:09:04.840 That's right.
00:09:05.440 Well, Alan, the answer was $4,700.
00:09:10.700 I mean, us here at The Little Rebel, we've spent that much on Google Ads in our short life, just
00:09:20.760 on some little campaign we did once.
00:09:22.600 $4,700.
00:09:24.120 For two years, we've heard this theme that Russia bought the election through fake news
00:09:28.980 and Twitter ads and Facebook ads.
00:09:31.280 $4,700, that's not a lot of money.
00:09:33.420 That's sort of a joke, but a joke that's been taken seriously for two years by the mainstream
00:09:38.660 media.
00:09:40.360 Yeah, well, the whole point about this idea that Russian disinformation swung the election
00:09:45.740 is twofold.
00:09:47.940 One, it's to delegitimize the election of Donald Trump.
00:09:50.100 What we've seen in the past two years is this elite backlash against any form of, any way
00:09:59.500 in which ordinary people express their opinion.
00:10:01.360 And that includes elections, that includes the Brexit referendum, that includes comment
00:10:04.620 sections on the internet.
00:10:06.060 And the whole idea is, you know, the election was illegitimate.
00:10:09.220 You know, the referendum was illegitimate.
00:10:11.680 It was all influenced and tainted by Russia.
00:10:13.740 So they have to push this idea that Russia had this huge influence on social media.
00:10:18.260 Obviously, they didn't.
00:10:19.040 It's total nonsense.
00:10:21.380 You know, even anti-Trump researchers like the political psychologist Brendan Nyhan have
00:10:27.680 said the effect of fake news and propaganda on the election on voters was absolutely nil,
00:10:33.340 because what little Russian ads there were on Facebook and other platforms, they were targeted
00:10:38.800 at political partisans who had already made up their mind.
00:10:42.100 So it didn't really swing any votes.
00:10:45.420 Now, I'm not, you know, Russia did play some role in social media, but their goal has never
00:10:51.540 really been to swing elections one way or the other.
00:10:54.400 Their goal is to manipulate both sides and deepen divides between both sides.
00:11:00.200 It's very different.
00:11:01.900 And, you know, South discord and doubt.
00:11:03.800 So it's very different from actually trying to influence an outcome.
00:11:06.280 Um, and actually, you know, this panic over Russia sort of helps them because, uh, it
00:11:12.680 helps them achieve that goal because it helped it, uh, you know, it deepens divides and makes
00:11:16.460 us mistrust each other even more.
00:11:17.820 That's Russia's goal, not in, not winning elections.
00:11:20.620 Uh, and this whole idea that, you know, they spent 4,700 or whatever on, uh, on Google ads.
00:11:26.580 It, it, it just shows you the depth of this, uh, conspiracy theory that Democrats are caught
00:11:31.540 up in.
00:11:32.520 And, uh, and, you know, the Democrats were saying social media since it was a conspiracy.
00:11:35.900 So there's a lot more evidence for that than there is for, uh, Russia trying to influence
00:11:40.480 people through Google.
00:11:41.240 Yeah.
00:11:42.080 Well, when Facebook shut down, uh, 30 or 40,000 Marine Le Pen sympathetic Facebook pages in
00:11:48.780 France, that single act of foreign interference from Mark Zuckerberg in the French elections
00:11:53.880 was far more effective and terrifying and under the radar and unreported and malicious, I would
00:12:01.460 say, than anything Russia would dream of doing.
00:12:03.580 Let's move on to the next, uh, clip.
00:12:06.140 This was a big, big scoop that you guys got.
00:12:08.920 And I got to say, Alan, not only are you covering the important beat, but you're just getting,
00:12:13.540 because of that, you're known in Silicon Valley as the guy to go to if you've got a leak, because
00:12:19.620 the other people, the other mainstream media would bury it because they just love talking
00:12:24.200 about, well, my new iPhone has this app on it.
00:12:27.080 And I, maybe I can, you know, get some, um, you know, free demo at, like I used to work,
00:12:33.100 um, uh, I remember when the national post, uh, when these tech companies would give out
00:12:39.960 devices to journalists, to try them out, like product test them and they would keep them
00:12:45.240 afterwards.
00:12:45.840 What's that but a bribe?
00:12:48.240 That's a thousand dollar bribe.
00:12:49.620 I remember when Jonathan Kaye reviewed some new handphone and was in love with, of course
00:12:54.780 he was.
00:12:55.560 And yeah, you're immune to that kind of, uh, uh, colonization, to use that word again.
00:13:02.040 Uh, and so you get all the scoops because the dissidents in Silicon Valley know they can
00:13:05.560 trust you.
00:13:05.960 And so you got a full one hour video of a weekly town hall that Google has where they
00:13:13.400 basically put their staff in a big theater and other people can join by remote.
00:13:17.720 This was never supposed to be seen outside of the inner sanctum of Google.
00:13:23.740 Just incredible.
00:13:25.580 I want to play for you an excerpt from a senior Google executive talking about partisan politics,
00:13:33.960 the hate for Donald Trump and the sorrow that he lost.
00:13:37.060 Take a look.
00:13:37.780 8.30 PM on Tuesday night, I was at home with friends and family watching the election returns.
00:13:44.280 And, uh, as we started to see the direction of the voting, I reached out to someone close
00:13:51.200 to me who was at the Javits Center where the big celebration was supposed to occur in New
00:13:56.640 York City.
00:13:57.620 Somebody had been working on the campaign and, um, I just sent him a note and said, you know,
00:14:02.260 are you okay?
00:14:02.860 It looks like it's going the wrong way.
00:14:06.220 And I got back a very sad short text, um, that read, people are leaving, staff is crying,
00:14:14.240 we're going to lose.
00:14:17.360 Uh, that was the first moment I really felt like we were going to lose.
00:14:24.800 And it was this massive, like, kick in the gut that we were going to lose.
00:14:28.620 And it was really painful.
00:14:30.480 It did feel like a ton of bricks dropped on my chest.
00:14:33.580 And I've had a chance to talk to a lot of fellow Googlers and people have said different
00:14:38.100 words, similar concept.
00:14:39.540 This, how painful is it?
00:14:41.260 How painful this is?
00:14:42.340 I think that's an incredible video.
00:14:43.900 And that's a scoop that you got.
00:14:46.120 Um, the fact that they were having this big staff cry is amazing.
00:14:50.780 But to me, the most powerful thing, and she said it three times, she didn't say the Democrats
00:14:56.520 were going to lose.
00:14:57.300 Hillary Clinton was going to lose.
00:14:59.100 They were going to lose.
00:15:00.920 She said, we are going to lose.
00:15:03.900 She said it three times.
00:15:04.920 First quoting her friend embedded in the Clinton campaign.
00:15:08.980 I bet she was talking about Eric Schmidt, the senior Google executive who was a Hillary
00:15:12.580 Clinton campaign volunteer.
00:15:14.380 Then she repeated it two more times.
00:15:16.360 We were going to lose because there is no division in the mind of a senior Google executive
00:15:21.660 between the company and Hillary Clinton's Democratic campaign.
00:15:26.100 I don't even think she realized she was saying it that way.
00:15:29.520 And no one in that town hall meeting objected because they all think the same way.
00:15:35.960 Yeah, that meeting featured the head, all the heads of Google had the co-founders, Larry
00:15:41.980 and Sergei, it had the CEO, Sundar Pichai, and the head of the chief legal officer, Kent
00:15:47.940 Walker.
00:15:48.720 It had, you know, everyone you can think of.
00:15:51.600 And they were all distraught.
00:15:53.340 They were all holding this sort of funeral for the country just one day after Donald Trump
00:15:57.980 won the election.
00:15:59.980 So there was, what you saw was a total uniformity of values at the top of Google.
00:16:04.320 Yeah.
00:16:05.020 And, you know, Google maintained that their political biases don't filter down into their
00:16:09.260 product, but how can you know the whole idea of a bias is that it's unconscious?
00:16:13.700 You don't know the effect it's having.
00:16:15.620 And the other thing we don't know is what internal safeguards Google has to prevent bias
00:16:21.820 from affecting their products.
00:16:24.120 We saw the Google hearing last month when Sundar Pichai went to go and testify on Congress.
00:16:30.480 And one of the best questions, I thought, was Representative Matt Gaetz of Florida.
00:16:34.300 He repeatedly asked Google, you know, have you launched an investigation into political
00:16:39.560 bias?
00:16:41.140 Because, you know, we've seen stories, we've seen leaked emails that we've pulled from
00:16:44.180 Breitbart showing that there are groups of Googlers who do work inside the company to
00:16:49.120 influence policy against conservatives and conservative media.
00:16:52.120 There was a concerted effort by Googlers, with the assistance of the director of monetization,
00:16:57.100 who's now the director of trust and safety, to get Breitbart demonetized from Google's
00:17:03.460 ads by cataloging examples of so-called hate speech, not from articles, but from the comments
00:17:09.420 section.
00:17:10.480 So we do see these anti-conservative efforts inside Google.
00:17:14.700 And the question is, what internal safeguards does the company have to stop the biases of
00:17:20.160 their workforce getting out of hand?
00:17:23.120 What sort of review process do they have when they introduce new products?
00:17:28.480 We know they have a review process.
00:17:30.300 You know, every Google product is looked at by multiple people and reviewed by multiple
00:17:34.320 people.
00:17:34.900 But we don't know if all the people reviewing them are all liberals or all left-wing, all
00:17:39.420 have a particular leaning.
00:17:41.520 And we don't know if they, you know, I don't think they do.
00:17:44.960 They never told me.
00:17:46.120 I don't know if they explicitly control the politics and that review process.
00:17:49.900 It seems like it would be a good thing to do if they were trying to avoid political bias.
00:17:54.120 But I don't think they're very serious about doing that.
00:17:57.700 All right.
00:17:57.880 Let's take a quick look at that question and answer from that congressional hearing in December.
00:18:05.320 Have you ever launched an investigation into whether political bias is impacting the consumer
00:18:08.940 experience?
00:18:11.520 Congressman, we do.
00:18:13.860 To the extent there are concerns, we look into them and, you know, it's...
00:18:17.340 Have you expressly launched an investigation into political bias of your employees?
00:18:23.380 On our employees, you said?
00:18:24.900 Yes.
00:18:25.740 You know, to the extent, you know, we always take...
00:18:28.360 We take any allegations around code of conduct across every issue seriously and we look into
00:18:32.620 them.
00:18:34.780 You said to me yesterday that, as it relates to political bias, you haven't launched those
00:18:39.560 investigations because there are so many redundancies and there is so much peer review that that
00:18:44.620 would not be possible.
00:18:45.540 Is that still your testimony today?
00:18:47.440 Congressman, it's the way our processes work.
00:18:50.540 If you need to make a change in our algorithms, there are several steps in the process, including
00:18:54.800 launch committees and user testing and our rate of guideline evaluation.
00:18:59.020 But at your company, your employees can get together and chat in groups, right?
00:19:02.520 Google groups?
00:19:04.160 Yes, they can.
00:19:04.960 One of those groups is the civil rights group, right?
00:19:07.580 We have many employee resource groups on which they can participate in conversations, yes.
00:19:12.000 Have you ever looked into the conversation into the resist group?
00:19:15.540 Congressman, no.
00:19:18.380 Is it...
00:19:18.860 Does that strike...
00:19:19.900 Is that a surprise to you that there's a resist group?
00:19:23.040 I'm not aware whether such a group exists or not.
00:19:25.700 If there was a resist group, would that be the type of thing that you would want to look
00:19:29.260 into?
00:19:29.520 You know, we have clear policies around how our products are built and...
00:19:35.520 No, but if there's a resist, you know that the resist movement is a movement built to resist
00:19:39.220 the agenda of President Trump.
00:19:40.360 If there's a resist group within your company where groups of employees, not one, are getting
00:19:44.920 together within that group to engage in discourse on company time with company infrastructure,
00:19:49.560 does that strike you as the type of thing you would want to investigate?
00:19:52.220 Congressman, I'm not aware of any such group.
00:19:55.940 None like that has been brought to my attention and, you know, happy to follow up to, you know,
00:20:00.860 and understand the consent better.
00:20:03.320 Yeah.
00:20:04.180 Mr. Chairman, I seek unanimous consent to enter into the record a document from what purports
00:20:08.980 to be Google employee Miles Borens, which is opposed to the Google group resist.
00:20:13.960 Well, Alan, it raises a good question because Google is not just neutral.
00:20:19.980 YouTube, Facebook, all these companies are not neutral.
00:20:22.660 They are the largest hirer of lobbyists in Washington.
00:20:27.040 They spend, you know, they've displaced the banks and the oil companies and the arms dealers
00:20:31.780 as the biggest lobbyists and the biggest political donors.
00:20:35.780 And so you can imagine if a congressman asks a friendly question of Google, he'll get not just
00:20:43.060 more donations.
00:20:44.680 And if a congressman asks a critical question of Facebook, Google, YouTube, Twitter, whatever,
00:20:49.820 he'll get fewer donations.
00:20:51.380 Now, that's fine.
00:20:53.280 But what's terrifying that we can't track like we can track donations is, is there some
00:20:59.020 internal squad within Google, YouTube, Facebook, Twitter that says, ah, that guy, that guy asked
00:21:05.560 us tough questions.
00:21:06.900 Let's make sure anyone who Googles that congressman that all the bad stories come up high, the
00:21:12.860 the good stories are down low.
00:21:15.860 And maybe let's demonetize his YouTube page.
00:21:19.480 And like what I'm saying is the secret efforts to marginalize Breitbart that you disclosed,
00:21:26.120 why wouldn't they have similar efforts to marginalize political opponents?
00:21:32.080 Because they have an oil company couldn't do that.
00:21:35.000 An arms dealer couldn't do that.
00:21:36.600 But people who control the Internet could do that.
00:21:41.220 And here's why we here's why Google and these other companies, we need that process in place
00:21:46.360 to to make sure that anything they do is reviewed to ensure that there's no political bias,
00:21:52.240 because, you know, the things we've leaked are things that Googlers have been discussing
00:21:56.680 in private on their email lists or in these recorded meetings.
00:22:00.600 But, you know, not everything that happens at Google HQ is recorded or put down in writing.
00:22:06.040 And especially after these leaks, I think they'll be putting a lot less down in writing
00:22:09.200 going forward.
00:22:11.020 So we don't know if there's a group of Googlers, you know, sitting around a cafeteria table discussing
00:22:15.720 how to demonetize conservative media right now.
00:22:18.280 And, you know, there's very little way we'd be able to know that unless there's someone
00:22:23.120 in the group who want to tell us about it.
00:22:24.660 So it's going to become harder and harder for the media to track these companies, especially
00:22:29.820 as they become more paranoid and more safety conscious about leaks.
00:22:34.120 So we need to be asking ourselves, what what safeguards does Google have?
00:22:38.600 What safeguards should they have?
00:22:40.460 And how effective are they?
00:22:42.500 Yeah.
00:22:42.780 You know, it wouldn't surprise me.
00:22:44.240 We were talking earlier about people de connecting, disconnecting from the Internet.
00:22:49.700 It wouldn't surprise me if senior executives at these tech companies had all their official
00:22:54.880 online communications and then they carried around an old fashioned pad of paper and a
00:22:59.680 pencil or a pen, something ultra low tech that couldn't be forwarded, couldn't be disclosed
00:23:09.220 in a search.
00:23:09.880 Which I absolutely believe that there is a that there is a disconnected Google where they
00:23:19.100 have frank talks about people like you, about Breitbart, about people like us in a smaller
00:23:23.880 degree and that you wouldn't find it in a search.
00:23:26.900 And if there was litigation, they'd say, oh, we searched all our files and there's nothing
00:23:30.200 because it's in a little that's a little conspiracy theory on my part.
00:23:32.880 But it would not surprise me one bit.
00:23:34.360 I want to show one more video again.
00:23:36.920 Again, this was from late in 2018.
00:23:39.860 It's from Tim Cook, the CEO who took over from Steve Jobs at Apple.
00:23:47.320 And I've shown this, I did a show about this last month.
00:23:51.500 Compare this to the great 1984 TV ad that Apple made when they were introducing their computer
00:23:59.280 against the conformity and linear thinking of IBM.
00:24:04.440 IBM's motto for really a century has been think, which is a great motto.
00:24:10.000 So Apple came along with think different.
00:24:13.260 And they had, I don't know if you remember, that great ad that was showing 1984, everyone
00:24:18.120 was like a zombie.
00:24:19.320 And this one woman came in and threw a hammer and smashed the screen of Big Brother.
00:24:23.560 And it was think different in 1984.
00:24:25.880 It doesn't have to happen.
00:24:26.900 It was this great, great, great ad about thinking differently.
00:24:29.880 Well, that was then.
00:24:32.640 And here's Tim Cook of Apple saying what he thinks about anyone who thinks differently.
00:24:37.900 He calls them a divider now.
00:24:40.620 Watch this terrifying clip of Tim Cook at an anti-hate rally, which is right out of Orwell 2.
00:24:47.640 Take a look at this.
00:24:48.220 Perhaps most importantly, it drives us not to be bystanders as hate tries to make its headquarters
00:24:57.360 in the digital world.
00:25:00.200 At Apple, we believe that technology needs to have a clear point of view on this challenge.
00:25:07.660 There is no time to get tied up in knots.
00:25:10.040 That's why we only have one message for those who seek to push hate, division, and violence.
00:25:22.320 You have no place on our platforms.
00:25:26.120 You have no home here.
00:25:38.520 From the earliest days of iTunes to Apple Music today, we have always prohibited music with
00:25:45.240 a message of white supremacy.
00:25:47.080 Because it's the right thing to do.
00:25:57.580 And as we showed this year, we won't give a platform to violent conspiracy theorists on
00:26:03.280 the App Store.
00:26:10.020 Why?
00:26:11.100 Because it's the right thing to do.
00:26:13.060 My friends, if we can't be clear on moral questions like these, then we've got big problems.
00:26:21.680 At Apple, we are not afraid to say that our values drive our curation decisions.
00:26:29.380 And why should we be?
00:26:32.460 Doing what's right, creating experiences free from violence and hate, experiences that empower
00:26:40.660 our creativity and new ideas is what our customers want us to do.
00:26:46.880 I believe the most sacred thing that each of us is given is our judgment, our morality, our
00:26:55.780 own innate desire to separate right from wrong.
00:27:00.280 Alan, I think that Tim Cook is doing something very tricky there.
00:27:08.140 He's conflating division, which means a difference of opinion.
00:27:12.700 He's just complaining with hate, which is a negative emotion, which all humans have.
00:27:19.180 With violence, which is a crime.
00:27:20.960 And I don't even know how you do something violent on the internet.
00:27:24.180 I guess you could talk about something violent.
00:27:27.040 But it's hard to do something.
00:27:28.360 So he's throwing in the word violence there to denormalize division, which is another way
00:27:33.340 of saying think different.
00:27:35.000 And hate, none of us like to show hate.
00:27:37.660 But if we take our hate and sublimate it into something constructive and positive, it can
00:27:42.020 be a great fuel for constructive change.
00:27:44.560 I'm sure Martin Luther King Jr. hated Jim Crow laws and hated racism, but he turned it into
00:27:50.140 something positive.
00:27:51.800 Hatred, division, and violence.
00:27:54.320 I'll tell you one thing.
00:27:55.140 Apple isn't a safe place for thinking differently anymore, is it?
00:27:59.440 No.
00:28:00.120 It's a very brave new world at the moment.
00:28:03.360 They've become the dystopia they used to make ads against.
00:28:07.600 And that whole thing at the end there is, it was very, very, very sly how he was talking
00:28:16.600 about.
00:28:17.420 We have a response.
00:28:18.700 We all want to choose between right and wrong.
00:28:21.240 We all have a responsibility to do that.
00:28:22.900 OK, well, apparently Apple's consumers don't have that responsibility anymore because Tim Cook
00:28:27.020 took it away from them.
00:28:28.120 They don't get to choose what's right and wrong.
00:28:30.080 They don't get to choose which podcast is fake news and conspiracy theories and which
00:28:34.120 one isn't because Tim Cook's already made that decision for them.
00:28:36.940 And that's the real question here.
00:28:39.080 Yeah.
00:28:39.300 I mean, who gets to make these decisions?
00:28:41.660 Does Tim Cook get to make them on behalf of the millions of people who use Apple products?
00:28:46.460 Or, you know, and that essentially infantilizes those users.
00:28:49.540 It says they don't, if you leave it up to them, they'll make the wrong decision.
00:28:54.700 That's the implication there.
00:28:56.080 And that's the question we should be asking, you know, whether we, whatever we think of
00:29:01.680 all the, of, you know, so-called conspiracy theorists or so-called hate mongers, who gets
00:29:08.020 to decide what, who falls into those categories?
00:29:11.420 Is it a small number of unaccountable heads of corporations?
00:29:14.460 Or is it the millions and millions of people who use the internet and consider themselves
00:29:19.380 intelligent enough to make those decisions?
00:29:21.820 You know, Tim Cook said on stage, is what his users want?
00:29:25.340 Has anyone asked them?
00:29:26.600 Has there been a survey?
00:29:27.960 You know, when there's been surveys about hate speech, most people disagree with the
00:29:30.860 concept.
00:29:31.280 There's been surveys about political rightness.
00:29:32.900 Most people disagree with the concept.
00:29:34.200 This is something believed by elites.
00:29:36.520 It's not believed by ordinary people.
00:29:38.000 And I, you know, even ordinary Apple consumers, many of whom are left wing, I think they'll
00:29:41.860 want, they'd want to, to make their own decisions, make their own minds up about what they want
00:29:46.460 to subscribe to and who they want to hear from.
00:29:48.480 And they wouldn't want some random CEO making that decision on their behalf.
00:29:52.800 Yeah.
00:29:53.340 And, you know, even his point about right and wrong, and he uses the word sin, which is quite
00:29:58.280 something coming from Silicon Valley that profits off every sin you can imagine.
00:30:05.480 That's division.
00:30:06.320 Right and wrong is the essential divide in all of humanity.
00:30:10.240 It's what separates us from animals is we believe in right and wrong.
00:30:14.720 That's division.
00:30:16.080 And he seems to hate intolerance, you could say.
00:30:19.780 He would phrase it in a more positive way.
00:30:22.260 It's incredibly Orwellian.
00:30:24.120 That is a different apple than the think different apple of that 1984 ad.
00:30:28.860 I find that very troubling.
00:30:30.540 And there is no sign that these companies are going to slow down at all here.
00:30:36.320 I think 2019 is the year they go in for the kill.
00:30:40.940 And let's end with a couple minutes on that, Alan.
00:30:43.120 I appreciate your review of these videos from 2018 and our discussions around them.
00:30:48.540 I got to tell you, I'm a pessimist, and it serves me well being a pessimist, because that means
00:30:54.580 I'm never disappointed when things go bad.
00:30:57.000 I hate saying I told you so because I'm always sad that I was correct.
00:31:02.260 I think 2019 is going to be a terrible year of deplatforming, unpersoning, censoring, the absolute merger between big tech and big government.
00:31:12.220 And I think that between you at Breitbart and us here at The Rebel, it would not surprise me if one of us was not left standing at the end of the year,
00:31:21.520 not because of any business decision or any market failure, but rather because we were deplatformed by the Tim Cooks and Sandra Pichai's of the world.
00:31:30.540 That's my extremely pessimistic prediction for 2019.
00:31:33.840 What do you think?
00:31:34.300 I think that would certainly follow the trend we saw in 2018, where we saw all sorts of figures being kicked off various platforms.
00:31:42.520 There was Patreon, the crowdfunding platform, and there was Twitter, where there was Facebook, where there was Google.
00:31:48.020 We even saw them target politicians.
00:31:50.400 So Marsha Blackburn wasn't allowed to run ads on Facebook or Google just a few days before the midterm elections.
00:31:59.000 We saw Google shutting down your former contributor, Faith Goldie, from having ads just 48 hours before polls opened in Toronto.
00:32:09.480 So they've displayed this brazen willingness to interfere in politics and interfere in elections, and there have been no consequences.
00:32:22.220 Republicans controlled Congress, and they called them into hearings, but they didn't actually pass any legislation saying you can't do this.
00:32:29.440 So they've learned now.
00:32:30.580 These companies have learned they can censor, they can interfere in elections, and the worst that will happen to them is a few negative headlines, maybe some leaks, and maybe an embarrassing hearing in front of Congress.
00:32:41.860 But there's no actual legislation that's been passed.
00:32:45.020 So I think as long as that continues to be the case, we're going to see more of it.
00:32:48.600 We're going to see more censorship.
00:32:49.480 We're going to see a tighter and tighter control over the Internet, which was supposed to liberate everyone and is now tyrannizing everyone.
00:32:55.900 Yeah.
00:32:56.000 You know, you mentioned our former contributor, Faith.
00:32:58.620 I sometimes look at our rebel alumni, and some of them moved on, some were hired away, some, frankly, we fired.
00:33:04.600 But they all were rebels, and when they were with us, they were sort of safe from the censorship for some funny reason.
00:33:11.180 But whether it's Faith Goldie, Laura Loomer, who was kicked off Twitter, Gavin McInnes kicked off Twitter and other sites, Tommy Robinson kicked off Twitter, kicked off PayPal, Lauren Southern banned from the United Kingdom.
00:33:27.100 All those people are alumni of the rebel, and I keep thinking, well, why aren't they coming for the rebel?
00:33:33.840 Now, they did try and come for us in August of 2017.
00:33:37.140 But I think that because they're sort of everyone I just listed is sort of on their own in a way, although Gavin was with CRTV.
00:33:44.140 But this is how the censors build up a track record, build up precedents, build up courage, build up sort of a body count.
00:33:53.060 And they start with the really, really, really fringe, you know, the neo-Nazis.
00:33:58.700 Then they go for the slightly less fringe, Alex Jones.
00:34:02.620 Then they come for the slightly less fringe, our alumni.
00:34:06.640 And you see they're moving closer and closer to the center, and they're building on the precedent they established with the most extreme cases that no one wanted to fight for.
00:34:15.620 No one wanted to fight for the neo-Nazi websites that were kicked off of GoDaddy or whatever, or whatever their host was, because who wants to defend a neo-Nazi?
00:34:23.860 Okay, so they set the precedent.
00:34:25.900 And here we are a year or two later, and a year from now, they're going to be, that's what I'm worried about, is that they're going to swallow up.
00:34:35.140 Well, I mean, we see PragerU, which is very mainstream conservative, having their videos censored.
00:34:40.420 Jordan Peterson, so many, they're really coming close to the center.
00:34:46.760 Will the left ever care, Alam, or do they just know that the bullies in charge here are their allies, so they don't care about the principle of free speech, because they know they won't be swallowed up by it either?
00:34:58.200 Well, the Democrats certainly won't care either, because like, you know, Representative Gerald Nadler, who's now going to chair the same committee that questioned Sundar Pichai last month,
00:35:06.960 is essentially in the pay of Google, Google is his highest donor, you won't hear much from him, or because they're really logical.
00:35:15.160 And, you know, multiple Democrats, when they've got an opportunity to question Silicon Valley CEOs, haven't asked them about censorship, they've called it a conspiracy theory.
00:35:23.260 They've instead asked them what they're doing to shut down hate speech, so they want more censorship, not less.
00:35:29.040 They're not going to see anything from them.
00:35:30.360 On the left, the people who actually have a handle on the issue and are honest about it are the anti-establishment, anti-war left people like Glenn Greenwald,
00:35:40.820 because they know that they're also in the firing line because the elites don't like them either.
00:35:47.280 And, you know, Facebook, for example, is advised by the Atlantic Council.
00:35:50.400 They're a neoconservative think tank.
00:35:51.900 And we see sort of neoconservative Bush-type think tanks, you know, in the U.K. and the U.S. talking about extremism and not just, you know,
00:36:01.900 legitimate extremism like real extremism like Islamic terrorists and ISIS,
00:36:07.860 but they're now talking about so-called far-right extremists as well,
00:36:11.880 and they're putting, you know, some of the people, some of you are alumni in that category.
00:36:15.200 So, you know, it's the establishment right and the establishment left that want censorship on social media,
00:36:23.640 and it's the anti-establishment of both sides, I think, that need to unite to fight against it.
00:36:29.480 I want to leave with one last question.
00:36:31.580 It's a little bit about tech.
00:36:34.580 I think it's more about journalism.
00:36:35.900 And thank you so much for your time.
00:36:37.740 I always want to keep you longer than I can, and I'm really grateful for your time.
00:36:42.720 Two things in Canada, and I don't know if you would have followed them because you're based,
00:36:48.800 you used to be based in the U.K., now you're in the United States.
00:36:53.680 A few months back, the Toronto Star had what I thought was a shocking story.
00:36:58.920 It was a report of when Justin Trudeau met with Sheryl Sandberg of Facebook.
00:37:05.680 And this was leaked to the Star by the Prime Minister's office.
00:37:10.560 It was meant as sort of a threat to Facebook.
00:37:14.360 And as you can see on the screen, the headline is basically Trudeau's telling Facebook,
00:37:19.340 if you don't fix your, quote, fake news problem, we will fix it for you through the law.
00:37:26.000 So it was a personal, private conversation.
00:37:28.580 Trudeau said to Sheryl Sandberg, fix it.
00:37:30.680 Oh, and he was very specific.
00:37:31.800 Fix it in advance of my 2019 re-election campaign.
00:37:36.000 So he was very precise.
00:37:38.780 He wanted it fixed before he was up for re-election.
00:37:41.720 If she didn't do his censorship for him, he would do it to her.
00:37:46.480 And obviously she wasn't moving quickly enough because they leaked the story to the liberal-friendly
00:37:51.420 Toronto Star, which ran it as a threat to Sandberg.
00:37:54.600 I thought it was a shocking revelation of Trudeau's thin skin and his plans.
00:38:00.740 It hasn't been followed anywhere, so that's Exhibit A.
00:38:04.560 And Exhibit B, Alan, so that's the stick.
00:38:08.340 Exhibit B is in Canada.
00:38:11.340 Justin Trudeau just announced a $595 million fund for journalists, for media companies who
00:38:18.780 are having a tough time.
00:38:20.020 The equivalent in the United States, because we're one-tenth the population for the states,
00:38:24.560 that would be like a $6 billion slush fund.
00:38:29.280 But Trudeau said it's only for journalists that he could, quote, trust.
00:38:34.520 That's how the Toronto Star again reported.
00:38:37.600 So you got the carrot.
00:38:40.200 If you're trustworthy, we got $595 million for you.
00:38:44.940 And the stick is, we're going to ban you as fake news.
00:38:48.420 I think 2019 could be the year the lights go out for independent journalists.
00:38:55.080 I just, I know that sounds so pessimistic, but I just don't know what else to make of
00:39:00.000 these two exhibits I've just mentioned to you.
00:39:01.820 What's your take?
00:39:03.620 Well, that is essentially state-run media you now have in Canada, all of it.
00:39:07.340 Because if they're all taking money from the government, they're no longer independent
00:39:10.260 of the government.
00:39:10.860 So what Trudeau is doing there is essentially trying to make all of the Canadian media his
00:39:19.660 personal pravda.
00:39:21.140 You know, in a sense they are already, I assume, because from what I hear, most media is pro-Trudeau
00:39:28.900 in the same way that most media was pro-Obama when he was in office in the U.S.
00:39:33.500 So in a sense it's already happening, but this will make it even more the case.
00:39:36.820 You know, the government paying off every single journalistic outlet.
00:39:41.280 That's incredible.
00:39:42.520 That's making the entirety of media essentially no longer independent of the government.
00:39:48.860 That is amazing to me.
00:39:50.240 It's even more incredible than pressuring Sheryl Sandro because, you know, that's happened
00:39:55.360 in other countries as well.
00:39:57.060 It's happened in the U.S.
00:39:58.220 We've seen Silicon Valley CEOs being pressured, saying, you know, what are you doing about fake
00:40:02.040 news?
00:40:02.280 What are you doing about hate speech?
00:40:03.320 Which, Angela Merkel, back in as early as 2015, was pressuring Mark Zuckerberg to crack
00:40:09.740 down on hate speech.
00:40:10.800 So all of these European politicians, all of these, and, you know, Canadian politicians
00:40:14.560 now also, and Democrats in the U.S. are pressuring these Silicon Valley companies to become their
00:40:20.400 own personal censorship wing.
00:40:22.520 And the only way to counter it, I think, is for the grassroots who do believe in Internet
00:40:28.800 freedom to rise up and demand their own legislation, their own protection from these tech companies
00:40:33.620 and their own politicians who want to use them as agents of oppression.
00:40:39.940 Yeah.
00:40:40.580 Well, I wonder if that will happen.
00:40:42.620 And sometimes I think in the luxurious, comfortable, postmodern West, we just enjoy ourselves too
00:40:50.180 much.
00:40:50.480 We don't like to get revved up about things.
00:40:52.940 We'll just watch another movie on Netflix or play another video game or just go out and
00:40:57.160 enjoy the sunny day.
00:40:58.600 I'm worried that we are too complacent.
00:41:01.160 And we used to be good at fighting for freedom.
00:41:04.240 Now we're just good at enjoying freedom.
00:41:06.700 And we don't know how to fight for it anymore.
00:41:09.860 And I feel like it's slipping away.
00:41:11.680 I hope, Alam, that when we talk again for our year in review next year, first of all,
00:41:16.340 I hope that we will talk again a year from now, that both of us will still be around in
00:41:21.400 the Internet.
00:41:22.260 And second of all, I hope that my prophecies here are wrong.
00:41:25.780 They're not conspiracy theories.
00:41:26.880 I have not posited anything I could not prove.
00:41:29.880 I've just made predictions that I think are dire.
00:41:32.980 And I hope and hope and hope that my predictions are false.
00:41:36.720 I have yet to see evidence to make me change my mind, though.
00:41:40.680 Last word to you, my friend.
00:41:44.020 Yeah, certainly what I'd like to see over the next year is more.
00:41:49.560 I'd certainly like to report on more grassroots activity, because I think that's the only,
00:41:55.020 especially with the Democrats in charge of Congress, that's the only way to push back
00:41:59.580 against it and ensure there'll be an independent media on the Internet.
00:42:02.440 There'll be a space for independent voices.
00:42:06.120 Because what we've seen over the past four or five years is essentially the robbery of
00:42:10.860 the generation.
00:42:11.760 You know, my generation especially was given this, this, you know, this great gift of the
00:42:16.900 open platform, the Internet, where you could say whatever you want and challenge mainstream
00:42:21.200 media.
00:42:21.880 And that's just been suddenly and rapidly taken away over the past three years.
00:42:25.400 And I think there are a lot of people in the country, in the U.S. and elsewhere who are
00:42:28.960 pretty, pretty angry about that.
00:42:31.820 Yeah.
00:42:32.860 Well, I am angry about it.
00:42:34.380 And it's is an existential threat to us here at The Rebel, which is, I think, is the point.
00:42:39.560 We have 1.1 million YouTube subscribers.
00:42:42.620 They all voluntarily want to hear what we have to say.
00:42:45.320 We have, I'll tell you, it's not much of a secret, 3.4 million people who are affiliated
00:42:50.540 with The Rebel in some way or another, whether following us on social media, who are engaged
00:42:55.820 with our petitions.
00:42:57.040 3.4 million people, which is quite a lot for a little country like Canada, but a lot
00:43:02.780 of our viewers are foreign.
00:43:05.800 But all it takes is one person, if they're the president of Google, YouTube, Facebook,
00:43:10.600 whatever, to shut us down.
00:43:13.260 I hope we'll be here in a year, my friend.
00:43:14.720 Thanks for your time.
00:43:16.240 Thank you.
00:43:16.720 All right.
00:43:17.120 There you have it.
00:43:17.620 Alan Bokhari, the chief technology correspondent with Breitbart.com.
00:43:21.200 I've said it before.
00:43:21.760 I'll say it again.
00:43:23.460 It's the most important political beat in the world.
00:43:27.040 And I think 2019 will be the year of reckoning.
00:43:31.480 That's it for today.
00:43:33.200 On behalf of all of us here at Rebel World Headquarters, to you at home, good night, and
00:43:38.580 keep fighting for freedom while you still can.