Rebel News Podcast - October 23, 2024


EZRA LEVANT | Media's Trump Derangement Syndrome triggered Big Tech censorship


Episode Stats

Length

48 minutes

Words per Minute

163.95755

Word Count

8,019

Sentence Count

592

Misogynist Sentences

5

Hate Speech Sentences

3


Summary

Big tech companies helped sway the election for Joe Biden in 2020. Will they help free the vote for Trump in 2024? We talk to Alan Bokhari about this and other things on this episode of The Ezra LeVant Show.


Transcript

00:00:00.000 Hello, my friends. What a great conversation today with really one of the smartest guys in tech,
00:00:04.760 Alan Bokhari. He used to be a journalist at Breitbart.com. Now he's the head of an online
00:00:11.940 freedom activist group. We're going to talk about everything from Elon Musk to Mark Zuckerberg and
00:00:16.940 the election. You don't want to miss it, but I'm going to show you a bunch of clips and I want you
00:00:21.320 to see them with your eyes, not just hear them with your ears. To get the video version of this
00:00:26.300 podcast, you got to sign up at rebelnewsplus.com. Just click rebelnewsplus.com, hit subscribe. It's
00:00:33.080 eight bucks a month. That might not sound like a lot of money to you, but boy, it sure adds up with us
00:00:36.920 because as we'll talk about with Alan, we have been demonetized by big tech, so we rely on you.
00:00:43.280 All right, here's today's podcast.
00:00:56.300 Tonight, big tech companies helped sway the election for Joe Biden in 2020. Will they help
00:01:07.180 free the election for Trump in 2024? We talked to Alan Bokhari about this and other things.
00:01:12.640 It's October 22nd, and this is The Ezra LeVant Show.
00:01:18.680 Shame on you, you sensorism bug.
00:01:26.300 You know, I saw this amazing graph the other day, and I almost didn't believe it, but
00:01:34.400 in my bones, I think it's true. It shows the donations by senior tech executives over the
00:01:41.680 last few election cycles. Now, you would think that people in tech are engineers and doers and
00:01:47.740 builders and entrepreneurs and problem solvers. You would think that by nature, those folks are
00:01:53.680 right of center or conservative. I remember when I was a student at University of Calgary,
00:01:58.540 whenever I was with the Young Reform Party handing out flyers, our best departments were
00:02:03.780 engineering and management. We stayed away from the soft social sciences. But over the last
00:02:11.180 generation, you can see how tech became colonized by leftists, and that's because the tech bros,
00:02:19.300 the engineers were soon replaced by HR specialists and sensors and other more, I would call them more
00:02:28.500 feminine activities. And I think this was crystallized by a series of those day in the life
00:02:35.460 at Twitter videos that caused Elon Musk to purge 80 percent of the staff when he arrived.
00:02:41.280 If you don't know the videos I'm talking about, here's one that just showed who started to work for
00:02:46.620 tech with massive six-figure salaries over the last decade. Take a look.
00:02:51.580 Hey guys, come to work with me at Twitter and Atlanta. This was my first time going into the
00:02:55.740 office in such a long time, but it was nice to have a change of scenery from my apartment.
00:03:00.140 Look at my co-worker Bree. She's so cute. For lunch, we decided to go downstairs to Bar Vegan. If you
00:03:04.720 haven't tried it before, it's a black-owned restaurant inside of Ponce Market. We ordered the
00:03:09.000 quesadillas with tots and then also got a fancy pants cocktail, and they were all really good,
00:03:13.400 so I'll definitely be back. After lunch, we came back upstairs to an extremely empty office,
00:03:19.180 but honestly, we were just so proud of our productivity. After work, we went back downstairs
00:03:23.980 to Monero's to reward ourselves with some after-work margaritas. We stayed at happy hour
00:03:29.340 until around 7 p.m., and then I finally headed home to enjoy a well-deserved bubble bath. Bye, guys.
00:03:34.500 Yeah, when Elon Musk came aboard, he fired 80% of the staff, and I would put it to you that the
00:03:39.980 product is stronger than ever, but my point is tech was colonized, and a lot of it was on purpose.
00:03:46.160 For example, George Soros himself bankrolled various NGOs designed to change the terms of
00:03:52.220 service for using tech platforms. He actually created an NGO called Change the Terms, which was
00:03:58.580 to get tech companies themselves to ban and censor political activists on the right. Well,
00:04:06.100 things are snapping back. You can see it in big tech investors, including that all-in podcast. These
00:04:13.740 are billionaires who have made their money in Silicon Valley and have historically voted Democrat,
00:04:19.180 but they are swinging back to the right, partly appalled by the state of San Francisco, which has
00:04:25.200 become a drug-infested, woke meltdown, and partly by Donald Trump. And I don't know, I think that
00:04:33.600 there's just something going on out there, and the personification is Elon Musk, who himself has gone
00:04:39.720 all in, not just campaigning grassroots level with Donald Trump, but setting up his own PAC called
00:04:47.440 America PAC and giving away a million dollars a day to people who sign up to his
00:04:54.660 petition to support the First and Second Amendments. A lot going on. And when I want to think about the
00:05:00.440 state of tech and politics, there's really only one name that comes to mind. We met him when he was
00:05:06.500 a senior tech writer at Breitbart.com, and we've kept in touch with him as he's moved on to a different
00:05:11.700 project. He's now the managing director of the Foundation for Freedom Online. You know who I mean,
00:05:18.500 our friend Alan Bocari, who joins us from Austin, Texas. Alan, it's great to see you again.
00:05:23.660 Don't mind me rambling on there, but I really think there's movement. For the first time in memory,
00:05:29.580 big tech is, if not leaning to the right, certainly showing its skepticism towards the woke left. That's
00:05:35.880 new, isn't it? That is. It's new in a sense, because, you know, tech used to be somewhat balanced
00:05:43.400 between Democrats and Republicans. But there was sort of a, I would say, a counter-revolution
00:05:51.540 between 2016 and 2020, when tech, along with, you know, every other major sector of the economy
00:06:00.820 and society and institution, just became completely radicalized. They bought into the media hype,
00:06:06.600 which painted Donald Trump, who won in 2016, as, you know, the second coming of Hitler.
00:06:11.840 And they just shifted massively to the left. And that's when you saw the peak
00:06:15.000 of censorship in Silicon Valley. You saw the peak of left-wing attitudes in Silicon Valley.
00:06:20.700 And I think that was partly because of the media scaremongering around Donald Trump. It was also,
00:06:27.220 as you said in your opening remarks, you know, the fact that Silicon Valley had become bloated
00:06:31.580 with all of these non-technical employees who leaned to the left.
00:06:37.000 Yeah. Now, there still are major tech players who are obviously hard left-wing. I think of Benioff
00:06:45.800 from Salesforce.com, even Mark Zuckerberg, who poured hundreds of millions into get-out-the-vote
00:06:52.760 strategies last time, which he claimed were neutral, but were so obviously designed to boost
00:06:59.520 the Democrats. He has shown a little bit of remorse, I think, or embarrassment. I think he's tried
00:07:06.600 to rebrand himself as a regular guy. And he wrote a letter a few weeks ago where he basically said,
00:07:12.300 I'm not going to meddle this time. People took it the wrong way, and I'm going to be super neutral.
00:07:17.240 I didn't believe a word of it. And I see in your old stomping grounds, brightboard.com,
00:07:21.440 a story basically highlighting James O'Keefe's latest sting, where a senior engineer at Facebook
00:07:29.340 Meta admits that Facebook demotes anti-Kamala Harris posts and shadow bans conservatives.
00:07:36.200 So Mark Zuckerberg might be telling the truth about his personal ideology, but his company is so large
00:07:43.080 and it's so infested with leftists, they seem like they're censoring like it's 2020 again.
00:07:49.240 Yeah. And that goes to the point you made earlier about Elon Musk coming into Twitter,
00:07:55.820 formerly Twitter, now X, and cutting 80% of stuff. Mark Zuckerberg, while he personally may not be
00:08:03.280 planning to interfere in the 2024 election on the side of Kamala Harris, maybe his personal
00:08:08.340 political viewpoints have changed. Maybe people around him at the top of Facebook, it's the same
00:08:12.660 story. But I think censorship became so deeply ingrained in every major tech company between 2016
00:08:19.580 and 2020, you know, spread out across their content moderation departments, their trust and safety
00:08:25.580 departments, as well as all the NGOs and government agencies that they relied on for censorship advice,
00:08:33.260 which are all pro-censorship. We cover them a lot at the Foundation for Freedom Online.
00:08:37.340 That's so deeply ingrained that unless you do the kind of cutting we saw with Elon Musk and X,
00:08:43.920 those systems are still going to live on. Those people in the content moderation department are
00:08:49.040 still going to be there and they're still going to be doing the same things. And I think that's why
00:08:52.480 you see stories like the one you mentioned at Breitbart about the Facebook employee.
00:08:57.160 Yeah, that Breitbart report was basically unpacking a video by James O'Keefe. Let's play a little bit of
00:09:02.560 that video. This is Jeevan Gyalwali, a senior software engineer at Meta. Let me show you the
00:09:07.960 primary source. I was crediting Breitbart because that's where I saw the video, but it's actually
00:09:11.940 James O'Keefe, the renowned undercover investigator. Here, take a look at the primary video with your own
00:09:18.660 eyes.
00:09:19.140 Paul in like Ohio said something about like Kamala Harris is unfit to be a president because he doesn't
00:09:27.100 have a child. That kind of shit is automatically developed.
00:09:30.500 Are they doing a good job protecting our democracy?
00:09:35.360 Because I can see these like right-wing groups like setting up Instagram accounts or Facebook
00:09:39.780 accounts for that matter, right? And just start posting this information to be like, oh, like
00:09:43.780 Harris is like blah, blah, blah.
00:09:45.680 That's all going to be demoted 100%. The civic class is very strong.
00:09:49.200 Would the person who posted that be made, would he be notified that...
00:09:54.500 The person would not be notified, but there's these things that we, what you collect is like,
00:09:59.960 if a bunch of items that, like if at least a couple of items that a person has created,
00:10:06.460 has violated civic classifier, then they're also red-listed.
00:10:10.600 Called shadow banning?
00:10:11.800 Shadow banning, yeah.
00:10:12.500 Okay, so they would be shadow banning.
00:10:13.700 Correct.
00:10:14.180 They would never be shown.
00:10:15.720 But so they will see a dip in like impressions and engagement, but they would not be like
00:10:20.900 officially warned of the reason why.
00:10:23.780 Correct.
00:10:24.300 Well, Alan, back to what you said there.
00:10:25.580 I think it was so ubiquitous, this censorship and the phrase disinformation and misinformation
00:10:31.220 and the community standards, this whole censorship industry and its related fact-checking industry,
00:10:40.560 which was, of course, more opinion-checking.
00:10:43.080 If you had the wrong opinions, you would be checked.
00:10:45.380 I think it became so ubiquitous that people just accepted that's how it is.
00:10:49.780 It's sort of like, if I may refer to another debate, global warming.
00:10:53.900 It's so pervasive.
00:10:55.160 It's so ubiquitous that unless you're actively trying to dissent, it's like a fish that doesn't
00:11:03.160 even know it's in water.
00:11:04.860 It's so completely infused in everything, you have to be a real willful skeptic to oppose
00:11:15.400 the global warming thesis.
00:11:17.980 And I think maybe to defend some tech leaders, it's just everywhere they went, everyone was
00:11:23.920 talking about it.
00:11:24.520 It's all government asked them about when they would go to Congress or Parliament.
00:11:29.340 The politicians were demanding it.
00:11:31.320 And these engineering nerds who think they're sophisticated masters of the universe, I think
00:11:38.420 they got totally manhandled by the politicians.
00:11:41.400 And I think they were scared that if they didn't comply, they would be absolutely destroyed
00:11:46.520 through regulation.
00:11:47.440 So I think they entered into a sort of partnership.
00:11:48.980 And by the way, you can see some of the emails that are being disclosed through litigation
00:11:53.660 that even big tech companies like Facebook were appalled by the censorship demands put
00:12:01.260 on them.
00:12:01.580 But they were scared.
00:12:02.840 They were scared that if they didn't do what the FBI, CIA, Congress were demanding of them,
00:12:08.620 that they would be in trouble.
00:12:09.840 So to cut them a little bit of slack, I think it was absolute peer pressure.
00:12:15.520 There was a, it was a velvet glove, but there was a steel fist underneath it.
00:12:19.360 I think that it takes genuinely dissenting thinkers like Elon Musk and some of the other names
00:12:26.780 I mentioned, David Sachs, Mark Andreessen, Bill Ackman, Peter Thiel, that unless you are
00:12:32.420 by nature a disrupting contrarian, it's, it would be hard to fight against the censorship
00:12:40.260 vibe.
00:12:42.340 Extremely difficult.
00:12:43.540 And, you know, as you said, the pressure between 2016 and 2020 on tech companies was
00:12:49.040 extraordinary.
00:12:50.040 It was coming from all directions.
00:12:52.120 It was coming from activist organizations and NGOs.
00:12:57.060 It was coming from big advertising companies that would repeatedly threaten to boycott the
00:13:02.560 tech platforms if they didn't come down hard on censoring political opinions.
00:13:08.160 You had media companies, big media companies that encouraged those advertiser boycotts.
00:13:13.120 They did it again and again.
00:13:14.140 They did it against YouTube.
00:13:15.920 They did that against Facebook.
00:13:17.800 And then they did it again against, against Twitter just recently when Elon Musk took over.
00:13:22.440 So you had that, that, that, you know, really powerful force of advertiser pressure on the
00:13:28.860 tech companies, all of which rely primarily on ad revenue to, to make a profit.
00:13:34.840 On top of that, you had every, as you said, every single government around the world telling
00:13:40.500 the tech companies that they need to do something on disinformation.
00:13:43.840 They need to censor disinformation.
00:13:45.320 They need to censor hate speech.
00:13:47.880 And that included, by the way, many U.S.
00:13:50.380 government agencies, because 2016 to 2020 was this very strange period where Donald Trump
00:13:57.480 was in office.
00:13:58.380 But many of the government agencies were a law unto themselves, and they were kind of
00:14:03.800 working to undermine him, including by encouraging tech platforms to censor his supporters.
00:14:09.920 CISA, the Cybersecurity and Infrastructure Security Agency at DHS, was a big culprit involved
00:14:20.440 in that in 2020.
00:14:21.920 We've covered it a lot at the Foundation for Freedom online.
00:14:24.660 So all of these, all of these sources of pressure, advertisers, activists, NGOs, universities,
00:14:33.500 the media, and government agencies, not just the U.S., but around the world, all of these
00:14:38.820 like combined, create this extraordinary pressure on the tech companies between 2016 and 2020.
00:14:44.600 It was hard to see how any tech company could avoid caving in.
00:14:49.200 And, you know, that remained the case until just recently, where we had the simultaneous
00:14:56.140 factors of Elon Musk taking over X, and the House Judiciary Committee and other congressional
00:15:03.340 committees investigating the censorship apparatus that the U.S.
00:15:07.440 government supported.
00:15:08.200 You know, while you were saying things, I just checked, I just looked something up very
00:15:14.720 quickly there.
00:15:15.460 In Canada, there was an edgy online journalist, sort of like in your country.
00:15:21.840 What's your name again?
00:15:22.840 Taylor.
00:15:24.760 I forget the journalist for The Washington Post who was their tech correspondent against.
00:15:30.180 Oh, Taylor Lorenz.
00:15:31.760 Taylor Lorenz.
00:15:32.400 Thanks very much.
00:15:32.960 I'm glad I forgot her name.
00:15:34.240 That wasn't just a Joe Biden moment for me there, Alan.
00:15:36.880 Taylor Lorenz, who's, by the way, well into middle age, positioned herself as sort of the
00:15:42.680 young, hip online reporter.
00:15:44.740 And I'll tell all you folks at The Washington Post what the kids are up to.
00:15:48.040 It was sort of like that Steve Buscemi meme, hello, fellow youth.
00:15:52.580 And anyway, Taylor Lorenz offered to be the guide for boomers through the world of the internet.
00:15:58.640 But what she really was, she engaged in sort of gotcha style, terrorizing activism journalism,
00:16:04.480 where she would call up a big tech company or call up an advertiser and say, hey, you
00:16:10.420 are showing an ad on this hate site.
00:16:13.380 I'm doing a story in two hours.
00:16:16.120 Let me know if you're going to continue carrying this hate message or I'm going to get you.
00:16:22.140 Like it wasn't really journalism.
00:16:23.640 It was extortion journalism that would serve to terrify, like it was, everyone was scared
00:16:30.400 of everyone.
00:16:31.400 And these bottom feeding fake journalists were part of it.
00:16:35.080 In Canada, our version of Taylor Lorenz was Rachel Gilmore, who was fired from Global.
00:16:40.420 And then she went to work as a journalist, but not for any news outlet, for some US based
00:16:46.320 NGO whose entire purpose was to terrorize advertisers on right wing sites.
00:16:52.200 I mean, Rebel News lived through this in 2017.
00:16:54.560 We really have no advertising to speak of.
00:16:56.700 We managed to survive through crowdfunding.
00:16:58.440 But if you're a massive platform like Facebook, Twitter, TikTok, Instagram, et cetera, you can't
00:17:03.740 survive without advertising.
00:17:05.340 Advertising is your entire model.
00:17:07.180 So between government regulation on the one hand, ideological entryism and colonization,
00:17:13.940 especially of your HR department, and then pressure from advertisers, it's a miracle that
00:17:19.320 any sites resisted censorship.
00:17:21.680 And Elon Musk is the guy.
00:17:24.700 Hey, let me ask you a question about this.
00:17:26.400 I haven't heard your view on this.
00:17:28.180 A couple of months ago, Elon Musk, through Twitter X, launched a lawsuit against some advertising
00:17:34.840 sort of, I'm not going to use the word cartel, but it was like a group of advertisers who
00:17:40.300 were concerned about hate speech online.
00:17:43.200 And he sued them, alleging bad faith that they were basically trying to put Twitter out
00:17:48.940 of business.
00:17:50.080 After Elon Musk's lawsuit, this coalition was disbanded.
00:17:53.300 Do you know anything about that lawsuit or its status?
00:17:55.700 Do you have anything to say about that?
00:17:56.700 Actually, yes.
00:17:59.420 We did a major report on this at the Foundation for Freedom Online.
00:18:02.960 The cartel was called, as it was in the past tense, the Global Alliance for Responsible
00:18:08.140 Media, and they actually shut down.
00:18:11.020 They ceased operations shortly after that lawsuit was filed.
00:18:14.860 Also, shortly after, we published a report on the government contracts that many of their
00:18:20.160 members have in the US, totaling over $1 billion to many of the big advertising agencies.
00:18:26.340 That were a part of this cartel that boycotted X or encouraged boycotts of X and went to
00:18:33.460 tech platforms, you know, with, you know, helpful advice on their content moderation policy and
00:18:38.740 how they could censor more people.
00:18:40.120 That's shut down now.
00:18:41.160 And that happened shortly after the lawsuit from X and also after investigations by the
00:18:46.620 House Judiciary Committee.
00:18:48.580 But what that organization was, it represented, I believe, because it was a part of the World
00:18:52.920 Federation of Advertisers, which represents over 90 percent of all advertising spending
00:18:58.720 worldwide.
00:18:59.540 So it's basically the entire advertising industry that was involved in this group.
00:19:04.040 And it had one purpose, which was to create uniform standards of censorship on social media
00:19:11.060 and on online platforms.
00:19:12.560 And that's exactly what they did.
00:19:13.860 They would go to platforms like Spotify and say, hey, did you know that Joe Rogan, your
00:19:18.960 top podcaster, had a vaccine skeptic on?
00:19:23.400 You know, were you aware of this?
00:19:24.560 This could be a problem.
00:19:25.720 We don't like this.
00:19:26.620 That's essentially what they were doing.
00:19:29.600 And it's kind of extraordinary because the platforms and the personalities that they complained
00:19:37.440 about were extraordinarily popular.
00:19:39.340 Joe Rogan, the number one podcaster in the world.
00:19:42.060 And they were complaining that, you know, about one episode with a vaccine skeptic.
00:19:48.060 They also, in communications that were obtained through congressional subpoenas, you can see
00:19:56.500 them admitting to targeting conservative media, targeting media they didn't like by, you know,
00:20:03.120 watching them like hawks for any transgression of their principles.
00:20:08.860 So, I think they specifically mentioned the Daily Wire and Breitbart News, how they specifically
00:20:16.860 watched those sites for any violation so that they could go to their clients and say, you shouldn't
00:20:22.780 advertise on these websites.
00:20:25.000 So, they admitted to, you know, unevenly applying their own standards to websites that they didn't
00:20:31.000 like.
00:20:31.400 But I'll say another thing about the advertising boycotts related to what I just said about
00:20:37.580 Joe Rogan.
00:20:38.560 It's kind of a trick.
00:20:41.400 And you can see how ideological these people are when you look at their internal communications,
00:20:45.060 because many of the platforms and personalities that received the most advertiser boycotts were,
00:20:53.680 you know, extremely popular platforms, Tucker Carlson, almost all his advertisers left when
00:20:59.640 he was at Fox News.
00:21:01.140 Joe Rogan.
00:21:02.640 You see this massive advertising conglomerate complaining about his guests.
00:21:10.680 Twitter, X, over 100 million users.
00:21:14.500 So, these are huge audiences that advertisers should want to be in front of.
00:21:20.840 And yet, these advertising agencies and these advertising cartels are telling them, well,
00:21:26.420 the risk to your brand is so great that you shouldn't be on there.
00:21:30.180 There'll be a massive consumer backlash.
00:21:32.000 But there's never been any sign of a consumer backlash.
00:21:34.200 The only backlash against advertisers for advertising in these places has come from the media.
00:21:39.720 It's come from activists.
00:21:40.500 It's come from a minority, a vocal minority of, you know, journalists and activists.
00:21:47.680 It hasn't come from the public.
00:21:49.160 No, there's never been a mass consumer boycott of any brand because their advertisers appeared
00:21:54.960 on conservative media or appeared next to an offensive post on X.
00:21:58.760 It's all been from journalists and the activists.
00:22:01.300 Plus, the only time, actually, in the last five years where you do see massive consumer backlashes
00:22:07.000 against brands come from the brands being too woke, like in the case of Bud Light or in
00:22:11.840 the case of Gillette in 2019 when they did their toxic masculinity act.
00:22:17.620 That's where you see brands actually losing customers and losing market share over a consumer
00:22:22.800 backlash, not because they advertise on platforms that are too free speech friendly.
00:22:29.180 That's exactly right.
00:22:30.080 You know, that's what happened to us at Rebel News in early 2017.
00:22:34.860 I've seen internal documents at Google, YouTube, where they say, oh, yikes, Rebel News is pretty spicy.
00:22:41.140 Better warn the advertisers.
00:22:43.060 Really?
00:22:43.720 So they were activists within YouTube, Google that were trying to block us.
00:22:48.700 And we managed to find other ways to survive.
00:22:51.480 But before we were shut down, we were on track to make a million dollars a year from YouTube ads.
00:22:58.120 And that's been reduced to zero.
00:22:59.620 We're still on YouTube because they haven't deleted our account.
00:23:02.580 So we still want to talk to people there.
00:23:04.620 But that has cost our company coming up on $10 million, which is a lot of crowdfunding we've
00:23:10.540 had to do to survive.
00:23:12.080 It's absolutely atrocious.
00:23:13.340 But what do you do?
00:23:13.920 There was nowhere to go until Rumble came along.
00:23:17.240 And it's still only about 1% the size of YouTube.
00:23:21.680 But they've had a real dedication to free speech.
00:23:24.500 And, of course, Elon Musk buying Twitter was the biggest thing of all.
00:23:27.540 I see Guy Verhofstadt, who's a big European Union bureaucrat.
00:23:32.960 He's tweeting just today about how delighted he is that some investment firm estimates the
00:23:42.060 value of Twitter is down by 75 or 80%.
00:23:45.760 I mean, just the glee with which the European Union bureaucrats want to destroy Elon Musk.
00:23:52.520 And it's purely ideological.
00:23:53.880 It's not just—I mean, I saw the other day that Elon Musk was saying he's worried about
00:23:57.360 going to the United Kingdom physically.
00:23:59.860 Like, he's worried about traveling there because you have politicians there calling for
00:24:03.700 his arrest over Twitter, speaking candidly about some of the mass immigration and the
00:24:09.980 race riots there.
00:24:11.220 Here's a clip of Elon Musk saying that.
00:24:12.800 I mean, I don't know if this is just—if he's just being dramatic, but the world's
00:24:17.960 most industrious man, the most prolific entrepreneur, the space adventurer, the electric car builder,
00:24:26.800 the essential, indispensable man, perhaps the most creative man, certainly the largest wealth
00:24:33.840 creator in history, he just drops the comment that he is afraid to travel to the United
00:24:41.080 Kingdom for fear of being arrested, not for any crime, but for his opinion.
00:24:46.000 Here, take a look at this.
00:24:47.040 There's a lot of—I mean, although we've got quite a lot of bureaucracy here, but in Europe,
00:24:52.060 they've got country-level bureaucracy, and then they've got EU bureaucracy on top of that.
00:24:56.720 You know, I mean, the EU headquarters in Brussels is a monument to bureaucracy.
00:25:01.880 It's really next level.
00:25:03.360 You know, and they don't—unlike America, they don't have a First Amendment.
00:25:11.040 You don't actually have freedom of speech in Europe.
00:25:15.000 So, you know, but we're kind of like a pretty rare situation having freedom of speech.
00:25:24.120 So, you know, like there's crazy stuff happening in the U.K. where, you know, people are getting
00:25:30.800 like two, three-year prison sentences for Facebook posts.
00:25:34.600 Yeah, like I'm like, I don't think I should go to, you know, visit Britain, because I'm like, they're going to, like, drag out some, you know, tweet and say two years in prison for this tweet or something bullshit like that, you know?
00:25:53.200 So, anyway, I think that's, you know, it's Trump elected.
00:25:56.680 We can put a stop to that stuff and say, like, ah, no way.
00:26:00.480 No, no.
00:26:02.480 Nope.
00:26:04.600 No throwing people in prison for random social media posts.
00:26:10.360 That's crazy.
00:26:12.160 Yeah, it's happened a lot.
00:26:14.340 No, I mean, it's so crazy in Britain that they were—they actually have released convicted pedophiles in order to imprison people for making social media posts.
00:26:23.400 That is a real thing that happened.
00:26:25.800 Insane.
00:26:26.440 I just got to throw one more thing at you, Alan, because I know you're originally from the U.K.
00:26:29.120 We can hear it in your delightful accent.
00:26:30.840 But there was this conference that the new labor government of the U.K. had trying to attract foreign investment.
00:26:39.020 Good luck with that, by the way.
00:26:40.160 But they specifically did not invite a man who's building factories everywhere, who's ramping up.
00:26:47.440 They deliberately did not invite Elon Musk.
00:26:50.940 And Trevor Phillips, who's a very interesting British commentator, was grilling the labor cabinet minister about that.
00:26:58.220 Three times he went back to him, why not Elon Musk, why not Elon Musk, and the labor government just refused to say why.
00:27:05.800 Take a look at this outstanding video.
00:27:08.280 Big up to Trevor Phillips.
00:27:09.380 Take a look.
00:27:10.000 Why didn't you invite Elon Musk?
00:27:13.040 You're desperate to get a company which sacks its employees by Zoom.
00:27:18.520 But you're sniffy about the biggest carmaker in the world because he put something on social media he didn't like.
00:27:25.080 Look, I'm not going to comment on particular invitations for a particular person.
00:27:29.180 Come on.
00:27:29.800 Elon Musk is not some odd invitation.
00:27:32.940 It is Elon Musk, biggest carmaker in the world, richest man in the world.
00:27:35.880 Why didn't you invite him?
00:27:37.480 Look, I'm not going to comment on the reasons for any specific person.
00:27:40.300 But I can tell you we have 300 of the most significant investors, business figures, people who can bring significant amounts of capital to the UK.
00:27:50.020 Big names, things that will make a big difference to working people's lives.
00:27:53.460 And that's the criteria.
00:27:54.000 You're happy to talk to me about DP World, who sacks their workers.
00:27:58.020 You're happy to invite the Saudis, who authorise the murder of Jamal Khashoggi.
00:28:03.960 And they get the red carpet.
00:28:05.460 Why isn't Musk being invited?
00:28:07.720 I'm not going to comment on particular.
00:28:08.700 This is you in opposition, isn't it?
00:28:10.300 Again.
00:28:10.680 No, not at all.
00:28:11.500 This is about who can bring the kind of investments that will make the biggest difference to the UK, to working people's lives.
00:28:17.920 The fact that we have, I think, in 100 days...
00:28:20.440 Oh, Musk hasn't got a bob or two that he could put into Britain.
00:28:23.820 Well, look, the criteria and the selection.
00:28:26.360 This is a summit.
00:28:26.960 I know that everyone wants to come.
00:28:28.300 I do understand that.
00:28:29.140 Not everyone can come.
00:28:30.560 And I'm not going to be right to go through the individual decisions for individual people.
00:28:34.820 But this is about what will make the biggest difference.
00:28:37.080 Look, look.
00:28:37.360 You understand how weird this sounds?
00:28:41.660 You want people to come and invest in Britain.
00:28:44.420 You want people to bring their money.
00:28:47.040 Yet the one person who probably has got more money to burn and would probably like to invest in Britain.
00:28:54.160 In fact, he says so publicly when he didn't get the invitation.
00:28:58.360 You're deciding he's not good enough.
00:29:00.700 For what reason?
00:29:02.040 No, look.
00:29:02.520 If people have an investment proposition for the UK, of course, I will talk to them about it.
00:29:08.240 I'm not going to go through...
00:29:09.340 Look, thousands of people, Trevor, wanted to come to this summit.
00:29:12.200 Of course they did.
00:29:13.280 Elon Musk is not that.
00:29:14.720 Thousands of people.
00:29:15.740 Well, look.
00:29:16.020 I'll simply say that the people coming, you'll see the scale of the investments.
00:29:19.020 You'll see the opportunities for the UK.
00:29:21.400 You'll see the kind of quality and depth of the engagement this new government has.
00:29:25.740 And I think to individually talk about a certain person here or there, look, that's not right on them.
00:29:30.300 Let alone the government.
00:29:31.120 I love that.
00:29:32.400 And we all know what it is.
00:29:34.140 It's his politics.
00:29:35.340 And it's not just the UK.
00:29:36.340 I won't pick on them.
00:29:37.300 The California Coastal Commission recently cited his tweets for a reason why they won't let more spaceships take off from Vandenberg in California.
00:29:46.020 They literally said we don't like his tweets, so we're not going to let him fire spaceships.
00:29:50.540 Even Gavin Newsom, the governor of California who is at odds with Elon Musk, said that's nuts.
00:29:55.400 He...
00:29:56.080 They hate him so much.
00:29:57.840 And when I was at Davos, Switzerland, last year for the World Economic Forum, the two words that people hated were Trump and Musk.
00:30:05.120 Those were the two names on everyone's list, Trump because the threat he poses to the global order, and Musk because people see that he will allow Trump to succeed if Trump is able to succeed.
00:30:16.280 I think...
00:30:17.260 I hate to say it.
00:30:19.420 You know, they say that graveyards are full of indispensable men, as in there's no such thing.
00:30:24.760 But I think Elon Musk is as close as it gets to an indispensable man in this moment.
00:30:29.440 That's right, and he's similar to Trump in that, you know, the media loved him, celebrities loved him, you know, left-wing politicians loved him because of all the...
00:30:42.440 Because of, you know, the green car revolution, the electric car revolution.
00:30:45.640 And simply by changing his political opinions, all of these people, the same people turned against him and completely forgot about, you know, all these other things.
00:30:55.640 The California Coastal Commission being like the craziest, the fact that they say it openly.
00:31:00.220 Normally, there's a sense of plausible deniability, you know, as that Labour MP was very unsuccessfully trying to do, trying to pretend that the reason he's being ostracized is for some neutral, non-political reason.
00:31:12.740 And what's funny about the California Coastal Commission is they came out and said it openly.
00:31:16.960 But we all know that's the case with the European Union and its investigations of Musk's company.
00:31:21.780 We all know that all of these reasons they come up with for investigating Musk's companies or, you know, denying some permit for a space launch, it's all just every supposedly neutral reason they come up with is just a fig leaf for a political retaliation, political retribution.
00:31:41.560 That's what's going on.
00:31:43.460 You know, I was in the UK and I was in Ireland over the weekend, Alan, and I bought a bottle of water and I discovered this new regulation.
00:31:54.500 I don't know if it's in the UK or just the EU, but when you unscrew a bottle from the top of the water, it doesn't come off completely.
00:32:01.280 There's this amazing innovation that the bottle cap, the cap stays on when you're drinking.
00:32:09.000 So it sort of squishes against your face or your nose and it's got that, it's a little bit prickly, right, where you broke it.
00:32:15.180 It's the most annoying thing in the world.
00:32:17.680 I'm sure it's done in the name of environmentalism because you can't drop the cap separately from the bottle.
00:32:24.000 I guess you could litter the whole thing.
00:32:26.060 And you know what I thought, Alan?
00:32:28.080 I thought in America you've got Elon Musk who's sending 90% of the Earth's payload into space, 10 times more than every other country in the world combined.
00:32:38.180 He's landing rocket ships vertically, being clasped by those big, huge robot arms.
00:32:46.440 He's doing stunning things.
00:32:48.860 He's giving internet to the world through Starlink.
00:32:52.060 He's got Tesla.
00:32:54.200 He's got this Neuralink program, which I don't know much about.
00:32:57.180 He's got, like he is, that's what he's doing.
00:33:01.160 But the European Union, what's their great innovation?
00:33:05.120 What's their competition to Elon Musk?
00:33:07.020 They've got that irritating little bottle cap that scratches your nose when you drink water.
00:33:12.360 And I thought, those are the two paths.
00:33:15.180 You know, choose wisely, young man.
00:33:17.240 On one hand, you have Elon Musk going to the stars.
00:33:20.140 On the other hand, the European Union with their regulations about bottle caps.
00:33:24.400 I thought that was, I just, I thought that is the state of the world today.
00:33:28.300 What do you think of that?
00:33:30.320 That's completely nuts.
00:33:31.680 That's a classic European Union regulation.
00:33:33.800 Now, you're right, that probably is because of regulation that the caps don't come up the bottles.
00:33:37.280 I always wondered why that was the case, because it's the case in the UK, it's the case in Europe.
00:33:41.860 It must be a regulation, surely.
00:33:44.000 The other thing to remember is, I think, there is a sort of insidious relationship between the EU, Brazil and other foreign governments that are going after Musk and Musk companies and the United States.
00:34:00.800 Because a lot of United States soft power organizations and NGOs and companies that have worked with the U.S. government to promote anti-disinformation research or to monitor disinformation online.
00:34:15.320 And they're very active in places like the UK and the European Union and Brazil, pushing for more regulation of social media platforms.
00:34:28.080 And the State Department, which is supposed to stand up for American companies when a foreign government is threatening heavy handed regulation, didn't do much to stop the Digital Services Act or even to water down the Digital Services Act, which is the European Union regulation that's being used to go after Musk companies.
00:34:47.620 The State Department should have done that, and they didn't.
00:34:50.740 And I think the reason they didn't is because the U.S. government probably quite likes it when the European Union forces American tech companies to go after disinformation and hate speech.
00:35:03.520 It's something they can't do directly themselves because of the First Amendment, even though they'd like to.
00:35:08.980 But a massive jurisdiction like the European Union doing it is the next best thing.
00:35:13.840 Yeah, we were down in Sao Paulo, Brazil about a month ago for the massive rally after their authoritarian regime banned Twitter.
00:35:23.780 They were banning individual Twitter accounts.
00:35:25.880 Elon Musk revealed that publicly, so they banned the whole app.
00:35:29.740 I was astonished, Alan.
00:35:31.300 I went down there because I wanted to see with my own eyes, is it really a rally about free speech?
00:35:35.560 Because that would be very unusual for people to rally for something so abstract.
00:35:40.820 You know, it just seemed like I've never seen a rally for freedom of speech in my life, actually.
00:35:48.140 And it's true.
00:35:49.260 Their signs were about Twitter and free speech and Elon Musk.
00:35:52.860 Just for fun, here's some of the streeters we did with Brazilians.
00:35:56.900 And by the way, it was a wonderful feeling.
00:35:58.680 Every age, every race.
00:36:00.340 Brazil is a very mixed race society, different classes.
00:36:04.160 There was an amazing unity.
00:36:05.840 And everyone was using the vocabulary of freedom.
00:36:09.320 It was really one of the most astonishing rallies I've ever been to.
00:36:11.820 And the fact that 200,000 people were there was just the punctuation mark at the end.
00:36:16.160 But here, just a little flashback.
00:36:17.640 Here's some of the streeters I did in a massive pro Twitter rally in Brazil.
00:36:23.060 Take a look.
00:36:23.840 Hello, Musk.
00:36:24.860 Thank you for fighting for our freedom.
00:36:27.240 We are with you.
00:36:28.340 Musk, stay firm.
00:36:30.680 Don't get excited.
00:36:31.380 We help you because the Brazil needs you and the world needs you.
00:36:34.860 Because we need freedom.
00:36:36.380 And you're a warrior, a fighter.
00:36:38.060 The Brazilian people ordinary, how do they feel about Elon Musk?
00:36:42.240 Do they even know who he is?
00:36:44.000 You know, most of the people are with Elon Musk.
00:36:47.420 Because we see what's going on.
00:36:49.240 We see, we know what is right.
00:36:51.120 And that's why we think that we, and we support him.
00:36:55.120 But it's easy to be at Lula's side.
00:36:57.260 I love Elon Musk.
00:36:59.100 Oh, I love Elon Musk.
00:37:01.460 He's helping us so much.
00:37:04.040 I hope so because he has a lot of power, a lot of money, a lot of everything.
00:37:09.660 I hope he can help us.
00:37:12.340 You trust him a lot.
00:37:15.040 If you had one message for the owner of X, Elon Musk, what would it be?
00:37:20.980 I didn't hear a peep from the State Department about a massive American company being blocked
00:37:32.360 like that.
00:37:33.120 And it wasn't just Twitter being blocked.
00:37:34.940 They went after Elon Musk's other unrelated holdings.
00:37:37.980 They went after Starlink, which is completely unrelated to Twitter.
00:37:41.820 And you would think that the U.S. State Department, by the way, here in Canada, the most terrifying
00:37:46.360 American is not any senator.
00:37:48.580 It's not even the president.
00:37:49.900 It's the U.S. trade representative.
00:37:52.120 If the U.S. trade representative thinks that Canada is engaging in monkey business in
00:37:57.460 softwood lumber, in mining, in environment, holy moly is that terror because there's trade
00:38:04.800 sanctions, there's tariffs, there's litigation in these free trade agreements.
00:38:10.980 I mean, they stand up for American rights in the economy.
00:38:16.280 Absolute silence when it was Elon Musk that was being basically expropriated by these other
00:38:22.280 countries.
00:38:23.060 You've been very generous with your time.
00:38:24.540 I just wanted to show two more things to you.
00:38:27.460 Actually, I'll just show one more thing because we've been interested in the case of Tommy
00:38:31.860 Robinson, and not everyone shares my taste for Tommy Robinson.
00:38:36.960 He's a little bit rambunctious and rough around the edges, but I believe that he is a warning.
00:38:43.140 He is the canary in the coal mine, just like Alex Jones is in America.
00:38:46.400 You don't have to like Alex Jones.
00:38:47.920 You don't have to agree with everything about Alex Jones.
00:38:49.760 But when he was canceled in a 24-hour period by 14 different social media companies on the
00:38:57.460 same day, including LinkedIn, and I think Pinterest, they're banning.
00:39:02.920 It was just real, you know, Spotify, YouTube, all his accounts, 14 different accounts suspended
00:39:09.580 on one day.
00:39:10.380 Don't tell me that's not political collusion.
00:39:13.060 People said, oh, it's just Alex Jones.
00:39:15.600 He's crazy.
00:39:16.520 He's out of control.
00:39:17.460 He's rude.
00:39:18.720 He swears.
00:39:20.420 Who cares?
00:39:21.300 Well, you know what?
00:39:22.280 That was a test drive to see what they could get away with.
00:39:25.540 And I think Tommy Robinson is similar.
00:39:29.200 Let me show you, Alam.
00:39:30.080 I know you know this, but I want to show our viewers again.
00:39:32.140 This is from a parliamentary hearing in the United Kingdom a few years ago where Google
00:39:38.300 YouTube's senior global executive in charge of counterterrorism was meeting in London's
00:39:46.100 parliament in front of a committee.
00:39:47.380 So this guy's job was tracking how real terrorist groups like ISIS or Al Qaeda would use YouTube
00:39:55.020 for propaganda and for recruitment.
00:39:58.260 He had an extremely serious and grave job where he was dealing with life or death matters.
00:40:06.600 And he goes to parliament.
00:40:09.060 And I've got to think he was spending a week in advance preparing for this extremely important
00:40:14.400 meeting.
00:40:14.940 You're in America.
00:40:16.080 You're going to the UK.
00:40:17.480 You're there to answer questions about terrorism and YouTube.
00:40:20.700 Wow.
00:40:21.240 That's a big job.
00:40:22.680 But take a look at this.
00:40:23.940 I'm going to play a couple minutes of it.
00:40:25.760 The questions weren't about terrorism, Alam.
00:40:28.260 They were MPs complaining that they had to see Tommy Robinson videos and that the algorithm,
00:40:36.880 when they typed in Tommy Robinson, kept stirring up more Tommy Robinson.
00:40:40.600 Take a look at this.
00:40:42.060 We are working to make sure that videos that promote hate or promote violence, if they violate
00:40:47.100 our policies, are removed from the platform.
00:40:49.080 If they walk right up to the line, we have also, at the encouragement of this committee, developed
00:40:54.780 a new enforcement mechanism to limit the features that these have.
00:40:58.600 They should not be appearing in our recommendation engine.
00:41:01.580 If they are, I will take this back to our team and see what the problem is.
00:41:05.640 Okay, but they are.
00:41:06.980 I mean, they are appearing.
00:41:08.160 They are in my recommended timeline at the moment.
00:41:11.040 So because I've been searching on my iPad for national action videos, I, as a result,
00:41:17.480 have the first two videos recommended to me by YouTube.
00:41:22.360 When I just click on, as I've just done with this afternoon, I click onto YouTube, the first
00:41:27.800 two recommendations are Tommy Robinson videos.
00:41:31.500 So the Tommy Robinson that was identified as part of the Finsbury Park online radicalization
00:41:36.880 process, that's what YouTube's recommended.
00:41:39.060 I've not searched for it.
00:41:40.640 YouTube has recommended that to me.
00:41:43.440 Doesn't that cause you some serious alarm?
00:41:45.320 I can't speak to these particular videos personally.
00:41:48.880 It causes me a lot of alarm, but I will take this back to our team and see why this has
00:41:53.400 happened.
00:41:53.720 It's not even just about the individual videos.
00:41:55.340 It's actually a recommended channel.
00:41:56.840 I have got up here, it is coming up as my recommended channels, that one of the recommended channels
00:42:02.780 for me is Tommy Robinson recommended channel.
00:42:07.560 I've also got British Warrior.
00:42:09.260 I've got a series of other, you know, quite sort of extreme things that are coming in.
00:42:15.320 up, but I have specifically Tommy Robinson recommended channel.
00:42:18.920 You can pass you my iPad.
00:42:21.980 It's important for the company and for our bottom line, for the recommendation engine
00:42:26.480 to work as it is intended, to make sure that people can find quality content that they are
00:42:31.940 looking for.
00:42:32.480 It should not be serving up videos that incite or inspire hate.
00:42:37.280 If it is, there is a problem and I will take it back to the team and see that it's addressed.
00:42:42.680 But lots of people have raised this with you.
00:42:44.720 This is not just us.
00:42:45.840 This is not the first time.
00:42:46.900 I do not believe this is the first time you have heard this.
00:42:50.200 Allegations and concerns that your algorithms are promoting more and more extreme content
00:42:55.880 at people.
00:42:56.500 Whatever they search for, what they get back is a whole load more extreme recommendations
00:43:01.480 coming through the algorithms.
00:43:03.380 You are the king of the search engine and yet your search engines are promoting things that
00:43:10.300 further and further radicalize people.
00:43:12.380 Whatever they search for, they get something more back.
00:43:14.780 And that woman who was leading the charge, Yvette Cooper, is now the Home Secretary, basically
00:43:21.860 the Minister of the Interior for the UK in charge of censorship.
00:43:26.680 By the way, shortly after that hearing, they banned Tommy Robinson from basically all social
00:43:32.620 media too.
00:43:33.160 They did an Alex Jones to him.
00:43:35.280 Elon Musk brought him back to life online and same with Alex Jones.
00:43:38.420 I don't know.
00:43:39.800 I think, Alan, that Rebel News would probably be dead meat by now if things had continued
00:43:45.900 on that trend.
00:43:47.320 And if it weren't for Rumble and Elon Musk and Twitter and this revelation by independent
00:43:54.380 thought leaders like Peter Thiel and others in Silicon Valley that, whoa, we're headed in
00:43:58.620 the wrong direction.
00:43:59.240 I think that we were going in a very bad way and I really think this election, as Elon Musk
00:44:07.360 said, is a do or die moment for America because if Kamala Harris, the most left-wing candidate
00:44:13.060 in American history on a major ticket, if she wins, she'll do what she says she'll do.
00:44:19.980 Tim Walls himself has talked about censoring misinformation.
00:44:23.860 I just got to throw one more clip at you, Alan.
00:44:25.240 Here's Tim Walls, the vice presidential candidate, saying misinformation should be illegal.
00:44:32.080 Take a look at this.
00:44:32.820 I think we need to push back on this.
00:44:34.520 There's no guarantee to free speech on misinformation or hate speech and especially around our democracy.
00:44:40.260 Well, Alan, I mean, I don't mean to be dramatic, but it's not just the fate of America in the
00:44:45.740 balance in a couple of weeks.
00:44:47.200 It's the fate of so many other countries that take their lead from America.
00:44:50.080 And frankly, it's the fate of our company, Rebel News, I'm sure of it.
00:44:56.160 And I think a lot of it.
00:44:57.160 I mean, I just can't.
00:44:58.500 Maybe I could overconsume social media and so I've been radicalized.
00:45:03.020 But I really think that everything is at stake this election.
00:45:06.560 What do you think?
00:45:08.960 Certainly when it comes to free speech, I think things might start getting worse if the Democrats
00:45:13.580 win, because, you know, as I was saying, so much of this was encouraged by the U.S.
00:45:19.180 government and U.S.
00:45:19.960 government agencies.
00:45:20.860 And it's only stopped because of congressional investigation.
00:45:24.520 The same people who encouraged big tech companies to come down so hard on disinformation are still
00:45:31.420 in the U.S.
00:45:32.160 government and they're going to stay there.
00:45:33.900 They're going to be promoted, I think, if if if there isn't a more pro free speech
00:45:41.920 administration next year.
00:45:45.140 And, you know, I would like the Democrats to become a more pro free speech.
00:45:48.920 I would like it not to matter if a Republican or a Democrat is in office.
00:45:52.820 I want I would I would like for, you know, both major parties in the U.S.
00:45:57.040 to be similarly supportive of free speech.
00:45:59.000 That doesn't seem like it's happening at the moment.
00:46:00.880 There are a few a few Democrats who aren't as bad as the others, but most mostly they've
00:46:05.980 been supportive of the disinformation panic.
00:46:08.740 There are some, you know, moderate establishment Republicans who supported the disinformation
00:46:12.440 panic as well.
00:46:15.260 But certainly this election will change a lot when it comes to U.S.
00:46:19.120 government policy and U.S.
00:46:21.140 government policy.
00:46:22.120 As I as I said, it's, you know, quietly encouraged censorship in foreign countries.
00:46:29.380 You know, the Brazilian, the U.S.
00:46:31.160 embassy in Brazil for itself, they put out a very mild statement saying Brazil should
00:46:35.980 respect free speech back when the judiciary was going after X for having too much free
00:46:41.960 speech on the platform.
00:46:43.440 But that same U.S.
00:46:44.400 embassy had up till then been hosting events on how to combat disinformation online and how
00:46:51.740 to deal with disinformation and fake news.
00:46:53.920 So it's been the policy of the U.S.
00:46:56.600 government to support social media companies taking measures against disinformation.
00:47:02.600 And, you know, that is that is that could be very much affected by the election.
00:47:08.700 Well, and we're saying this in 2024.
00:47:11.800 Imagine how bad things would go three or four years into it.
00:47:15.140 I mean, I'm really three or four years of judicial appointments, three or four years of regulatory
00:47:22.440 squeeze.
00:47:23.380 I saw a graphic in The New York Times the other day of all the lawfare against Elon Musk and
00:47:28.420 his various companies.
00:47:29.880 They I believe four years from now, if there was a Harris win, Elon Musk would be in trial
00:47:36.640 as much as Donald Trump has been on trumped up charges.
00:47:40.200 And and if you could take on Trump, you can take on anyone.
00:47:44.640 And I think they would.
00:47:46.260 Well, listen, Elon, it's great to catch up with you.
00:47:47.960 I'm delighted that you are the head of the Foundation for Freedom Online dot com.
00:47:53.300 Managing director is always nice to chat with you about these tech things.
00:47:57.040 And I sign off my show every day with the same slogan.
00:48:01.760 I've been using it nine years here at Rebel News and at Sun News Network beforehand.
00:48:06.900 And I keep saying, keep fighting for freedom.
00:48:09.780 That's the tagline I've used for really almost 15 years.
00:48:13.480 And you have to, because these freedoms were hard fought over centuries, even over millennia.
00:48:21.880 And they're slipping away.
00:48:25.080 And it's easier to keep them than to lose them and fight to get them back.
00:48:30.740 And I really think that we're at that key inflection in history.
00:48:34.320 Alan Bakari, great to see you, my friend.
00:48:35.920 Stay free.
00:48:37.780 Good to see you guys, right?
00:48:38.880 Right on.
00:48:39.360 There you have it, Alan Bakari, managing director of the Foundation for Freedom Online.
00:48:44.460 Well, that's our show for today.
00:48:46.280 Until next time, on behalf of all of us here at Rebel World Headquarters, to you at home,
00:48:51.120 good night.
00:48:52.100 And you know it.
00:48:53.520 Keep fighting for freedom.
00:48:54.500 Keep fighting for freedom.