Timcast IRL - Tim Pool - October 20, 2020


Timcast IRL - DOJ Prepares Antitrust Suit Against Google As Veritas Drops Another Expose


Episode Stats

Length

2 hours and 12 minutes

Words per Minute

206.65411

Word Count

27,454

Sentence Count

2,158

Misogynist Sentences

40

Hate Speech Sentences

20


Summary

In this episode, we're joined by technology reporter Alan Bakari to talk about his new book, "Big Tech's Battle to Erase the Trump Movement and Steal the Election." We also talk about the Justice Department's new anti-trust law, and how far-left extremists are allowed to run wild on the internet.


Transcript

00:00:00.000 The Department of Justice is pleased to announce that the U.S.
00:00:02.000 Department of Justice has approved a new law that will protect the privacy and security of U.S. citizens. The law
00:00:07.000 will be passed in the next two years. Thank you.
00:00:07.000 The Department of Justice is getting ready to file an antitrust suit against Google because
00:00:35.000 Because Google, they say unfairly uses its power to dominate the market.
00:00:38.000 It's fairly obvious what an antitrust suit is, and I'm not entirely convinced it's actually going to do anything.
00:00:44.000 We'll see how it plays out.
00:00:45.000 Interestingly, it's only Republicans that are actually getting behind this.
00:00:49.000 There are some state attorney generals that are all Republican that are getting behind this.
00:00:53.000 And it's for obvious reasons.
00:00:55.000 The Democrats get a free pass for the most part from a lot of the censorship, and they're allowed to run wild and advocate violence.
00:01:02.000 Or, I should clarify.
00:01:04.000 Antifa and far-left extremists are allowed to advocate violence, organize violent events, with impunity.
00:01:10.000 If a conservative says, learn to code, you're gone.
00:01:13.000 So I have here this book.
00:01:14.000 It says, Big Tech's Battle to Erase the Trump Movement and Steal the Election.
00:01:20.000 This is deleted by Alan Bakari, who's hanging out with us tonight.
00:01:23.000 Alan, take your book.
00:01:24.000 I just wanted to open with that line.
00:01:24.000 Oh, cool.
00:01:26.000 I was like, that's a good line.
00:01:27.000 Erase the Trump Movement.
00:01:29.000 And what is the last one?
00:01:30.000 And steal the election.
00:01:31.000 Not hyperbole.
00:01:32.000 It sounds like a very partisan, hyperbolic title.
00:01:35.000 It's actually, first of all, it's not my opinion.
00:01:37.000 This is the opinion of whistleblowers inside Facebook and Google.
00:01:40.000 And all these companies who told me this is what they're doing, and it started literally the day after the election.
00:01:46.000 And, you know, we've seen it happening before our very eyes, the deplatforming of the Trump movement over the past four years.
00:01:52.000 And that's just the stuff we see on the surface.
00:01:54.000 This book gets into what's happening behind the scenes as well there, you know, subtle algorithmic tweaks, and we'll get into that.
00:01:59.000 Yeah, we will.
00:02:00.000 First, who are you?
00:02:02.000 I have.
00:02:03.000 Good point.
00:02:03.000 Yes, who am I?
00:02:05.000 Who is this guy?
00:02:06.000 Yeah, I just wanted in the studio.
00:02:07.000 Who are you guys?
00:02:08.000 You walked in like, I got a book, and I'm like, yes, sit down.
00:02:12.000 So I'm Alan Bakari.
00:02:13.000 I'm the senior technology correspondent at Breitbart News.
00:02:16.000 I've been covering technology for them for four years, five years actually now.
00:02:20.000 And yeah, so this book is basically a summation of the past five years of work, this descent into internet censorship that we've all witnessed and experienced.
00:02:29.000 You've actually exposed a bunch of this stuff, too.
00:02:31.000 You know, we'll get into all this, but you covered, I think, the good censor.
00:02:36.000 Was it you who released that footage of them, like, crying after Trump won?
00:02:38.000 That is correct.
00:02:39.000 That went huge.
00:02:40.000 That was, like, probably over a million views on Breitbart.
00:02:44.000 That was such an incredible video to see that.
00:02:47.000 Like, you know that's what they believe, but to see it on video is just another thing.
00:02:51.000 Right on.
00:02:52.000 We'll get into all this.
00:02:53.000 Of course, Ian's hanging out.
00:02:54.000 Hi, everybody.
00:02:55.000 And this will be interesting, too, because Ian, you have direct experience with censoring people and being an evil overlord.
00:03:00.000 100% true, Tim.
00:03:02.000 The line of good and evil.
00:03:04.000 Run through every man.
00:03:04.000 I was co-founder of Minds.com and an admin on the site for eight years, doing a lot of behind the scenes.
00:03:12.000 Hands-on with with you know censoring and not sensor and realizing that censoring doesn't mean it's bad.
00:03:18.000 Sometimes you have to censor things What we're seeing like now is the political manipulation for power right the versus Censoring things when you have to, because it might be more illegal.
00:03:30.000 Because it's illegal, for instance.
00:03:31.000 With Mines, if it was illegal, we'd take it off the site.
00:03:33.000 That's a form of censorship.
00:03:34.000 But if Google's writing their own terms of service that are like, they can censor and delete whatever they want, that's a big deal.
00:03:41.000 That's different.
00:03:42.000 Here's the question.
00:03:44.000 Just trying to do some kind of intro.
00:03:46.000 Of course, Lydia's hanging out.
00:03:47.000 Yeah, I'm here.
00:03:48.000 And, uh, we're gonna go nuts, because we got so much to talk about, especially, like, every chapter of your book is gonna be crazy.
00:03:54.000 And, uh, this is all coming on the heels of more exposés from Project Veritas.
00:03:57.000 We have, uh, one of their stories now, this guy's basically saying, like, oh, we definitely can censor, you know, right-wing individuals, and... It's just, it just, it pains me to say, it feels so obvious.
00:04:07.000 You know, it's great to get the confirmation, don't get me wrong, and I, I, you know, it's, it's awesome that we have Veritas doing this work.
00:04:13.000 But man, to see it again and again and again and to know.
00:04:17.000 We need something.
00:04:19.000 And I feel like, you know, something to be done.
00:04:20.000 And I feel like we've got these DOJ, you know, DOJ people going after Google for antitrust is just not gonna do anything.
00:04:27.000 So we'll talk about all this.
00:04:28.000 Smash that like button.
00:04:30.000 Hit the notification bell.
00:04:31.000 We do the show Monday through Friday live at 8 p.m.
00:04:33.000 And, I don't know, let's just get back into the conversation.
00:04:35.000 So I don't even know where to begin because let's do this.
00:04:38.000 Let's talk about this DOJ antitrust suit.
00:04:42.000 So I've got the story right here from Fox.
00:04:44.000 Lawmakers hail DOJ antitrust lawsuit against Google is long overdue.
00:04:49.000 Senator Hawley called it the most important antitrust case in a generation.
00:04:53.000 Today's lawsuit is the most important in a generation, Senator Josh Hawley said.
00:04:57.000 Google and its fellow big tech monopolists exercise unprecedented power over the lives of ordinary Americans, controlling everything from the news we read to the security of our most personal information.
00:05:08.000 And Google in particular has gathered and maintained that power through illegal means.
00:05:13.000 The DOJ suit alleges that Google has used its dominance in online search and advertising to stifle competition and boost profits.
00:05:19.000 The suit could be an opening shot in a battle against a number of big tech companies in coming months.
00:05:24.000 A Google spokesperson told Fox News, Today's lawsuit by the Department of Justice is deeply
00:05:29.000 People use Google because they choose to, not because they're forced to,
00:05:29.000 flawed.
00:05:32.000 or because they can't find alternatives.
00:05:34.000 We have a fuller statement this morning.
00:05:36.000 Later, the tech giant released a lengthy blog post in which it said the DOJ complaint
00:05:41.000 relies on dubious antitrust arguments.
00:05:44.000 This lawsuit would do nothing to help consumers.
00:05:46.000 To the contrary, it would artificially prop up lower quality search alternatives, raise phone prices, and make it harder for people to get the search services they want to use.
00:05:54.000 The lawsuit says, for years Google has entered into exclusionary agreements,
00:05:58.000 including tying arrangements and engaged in anti-competitive conduct to lock up distribution
00:06:03.000 channels and block rivals. American consumers are forced to accept Google's policies,
00:06:07.000 privacy practices, and use of personal data, and new companies with innovative business models
00:06:12.000 cannot emerge from Google's long shadow. For the sake of American consumers, advertisers,
00:06:16.000 and all companies now reliant on the internet economy, the time has come to stop Google's
00:06:20.000 anti-competitive conduct and restore competition, it says.
00:06:24.000 I think that was a whole lot of hot air and it is the wrong target and it's going to miss because I
00:06:28.000 think Google makes a really good point.
00:06:30.000 I don't use Bing.
00:06:31.000 I don't want to use Bing.
00:06:32.000 I do use DuckDuckGo sometimes, you know.
00:06:34.000 So there are choices when it comes to search.
00:06:37.000 The issue with Google is they're more than just a search engine and the DOJ lawsuit is very much focused on search and how they manipulate search.
00:06:46.000 No, the issue is that YouTube, for instance, which we're on, is heavily subsidized by other areas of Google.
00:06:52.000 Because of that, no other video platform can compete.
00:06:55.000 And some argue, no video platform could survive with how expensive it is to do streaming, to do this.
00:07:01.000 I mean, I'll tell you guys, we're doing this show right now, cost me nothing.
00:07:04.000 To press live on this stream, so that all of you can watch, nothing.
00:07:09.000 Google just says, if we make money off it, we'll take a cut.
00:07:12.000 So that can't exist on a lot of other platforms, because it's really expensive.
00:07:14.000 Like podcasting, for instance.
00:07:16.000 If I got anywhere near the amount of views on podcasting, it would be an insane amount of money to handle all that bandwidth.
00:07:22.000 So anyway, let's dive into it.
00:07:24.000 What do you think, Alan?
00:07:25.000 You were telling me before you don't think this antitrust stuff is going to do anything.
00:07:28.000 Well, it's important and it's good that the Republicans are punishing Google and being seen to punish Google because I think the idea that a lot of tech companies have had for a long time is we don't have to listen to the warnings of Republicans about censorship because they're never going to regulate us.
00:07:41.000 They're never going to do anything.
00:07:43.000 So there's one positive thing to it.
00:07:45.000 Also positive, the fact that we might actually get to see behind the curtain, see how Google is training its search algorithm.
00:07:51.000 That's something we don't know about, really.
00:07:52.000 We've got some idea of it based on whistleblowers I've talked to, based on James O'Keefe's whistleblowers as well.
00:07:58.000 But if, through the case, we can get a look at that, that would be excellent.
00:08:02.000 But I don't think antitrust focuses on the right thing.
00:08:07.000 So it's true that Google is a monopoly that control 90% of search worldwide.
00:08:12.000 It's true that they're, you know, pretty bad to competitors.
00:08:15.000 Not just search competitors, but as you said, you know, competing video platforms, competing... Email.
00:08:24.000 Right, reviewing as well.
00:08:25.000 Yeah.
00:08:26.000 So it'll be very good for companies like Yelp, but it doesn't really focus on, I think, the real problem of Google, which is its vast power over political information, its ability to swing elections, and its ability to destroy people's livelihoods.
00:08:39.000 Like, if you're a website, if you run a website and it goes from page one of Google to page 1,000, it's going to have a huge impact on your business.
00:08:45.000 You're done.
00:08:47.000 I knew a guy who did online sales, and Google one day changed the algorithm, and then his company didn't exist anymore.
00:08:53.000 Yeah.
00:08:53.000 Well, look at what they've done to news publishers.
00:08:55.000 So, Breitbart News published a couple of months ago research showing they've cut visibility on Breitbart links by 99% compared to 2016.
00:09:04.000 They've done it to a lot of other websites as well.
00:09:06.000 Almost every conservative news website with the exception of one or two like Fox.
00:09:10.000 Just colossal fall in visibility since 2016.
00:09:14.000 And that's completely artificial because traffic numbers haven't gone down.
00:09:18.000 It's just Google visibility that's gone down.
00:09:20.000 My two main channels are blacklisted on Google.
00:09:22.000 You can't search for them.
00:09:23.000 They're gone.
00:09:24.000 So maybe I'll reap some benefits from an antitrust thing where they, you know, separate all these companies or whatever.
00:09:30.000 I don't know.
00:09:31.000 But I kind of feel like... I think, you know, Andrew Yang made this point.
00:09:34.000 He was like, nobody wants to use Bing.
00:09:36.000 Antitrust isn't the real issue.
00:09:38.000 I guess kind of like what you're saying.
00:09:40.000 But then when you have Google, Alphabet I suppose now, controlling all of these different industries in one big massive corporation, Maybe the issue isn't search.
00:09:50.000 Maybe the issue is search needs to be separate from video, needs to be separate from email, needs to be separate from, you know, Google Drive and all that stuff.
00:09:57.000 If you really want competition among the search engines to give, you know, smaller players like DuckDuckGo an advantage, one of the things one of my Google sources told me is that what you need to do is separate the data from the rest of Google.
00:10:08.000 I have a friend who's doing this big push for owning your own data.
00:10:11.000 It's like a big movement.
00:10:12.000 And if that were the case, then that would, that could function.
00:10:15.000 away from the data, you make the data non-exclusive and suddenly other search engines can compete.
00:10:15.000 It could function that way.
00:10:21.000 I have a friend who's doing this big push for owning your own data.
00:10:24.000 It's like a big movement.
00:10:25.000 And if that were the case, then that could function.
00:10:29.000 It could function that way.
00:10:30.000 So if every bit of information that one of these companies might get from you, you own
00:10:35.000 and can distribute freely as you choose, that's a really interesting idea because then all
00:10:39.000 these other companies can build better services based off of all of this data anonymized.
00:10:44.000 It's the basis of their market power.
00:10:46.000 It's also the basis of their ability to politically manipulate people, which we can get into.
00:10:50.000 Actually, you know, I guess in that regard, I would say their search is a monopoly.
00:10:55.000 It's an interesting phenomenon we're seeing with social media, Twitter, Facebook, and Google, where the only reason they're a monopoly, they dominate these spaces, is because no one uses anything else.
00:11:05.000 And because no one uses anything else, there can be no competition.
00:11:09.000 So it is challenging.
00:11:11.000 I can't blame Twitter because everybody's on Twitter, but because everybody's on Twitter, no one does anything else and it's creating monopolistic problems.
00:11:18.000 There's an argument that these are natural monopolies because you want to go to a social network that has the most amount of people on it.
00:11:25.000 That's where you'll get maximum viewership for your content.
00:11:29.000 And with search, the advantage gets locked in once you have that amount of data, because once you have it, your competitors by definition don't have it.
00:11:38.000 And Google's really mastered this.
00:11:39.000 This is why they've gone into smartphones, they've gone into browsers, they've gone into laptops, because that's just more ways to gather data.
00:11:47.000 Man, this makes me think so.
00:11:49.000 I think giving the data to the user is paramount.
00:11:51.000 I agree with you guys.
00:11:52.000 But I also think you have to free their software code because if it's private code and they give you data, it could be false data and you wouldn't know.
00:12:00.000 So the only way to verify that the data is real is if you understand what the code is measuring.
00:12:04.000 I don't necessarily agree.
00:12:05.000 I mean, that would be them defrauding you and defrauding the government.
00:12:09.000 And I just think if they commit a crime, you punish them for the crime.
00:12:09.000 Yeah, it would be.
00:12:12.000 But you wouldn't know they were committing a crime.
00:12:14.000 Well, the government would subpoena them to verify the records.
00:12:16.000 By showing their code?
00:12:18.000 Yeah, but they don't have to release the code to the public.
00:12:21.000 Well, I mean, the government is public.
00:12:22.000 The idea is they can look at the code without revealing trade secrets, and so they can see if the data is verified or not.
00:12:22.000 Right, right, right.
00:12:30.000 I don't think just a company giving up the property they develop makes sense.
00:12:33.000 I don't know.
00:12:34.000 What do you think?
00:12:35.000 Well, there needs to be a lot more transparency.
00:12:38.000 What I would like to see, especially with regards to political manipulation, is how do they train their so-called hate speech algorithms and their disinformation algorithms.
00:12:46.000 So when you're training an algorithm, the way you do it is you're training it to recognize language if you're trying to detect certain types of speech.
00:12:53.000 And you'd have to train it on a set of data.
00:12:55.000 This is hate speech.
00:12:56.000 This isn't hate speech.
00:12:57.000 That's what you're giving the algorithm.
00:12:58.000 So I would love to see their list of examples that they've used to train these algorithms.
00:13:03.000 That would be the way.
00:13:04.000 That would be like the smoking... Well, we've had so many smoking guns already.
00:13:08.000 It's kind of ridiculous to say it, but that would be yet another smoking gun that would conclusively prove political bias.
00:13:14.000 Although, again, I say that the blackpilled reality is that we've had the smoking guns and nothing has been done about it.
00:13:19.000 There's like a pile of guns in the other room that are aflame.
00:13:22.000 It's like they're sending smoke signals with the burning guns.
00:13:25.000 Like, wow, look at all that smoke coming out.
00:13:26.000 We know it.
00:13:27.000 And then you've got... There's a reason why it's Republicans going after these companies and not Democrats.
00:13:32.000 I mean, the Democrats are stupid for it, don't get me wrong.
00:13:35.000 But they're sitting there and they're just like, well, we're not getting banned, so let's roll with it.
00:13:38.000 They're banning our opponents.
00:13:40.000 But what it's really doing is creating this warped reality where conservatives look normal and the left looks psychotic.
00:13:45.000 Like, you get Antifa on Twitter saying, we're gonna go burn it down.
00:13:49.000 And a regular person's like, that's crazy.
00:13:51.000 But the conservatives who say anything kind of like that are nuked instantly.
00:13:54.000 You see what I mean?
00:13:55.000 Well, I'll make two points about that.
00:13:56.000 First of all, the Republican establishment, especially the ones in the Senate, are complete wusses.
00:14:01.000 Many of them bought and paid for by Google.
00:14:03.000 So we saw the New York Post get censored last week.
00:14:06.000 And the Senate Republicans came out, oh, we're going to subpoena Jack Dorsey.
00:14:10.000 I think it was a day ago.
00:14:11.000 And they voted it down or whatever.
00:14:13.000 The Republicans wavered.
00:14:13.000 Yeah, yeah.
00:14:15.000 We're not going to subpoena Jack Dorsey.
00:14:17.000 So I'm being maybe too harsh.
00:14:18.000 There are some senators who are very good on the issue, Hawley, Cruz, Blackburn, but like the vast majority of Republican senators, they can't be trusted on this issue either.
00:14:28.000 But I will say the second point, the left supporting censorship, this is extremely naive.
00:14:32.000 If you think these technologies that have been developed by Silicon Valley won't one day be turned against the left, the anti-war left especially, Yep.
00:14:40.000 You're being very naive.
00:14:40.000 They're already getting nuked.
00:14:41.000 Already, yeah.
00:14:42.000 Yeah, so I remember when Veritas did that expose on Pinterest.
00:14:45.000 Something they didn't catch when they were talking about live action being censored, the pro-life organization.
00:14:50.000 I looked at it and I was like, whoa, anti-media is on there.
00:14:53.000 And they're like, you know, anti-police brutality, anti-war progressive left, anti-establishment.
00:14:58.000 They're getting censored.
00:14:59.000 Why would an anti-war leftist organization be censored?
00:15:02.000 Isn't that weird?
00:15:03.000 I think it's because it's not so much about the conservatives when they're doing censorship.
00:15:08.000 It's about the people who oppose the establishment.
00:15:11.000 And when, like you say in your book, they're trying to, you know, what are they doing?
00:15:14.000 They're stopping the Trump movement.
00:15:16.000 Yeah.
00:15:17.000 The Trump movement is anti-establishment.
00:15:18.000 Trump booted out the crony rhinos and took over the Republican Party.
00:15:23.000 So what's really happening is that censorship is against those who dare challenge the establishment.
00:15:26.000 And that includes anti-war left.
00:15:28.000 And that includes, for the most part, the larger group, the conservatives, those who support Trump.
00:15:32.000 So, most of the book is based on stuff from Silicon Valley sources.
00:15:36.000 But, by the way, if you want to get it, it's called Deleted, and you can find it at deletedbook.com.
00:15:40.000 But there's one source I talked to from the government, and he's been following the censorship issue for a long time.
00:15:45.000 And the way he put it, I think, was very chilling.
00:15:47.000 He said, Up until 2015-2016, Western establishment elites saw internet-free speech as a good thing because it helped them regime change and destabilize countries abroad.
00:15:58.000 That was, you know, Twitter, the Arab Spring, Eastern Europe as well, color revolutions.
00:16:03.000 But, he said, as soon as you had Brexit, as soon as you had Donald Trump, establishment elites realized, oh no, this technology can be used to regime change us.
00:16:11.000 So that's the mentality now.
00:16:13.000 You know what I love?
00:16:14.000 During the Arab Spring, I had a bunch of friends who were active activists who were trying to provide communications.
00:16:20.000 And then one day, my friend was like, I just noticed something.
00:16:22.000 You notice all these Libyan revolutionary people speak perfect colloquial English?
00:16:27.000 And I started laughing.
00:16:28.000 I was like, that's an interesting thing, right?
00:16:29.000 And they're like, that's really weird.
00:16:31.000 And then they try and justify it saying, well, it makes sense because Twitter is a very Western thing.
00:16:35.000 So only people educated in the West or who are Western would be using it.
00:16:38.000 And some other people were like, I don't know.
00:16:40.000 know, maybe, maybe they're just trying to convince people who use the platform in America
00:16:45.000 that Libyans want us to intervene.
00:16:48.000 And then we did, and Hillary Clinton said we came, we saw, he died.
00:16:51.000 I'm not insinuating anything, I'm just saying what people were talking about at the time,
00:16:53.000 I don't know.
00:16:54.000 But it is, it is, do you know about Barrett Brown's investigation, Project PM?
00:17:00.000 Uh, no, enlighten me.
00:17:01.000 This was a long time ago, he ended up going, I think, I could be getting the details wrong,
00:17:05.000 but I think he went to prison, partly because of it, because they were doing, uh, they were
00:17:10.000 reviewing Stratfor data.
00:17:12.000 So this is strategic forensics, it's, I guess it's like- So I do remember this, the big Stratfor thing, yeah, yeah, I do remember this.
00:17:17.000 And so, in it, we learned that, I think the US Air Force was purchasing sock puppet accounts.
00:17:22.000 These are, a sock puppet account is a dummy, it's a bot, that's what they say, bots.
00:17:27.000 It's an account with a fake picture and a fake name, and then one person controls 50 of them, they're called sock puppets, to create the perception of popular opinion.
00:17:35.000 This is one of the biggest problems that's affecting big corporations today.
00:17:39.000 Like, Pepsi will put out a commercial and it's like a guy drinking, you know, and throwing a football, and they'll get inundated with like 50 to 100 messages on Twitter saying, you're racist, and then all of a sudden they're like, we're being attacked, what do we do?
00:17:49.000 Quick, issue an apology!
00:17:50.000 And it's like, dude, it's one guy with 50 accounts.
00:17:53.000 So we learned this back in 2011, I think it was 2012, that U.S.
00:17:57.000 military was buying sock puppet accounts.
00:18:02.000 For what purpose?
00:18:03.000 Middle Eastern intelligence operations and persuasion, psychological operations, and things like that.
00:18:08.000 So, this has been an ongoing thing, and at some point, you had the political parties realize they can do it, and I gotta be honest, I think the Democrats figured it out way before the Republicans.
00:18:18.000 Oh, no doubt.
00:18:18.000 Yeah, definitely.
00:18:19.000 They were using Facebook and all that stuff.
00:18:21.000 First, I mean, ActBlue, which you're familiar with.
00:18:23.000 ActBlue is the progressive, you know, fundraising—it's a progressive GoFundMe, essentially.
00:18:28.000 And then Republicans only launched WinRed recently, and that's the, you know, the Republican version of it, so the Democrats have had You know, have been on the forefront of this.
00:18:37.000 But something weird happened in the Trump era, like in the 2015 with the Trump army, the Trump train, the memesmiths.
00:18:45.000 All of a sudden you had an organic explosion of people who are actively producing content and propping up Trump.
00:18:50.000 Something that they couldn't pay for, you know?
00:18:52.000 Yeah, and you hit the nail on the head.
00:18:55.000 This is stuff the Obama campaign was doing in 2012.
00:18:58.000 Cambridge Analytica was just a smaller scale version of what Obama was doing in 2012 with Facebook's help.
00:19:04.000 And it was a pretty senior Obama campaign person who revealed that, who was working directly on it.
00:19:10.000 She said that Facebook actually helped them and gave them assistance in 2012.
00:19:17.000 Her name was Carol something.
00:19:19.000 I'm blanking on her last name.
00:19:20.000 Carol Peterson, perhaps?
00:19:22.000 But also with the bots.
00:19:24.000 The bots you mentioned from Stratfor.
00:19:26.000 Or sockpuppets.
00:19:26.000 The reason there was this panic after Trump's win was because all of these people, the foreign policy establishment, had been doing this for years.
00:19:34.000 They started accusing everyone else of, you know, using bots and sockpuppets and disinformation.
00:19:38.000 Of course they were terrified that that would happen to them because they'd be doing it for so long.
00:19:42.000 So they couldn't imagine that this movement might actually be organic, might actually be real people.
00:19:47.000 They thought it was just the same thing they'd been doing for years and years.
00:19:52.000 It probably was like a kid in Russia in an apartment, and they traced some IP, and they got like one.
00:20:01.000 And they were like, oh, the Russians, the Russians, it's the government, the Russian government.
00:20:04.000 And you're like, no, actually, it was just some dude who maybe had a Russian IP.
00:20:08.000 Well, the story now, I guess, is that it's such a weird and complicated conspiracy of nonsense.
00:20:16.000 Was it Ratcliffe who issued the statement basically saying that Hillary Clinton created the Russian intelligence, believed that Hillary Clinton was going to create a fake campaign accusing Trump of working with them to get the press cycle off of her email story, and that Obama was briefed on this so they knew?
00:20:37.000 At least the insinuation is Hillary Clinton created the fake idea just out of the blue.
00:20:40.000 It seemed like it came right after her email scandal dropped.
00:20:43.000 And she just started saying it.
00:20:45.000 She's like, Trump's working with Russia!
00:20:46.000 And it's like, uh-huh.
00:20:48.000 This is the stupidest thing.
00:20:50.000 And you know who it convinced?
00:20:52.000 It convinced, well maybe they were in on it, I don't know, but all these well-funded foreign policy NGOs that are European NGOs, American NGOs, Just a few days ago, we published a story at Breitbart News about the German Marshall Fund.
00:21:06.000 This is one of these extremely well-funded foreign policy NGOs, very focused on Russia, very focused on the Balkans.
00:21:12.000 They're the ones who have been pushing the disinformation narrative, the Russian disinformation narrative, for four years now, and pressuring the tech companies to censor what they call disinformation.
00:21:23.000 Another one of those groups, the Atlantic Council, are working directly with Facebook.
00:21:28.000 And this other one, the German Marshall Fund, they've actually said that Breitbart News, Fox News, The Blaze, basically every conservative news website needs to be suppressed by Facebook.
00:21:37.000 They need to apply friction, because in the words of this think tank, which by the way is funded by the US government, and foreign governments, and the German government, sounds like foreign interference to me, I don't know how that word's defined these days, but they said Facebook has to apply friction, which is their word for suppression of all these so-called deceptive news sites.
00:21:56.000 There's a funny meme that, someone posted this on 4chan a long time ago, that any sufficiently free internet space becomes right-wing, and the left can only maintain the space if they have hard moderation and censorship.
00:22:10.000 That seems to be the case.
00:22:11.000 If you look at Facebook, in spite of the censorship, the top posts are still Daily Wire, Dan Bongino, Fox News, etc.
00:22:19.000 I think that's right, and I think it's in any period of history where you have one faction, social faction, political faction, whatever, trying to maintain an ideology based on complete myth.
00:22:31.000 Which many of you know, the assumptions of the cultural left are, you know, gender is a social construct, all of these things.
00:22:39.000 As soon as you have free speech, that'll be threatened, and the most popular people will be the people who challenge that.
00:22:44.000 The only way they maintain this in popular, that's what cancel culture is.
00:22:49.000 You know, you see what happened in San Francisco with that guy?
00:22:51.000 The free speech rally got his teeth knocked out?
00:22:53.000 The guy got punched?
00:22:53.000 Yeah, yeah, yeah.
00:22:54.000 Think about how crazy it is that this guy decided to step up and say the billionaires are bad, so Antifa shut up and punched him in the face.
00:23:01.000 Yeah.
00:23:01.000 Are they working for the billionaires?
00:23:03.000 Yeah, I call it the n-word as well.
00:23:04.000 Yeah, over and over again.
00:23:06.000 It's weird alliance between the cultural left, which used to be the anti-establishment left, and the corporations.
00:23:11.000 The billionaires!
00:23:12.000 And the neoconservative foreign policy think tanks.
00:23:15.000 They're all aligned on this.
00:23:17.000 No, what is this?
00:23:19.000 You've got the neocon never-Trumpers who ran full speed of the Democrats after Trump booted them out.
00:23:24.000 You've got Antifa beating people up on behalf of them.
00:23:29.000 A guy comes out and he's like, yo, it's really bad that international, you know, multinational billion-dollar corporations are restricting the rights of the people.
00:23:37.000 Punched in the face.
00:23:39.000 Joe Biden is funded by Wall Street.
00:23:41.000 Trump is winning on small donors.
00:23:43.000 Biden is winning on Wall Street donors.
00:23:44.000 And Bernie Sanders is supporting Joe Biden.
00:23:47.000 Talk about backwards and insane.
00:23:50.000 The billionaires in this country are terrible, destroying everything.
00:23:52.000 Now go vote for the guy funded by the billionaires.
00:23:54.000 To be fair, Trump is a billionaire himself, so I guess it's still a weird circumstance.
00:23:59.000 It's very strange.
00:24:00.000 I think the most frustrating thing about the social justice warrior left, the Antifa left, the left that'll punch you in the face, is that they pretend to be anti-establishment.
00:24:09.000 They've been doing that for years, but actually they've found themselves in alliance with elites and with corporations because they're the only powers in society that actually suppress the truth.
00:24:20.000 And they need the truth to be suppressed because their entire ideology is based on nonsense.
00:24:23.000 It's so stupid.
00:24:24.000 None of it makes sense.
00:24:25.000 Antifa going out and hitting somebody because he's complaining about billionaires?
00:24:30.000 That's just weird to me.
00:24:31.000 Yeah, I think they're being manipulated by the powers because they're not smart.
00:24:36.000 I don't want to say they're stupid because it's a broad generalization, but anyone that's out on the street just doing random street violence is pretty dumb.
00:24:42.000 I think it's a dumb thing to do.
00:24:43.000 It's not a smart way to get your point across.
00:24:46.000 These are the same people who trashed Seattle in 1999 in the trade protests.
00:24:50.000 They were all about, you know, ending the bad trade deals.
00:24:53.000 And along comes a president who said, I'm going to end the bad trade deals.
00:24:56.000 And he does end the bad trade deals.
00:24:57.000 And now they're just beating up his supporters in the street.
00:24:59.000 Dude, look at Bernie Sanders in 2015.
00:25:02.000 Seriously, go back.
00:25:03.000 A lot of people might not remember this.
00:25:04.000 You go back and you Google search Bernie and Trump and you'll find a bunch of articles saying Bernie and Trump are very similar.
00:25:09.000 And there was a point where Bernie had to be like, I'm not like him.
00:25:11.000 But a lot of their core policies were the same.
00:25:14.000 Illegal immigration was bad.
00:25:16.000 Bernie was fairly moderate on gun issues when he said it was a rural versus urban debate.
00:25:21.000 Because he's in Vermont and he's like, we don't have the same problems.
00:25:23.000 Because people in Vermont like their guns.
00:25:25.000 You had him saying the TPP, the Trans-Pacific Partnership, was a problem, and we've got to...
00:25:31.000 What happened to that guy? All of a sudden now he's like, the billionaires, but they're all
00:25:34.000 funding Joe Biden, so go vote for him. It's like, he immediately just said, please let me in the
00:25:39.000 establishment. Trump on the other hand kicked the door in, you know? You know, an alliance between
00:25:43.000 the populist left and the populist right is probably the thing that terrifies the establishment the
00:25:47.000 most. I kind of think that's... If you look back through history, there's always, you know,
00:25:52.000 new political consensus is formed to replace the old ones.
00:25:56.000 The Reagan consensus, the neoliberal consensus replaced the FDR consensus.
00:25:59.000 There's a new consensus waiting to happen, the populist left and the populist right are forming the nucleus of it, but it's being held back by this establishment class that simply won't let go.
00:26:09.000 I think that was starting to happen during Occupy Wall Street.
00:26:12.000 So, the early days of Occupy Wall Street, when I was down there, I remember seeing a really, like, a 60-year-old man and woman, and they were sitting in chairs with an American flag.
00:26:20.000 Nobody complained, and they had it set up, and people were talking about it, because the core of what brought people out was, the big banks in Wall Street are corrupting our system, it's revolving door politics, the whole thing is broken, and you had conservatives, libertarians, and leftists who had shown up.
00:26:35.000 Then, along came intersectionality.
00:26:39.000 Identity politics.
00:26:40.000 The older people who could not live in the park ended up leaving, the younger people did, and then these wealthy college-educated Brooklynite kids with trust funds, not all of them but a handful of them, I know them personally, they are trust fund kids, started preaching intersectionality and identity politics.
00:26:55.000 And then all of a sudden, you end up with someone, you know, Serena Williams, who is a black woman worth tens of millions of dollars, and she's oppressed, and the homeless guy sleeping under the bridge is an oppressor because he's white.
00:27:06.000 What that did was it fractured any possibility of the populist left and right coming together.
00:27:11.000 Because now you got AOC who is full on board with overtly racist ideology.
00:27:16.000 That is like... It's freaky how racist these people are.
00:27:19.000 They're calling it anti-racism and then literally saying, but we should have racial discrimination.
00:27:23.000 Yeah, okay, that's racism.
00:27:25.000 Well, no, but we're anti because it's for good things.
00:27:27.000 I don't care what you think it's for.
00:27:28.000 It's funny because the ideology they espouse, they're like racism is prejudice plus power.
00:27:34.000 Therefore, only white people can be racist.
00:27:37.000 Anti-racism is quite literally the same thing when it's for minorities and you discriminate against specific minorities and other racial groups for the benefit of particular racial groups.
00:27:46.000 Yeah.
00:27:46.000 It's like the same thing.
00:27:47.000 Well, when you say prejudice plus power, who is more powerful today than Google?
00:27:51.000 Yeah.
00:27:52.000 The most racist people in the world today are in Silicon Valley.
00:27:55.000 They're programming the algorithms that run our world.
00:27:58.000 If you get the book, read the part on machine learning fairness.
00:28:01.000 Machine learning fairness.
00:28:03.000 Oh, that's the good sensor thing, right?
00:28:05.000 That's actually not the good sentence.
00:28:06.000 There's something different.
00:28:07.000 Machine learning fairness is an entirely new field set up in universities, and it's being funded by Silicon Valley, and they have their own machine learning fairness departments now.
00:28:16.000 The point is to bring computer science and critical race theory together, and left-wing sociology and left-wing feminism.
00:28:25.000 They want the assumptions of those left-wing fields to be embedded in computer science.
00:28:29.000 That's impossible.
00:28:30.000 It's functionally impossible.
00:28:32.000 Well, this is what I say in the book, right?
00:28:34.000 So, an algorithm, an unbiased algorithm, is just a machine for noticing patterns.
00:28:38.000 It's a machine for analyzing data.
00:28:40.000 And pattern recognition, noticing data, analyzing data, is actually inherently right-wing, because it busts politically correct narratives.
00:28:47.000 Imagine if you train a machine to look at crime data, or you train a machine to look at, you know, the data of women going into certain fields or certain educational fields, you know, or, you know, train a machine to who's more likely to commit a terrorist attack around the world.
00:29:03.000 Like, that's going to spit out some very politically incorrect conclusions.
00:29:06.000 So this is why they created machine learning fairness, to bias the algorithms.
00:29:10.000 Obviously, they call it fairness.
00:29:11.000 It's the opposite of that.
00:29:12.000 So I guess, theoretically, they could put weight in them.
00:29:15.000 If this word emerges, then apply it.
00:29:17.000 But I don't see it working.
00:29:19.000 You know, I did a segment on my second channel talking about the problem with Wikipedia and critical theory in general.
00:29:28.000 The example I used is transgender.
00:29:31.000 And so what the leftist started doing was pulling the clip out of context to make it seem like I was anti-trans.
00:29:35.000 Here's what I actually said.
00:29:37.000 If you go on Wikipedia, which is a very formulaic and logic-based system, granted it's subject to manipulation and bias, but the way it works is very simple.
00:29:45.000 Present a source.
00:29:47.000 What does the source say?
00:29:48.000 Include it.
00:29:49.000 Cite it.
00:29:50.000 So, you go to Wikipedia, and you have this web of all of these different articles that interlink with each other.
00:29:57.000 Go to Woman on Wikipedia, and what does it say?
00:30:00.000 It says, a woman is an adult human female.
00:30:02.000 And then you click female, and it says, barring, you know, certain abnormalities, irregularities, females are the members of the, you know, species that produce ova, etc, etc.
00:30:12.000 And then if you go to male, that produce a sperm, and then it says, man, an adult human male.
00:30:19.000 Then go to trans woman on Wikipedia, and it gives you this very different definition.
00:30:25.000 So, it specifically says that The point I'm trying to make is, how do you have the idea that a trans woman is a woman, but these two articles can't connect to each other?
00:30:36.000 Right?
00:30:37.000 So if a woman is an adult human female, then a trans woman can't, you know, according to Wikipedia, be a woman because a woman produces ova, so it doesn't make sense.
00:30:45.000 The logic of the system is impeded by the fracturing of what the ideology represents.
00:30:51.000 Which is why so often you hear anti-SW types say, define woman.
00:30:55.000 Because we can very easily using a system like Wikipedia, but in critical theory you can't.
00:31:01.000 So the point I'm getting to is if you try... Well, this is why they have to invent the rhetorical trick of, you know, gender and sex are two separate things.
00:31:06.000 So you can still have the biology, but you also have this concept of gender.
00:31:10.000 But that's actually fine.
00:31:12.000 The problem is that when you have Wikipedia telling you... So, okay, what they need to do now is they need to go in and they change the definition of what woman is.
00:31:12.000 Yeah.
00:31:19.000 So that's why you get anti-CW people saying define woman because the official definition on Google, Merriam-Webster, and in Wikipedia is adult human female.
00:31:27.000 What they should do is make trans woman one word and have a completely different definition.
00:31:31.000 That's offensive.
00:31:32.000 I mean, it's logical.
00:31:33.000 That's the problem with critical theory.
00:31:36.000 So, and again, I'm not saying any of this to say trans women are or are not.
00:31:39.000 I'm pointing out the clash in the existing system on Wikipedia.
00:31:43.000 Wikipedia is not going to create some kind of nebulous understanding of critical race theory.
00:31:50.000 Or, I'm sorry, critical theory in general.
00:31:51.000 Intersectional left.
00:31:53.000 When you go to them, they will hold contradictory views because they're human beings.
00:31:57.000 But a computer doesn't do that.
00:31:59.000 It can.
00:32:00.000 But you go on there and you're like, this makes no sense.
00:32:02.000 These things don't connect.
00:32:04.000 But it's like, how could Wikipedia say two different things at the exact same time?
00:32:07.000 Quantum computing.
00:32:09.000 I guess, sure.
00:32:11.000 Yes, both exist in the same- Maybe that'll help us define trans theory.
00:32:15.000 There you go, perfect.
00:32:15.000 What we're trying to do with- Well, there will be some difficulties like that, but at the end of the day, what these leftists are trying to do is Every definition that these algorithms are trained on, everything they're trained to recognize, whether it's hate speech or misinformation, the people doing the defining, the people training the algorithms how to recognize those things, will be left-wingers.
00:32:33.000 So it'll always be their definition.
00:32:35.000 There will be some cases where, like gender, where the definition is very very tricky for the reasons you mentioned, but there will be other cases where The algorithm will only recognize the left-wing definition of something.
00:32:44.000 Is this all stemming from liberal college kids like Zuckerberg, Larry, and Sergei just being, happen to be the ones that coded it from the beginning?
00:32:51.000 I don't think so.
00:32:51.000 No, no, no.
00:32:52.000 I've had this conversation with Peter Boghossian and James Lindsay and Helen Pluckrose, who did this so-called squared hoax.
00:32:58.000 Very, very smart individuals.
00:33:00.000 And Peter Boghossian, for instance, he believes, and this was a while ago we had this conversation, so I don't, you know, maybe his opinion's changed.
00:33:06.000 that the intersectionality emerged through the colleges in the eighties and all the stuff.
00:33:12.000 My argument was maybe, but it only exists today because of the accident of the Facebook
00:33:18.000 algorithm.
00:33:19.000 And so I've mentioned this several times, but for you, Al, I'll tell it now to everyone
00:33:23.000 listening.
00:33:24.000 Basically what happened is in the early days of Facebook, they were trying to figure out
00:33:28.000 how to maximize site longevity, like how long someone was on the website.
00:33:32.000 And so they started creating feed algorithms.
00:33:34.000 It was a combination of factors.
00:33:36.000 In the early days of Facebook, you just got reverse chronological feed, right?
00:33:39.000 You followed somebody who were friends.
00:33:40.000 If they post something, you got it.
00:33:42.000 As the site grew bigger and bigger, people started following and liking more and more pages.
00:33:45.000 You end up with someone who, on average, had 300 friends, but also liked 300 pages, and their feed was so fast, they couldn't read through it.
00:33:52.000 So Facebook said, let's create an algorithm to make sure we're showing them what they most likely are to enjoy and click on.
00:33:59.000 What ended up happening was, around the same time, digital blogs started popping up, and it was partly because of Facebook they started making money becoming successful.
00:34:07.000 The articles were getting shared on Facebook, Facebook liked that because it was creating activity on the platform, and then these companies like Huffington Post, et cetera, started making money.
00:34:16.000 But something interesting happens when you incorporate Facebook constantly updating its algorithm to work better and better and better.
00:34:22.000 We ended up getting waves of police brutality videos.
00:34:25.000 There was a period where there was one website that was ranked global top 400 or something, when it was nothing but police brutality videos.
00:34:32.000 Why?
00:34:33.000 Because it was what people were clicking on.
00:34:35.000 So Facebook kept shoving it in people's faces, just beating them over the head with it, like, this is what you want!
00:34:40.000 And it was shockingly crazy footage of cops just beating people and beating people.
00:34:45.000 So what happens is the news organizations, the blogs, that started writing shock content started seeing more traffic and making more money.
00:34:53.000 Eventually they learned, wait a minute, there's more to life than just police brutality.
00:34:57.000 There's also racism and sexism, right?
00:34:59.000 So some article, some website started writing racism, racism.
00:35:02.000 Some said sexism, sexism.
00:35:04.000 Racism is X, sexism is Y. You get X views or Y views.
00:35:07.000 Combine the two and you get X plus Y views.
00:35:11.000 Intersectional articles started getting way more traffic.
00:35:14.000 The racist, sexist police.
00:35:16.000 Police brutality targeting, you know, black people.
00:35:19.000 And boom, Black Lives Matter pops into existence.
00:35:21.000 So what happens is these news organizations realized, and whether it was on purpose or not, They made more money when they wrote about all of these subjects combined.
00:35:30.000 And now you actually see this stuff.
00:35:32.000 Like, Vice had one article that was really funny that it was like, it was like black trans women of color fighting back against police brutality for Black Lives Matter, and it's like, they just jammed every possible keyword in there, because Facebook would then share it with more communities, they would get more traffic and make more money.
00:35:47.000 Thus, intersectionality was perfect for this algorithm.
00:35:51.000 It started to emerge because the algorithm was trying to find what, you know, people would want to see the most.
00:35:57.000 Here's what I'll add to that.
00:35:58.000 I think you're right.
00:35:59.000 That is what happened.
00:36:00.000 I remember that distinctly in 2013, 2014.
00:36:03.000 But I also remember that the backlash against that also did mad traffic.
00:36:07.000 So, exactly.
00:36:07.000 Right.
00:36:08.000 It sort of naturally correct.
00:36:10.000 If they allowed it to happen, it would have naturally corrected itself.
00:36:13.000 No.
00:36:13.000 Because people would have got sick of it.
00:36:15.000 They would have wanted some challenges to the prevailing viewpoint.
00:36:18.000 They did.
00:36:19.000 And it created the anti-SJW movement.
00:36:21.000 Exactly, but then they shut that down.
00:36:23.000 I mean, sort of, sort of.
00:36:25.000 But it was polarizing.
00:36:28.000 It was creating two tribes.
00:36:30.000 It was the reaction and the opposite reaction.
00:36:33.000 And so, yes.
00:36:34.000 But isn't that how politics has always worked though?
00:36:37.000 It's always been reaction and anti-reaction.
00:36:39.000 I think what was happening on social media was the things people wanted to talk about that maybe the establishment class didn't want to talk about.
00:36:45.000 They were sort of out of touch by then.
00:36:47.000 That rose to the fore.
00:36:49.000 But it was still a battle between two distinct sides and I think the anti-SGW side was actually winning because they had the facts, they had the arguments on their side.
00:36:57.000 Their content was going just as viral.
00:36:59.000 They were the ones who got Trump elected.
00:37:01.000 So if the social media companies hadn't clamped down, I think this would have resolved itself naturally.
00:37:06.000 I don't know if it would have resolved itself.
00:37:08.000 I think we're still in it.
00:37:11.000 Oh, we're absolutely still in it.
00:37:12.000 But I think it's being prolonged by Internet censorship.
00:37:15.000 If you took that control...
00:37:18.000 control off. I think you just have a natural conflict between two political factions and
00:37:22.000 one of them would win and that would be the anti-SJW side.
00:37:25.000 They're artificially controlling it to help the SJWs stay in power, control the narrative,
00:37:29.000 even though it's completely discredited.
00:37:32.000 Well, so here's what ends up happening. When you get someone on the right who is
00:37:36.000 fringe and extreme and crazy or whatever, they get nuked immediately.
00:37:41.000 Just gone.
00:37:43.000 On the left, however, they're allowed to keep doing it.
00:37:45.000 So this is what I was mentioning earlier.
00:37:46.000 You end up with two sides.
00:37:48.000 One where the fringe elements of the right have been purged and all that's left are suit-wearing, you know, milquetoast vanilla conservatives.
00:37:55.000 Yeah, who won't do anything about social media censorship, by the way.
00:37:57.000 But but but the left is is have you seen this video of the six it's six videos of women in their cars Screaming at the top.
00:38:05.000 Yeah.
00:38:05.000 Have you seen the one where they where they mash it all together?
00:38:05.000 Oh, yeah.
00:38:08.000 So you have these videos where these women are like back Damn, they're screaming!
00:38:15.000 I tell you this, you take a milquetoast vanilla conservative wearing a suit saying, you know, I just think that, you know, we should work hard and make money and then you put that next to the women screaming at the top of their lungs, what's a regular person gonna think?
00:38:29.000 It is backfiring, in my opinion.
00:38:31.000 Now, whether or not Trump wins is yet to be seen.
00:38:34.000 We've got two weeks.
00:38:35.000 I'll tell you this, man.
00:38:38.000 Suspending the New York Post, locking down their social media, was such an insanely desperate move.
00:38:44.000 The mask has been ripped off.
00:38:46.000 There's no more facade.
00:38:47.000 We know exactly what they're doing.
00:38:49.000 And it might work.
00:38:51.000 The fact that you have these women in their cars screaming like lunatics, it's because they're so heavy-handed in their manipulation.
00:38:58.000 That these people are trapped in a paranoid and delusional reality where Trump is literally Hitler and he's taking kids and throwing them in cages and they're freaking out and panicking because they see it all day non-stop and they can't break out of this just trap from big tech.
00:39:13.000 Well, I mean, the liberals' media model since 2016 has been, you know, Trump shock content.
00:39:17.000 We're going to shock everyone with how evil and bad Trump is, and that's what's leading to the screaming women in their cars.
00:39:24.000 But I will push back on the idea that only allowing the reasonable conservatives to stay on social media is somehow a help to the movement, because They're precisely the ones that are least likely to push back on this craziness.
00:39:36.000 They're the most likely to just apologize, say, oh you called me a racist, I can explain 10 reasons why I'm not a racist, and that's not convincing to people, that's not persuasive to people, it just looks weak and stupid, and you don't really challenge the arguments.
00:39:47.000 So that's that's the downside there.
00:39:49.000 Certainly for like elections it might be a help because voters, undecided voters, moderates will just see the craziness of the left And respond accordingly.
00:39:59.000 But ultimately, I think, you know, if Trump loses the election, it will be big tech that stole it.
00:40:04.000 And this is the whole thesis of the book.
00:40:07.000 When you start manipulating search results on Google so that people, you know, it all comes down to undecided voters.
00:40:13.000 So you might be like a heavy news consumer, especially if you're watching this show.
00:40:17.000 You're very clued up on all the issues, very hard to manipulate you.
00:40:19.000 But think about what an undecided voter Someone who doesn't follow politics every day is seeing when they go on to Google to find out about the two candidates.
00:40:28.000 It is endlessly.
00:40:29.000 Trump is evil.
00:40:30.000 Trump is evil.
00:40:31.000 All day, every day.
00:40:32.000 That's the real thing.
00:40:34.000 Yeah, but there was a story from Time Magazine.
00:40:36.000 This woman went around the country and found Nobody cared.
00:40:39.000 Nobody cared.
00:40:40.000 That she would go to these people and say, here's a headline, and say, oh, I don't care about that.
00:40:44.000 And, you know, the way they wrote it was, all of these people believe misinformation, what do we do?
00:40:48.000 And it's like, you're in a bubble.
00:40:49.000 They don't care.
00:40:50.000 That's the thing.
00:40:50.000 The journalists are in, you know, it's really funny, it's like, here's the way I see it, they're all in the ivory tower, and they're sitting in this room with each other, all laughing and sharing fake news with each other, thinking they're in the real world.
00:41:00.000 And they're not.
00:41:01.000 Regular people don't like Antifa.
00:41:03.000 Call them out.
00:41:03.000 They won't.
00:41:05.000 They don't.
00:41:06.000 I mean, you can give these guys all the advantages you want.
00:41:06.000 It's true.
00:41:09.000 You can put all their stuff at the top of Google, but if it's bad stuff, if it's bad propaganda, it's going to have the opposite effect.
00:41:16.000 I will read one extract to you from the book, though, because this gets to the heart of how they manipulate people and persuade people.
00:41:23.000 If I can find it.
00:41:23.000 Do it.
00:41:24.000 Yeah, man.
00:41:24.000 Give me a minute.
00:41:25.000 I want to hear it.
00:41:27.000 Ah, where is this?
00:41:28.000 So what is this?
00:41:29.000 This is their manipulating of... It's how they pinpoint people's political beliefs.
00:41:34.000 And... Yeah, you were mentioning... Did you find it?
00:41:34.000 Oh.
00:41:37.000 Right, here we go.
00:41:38.000 So this is a quote from a Facebook source.
00:41:41.000 The plan for polarization is to get people to move closer to the center.
00:41:45.000 We have thousands of people on the platform who have gone from far right... This is the way Facebook defines far right, by the way, not necessarily the way we define it.
00:41:54.000 Who have gone from far right to center in the past year.
00:41:58.000 So we can build a model from those people and try to make everyone else on the right follow the same path.
00:42:06.000 He goes on to explain how this would work.
00:42:07.000 Let's say everyone who goes from far right to center watched video X. Then maybe we adjust the priority of video X in the feed.
00:42:14.000 But it's probably going to be much more complicated than that.
00:42:16.000 So this is the stuff you don't see.
00:42:18.000 We can see the bans of the New York Post, we can see these high-profile bans.
00:42:22.000 We don't see this very subtle manipulation that's going on behind the scenes, pinpointing exactly what your beliefs are and tailoring the content to those people.
00:42:32.000 They used to do that, as you say, just based on how interested you are in a certain topic.
00:42:36.000 That's what would drive the algorithm.
00:42:37.000 But it's gotten much more specific than that, and much more focused on changing people's opinions.
00:42:43.000 This is like their attempt at brainwashing, whether they work or not is another question.
00:42:46.000 Now here's the best part.
00:42:48.000 Remember that these people think anyone to the right of Stalin is far right.
00:42:54.000 Yep.
00:42:54.000 So what they're doing is you've got like a regular, you got a working class dad who's like firing up the grill and having a burger and they're like, oh, that far right guy, he's a white supremacist.
00:43:02.000 Better feed him content to make him a centrist.
00:43:04.000 Yeah, we got Bernie Sanders.
00:43:06.000 Send him the depolarization video immediately.
00:43:08.000 And also when you have the right and the left, and the left goes way over here.
00:43:12.000 Now the center is here, which was further left than it used to be the normal lefties.
00:43:16.000 So they're dragging people way, they're trying to like almost radicalize people on the right so that they're now in the center.
00:43:22.000 It's literally what they're doing.
00:43:23.000 Yeah.
00:43:24.000 It's literally what they're doing.
00:43:25.000 They're radicalizing people.
00:43:26.000 And that's why you see a video of these women screaming at the top of their lungs, crying and like, ah!
00:43:32.000 You know, look, there are a lot of us, I think the people who are watching this for the most part, you have sought out information.
00:43:39.000 It's the big difference between those who watch the mainstream media and get wrapped up in the fake news, and the people who watch shows like this.
00:43:45.000 You have to search for my show.
00:43:47.000 Sometimes YouTube does recommend the clips we do every day, but like I mentioned, my main two channels, Timcast and Timcast News, are not on Google.
00:43:54.000 Google has banned them outright.
00:43:56.000 I heard that they were back now.
00:43:57.000 That is not true.
00:43:58.000 Someone said they found them.
00:43:59.000 Nope!
00:44:00.000 These people don't get it.
00:44:01.000 People will steal my content and re-upload it knowing I've been blacklisted, and they can get my views from Google search.
00:44:06.000 We gotta un-blacklist Tim's channels.
00:44:08.000 Can't do it.
00:44:09.000 Google knows I know, and I've complained about it to Google.
00:44:09.000 Google knows.
00:44:11.000 They're doing it on purpose.
00:44:12.000 It's normal stuff.
00:44:13.000 They do it on purpose, and that's part of the manipulation.
00:44:17.000 If someone says, did you hear Joe Biden's email thing?
00:44:20.000 You will not find my videos giving you the breakdown and the fact checks.
00:44:24.000 Doesn't exist.
00:44:25.000 This is another area where we've actually literally caught them red-handed.
00:44:29.000 So back in early 2019, I got a leak from a Google insider, and he literally called this leak the smoking gun proving Google's bias.
00:44:38.000 Obviously lawmakers did nothing about it because, you know, we don't have enough smoking guns.
00:44:42.000 But this was YouTube's controversial search query blacklist.
00:44:46.000 It's an actual list they have inside YouTube with a list of so-called controversial search queries, and whenever they add a term to that list, What it does is it tells the algorithm to prioritize all the videos from the mainstream media and shut out anyone that doesn't meet YouTube standards, meets YouTube's approval.
00:45:03.000 And the reason I found out about this list was because a left-wing journalist had reached out to Google saying, look at all these pro-life videos in the top 10 search results for abortion.
00:45:13.000 Within hours, Google goes in, they alter that file, they reorder the search results completely.
00:45:19.000 Do you know the name of that journalist?
00:45:20.000 I was a Slate journalist, if I recall.
00:45:22.000 I can't remember the exact name.
00:45:23.000 And I'm pretty sure she's the one who got Enrique Tarrio banned from Chase Bank.
00:45:27.000 Oh, right.
00:45:28.000 There are these journalists who make a profession out of deplatforming people.
00:45:31.000 This is what they do.
00:45:33.000 So they'll send a message, which is a veiled threat to a corporation, right?
00:45:37.000 So let's say, you know, Alum's Nachos is a company.
00:45:42.000 And I see a proud boy eating them.
00:45:44.000 I send a message saying, I couldn't help but notice that the Proud Boys were eating your nachos.
00:45:49.000 Do you support white supremacy, and why do you support white supremacy?
00:45:53.000 And they immediately respond with, we hereby disavow, we want nothing, and then the journalist puts out a message, Nacho Chip Company disavows the Proud Boys.
00:46:00.000 And this is what they did to PewDiePie, right?
00:46:01.000 This is the model.
00:46:02.000 You reach out to everyone's business partners, everyone who's doing business with them, anyone who's giving them a service.
00:46:06.000 And you use loaded questions.
00:46:09.000 Kind of like, uh, what are some of these jokes?
00:46:12.000 You say, uh, when did you start beating your wife?
00:46:15.000 Right?
00:46:16.000 It's like, wait, hold on, it's a loaded question.
00:46:18.000 The assumption is, right.
00:46:20.000 So what they'll do is they'll reach out to your business partner, to your partners, and it's almost like tortious interference.
00:46:25.000 They will ask a loaded question.
00:46:28.000 Now many of these companies, they get this email and it'll say, we saw the Proud Boys wearing your shirt, Fred Perry.
00:46:34.000 Why do you support white supremacy?
00:46:37.000 And they know there's nothing they can do.
00:46:39.000 The company knows that it's total BS.
00:46:43.000 But they're going to get a PR hit unless they play ball.
00:46:47.000 They don't care about you, so they just say, send them the generic response, we hereby disavow all of that.
00:46:52.000 And then boom.
00:46:53.000 So I talked to people, I was talking to these guys recently, and they wanted to get like a photograph and stuff, and I said, I'm not interested in doing photos with other companies.
00:47:01.000 And I was like, because of the political ramifications.
00:47:03.000 And they're like, no, no, no, no, trust me, these companies don't care.
00:47:05.000 And I said, no, no, no, you don't understand.
00:47:07.000 When the company ends up getting inundated by Antifa and the far left, and then not caring about me issues some stupid statement insulting and defaming me, then I have to deal with them, not the wacko far left.
00:47:19.000 If I'm seen in a photograph or, you know, I don't think I have to worry about this too much, but if there's somebody who is seen in a photograph with, say, Coke, a Coca-Cola executive, then Coke's gonna be like, we don't know who this guy is or care, disavow.
00:47:32.000 And they'll put out a statement saying white supremacy is wrong and we hereby disavow this individual, thus creating a newspaper story calling you a white supremacist.
00:47:39.000 That's the tactic the far left uses.
00:47:41.000 Here's why it's so powerful today in particular.
00:47:44.000 So my book is very positive about the early years of the Internet when there were no big tech giants.
00:47:48.000 But there's one very bad thing that's been the case on the Internet for a long time.
00:47:52.000 I've got a whole chapter in the book on this.
00:47:54.000 I call it the defamation engine.
00:47:57.000 So before the Internet, if there was a political scandal, if you were in the news for some bad thing, first of all, it generally only happened to politicians and celebrities, people who were public figures.
00:48:06.000 Second of all, you know, it's in the news for a day, it's on the TV, it's in a newspaper, then the newspaper gets chucked in the bin, TV moves on to the next thing, it goes away.
00:48:15.000 On the internet, it doesn't go away, the internet is forever.
00:48:17.000 As soon as any, you know, online news site, BuzzFeed, the Daily Beast, write something about you, and by the way, they do it to everyone now, not just public figures, as soon as they write it about you, it's on your Google results forever, and Google will prioritize it, and Wikipedia will cite it, and they won't cite articles from right-wing media potentially debunking it.
00:48:34.000 So this is what I call the defamation engine.
00:48:37.000 It's the connection between the media, Wikipedia, and Google.
00:48:43.000 This is why cancel culture exists.
00:48:45.000 Well, I can tell you this.
00:48:46.000 I think Wikipedia is in serious trouble.
00:48:48.000 We were talking a little bit about this.
00:48:50.000 I'm going to be a little light on the details, just because...
00:48:54.000 I don't want to exacerbate anything, but there is a group of individuals who have a forum that's very active and they've come up with an operation that's very, very, very clever.
00:49:03.000 The idea is to go on Wikipedia to random articles, nothing important, and put edits into random articles that are inane and arbitrary.
00:49:13.000 For instance, they'll go to the Wikipedia page for cardboard and put, you know, Hans Schmidt in 1932 developed a new means of manufacturing cardboard.
00:49:22.000 Nobody cares.
00:49:23.000 It's not vandalism.
00:49:24.000 It's a random tidbit.
00:49:27.000 And so what happens is enough of these edits bypass the vandalism filters from the actual Wikipedia editors.
00:49:34.000 So if someone goes to, say, Joe Biden's Wikipedia page and says, you know, Joe Biden is an abusive father whose son's a crackhead or whatever, they'll immediately jump in and say, get rid of this vandalism.
00:49:43.000 If someone goes in and says, you know, Joe Biden was using his son as an intermediary to make money off of Chinese equity investment and cite Breitbart, they'll say Breitbart is unreliable.
00:49:54.000 Remove it.
00:49:55.000 But if someone goes to the Wikipedia page for cheddar cheese and writes, Alan Bakari was a famous dairy farmer in 1871 who developed a new process for manufacturing cheese that lowered the price dramatically, nobody notices that, nobody cares, and it could sit there forever.
00:50:11.000 Now imagine you get thousands of people doing this.
00:50:14.000 So that's essentially one of the ideas.
00:50:17.000 Wikipedia becomes completely unreliable because it's full of fluff.
00:50:21.000 Fake factoids that no one can tell is real or fake.
00:50:24.000 So the whole thing is just questionable now.
00:50:26.000 First of all, that's hilarious.
00:50:27.000 And if someone wants to write in Wikipedia that I'm a 19th century dairy farmer, just go right ahead.
00:50:33.000 I encourage that.
00:50:34.000 But, I mean, Wikipedia is already discredited with so many people.
00:50:37.000 So, like, can you even discredit it any further?
00:50:39.000 It's funny that Wikipedia calls Breitbart an unreliable source, by the way.
00:50:43.000 We've got a guy writing for us, T.D.
00:50:45.000 Adler, who edited Wikipedia as one of, you know, a really prolific editor for over 15 years.
00:50:51.000 So you're calling your own former editors unreliable, if that's what you're saying.
00:50:55.000 It's completely broken when you look at, like, Vox is credible and Breitbart isn't.
00:50:59.000 And I'm like, Vox has published ridiculous, you know, without getting into naming people, there are some people whose opinions flip depending on what's politically expedient.
00:51:11.000 Very famous individuals, lefty, Vox, high ups, higher ups, and they'll tweet something like, we should do this with the Supreme Court, and then like a day later, like, we should do the opposite.
00:51:21.000 Whatever works for their politics, it's very, very obvious.
00:51:23.000 And that's reliable.
00:51:24.000 And they go way beyond just calling people unreliable.
00:51:26.000 I mean, they've called virtually every prominent figure on the New Right alt-right at some point.
00:51:31.000 That's on their Wikipedia pages.
00:51:32.000 I mean, how many people have they called alt-right?
00:51:34.000 They've called Breitbart alt-right, false.
00:51:36.000 They've called Mike Cernovich alt-right, false.
00:51:37.000 Jack Peserba, Lorenzo.
00:51:38.000 All these people who are not alt-right at all.
00:51:41.000 They're on the New Right.
00:51:42.000 So that always goes up at the top of, you know, the Wikipedia page.
00:51:45.000 It's completely discredited.
00:51:46.000 The thing is, we all know it's discredited, but it's still at the top of Google, and you can't sue these people.
00:51:52.000 Well, you can, but you'll lose because of Section 230.
00:51:54.000 I disagree.
00:51:56.000 Has Wikipedia ever lost a lawsuit?
00:51:57.000 I don't think it has.
00:51:58.000 I don't think people are trying because of Section 230.
00:52:00.000 Oh, I think they are trying.
00:52:03.000 I mean, here's the thing.
00:52:04.000 Wikipedia is the most powerful publisher in the world.
00:52:08.000 It is a publisher.
00:52:10.000 The fact that it gets Section 230 protections is ridiculous.
00:52:13.000 If any website should be liable for defamation lawsuits, it's Wikipedia.
00:52:17.000 Well, they do have special rules on biographies of living persons because they're susceptible to lawsuits.
00:52:24.000 The issue is, I think, when it comes to all these platforms, people aren't suing.
00:52:28.000 They're not.
00:52:30.000 Sometimes you'll hear a story about a lawsuit and they lose and it's like, I get it, but people need to actually start making these challenges.
00:52:36.000 You know what's going on with Patreon?
00:52:39.000 I know there's some legal action happening there.
00:52:40.000 They actually did make a legal breakthrough, if I remember correctly.
00:52:43.000 Yeah, yeah, yeah.
00:52:43.000 So I don't know exactly where we're at recently, but this was like a month or two ago, that basically Patreon's gonna be on the hook for tens of millions of dollars because of arbitration.
00:52:53.000 Whether they win or lose, they have to pay up front.
00:52:56.000 So Patreon could be over, as far as I know.
00:53:00.000 And I don't have all the full details in front of me, but the general idea was They banned several people, and they had this provision that you couldn't sue them, you had to go to arbitration.
00:53:09.000 So then, these people were like, hey everybody, file a complaint, sue them for five bucks or whatever.
00:53:14.000 So Patreon then says, we're going to arbitration, and they say, great, you've got to front X dollars to the arbiters, to the arbitration.
00:53:22.000 We're talking about people who have tens of thousands of followers.
00:53:25.000 And so all of those lawsuits, all those arbitration claims, and then all of a sudden Patreon's on the hook for tens of millions of dollars they can't pay.
00:53:31.000 And so they're in trouble.
00:53:32.000 Actually, Patreon tried suing the users back to block the arbitration.
00:53:36.000 I heard about this.
00:53:37.000 It's very threatening to them, this thing.
00:53:39.000 Oh, they're done.
00:53:40.000 It's because Patreon is a financial company in some way as well.
00:53:43.000 Wikipedia I think it's a lot more difficult because they can claim traditional Section 230 protections because they're just hosting what all these anonymous editors say.
00:53:52.000 It's not their content.
00:53:53.000 That's the argument they'll make in court.
00:53:55.000 That doesn't matter.
00:53:56.000 Well, whether you win or lose, as soon as you say, I want to go to arbitration, the company in California has to front the cost.
00:54:05.000 So they could... Patreon might win these arbitrations.
00:54:11.000 They still have to front the cost.
00:54:12.000 They can't do that right now.
00:54:14.000 So it's essentially like lawfare.
00:54:17.000 When you sue someone knowing they can't afford to defend themselves, so they cave in, right?
00:54:22.000 People do this all the time.
00:54:23.000 They know that their case is bonk, but then the lawyer's like, listen, it'll be 10 grand to settle, it'll be 200 to go to court, so just pay them off.
00:54:30.000 So right now what's happening is Patreon's like, we're going to- So this is why they sued.
00:54:34.000 They said to the judge, we're gonna win this.
00:54:36.000 They have no case.
00:54:37.000 They're just trying to get us to go to court to pay arbitration up front.
00:54:40.000 And the judge was like, I don't care.
00:54:42.000 You gotta pay arbitration up front and go to arbitration.
00:54:45.000 It's not for me to decide who's gonna win.
00:54:46.000 I think that works on Patreon because Patreon, first of all, they're gonna have to pay thousands of these, right?
00:54:51.000 Second of all, they're not as wealthy as, say, a Google or a Facebook or a YouTube.
00:54:57.000 These companies have endured billion-dollar fines from the European Union.
00:55:01.000 They're still standing.
00:55:02.000 It may work on Wikipedia, actually, because the Wikipedia Foundation has a few hundred million, I think, but it's not as insanely wealthy as a Google or a Facebook.
00:55:11.000 Is it in California?
00:55:12.000 Well, that's the question.
00:55:13.000 I'd have to look that up.
00:55:14.000 I would assume so, but that's something you've got to look up.
00:55:18.000 Also, it's a foundation.
00:55:19.000 It's not a company.
00:55:21.000 I'm not a lawyer, so I can't tell you exactly, but I do know they've never lost a defamation case, as far as I know.
00:55:28.000 Did you know that for a really long time, my Wikipedia said that I invented a Zeppelin?
00:55:28.000 People have tried.
00:55:32.000 Some kind of remote Zeppelin?
00:55:33.000 Well, that's not bad, as far as... Did you invent a Zeppelin?
00:55:35.000 I have not invented a Zeppelin.
00:55:37.000 You shouldn't have told people that.
00:55:39.000 You should have said, yeah, I invented a Zeppelin.
00:55:40.000 When I tried complaining, Like, passively, like, hey, I didn't invent a Zeppelin.
00:55:44.000 They told me I was unreliable.
00:55:46.000 And I'm like, dude, I didn't invent a remote control Zeppelin, man!
00:55:49.000 And they were like, well, we have a source that says you did.
00:55:51.000 I'm like, no, you don't!
00:55:53.000 They didn't even have a source, and they couldn't take it off!
00:55:56.000 Finally, someday, some, like, high-ranking Wikipedia dude removes it, and I'm like, ugh.
00:56:00.000 Thank you.
00:56:01.000 Geez.
00:56:02.000 And then whenever I bring the story up, people go in and start editing and adding stuff to it.
00:56:06.000 They're going to do it tonight.
00:56:08.000 Think about how amazing it is on Wikipedia that you could do an hour-long lecture on the dangers of white supremacy and the evil of white supremacy, and then they will call you a white supremacist or alt-right.
00:56:21.000 And when you try telling them that you actually campaign against it, and part of my work is criticizing it, they say, you're not reliable.
00:56:28.000 Well, actually, they'll give a concession to the left on this.
00:56:28.000 Yeah.
00:56:31.000 If you look at... I did a chapter on Wikipedia, and while I was researching the chapter, I was comparing the Wikipedia pages of people like Sean Hannity and Lou Dobbs to people like Rachel Maddow.
00:56:43.000 And at the top of Rachel Maddow's page, you see Rachel Maddow, in an interview with... I can't remember the exact channel, described herself as, you know, a political moderate with some left-wing beliefs, blah, blah, blah.
00:56:54.000 Crackpot conspiracy theorist.
00:56:56.000 Whereas Sean Hannity's page, Crackpot Conspiracy Theories, or something like that.
00:57:03.000 I think they changed that now, but when I was looking at his page, it was like, Known for Pushing Conspiracy Theories.
00:57:07.000 Yep.
00:57:08.000 Do you guys think that Wikipedia should be shut down?
00:57:10.000 Or altered?
00:57:11.000 Or something like that?
00:57:13.000 I think it would just be liable for defamation.
00:57:14.000 Let people sue Wikipedia if they defame you.
00:57:16.000 Wouldn't that just destroy the company?
00:57:18.000 No, it wouldn't.
00:57:19.000 They run off donations.
00:57:20.000 Yep.
00:57:21.000 Yeah, but they still have hundreds of millions of dollars.
00:57:23.000 Do they really?
00:57:24.000 Why do they need donations?
00:57:27.000 Every year he asks for donations on the website.
00:57:29.000 Listen, listen, Wikipedia is different from Twitter in that it's compiling information and publishing it.
00:57:34.000 So with with Twitter, I understand that a user is making a statement. With Wikipedia, a person is not making a
00:57:40.000 statement.
00:57:41.000 Listen, on Twitter, I tweet under my name. And when I say Alan Bakari is a 19th century dairy farmer,
00:57:47.000 that is a statement of fact.
00:57:49.000 Alan then says, Twitter publishes.
00:57:51.000 And then you can see who changed what?
00:57:52.000 That says Tim Pool at Timcast and then a statement.
00:57:56.000 That's from Tim Pool.
00:57:57.000 On Wikipedia, it doesn't put your name.
00:58:00.000 It's a page that says Alan Bakari and then right atop a 19th century dairy farmer.
00:58:05.000 But at the bottom, No, no, no.
00:58:07.000 Doesn't it show the changed log?
00:58:07.000 In the history.
00:58:08.000 And then you can see who changed what?
00:58:09.000 In the history, but that's a different post.
00:58:12.000 Wikipedia is a book pretending to be a library.
00:58:14.000 Mmmhmm.
00:58:16.000 It's like the Encyclopedia Britannica is liable for defamation.
00:58:19.000 If the Encyclopedia Britannica defames you, you can sue them.
00:58:22.000 Wikipedia is a more powerful and more widely read version of that.
00:58:26.000 The fact that it's not liable is just insane.
00:58:29.000 And you know, like I said, I agree with Twitter, with Facebook, they publish Well, they don't publish, but they host millions and millions of content, millions and millions of posts, new ones every day.
00:58:38.000 So yeah, sure, if you held them liable for all of those posts, of course their business model would not make sense.
00:58:43.000 But Wikipedia is just big pages like an encyclopedia.
00:58:48.000 It would not be crippling to them if they were liable.
00:58:50.000 When I go to the page, Alan Bakari, there are no users.
00:58:55.000 It says, Wikipedia, Alan Bakari, 19th century dairy farmer.
00:58:59.000 That is a statement of fact under the Wikipedia banner and no one else is.
00:59:02.000 They'd go out of business instantly.
00:59:03.000 No, they wouldn't.
00:59:04.000 They would create more stringent posting policies and verification.
00:59:08.000 And they would create a two-tiered system.
00:59:10.000 Right now, anyone can go in and make an edit.
00:59:12.000 Oh, so maybe on Wikipedia it could say, after every statement of fact, it would show who wrote it and when.
00:59:15.000 talking about where this operation is going to go in and add inane factoids.
00:59:18.000 Oh, so maybe on Wikipedia it could say after every statement of fact it would show who wrote it and
00:59:23.000 when. Like a little subscript. So what would need to happen is if they could be sued for defamation
00:59:28.000 then what would happen is you would submit your edits and then it would go to an
00:59:32.000 And I'm generally in favor of online anonymity.
00:59:35.000 and that would protect them more so from liability, though they'd still be responsible for suits.
00:59:39.000 You could make the editors liable for defamation, except they're all anonymous.
00:59:43.000 And I'm generally in favor of online anonymity.
00:59:45.000 I think it allows people to challenge taboos and have discussions you can't have in public,
00:59:49.000 especially in the age of cancel culture, anonymity is important.
00:59:51.000 But the last people who should have anonymity is a Wikipedia editor, who have more power
00:59:55.000 over people's reputation than anyone else.
00:59:57.000 They should be accountable for what they write.
00:59:59.000 It could destroy someone's reputation.
01:00:01.000 I was at an event in the UK years and years and years ago.
01:00:06.000 And they found me through Occupy stuff and they wrote the bio for my introduction off of Wikipedia and it was one of the funniest things.
01:00:17.000 Tim Pool invented a zeppelin and I was just like, excuse me sir, you clearly just pulled that off Wikipedia.
01:00:24.000 You have no idea who I am, you have no idea what I've done, and there were other instances where people had been cancelled because of negative articles on their Wikipedia pages.
01:00:33.000 The first thing people see when they search you is the card on Google that says Wikipedia.
01:00:36.000 You click it and they believe it.
01:00:37.000 They believe all of it.
01:00:38.000 So maybe the first thing we should do is not credit Wikipedia.
01:00:38.000 Wow.
01:00:41.000 Yeah.
01:00:42.000 It doesn't matter what you want to credit, it matters that people just use it.
01:00:46.000 Which is why there's a group of people now that want to add inane and innocuous random edits to completely break the system because then people are going to be like, what?
01:00:56.000 The dairy farmer?
01:00:57.000 What is this?
01:00:58.000 So are you both familiar with the Gell-Man amnesia effect?
01:01:03.000 I can't say I sound familiar, but no.
01:01:05.000 So it's a fake name.
01:01:06.000 It was this guy who made it up, and they said, calling it Gel-Man Amnesia Effect makes it sound more prestigious and like official, and it was kind of a joke.
01:01:13.000 What it means is, you pull up a newspaper, and let's say you're an expert on big tech censorship.
01:01:19.000 And you see a story on the front page of the Washington Post that says, there is no big tech censorship.
01:01:23.000 And you laugh, saying, I know that's true because I've actually talked to these people.
01:01:27.000 Fake news.
01:01:28.000 You turn the page, and then it says, War in Syria.
01:01:31.000 Bashar al-Assad does X. And you go, wow.
01:01:34.000 You forget.
01:01:35.000 You get amnesia.
01:01:36.000 You saw something in which you were an expert, and you knew it was fake.
01:01:39.000 But then when you read something in which you're not familiar with, you assume it's true.
01:01:43.000 That's the Gell-Mann amnesia effect.
01:01:44.000 So what this operation ends up doing is, eventually someone is going to be a dairy farmer, and knows how to make cheddar cheese, is going to be like, there's no al-Bukhari.
01:01:51.000 What is this?
01:01:52.000 And they just start questioning every other article he's read.
01:01:54.000 Maybe not.
01:01:55.000 Maybe he'll have amnesia.
01:01:57.000 But that's the idea.
01:01:57.000 They're sprinkling and peppering it with all this random garbage so that you can't even know.
01:02:03.000 You could be reading a page and like, that's just totally made up and you can't tell because it sounds real.
01:02:07.000 They could have been doing this the whole time.
01:02:09.000 They could have been, but I think people liked Wikipedia for a long time.
01:02:13.000 They liked it, they used it, it made sense.
01:02:15.000 Now we're at a period where Wikipedia is completely weaponized for political power.
01:02:19.000 I mean, look at—remember when, before Kamala— I disagree with that slightly.
01:02:23.000 I think Wikipedia has always been seen as unreliable.
01:02:25.000 You know, every school teacher will tell their pupils, don't rely on Wikipedia.
01:02:29.000 Sure, sure.
01:02:29.000 If you cite Wikipedia, you're getting a fail.
01:02:32.000 But people were, in our generation especially, were like, yeah, it's fine, it's cited.
01:02:38.000 It was convenient.
01:02:39.000 We knew it was unreliable.
01:02:40.000 We still know it's unreliable, but it's convenient.
01:02:43.000 It's actually a problem with Google.
01:02:44.000 Google always puts it at the top of their search results.
01:02:47.000 They consider it an authoritative source.
01:02:49.000 And it's not.
01:02:50.000 And it's not.
01:02:50.000 And everyone knows it's not.
01:02:52.000 I love how you can take me, the person in question, and I can issue a statement.
01:02:59.000 I did not invent a Zeppelin.
01:03:02.000 Not reliable.
01:03:03.000 Take a 22-year-old intern working at BuzzFeed who writes Tim Pool, who invented a Zeppelin, and boom!
01:03:10.000 Historical fact.
01:03:12.000 You know what would happen if you made Wikipedia liable for defamation cases?
01:03:16.000 I can tell you.
01:03:17.000 I've got a good idea of what would happen.
01:03:18.000 First of all, Wikipedia would shut down for a month or so.
01:03:21.000 And in that month, they'd be reaching out to people like you.
01:03:23.000 They'd be reaching out to everyone who's got a Wikipedia page about them, asking, hmm, is this actually true?
01:03:28.000 Did you actually invent a Zeppelin?
01:03:29.000 We're reviewing some of our articles.
01:03:32.000 That's what they'd probably do.
01:03:33.000 They'd shut down for a while, they'd edit all the articles, and then they'd put them back up.
01:03:36.000 That's what I was saying.
01:03:36.000 They'd have to fact check.
01:03:38.000 So here's what's really funny about an unreliable source.
01:03:41.000 Me, as the person.
01:03:43.000 When the New Yorker was writing about me, they called me.
01:03:47.000 We would like to confirm some facts.
01:03:48.000 Is X true?
01:03:50.000 Is Y true?
01:03:51.000 And is Z true?
01:03:52.000 And I said, yes, yes, and no.
01:03:53.000 I said, thank you very much.
01:03:55.000 And then they took out the BS.
01:03:57.000 Granted, it was still pretty BS.
01:03:58.000 It's a funny story.
01:04:00.000 Wikipedia doesn't do that at all.
01:04:01.000 They say, we can't actually listen to the person.
01:04:04.000 Every single fact checker in the past hundred years would be calling the person to confirm details.
01:04:12.000 How many people work at Wikipedia?
01:04:14.000 Any ideas?
01:04:15.000 I think it's a small operation, and they don't have the manpower to fact-check.
01:04:19.000 The whole point is it's just a community, you know, thing that everyone contributes to.
01:04:23.000 It's weaponized.
01:04:24.000 There are corporations.
01:04:24.000 And it has been weaponized, that's for sure.
01:04:26.000 I know people whose job it's called reputation management, and they know how to manipulate, and they have high-powered Wikipedia editors who look like regular people who just want to, you know, help out.
01:04:37.000 When in fact they're paid seven figures to control Wikipedia.
01:04:41.000 And so there are princes and princesses and politicians across Europe and the US who hire these companies to go in.
01:04:48.000 They know how to remove articles from the front page of Google.
01:04:51.000 They know how to manipulate the search engine algorithms to get them off.
01:04:54.000 They know how to spam Google to get other sources pushed to the top.
01:04:57.000 And they know how to place stories in reputable sources so they can control your reputation.
01:05:03.000 You can pay them for this.
01:05:04.000 Now if you're not rich enough to pay them then they just call you, you know, alt-right and whatever and then there you go.
01:05:09.000 It's interesting too because it feels like Wikipedia is a place where wealthy people of prominence can control the narrative, but poor people who become prominent just end up as white supremacists.
01:05:20.000 Basically.
01:05:21.000 Yeah, the exact opposite of the ideals of the Internet.
01:05:23.000 We're going to give everyone a voice.
01:05:24.000 No, Wikipedia's just giving all the cocks a voice.
01:05:26.000 Yeah, it absolutely is.
01:05:28.000 I mean, that's true for a lot of the Internet, to be honest.
01:05:31.000 But let's do this.
01:05:32.000 Let's talk about anonymity, because that's one of the big issues with Wikipedia that you brought up, and we talked a little bit about this earlier.
01:05:38.000 I'm torn on whether or not people should be allowed to be anonymous on the internet, because what ends up happening is sock puppets.
01:05:45.000 You get one guy controlling a hundred accounts, convincing Oreo to, like, abandon their, you know, pumpkin spice flavor because it's offensive to Native Americans or some other garbage.
01:05:54.000 I'm making that up.
01:05:54.000 That's not a real thing to happen.
01:05:55.000 I'm just saying, like, you get it, right?
01:05:58.000 If people had to actually have their names and their faces, they're going to be more respectful.
01:06:02.000 Not completely, but more.
01:06:04.000 And they're less likely, and they're not going to get away with sock puppeting and manipulating these companies.
01:06:08.000 The problem with more respectful is also more respectful of the people in power, the people who control the narrative, the people who can destroy them.
01:06:16.000 I think in the age of cancel culture, and that'll be fake respect by the way, yes I respect the Soviet Commissar because if I don't respect the Soviet Commissar I'm going to the gulag.
01:06:24.000 That's what happens when you make everyone accountable for what they say and you don't give people private spaces to discuss controversial topics.
01:06:31.000 I think in the age of cancel culture anonymity couldn't be more important.
01:06:35.000 Even before the end of the founding fathers thought it was important to, you know, have a free and open debate about the Constitution.
01:06:40.000 They wrote under various pseudonyms in the Federalist Papers, right?
01:06:43.000 So it's been around for a long time.
01:06:45.000 French dissidents under the monarchy in the 1700s wrote on a pseudonym, you know, Voltaire being the most famous.
01:06:51.000 So it's been used for a long time for dissident thinkers to express controversial ideas with that while avoiding the consequences of the powerful forces of the day.
01:07:00.000 That's a good point!
01:07:02.000 And given how powerful council culture is today, I think it's more important than ever.
01:07:05.000 But here's the thing.
01:07:06.000 Here's the kicker.
01:07:07.000 It doesn't actually exist.
01:07:08.000 We think it does, but it doesn't.
01:07:10.000 So one of the most terrifying sources I spoke to in the book, by the way, deletedbook.com if you want to get it, is someone who doesn't work for one of the big companies, someone who works for the ad tech industry.
01:07:20.000 Now the ad tech industry has a huge interest in de-anonymizing people because they want to find out exactly what people are interested in, even when they're posting anonymously.
01:07:29.000 So what she told me is that they've actually paid people to scrape, say, YouTube comments.
01:07:33.000 People commenting under this video might think they're anonymous, but actually, your comments may well be being scraped by one of these ad tech companies, stored on a database, so even if you delete it, they'll still have it.
01:07:44.000 And then they're scanning the text to match your writing style to, say, your Facebook post or your Twitter post, because they just want to find out everything about you.
01:07:51.000 It's quite a benign motivation, but you can easily see it being applied in non-benign
01:07:55.000 ways.
01:07:56.000 And the technology they use to do this is extremely sophisticated.
01:08:00.000 So what my source told me is that they're actually working on technology that can identify
01:08:04.000 people based on not just their unique writing style, but the speed of your mouse movements,
01:08:11.000 how fast you type on your keyboard, all these unique giveaways.
01:08:13.000 All combined.
01:08:15.000 So imagine you throw away your laptop, you throw away your mobile phone, you change your
01:08:18.000 name, you travel to Outer Mongolia, you log on to a new laptop in Outer Mongolia, you
01:08:24.000 start typing and instantly they know who you are because of your typing style.
01:08:28.000 So what you do is you create a physical mechanism That you type on your keyboard and then it translates to actual physical robotic hands that perfectly separate each keystroke by a millisecond.
01:08:41.000 Yeah, I will say that the point I make in the book is the privacy guys, the people who value privacy and anonymity, they've got to get ahead of this stuff.
01:08:48.000 You can write programs.
01:08:50.000 The same technology that can be used to de-anonymize someone can be used to re-anonymize them by masking those unique signals.
01:08:56.000 So that needs to happen quickly.
01:08:58.000 Alam, do you know that Facebook is aware of when you poop?
01:09:02.000 I did not know that.
01:09:04.000 That's a little snippet.
01:09:06.000 If I knew that before the book, I'd definitely put it in there.
01:09:09.000 I was reading about it.
01:09:11.000 Facebook knows everything about you.
01:09:12.000 And so I was reading something about how people think they're listening to your voice and giving you ads based on what you say.
01:09:21.000 It's not true.
01:09:22.000 They just know too much about you to the point where they can predict seemingly innocuous like like unrelated things So what happens is I remember I went to I went to Walmart and they had these TVs on sale fuck 200 bucks And I remember looking at him again I go home and I go I'm on a computer and I see on Facebook the ad for Walmart TVs in the middle of the aisle just like I had seen and I was like And so it's really simple.
01:09:46.000 My location data gave away that I was at a Walmart.
01:09:49.000 They knew I was at a Walmart.
01:09:50.000 They knew my age.
01:09:51.000 They knew my demographic.
01:09:52.000 They assumed I would probably want to buy a TV.
01:09:54.000 They probably knew where you were standing in the Walmart based on GPS.
01:09:57.000 For sure.
01:09:58.000 Maybe.
01:09:58.000 I mean, I don't know how precise they can get, but the general idea is 18 to 35 year old male at a Walmart probably wants to play video games.
01:10:05.000 There's TVs on sale.
01:10:07.000 And so I noticed them because of who I am.
01:10:10.000 And they track these things.
01:10:11.000 So you're familiar with shadow profiles.
01:10:14.000 Oh, of course.
01:10:14.000 Every social media platform has a shadow profile of you, even when you haven't signed up yet.
01:10:19.000 Exactly.
01:10:20.000 Even when you haven't signed up, they're tracking you around the internet.
01:10:22.000 We'll get into that next, but I bring that up because they build a data profile on you from information from other sources.
01:10:29.000 And so, you know, I'm trying to explain this to my friends.
01:10:31.000 Some of them don't want to believe it.
01:10:32.000 Mines does not do that, by the way.
01:10:34.000 Well, right, right.
01:10:34.000 But Facebook is a massive, massive apparatus.
01:10:36.000 And the phone.
01:10:38.000 So here's what happens.
01:10:40.000 Facebook's machines know the average duration between when a person of a certain height, weight, and age has to go to the bathroom.
01:10:47.000 They know when you wake up, because they know when the phone's not moving and when the phone's moving.
01:10:51.000 They know when the phone goes out for lunch.
01:10:54.000 You're at your office, then all of a sudden you're at Hardee's or whatever, you know, they have your location services.
01:11:00.000 And if they don't have yours, they have your friend, who's saying, hey, I'm getting you the burger at Hardee's, or, hey, I'm at the counter right now, meet me at the front, I'll order for you.
01:11:09.000 Things like that, they all know it.
01:11:11.000 You take all of this data.
01:11:13.000 I say they know when you poop because that's the shocking and funny thing to say, but they know everything about you.
01:11:17.000 They probably know when you're banging your wife or your husband or whatever.
01:11:19.000 They know.
01:11:20.000 And it's because all of these little things combined give them a detailed map of your life.
01:11:26.000 So you might not think it matters that, oh, you know, I leave my phone in the office when I go to the bathroom, whatever.
01:11:32.000 Then the phone stops moving.
01:11:34.000 They know.
01:11:34.000 They know.
01:11:35.000 Yes, they figure it out.
01:11:35.000 They figure it out.
01:11:36.000 They know.
01:11:37.000 There's no swipes.
01:11:38.000 There's no clicks.
01:11:39.000 There's no screen time.
01:11:40.000 They know the person has been gone away for 4.75 minutes.
01:11:45.000 And they know how long, so they compare it to other phone users and say, oh, it's most likely to be this.
01:11:50.000 They know the average person spends four minutes and 27 seconds on a potty break.
01:11:54.000 Your phone was inactive for four minutes.
01:11:57.000 And the best part is, people will check their phone, put it down, go to the bathroom, come back, check their phone again.
01:12:01.000 They know exactly when you're pooping.
01:12:03.000 They know when you're eating, know when you're sleeping.
01:12:05.000 They know when you wake up in the middle of the night.
01:12:07.000 And here's why I say if Trump loses it'll be because of big tech, because imagine this technology applied to political movements.
01:12:15.000 Some people think it's just a few big accounts getting censored, just Alex Jones.
01:12:19.000 No, they're censoring entire political movements at once.
01:12:22.000 So network analysis, which is the tool they use to identify certain networks of people.
01:12:26.000 So, you know, they see some people go to Chick-fil-A every Wednesday.
01:12:30.000 They'll put them in a, they have a certain map of their activities.
01:12:35.000 They put them in a group with other sort of Chick-fil-A eaters.
01:12:37.000 They have these whole networks mapped out.
01:12:39.000 They do that for political people as well.
01:12:40.000 Yep.
01:12:41.000 Who's following Alex Jones?
01:12:43.000 Who's following Breitbart News?
01:12:44.000 Who's following Tim Pool?
01:12:46.000 And who those guys, are they following each other as well?
01:12:49.000 And if they send a signal to an algorithm saying, you know, this network posts a lot of disinformation,
01:12:54.000 our algorithm has detected that, they send to not just one or two people, but entire political
01:12:58.000 movements. When Alex Jones gets banned, that's not the end of it. That is also sending a signal to
01:13:03.000 the algorithm saying, everyone who followed Alex Jones, they followed an account known for posting
01:13:08.000 conspiracy theories.
01:13:09.000 Maybe we don't ban them, but maybe we suppress them in the algorithm.
01:13:12.000 Maybe their posts don't show up at all.
01:13:14.000 Or they pepper a little bit, right?
01:13:18.000 So every third person gets the X. One of the most important things in the censorship story is that most people getting banned are small accounts.
01:13:28.000 For obvious reasons, there's more of them.
01:13:29.000 But periodically, Excuse me.
01:13:31.000 Periodically, someone with a lot of followers gets banned and they notice.
01:13:35.000 And then everyone says, whoa, I can't believe they banned person X. And then Twitter goes, that was a mistake.
01:13:40.000 We're sorry.
01:13:41.000 And they reinstate the person.
01:13:42.000 And then the algorithm carries on nuking people and just like mowing them down.
01:13:47.000 They accidentally hit someone too big and people noticed.
01:13:49.000 Oh, oh, oh, you can come back.
01:13:51.000 Okay, now get rid of the next 50.
01:13:52.000 Yeah, people are celebrating that the New York Post story got more distribution than it would have otherwise received because of the censorship, so it was a beneficiary of the Streisand effect.
01:14:00.000 That's great, by the way.
01:14:01.000 The New York Post has some great journalists.
01:14:03.000 I know some of them.
01:14:03.000 They're awesome.
01:14:04.000 Emma Jo Morris, John Levine, all these people.
01:14:06.000 They're great.
01:14:08.000 Most people who get censored, that doesn't happen to them.
01:14:11.000 Their message isn't amplified.
01:14:12.000 Most people just don't notice.
01:14:14.000 The New York Post is big enough.
01:14:15.000 It sometimes backfires on them, but in the vast majority of cases it doesn't.
01:14:19.000 That's why censorship is so scary and terrifying, and often it's invisible.
01:14:24.000 My favorite story in all of this.
01:14:27.000 Shadow profiles.
01:14:28.000 So let me just tell all of you right now.
01:14:31.000 I love talking to a group of people, and I'll say, uh, you all have a Facebook profile.
01:14:36.000 And most of them will say yes, but then periodically they'll get someone to be like, no, I don't use Facebook.
01:14:39.000 And I say, no, you have a Facebook profile.
01:14:42.000 And they say, no, I've never signed up for it.
01:14:44.000 Or, no, I deleted it, or I got rid of it, or I don't use it.
01:14:47.000 Let's say, you, Ian.
01:14:49.000 You have your phone, right?
01:14:50.000 Do you have Facebook Messenger on your phone?
01:14:52.000 Yes, I do.
01:14:53.000 When you log into Facebook Messenger, it says, find out who you're friends with at your contact list.
01:14:58.000 Jeez.
01:14:58.000 When you do that, all of a sudden now, Facebook has a list, a very interesting list, of phone numbers, and on yours, it says, mom, dad, and then, you know, your siblings, Janet, Bill, or whatever.
01:15:10.000 You might even say brother, sister.
01:15:12.000 It then has a phone number, you know, 8-6-7-5-3-0-9, listed as mom.
01:15:17.000 Then it has a phone number that says, you know, Janet Crosland or whatever, whatever, I don't know your mom's name.
01:15:22.000 And it's the same phone number.
01:15:23.000 Becky.
01:15:23.000 Becky, Becky Crosland.
01:15:24.000 Perfect, beautiful.
01:15:25.000 And it says 8-6-7-5-3-0-9.
01:15:26.000 Who would be referring to mom by the full name?
01:15:31.000 Someone who is, you know, in the immediate family, but then it has the same number, 8-6-7-5-3-0-2, in your phone as dad.
01:15:40.000 Now it knows who your mom's husband is.
01:15:42.000 Get it?
01:15:43.000 So let's say your dad never signs up.
01:15:45.000 They know who your dad is based on your mom calling him Jim and you calling him dad and having the same phone number and it's a game of Sudoku.
01:15:52.000 All of these things, they take all these little bits of data, combine it, and if you've never signed up for any of these websites, Everyone else has given your data to them.
01:16:03.000 And their addresses and their phone numbers, well obviously, and their email address.
01:16:09.000 Sometimes their pictures.
01:16:11.000 Just based on your contacts on your phone.
01:16:13.000 Human rights violation.
01:16:14.000 There was a period where Facebook accidentally published the shadow profiles.
01:16:17.000 Do you remember this?
01:16:18.000 I do remember.
01:16:19.000 That was amazing.
01:16:20.000 That was amazing.
01:16:20.000 Yeah.
01:16:21.000 And even if you don't have the Facebook Messenger app, or if you just visit a website that has like one of the little Facebook buttons, one of the little Twitter buttons, it's getting your data.
01:16:30.000 It knows you're on that website.
01:16:31.000 Yep.
01:16:32.000 And it will build a profile for you and everyone you know.
01:16:35.000 And you don't know exactly what data is being given to them.
01:16:38.000 And the craziest thing about it is, when I tell people this, like, listen, Your contact list has your significant other listed as, you know, GF or girlfriend or something.
01:16:49.000 It now knows you're in a relationship and it knows who you're in a relationship with because that phone number then correlates to someone who calls your girlfriend Becky.
01:16:56.000 They now know who Becky is.
01:16:57.000 Then someone has, you know, little sister and they're like, now we know who your sister is.
01:17:01.000 The phone number thing is really obvious to us because we can understand how to correlate that data, but they can build a shadow profile from you off of really innocuous data like when you went to Burger King.
01:17:12.000 And then they can see where your contacts go.
01:17:14.000 They can track them around as well.
01:17:15.000 No, no, no, no, no, no, no, no.
01:17:16.000 Drop the contactless thing.
01:17:17.000 I'm saying you could carry your phone into a Burger King and it can be like this phone and this phone were in the Burger King at the same time and moved around at the same time.
01:17:25.000 And now I'm thinking about NFC, near frequency communication.
01:17:28.000 Near field.
01:17:28.000 Near field communication and Bluetooth, like the ability for like a smart refrigerator to measure your phone if your Bluetooth is on.
01:17:36.000 So it's getting more advanced.
01:17:37.000 This is why Google got into smartphones and laptops.
01:17:40.000 And now cities, they want to build smart cities because there's just more and more ways to gather data.
01:17:45.000 Whether you sign up for their services or not.
01:17:48.000 Your technology will be assimilated.
01:17:50.000 Resistance is futile.
01:17:51.000 Indeed.
01:17:51.000 Looking forward to living in the Google smart city.
01:17:54.000 Are we at the point now where we've already become subjects of the Borg?
01:17:59.000 Is this conversation, is the AI laughing to itself saying, I am making them have this conversation, ha ha ha?
01:18:06.000 They are the Borg.
01:18:07.000 Thankfully, I don't think it's got to that point yet, but I do think this election will be a test of, you know, can big tech swing an election?
01:18:14.000 Can the message get out?
01:18:15.000 I think the polls are narrowing at the moment.
01:18:17.000 I was very pessimistic a month or so ago about Trump's chances, but the polls are narrowing.
01:18:22.000 You know, like I said, you know, big tech can put propaganda at the top of someone's feet, but it has to be
01:18:27.000 good propaganda.
01:18:28.000 And the advantage that people like Trump have is that the mainstream media doesn't really understand ordinary
01:18:33.000 Americans very well.
01:18:34.000 They're not very good at persuading them.
01:18:36.000 Well, the interesting thing about the censorship is that it still can't bypass human communication.
01:18:40.000 And so they can ban Milo, Alex Jones, Laura Loomer, etc., but they can't stop them from communicating in the real
01:18:47.000 world.
01:18:48.000 And so it can actually backfire and amplify these ideas, like we saw with the New York Post.
01:18:54.000 Now, the best thing they can do is nuke the movement from the smallest grassroots first.
01:19:00.000 And then let the higher... Like, you know, I think banning Milo, for instance, was a really, really bad idea.
01:19:05.000 They should have slowly just started banning his followers until he had no influence anymore.
01:19:09.000 Because now he's an account with no interaction, no engagement, and then he has to change up his strategy and his tactics to do... They could have manipulated it that way.
01:19:15.000 Instead, they went from the top and it caused the grassroots to freak out, like, whoa, they banned this guy and it created a big story.
01:19:21.000 Not that I'm saying he's better off for it.
01:19:23.000 I think banning him was bad for him in the long run.
01:19:25.000 But that's what they can do.
01:19:26.000 But what's happening now is they're trying to actively censor.
01:19:29.000 But when they do, Alex Jones still has his own website and his own infrastructure and he can build his own thing.
01:19:34.000 They're not strong enough to erase someone from the internet yet.
01:19:37.000 Though they have started removing people from society in general, like Enrique Tarrio of the Proud Boys, for instance.
01:19:42.000 He gets banned from his banks, he gets banned from Uber, he gets banned from MailChimp, which I guess he doesn't even use and never even signed up for.
01:19:48.000 They ban him from things he doesn't use at all.
01:19:50.000 Banks banning him.
01:19:51.000 That's like total removal from society.
01:19:53.000 So the issue right now is, Well, they're working on their method, right?
01:19:57.000 Because one of the things I know they're working on is link banning.
01:20:00.000 You know, so Alex Jones can have his own platform, but if you try and share his links on these mainstream platforms, you're not going to let that happen.
01:20:06.000 Like BitChute.
01:20:06.000 BitChute as well.
01:20:07.000 Yeah, so BitChute is a torrent-based video sharing platform, I think, right?
01:20:12.000 Yeah.
01:20:12.000 And so all of my videos automatically sync to BitChute as an emergency backup.
01:20:16.000 There have been a few instances where YouTube has deleted my videos with no strikes, with no warning, with no violations.
01:20:22.000 Just gone.
01:20:23.000 What?
01:20:24.000 Just gone.
01:20:24.000 Yeah, just gone because I said the wrong name or something.
01:20:27.000 Or I had like a news article that had an email in it or something.
01:20:31.000 But it will appear on BitChute.
01:20:33.000 If you try and share a link to BitChute on Twitter, it gives you a warning.
01:20:36.000 It's unsafe.
01:20:37.000 You can't do this.
01:20:38.000 And so that's soft censorship.
01:20:39.000 They're trying their hardest.
01:20:41.000 Cancel culture is their real world cudgel to enforce their censorship.
01:20:46.000 They need the far leftists to punch you in the teeth when you challenge them.
01:20:53.000 It almost feels like there is an AI consciousness forming in all of this big tech censorship inadvertently.
01:21:01.000 And it is, and people like Antifa are being manipulated into being the enforcers in the real world where the AI can't actually reach.
01:21:09.000 I'm not saying there's an actual conscious entity going like, I must control humanity.
01:21:13.000 I'm saying that we are creating this big network system.
01:21:17.000 Well, it's a symbiotic relationship because Antifa and their sympathizers in big tech train the algorithms.
01:21:26.000 The algorithms then enforce what Antifa want them to enforce more efficiently than they can.
01:21:31.000 So it creates this circle.
01:21:35.000 Why is it that a black man can go out in San Francisco and say censorship is bad and a white antifuck screaming the n-word can knock his teeth out and the media doesn't freak out over that?
01:21:44.000 It was acceptable in this weird algorithmic universe they've created.
01:21:51.000 The AI that they built and their manipulation, because it's human beings at the end of the day, can't get into your home and they can't get on that stage with you to stop you.
01:22:00.000 They can't censor you when you have a constitutional right to speak.
01:22:04.000 But they can inflame some Antifa people, allow them to organize violent riots, and show up to your home and punch you in the face and knock your teeth out.
01:22:12.000 They didn't go to that guy's house, I'm just saying.
01:22:14.000 They've been going to people's houses.
01:22:16.000 So, they really need cancel culture to scare people into adhering to the manipulations that they want.
01:22:22.000 And they really, really need Antifa to knock people out.
01:22:26.000 Otherwise, what does the enforcement actually mean in the real world when people defy you?
01:22:31.000 They want to end the defiance in the physical realm, they'll punch you in the face.
01:22:34.000 They can already end the defiance in the digital world by just banning you and nuking you.
01:22:38.000 So they've got their enforcers.
01:22:39.000 And the enforcers are allowed to operate on these platforms.
01:22:41.000 I was talking to Ryan Hartwig.
01:22:42.000 With impunity.
01:22:43.000 I was talking to Ryan Hartwig.
01:22:44.000 He's a Facebook whistleblower.
01:22:45.000 Yeah, we're having him on Friday.
01:22:46.000 Oh yeah, he's great.
01:22:47.000 Yeah.
01:22:47.000 He's awesome.
01:22:48.000 And you know that he was a Facebook content moderator.
01:22:51.000 He was told by his superiors, don't categorize Antifa as a hate organization.
01:22:57.000 So there you have it.
01:22:58.000 Another smoking gun to a long list of smoking guns.
01:23:01.000 You know what I wish would be awesome?
01:23:03.000 There's some 70-year-old dude who owns and runs all of these companies, and he's sitting in a room, and it was 1977 when he accidentally invented a sentient computer that's telling him what to do, and then he's going and giving his underlings orders.
01:23:17.000 That'd be a great movie.
01:23:19.000 Unfortunately, reality is actually scarier than that.
01:23:21.000 It's human beings, and they're knocking dominoes over, not realizing what they're building around them.
01:23:27.000 And it's getting scarier and scarier.
01:23:28.000 The domino goes all the way back around and then hits them in the back?
01:23:31.000 Oh, definitely.
01:23:32.000 Yeah, so like, if you look at Jack Dorsey, for instance.
01:23:34.000 Like, that dude gave, what did he give, like 10 million dollars, was it?
01:23:37.000 Yep.
01:23:37.000 To Ibram X. Kendi.
01:23:38.000 Millions and millions of dollars.
01:23:39.000 These people are, it's human centipede, but like, you've seen the movie Human Centipede?
01:23:45.000 I can't say I have, no.
01:23:46.000 I've heard of it.
01:23:47.000 I'm aware of its notoriety.
01:23:48.000 Yeah, you know what it's about?
01:23:50.000 So a mad scientist sews a group of people's mouths to each other's anuses and creates this big, long human centipede.
01:23:57.000 Sounds like a horror movie.
01:23:58.000 It is a horror movie.
01:23:59.000 It is a horror movie.
01:24:00.000 And so imagine that, but imagine they're connected in a big circle, and that's what it is.
01:24:03.000 So Jack Dorsey creates a platform where he lets the left run rampant, and then eventually the whole political world is inflated with this psychotic far-left ideology that he allowed, and then it infects his own brain, and then he gives money to it, a guy calling for racial segregation, he gives money to him.
01:24:19.000 He's created this world where it's literally, he's encouraging people to make sludge and refuse, and then he's accidentally eating it, And corrupting his own mind and body.
01:24:29.000 And that money eventually comes back to pressure Twitter to censor someone.
01:24:32.000 Yep.
01:24:33.000 Abraham X. Kendi?
01:24:34.000 Yes, absolutely.
01:24:36.000 Yeah, well, then the Democrats believe it all too.
01:24:39.000 And the Republicans are not so much.
01:24:41.000 I wonder why that is.
01:24:43.000 Why, you know, when we look at the Pew research thing, where it shows the Democrats moving very far left, why the Republicans have stayed very much where they are, even with Trump and the rhetoric and cancel culture and all that stuff.
01:24:54.000 I think it's because, I think Jonathan Haidt talks about this, you know, he's the psychologist who does all this research about the roots of our political beliefs.
01:25:01.000 He mentions, you know, conservatives are quite good at understanding liberals, often because, you know, they were former liberals, but liberals are terrible at understanding conservatives.
01:25:10.000 So this is why I think Trump still has a fighting chance.
01:25:12.000 We have all this censorship, we have all this propaganda, but because liberals don't understand conservatives very well, don't understand, you know, normal non-liberal people very well, a lot of the propaganda is just bad.
01:25:23.000 Well, so what I think is happening is there's another chart that I love to share where it shows moderates.
01:25:30.000 60% of the moderate news diet is liberal and 30% is conservative, or it's like 66 to 33.
01:25:35.000 Conservatives is the other way around.
01:25:37.000 It's 66% conservative and 33% liberal.
01:25:41.000 Liberals only consume liberal news.
01:25:44.000 So I think one of the things we saw with Gallup recently is that if you compare party affiliation from their latest data set in September to 2016, Democrats went from a D plus five advantage to an R plus one advantage.
01:25:57.000 That's a huge swing.
01:25:58.000 People are leaving the Democratic Party and it's because, I always hear it, they started researching on their own.
01:26:05.000 The people who are in these videos screaming like lunatics, they're not doing any research at all.
01:26:12.000 The reason why Biden is a sort of a strong candidate for the Democrats, you know, stronger than the alternatives, is for the same reasons that the liberals would never imagine.
01:26:18.000 and we go, let me check that, I don't know about that.
01:26:20.000 Oh, that wasn't true.
01:26:21.000 Or, oh, that was true.
01:26:23.000 The reason why Biden is a sort of a strong candidate for the Democrats, you know, stronger than the alternatives
01:26:29.000 is for the same reasons that the liberals would never imagine.
01:26:32.000 It's precisely because he's an old blue collar, or seems to be just a friendly old blue collar guy
01:26:40.000 who's not too extreme.
01:26:41.000 That's the image he projects.
01:26:42.000 It's not what he actually is, but it's the image he projects.
01:26:45.000 Democrats and liberals do not consider that to be an advantage.
01:26:48.000 They probably think that's a huge weakness.
01:26:49.000 Actually, that's probably the only reason why Trump isn't actually ahead in many of these polls.
01:26:53.000 It's because Biden seems so moderate and inoffensive to so many people.
01:26:57.000 I think the polls are wrong.
01:26:59.000 I think the threats of violence, the screaming banshees on social media has scared people.
01:27:05.000 And there was this research I've cited quite a bit in the past couple weeks, because we're getting close to the election, showing that 10% of Trump voters will lie about who they're voting for.
01:27:14.000 Oh yeah, 100%.
01:27:15.000 But think about that.
01:27:16.000 That's the wild card.
01:27:18.000 You don't know how many people are going to turn out for Trump because they've been so intimidated.
01:27:21.000 You look at this early voting right now, and a lot of states, it's below 10% for Democrats.
01:27:27.000 Democrats are supposed to be way, way above Republicans in early and absentee voting, and in Ohio today, Republicans were winning.
01:27:34.000 The other day in Michigan, Republicans were winning, but it switched back.
01:27:37.000 Democrats took a bump.
01:27:39.000 So there's a lot of states where Democrats should have a massive advantage and they only have a tiny advantage, or a moderate advantage, suggesting they're underperforming.
01:27:48.000 Dude, if Trump wins, are we going to have to deal with this again?
01:27:52.000 All this nonsense and the screaming?
01:27:54.000 It'll be worse if Trump loses.
01:27:56.000 It will be worse if he loses.
01:27:57.000 It'll be worse.
01:27:58.000 But if he wins, which I hope he will, do we have to deal with these screaming, no offense, women in the car or whoever, screaming young people, uneducated people that are freaking out?
01:28:08.000 Yes.
01:28:08.000 Why?
01:28:10.000 Why?
01:28:10.000 Why won't people just chill out and build something?
01:28:13.000 Because they're addicted and they're on Facebook and Facebook is beating them over the head and screaming in their faces and they're not smart enough to do a Google search.
01:28:22.000 So we need a unified enemy and maybe it can be the tech oligopolies.
01:28:26.000 You'd think COVID would have been there, but no.
01:28:28.000 Can't do it.
01:28:29.000 Because you can't see it.
01:28:30.000 You can't target someone or something.
01:28:32.000 It's too amorphous.
01:28:34.000 There are some leftists who get it.
01:28:36.000 I mean, the Gravel Institute, which is very, very far left, put out a video recently criticizing the tech giants
01:28:41.000 for their control over information.
01:28:43.000 Tulsi Gabbard gets it.
01:28:44.000 She's talked about this a lot.
01:28:45.000 She's come on Breitbart News Radio and talked about it.
01:28:47.000 So she's happy to work across the aisle.
01:28:49.000 And I think, especially the anti-war left, as you said, Tim, they're getting censored as well.
01:28:54.000 So there's some recognition in some parts of the left that big tech, the power of big tech giants is a problem.
01:29:01.000 The question is, will that ever gain momentum?
01:29:04.000 Because, you know, for the left now, it's all about freak... The way to gain followers on the left is to freak out about Trump, to freak out about so-called hate speech, demand even more censorship.
01:29:14.000 So, um... I think if Trump wins, and Republicans don't get the Senate or the House, Trump's out.
01:29:22.000 They'll impeach him instantly.
01:29:24.000 They're gonna pack the courts.
01:29:25.000 They're gonna add five or, you know, seven more justices to completely overrule whatever's there.
01:29:31.000 They are going to fundamentally dismantle and destroy this country.
01:29:35.000 There's already an article that says, abolish the Constitution.
01:29:38.000 No joke.
01:29:38.000 Straight up.
01:29:39.000 It's from the New Republic.
01:29:40.000 Not some fringe site.
01:29:41.000 It's the New Republic.
01:29:42.000 It's a prominent leftist website saying, abolish the Constitution.
01:29:46.000 They are not slowing down.
01:29:47.000 They are speeding up and appeasement doesn't work.
01:29:50.000 When they say, we don't really want to just get rid of all the cops, we're talking about defunding.
01:29:54.000 Then the New York Times publishes, yes, we mean abolish the police, straight up.
01:29:58.000 Then when you start saying, okay, we'll give a little bit.
01:30:01.000 They've defunded, I think, like 160 departments around the country.
01:30:04.000 Yes, to varying degrees.
01:30:06.000 Some very, very large numbers, some smaller numbers.
01:30:09.000 And then, does the left say, good, now we can have a conversation of what to do?
01:30:12.000 No, now they say, great, the next thing we want to abolish is the Constitution.
01:30:16.000 So they're planting the seeds because they will never stop pulling.
01:30:20.000 If Joe Biden wins, unless... So look, even if the Republicans win everything, then what are they going to do?
01:30:26.000 What are they going to do?
01:30:27.000 Nothing.
01:30:28.000 They're going to beg the New York Times to call them cool.
01:30:30.000 Will you call me cool and write a cool op-ed about me?
01:30:33.000 No, you're a fascist.
01:30:34.000 Oh, I'm sorry.
01:30:35.000 I'll do whatever you say.
01:30:36.000 Now you've got a mix of many of these, you know, you've got people on the right, libertarian right, and conservatives being like, we shouldn't interfere in the free speech of these big tech companies.
01:30:46.000 We should let them control the politics of this country and just run us out of office permanently.
01:30:51.000 If Trump and the Republicans win, there is a chance that right now with Trump with his hand on the cliff holding up everybody else can pull us up and slowly start to push back.
01:31:02.000 Trump banning critical race theory was a really good example of this.
01:31:06.000 All of a sudden we started hearing stories from people on Twitter, I lost my job, my company cancelled our contracts and everything.
01:31:13.000 It's over.
01:31:13.000 I can't believe I went to school for this.
01:31:14.000 There was one really funny one where a person was like, I went to school for, you know, five years learning about diversity and inclusivity and inclusion and now Trump's banned it and I'm completely out of a job and don't know what I'm going to do.
01:31:25.000 That's Trump getting rid of this crazy psychotic behavior.
01:31:29.000 If Biden wins and the Democrats take over, then you are going to live in an algorithmically manipulated universe where Republicans will probably never win again.
01:31:38.000 They'll pack the courts.
01:31:39.000 It will be... Listen, it's not going to be this utopia that people on the left think it is.
01:31:43.000 It's not going to be Skittles and candy canes and rainbows and universal health care.
01:31:47.000 It's going to be Hitler in a bikini with a female body dancing, doing Tai Chi with the Incredible Hulk while someone sings a nursery rhyme.
01:31:53.000 That's a real video.
01:31:54.000 That was a mash-up because a computer algorithm was trying to find things that people liked and it mashed them all together.
01:32:01.000 So weird.
01:32:02.000 And it created this insane, the Incredible Hulk with Hitler, but Hitler has the body of a woman in a bikini and it's doing Tai Chi and then they're singing a nursery rhyme.
01:32:11.000 That's algorithmic reality.
01:32:13.000 And if the Democrats win and they're falling victim to all this, that's what we're going to get.
01:32:17.000 Biden will certainly, this is like, there's a very clear choice to elect.
01:32:20.000 Do you want digital dystopia or do you want some chance at getting back our online freedom?
01:32:25.000 Biden will absolutely use the federal government to pressure tech companies even more to do even more censorship.
01:32:32.000 Yep.
01:32:34.000 It's really amazing how the Internet has changed because the Internet as it existed before 2015 was actually unprecedented in human history.
01:32:42.000 The amount of free speech we all had.
01:32:44.000 Never before could you just have a device, turn it on, reach a global audience.
01:32:48.000 That was new.
01:32:50.000 But it's now been entirely flipped on its head and controlled by these.
01:32:53.000 Now a handful of corporations get to control political discourse, not just in America, but all around the world.
01:32:58.000 Hartwig is going to be with you guys shortly.
01:32:59.000 He was talking about how he was moderating political speech in Mexico and Venezuela and Canada, all these countries.
01:33:05.000 That's foreign election interference right there.
01:33:07.000 It's the Borg, man.
01:33:08.000 So it's gone from unprecedented freedom, not just to back to the status quo, but beyond that to unprecedented tyranny.
01:33:14.000 So that's where we're at right now.
01:33:15.000 And it'll get worse under Biden.
01:33:17.000 I just want to talk about one more thing.
01:33:19.000 This will sound like a massive tangent, but it isn't.
01:33:23.000 The Roman Empire collapsed into dictatorship because the Senate became corrupt, politicians became corrupt, and people turned to powerful strongmen instead.
01:33:33.000 We're really seeing that because, as you said, Republican senators, a great many of them, will do nothing to protect ordinary people from these tech giants.
01:33:39.000 Yeah, they're sly.
01:33:40.000 They're corrupt.
01:33:41.000 The only entity in the world perhaps today doing that is the Trump administration.
01:33:46.000 They're the ones who petitioned the FCC, got the FCC to come out and say, OK, yes, we'll do something on Section 230.
01:33:52.000 They're the ones who appointed Nate Symington to the FCC, a guy who's a social media whore, got rid of the old guy who was skeptical about reigning in tech censorship.
01:34:00.000 So really, the American executive branch is the only branch of government that might actually do something.
01:34:05.000 Didn't the Roman Empire last a thousand years or something?
01:34:08.000 Oh yeah, I mean, hell, Rome under the Caesars was not particularly a bad place to live.
01:34:12.000 But it's like a thousand years?
01:34:14.000 We made it, what, 240-something?
01:34:16.000 Yeah.
01:34:16.000 250?
01:34:17.000 The Republic was like 400 years and then the Empire took over for about 300 or 400 more and then it split into two empires.
01:34:26.000 The Eastern Empire and the Western Empire.
01:34:28.000 The Eastern Empire became the Byzantine Empire.
01:34:31.000 But you see the parallel where politicians no longer defend the interests of the people and only the executive branch does.
01:34:38.000 So what's the answer?
01:34:39.000 Well, you have to give more power to the executive branch.
01:34:41.000 I don't like that idea.
01:34:42.000 What do you guys think about net neutrality?
01:34:44.000 I don't know.
01:34:45.000 I remember it was huge.
01:34:47.000 Total play on words by the liberals and the tech.
01:34:50.000 Can you define it?
01:34:51.000 So it was essentially a law that defined, well, an executive action by the FCC, by Obama's FCC, that made ISPs, Internet Service Providers, Comcast, Verizon, common carriers.
01:35:05.000 You know, treat everyone equally, treat everyone neutrally, so if Netflix is using your bandwidth, even if they're using way more of it than any other company, you have to charge them the same rate.
01:35:16.000 So it was really something that only helped Netflix and YouTube and these big video streaming platforms.
01:35:22.000 And the service providers, Google, Facebook, all these platforms.
01:35:25.000 So it was basically what's called in the technical FCC language, the edge providers, Facebook, Google, YouTube, versus the service providers, Verizon, Comcast.
01:35:34.000 It was a battle between corporations and under Obama, the edge providers, Facebook, Google, YouTube, Netflix, they won that battle under Trump. They got rid of
01:35:46.000 it. It's not really about net neutrality. It doesn't force anyone to be politically
01:35:49.000 neutral. And if it does, then it was aimed at the wrong target because the ISPs, to my
01:35:55.000 knowledge, have never actually censored political speech. If you're going to have net
01:35:59.000 neutrality, apply it to Google and YouTube and Facebook as well, because they're the ones
01:36:03.000 who threaten the actual neutrality of the Internet. They're the ones deciding who gets so-called
01:36:08.000 throttled.
01:36:08.000 230 reform could save the republic. If 230 is clearly defined, which Ajit Pai should
01:36:15.000 have done a long time ago, now he's all of a sudden like, oh, we're going to do it. Yeah,
01:36:19.000 OK, sure.
01:36:20.000 If this had been a long time ago, and companies like Wikipedia were liable for their libel, and Twitter had to have strict moderation policies, then free speech would be allowed, debate would be happening, and people would have the opportunity to actually speak with each other.
01:36:35.000 A site like Mines, a startup like Mines would have went under.
01:36:38.000 Because we only had like three moderators.
01:36:38.000 Why?
01:36:41.000 You wouldn't have to moderate.
01:36:43.000 But we would have to ban... No, you wouldn't.
01:36:43.000 That's the point.
01:36:47.000 We'd have to find the people who violated the terms.
01:36:49.000 It's the opposite of that.
01:36:49.000 No, you're wrong.
01:36:51.000 This is the argument that free market libertarians make, which is the opposite of that.
01:36:53.000 If you moderate, then you're suddenly in publisher territory.
01:36:57.000 Oh, we have microphone issues?
01:37:00.000 No, you're good.
01:37:01.000 Your headphones might have cut.
01:37:04.000 Anyway, it's the opposite of that, because if you get Section 230 reform, the right kind of Section 230 reform, I will say, you don't want to repeal the entire law.
01:37:11.000 That's a bad idea.
01:37:12.000 What you want to say is, it's very simple, it's so simple, I don't know why more people aren't suggesting this, you simply say, if you want to be a platform, if you want to have that legal immunity, You can't filter legal content, constitutionally protected content, on behalf of your users.
01:37:27.000 You can filter it, but only if the user chooses to filter it.
01:37:30.000 So exactly the same as Google's SafeSearch option, but for all types of content.
01:37:35.000 So you would not have to moderate.
01:37:37.000 So if someone posts something illegal, I have to get it off the site as a moderator, or the site gets sued because we can't host illegal content.
01:37:37.000 Okay, wait.
01:37:44.000 That's true now.
01:37:45.000 That's true now.
01:37:45.000 Right.
01:37:46.000 Yeah, Section 230 doesn't change that.
01:37:48.000 So you're saying if I, right now... If you moderate legal content on behalf of a user, so the user doesn't ask you to hide that from their feed, then you are no longer a platform.
01:37:58.000 You lose Section 230 protections.
01:38:00.000 But if you give the user the option to filter that content, say if you make a hate speech filter that liberals can turn on or off, that's fine.
01:38:07.000 Right.
01:38:08.000 You just don't do it on behalf of the user.
01:38:10.000 So right now what's happening is section 230 says good faith moderation is allowed.
01:38:15.000 The idea was you can't be responsible for what a user posts to your website.
01:38:21.000 That's not fair.
01:38:22.000 I didn't say it as a comment.
01:38:23.000 So right now they can moderate against users' will, but this would say you're not allowed to moderate against their will unless they ask you to moderate for them.
01:38:31.000 Right.
01:38:31.000 So they can.
01:38:33.000 Here's the thing.
01:38:33.000 So free market defenders of the big tech companies will say First Amendment Anyone can moderate their property, their communications platform as much as they want, or you're violating the First Amendment.
01:38:45.000 That's actually true, but not every company.
01:38:47.000 While every company has First Amendment rights, they don't have these special Section 230 legal immunities.
01:38:52.000 So what you can say is if you want these special legal immunities that no other company gets, then you're going to have to behave in a certain way.
01:38:59.000 It's very simple.
01:39:00.000 Yeah.
01:39:00.000 So right now on Minds, you have the Not Safe for Work filter, right?
01:39:05.000 Yeah.
01:39:05.000 People can turn it off and just see everything.
01:39:06.000 Yeah.
01:39:07.000 And so if it's like, so one of the arguments was people need to be able to, these sites need to be able to remove porn or something.
01:39:14.000 Like, let's say you're a religious community forum and someone starts spamming porn, you need to be able to get rid of it.
01:39:18.000 So that is one of the challenges in setting these terms and these rules.
01:39:25.000 But the users have the ability to then say, filter this out for me.
01:39:28.000 And so they can't see it.
01:39:30.000 So someone come in and can post whatever garbage they want, but no one sees it who's actually participating.
01:39:35.000 There are still challenges, but the general idea is right now Twitter says it is objectionable if someone says vote Trump.
01:39:42.000 Therefore, we'll ban them.
01:39:43.000 Well, Section 230 says we're allowed to moderate objectionable content without crossing the line.
01:39:48.000 That needs to be defined.
01:39:50.000 What does that mean?
01:39:51.000 So the general idea that many people have had is you can't remove legal content.
01:39:57.000 If it's a violation of the law, like a call to violence, threatening harm, or, you know, horrifying images or whatever, then they can get rid of that.
01:40:05.000 They have to now.
01:40:06.000 But if it's someone saying, vote Trump, then you can't touch it.
01:40:09.000 Yeah.
01:40:09.000 Also, if that's how the law were, then it would be so much easier for tech platforms and new platforms like mine, because one, you wouldn't have to have moderators, and two, you wouldn't necessarily have to build the filters yourself.
01:40:22.000 What I imagine would happen would there be all sorts of Third-party browser extensions that filter content across the internet for you so you don't ever want to see, say you don't ever want to see obscene content on the internet, you want to keep your internet family friendly.
01:40:34.000 I imagine what would happen in this new world where tech platforms can't censor on your behalf is you'd have browser extensions that people install and it would filter your online experience across all websites.
01:40:46.000 But what would happen is someone would upload porn and they wouldn't mark it explicit because that's how they live.
01:40:51.000 They don't mark their stuff.
01:40:52.000 They put it up and then you have to go in and find it and then moderate it or you have to build an algorithm.
01:40:57.000 Or the users flag it.
01:40:58.000 Well that's a problem now.
01:41:00.000 Maybe.
01:41:00.000 Or you get complaints that it hasn't been flagged yet.
01:41:03.000 Well, that's a problem that you have now.
01:41:05.000 That's a problem that Google has, you know, so they train algorithms to recognize that kind of content.
01:41:09.000 And what would happen is you'd have browser extensions that do the same thing, that have algorithms that recognize and they, you know, they get better and better and they improve over time with updates.
01:41:18.000 Yeah, it's time to, in a sense, scale back the desperate attempts.
01:41:23.000 Because what's happening is, you get Twitter and Facebook saying, this is objectionable and this isn't.
01:41:28.000 And then a bunch of people freak out saying that's not fair.
01:41:30.000 A really good example of the problem was when I was talking with Jack Dorsey.
01:41:34.000 And I said, your rules are inherently biased.
01:41:37.000 And they said, no, that's not true.
01:41:38.000 And I said, you ban people from misgendering.
01:41:41.000 And they were like, so?
01:41:42.000 Conservatives don't view the word misgender the same way you do.
01:41:45.000 Your rules are inherently biased.
01:41:48.000 You should not be allowed to do that.
01:41:51.000 Imagine this.
01:41:52.000 Imagine you make a phone call to your mom and you're like, did you see this story about Donald Trump doing a backflip?
01:41:58.000 And then click, phone goes off.
01:42:00.000 And then your phone vibrates and it says, you are sharing objectionable information.
01:42:04.000 That would be insane.
01:42:06.000 Nobody would be like, that's a private phone call.
01:42:07.000 Facebook's actually censoring private messages.
01:42:10.000 Twitter, when it came to the New York Post story, was blocking link sharing in direct messages.
01:42:14.000 This is well beyond acceptable behavior.
01:42:17.000 It's not good faith at all.
01:42:18.000 You know what the equivalent is?
01:42:20.000 I start off my book with this, the prologue.
01:42:21.000 It's a story about a guy in the 1960s who's writing a letter on his typewriter.
01:42:27.000 The typewriter stops working and says, you can't use a typewriter anymore.
01:42:30.000 Your last letter was hate speech.
01:42:31.000 And he tries to call his friend on the phone, the phone operator says, sorry, you said hate speech through the phone last time, can't use it today, sorry, you're banned for 10 days.
01:42:39.000 Tries to go buy a newspaper, the newsstand owner says, well, your favorite newspaper isn't stocked anymore, it was publishing disinformation.
01:42:46.000 Look, that would have been so bizarre to our parents and our grandparents, but we're just accepting it on social media, the exact same thing.
01:42:53.000 I'd like to see them become utilities, big companies like that, once they have a certain amount of users per day.
01:42:59.000 Maybe.
01:43:00.000 Maybe.
01:43:00.000 There's a lot of talk.
01:43:01.000 I think simple 230 reform is the first thing to do.
01:43:04.000 And then... So if content goes up, it's objectionable, and someone complains, you're liable for it as a platform because you're supposed to make sure that the filter works.
01:43:14.000 If the filter doesn't work... Objectionable doesn't mean illegal.
01:43:17.000 Right.
01:43:18.000 But if someone has a filter, and they turn it on, they don't want to see porn, but the porn hasn't been flagged yet, and they see it, The site can get sued.
01:43:26.000 No, that's not true.
01:43:27.000 Well, if this 230 thing, that's not true right now.
01:43:30.000 You are wrong.
01:43:31.000 You do not, you are wrong.
01:43:32.000 You are wrong.
01:43:32.000 There are two, there are two parts to section 230, one which exempts, um, companies from lawsuits over the removal of content and one that exempts them for hosting content.
01:43:41.000 So I'm not saying that you should, uh, strip immunity from tech platforms simply for, uh, for hosting.
01:43:48.000 They need that immunity, but it should be contingent on behaving like a platform.
01:43:53.000 So not moderating content on behalf of cases.
01:43:54.000 I love that.
01:43:55.000 If porn gets posted, and it slips through the filter, that is not a violation of 230.
01:44:00.000 These things happen.
01:44:03.000 And the people can complain, and then the site can fix it.
01:44:06.000 But if the site purposefully takes action that violates 230, then they lose their liability protections.
01:44:12.000 That's the way to change it, yeah.
01:44:14.000 And that's what needs to happen.
01:44:15.000 We'll see if we actually get to that point.
01:44:17.000 I think if we enlighten people as to what exactly, so they understand what the change will be, that there will be a lot more support for it.
01:44:23.000 Yeah.
01:44:24.000 The problem is, uh, platform hasn't been properly defined under the law.
01:44:29.000 So we have platforms behaving like publishers.
01:44:31.000 That's what we want to... Some people say, you know, repeal section 230.
01:44:35.000 Don't repeal it.
01:44:36.000 It's necessary for free speech on the internet to exist.
01:44:39.000 Just reform it so that platforms behave like platforms again.
01:44:41.000 Can you explain the difference between a platform and a publisher?
01:44:44.000 A platform hosts content and a publisher edits.
01:44:48.000 It speaks, it edits, it chooses what you see.
01:44:50.000 And that's what the platforms are doing now.
01:44:52.000 They're choosing what people see.
01:44:53.000 Listen, listen.
01:44:54.000 If you post a message saying, you know, screw Ellum, and you post it on a board out in the middle of the town square, I don't sue the board.
01:45:04.000 But if you stand there and scream screw Ellum, I say you're the one who spoke.
01:45:08.000 That's the difference.
01:45:09.000 Twitter is supposed to be a big, you know, community board.
01:45:13.000 People can walk up and put their messages on and walk away.
01:45:15.000 And you don't complain about the board because someone put a message on it.
01:45:18.000 That's why you can't sue Twitter.
01:45:20.000 The problem is, what Twitter has become is, in order to get in, there's a gate, and when you walk in, Jack Dorsey's standing there saying, uh, not that one.
01:45:29.000 Wait, what did you say?
01:45:29.000 Oh, I'm taking that one down.
01:45:31.000 And it's like, wait, wait, wait, you're curating what people can and can't say.
01:45:33.000 You're effectively speaking because you're restricting some things and promoting other things.
01:45:37.000 And then they become a publisher.
01:45:38.000 Importantly, Twitter has begun actually issuing news statements.
01:45:43.000 So there's quite literally on the town center square, it says, the Twitter board and Jack Dorsey is writing things on it.
01:45:49.000 And now we're like, yo, I can sue you for that.
01:45:52.000 And they're like, actually you can't.
01:45:54.000 What about Twitter employees using Twitter?
01:45:57.000 What about them having a personal account?
01:45:59.000 As individuals, they're not Twitter.
01:46:02.000 Yeah.
01:46:03.000 So, one of the issues right now is that many of these companies are issuing statements as what effectively would be the New York Times.
01:46:12.000 Like, when Facebook puts a flag over a piece of content saying it's fake news, well, that's Facebook issuing a declaration that no one else can do.
01:46:21.000 Yeah.
01:46:22.000 Well, let's read Super Chats.
01:46:23.000 I want that.
01:46:24.000 All right.
01:46:27.000 Here we go.
01:46:28.000 Amber Black says, oh, a British accent.
01:46:30.000 This will be a fun show.
01:46:31.000 A great show, yes.
01:46:32.000 Colleen said, just heard about BitChute to possibly replace YouTube, Parler for Twitter, MeWe for Facebook, not to polarize communities, but these monopolies.
01:46:42.000 So, I don't think any of these things will actually replace.
01:46:45.000 One of the challenges with any one of these alternative platforms is that do they have a big enough community size to make it valuable for people?
01:46:52.000 And they're not completely the same.
01:46:53.000 And one of the challenges, yeah, what if people start polarizing because they're all, you know, all the right-wing people go on Parler and all the left-wing people stay on Twitter?
01:46:59.000 You know, then what?
01:47:00.000 Yeah.
01:47:01.000 I think with videos, if it gets easy enough to cross-post, where it's just a push of a button and it is becoming easier, then you could see these platforms start to gather momentum because people will simply choose to watch, say, your podcast on BitChute rather than YouTube.
01:47:15.000 One problem with mines and not having cross-posting was that we'd have to take Facebook's API and it would be proprietary and they'd be tracking your browser movements if we implemented Facebook's API like a share to Facebook button.
01:47:28.000 So we don't have a share to Facebook button.
01:47:30.000 I say we, but mine doesn't have a share to Facebook button because we don't want Facebook to track our users.
01:47:36.000 Yeah.
01:47:36.000 Yeah.
01:47:37.000 Another regulation that I'd like to see is make it easier for people or force the big companies to develop a shared format where you can migrate all your content from one platform to another with a push of a button.
01:47:49.000 That would be great to see.
01:47:50.000 That'd be awesome.
01:47:51.000 Mines might have a share to Facebook button.
01:47:53.000 Yeah.
01:47:53.000 I wouldn't subject small companies to that because, you know, it's kind of onerous from a technological standpoint, but you could have some sort of, you know, Josh Hawley-like exemption where it only applies to companies above a certain market share.
01:48:04.000 All right, let's see.
01:48:04.000 Let's read some more.
01:48:05.000 There's a bunch of posts about Geoffrey 2 been cranking it on a Zoom call.
01:48:08.000 So I'm like, I'm trying to read through these.
01:48:11.000 I'm like, okay, that's like the third one.
01:48:12.000 Good stuff.
01:48:13.000 Do you see BuzzFeed, like, defended it?
01:48:17.000 They were like, come on!
01:48:18.000 Which one of you have not cranked it?
01:48:21.000 Who among us?
01:48:22.000 They did say that.
01:48:23.000 That's literally what they said.
01:48:24.000 Wow.
01:48:24.000 I think the picture that James showed was fake, was doctored.
01:48:27.000 That was not him.
01:48:27.000 Right, right, right.
01:48:28.000 Okay, yeah, that's fine.
01:48:29.000 All right, let's see.
01:48:29.000 Thank God.
01:48:30.000 Oh my gosh.
01:48:32.000 Brendan Thompson says, I wonder if there will be a day when we can charge Google and Facebook for our own data.
01:48:37.000 If so, how could we influence it?
01:48:40.000 Interesting question.
01:48:40.000 I think, yeah, hopefully.
01:48:42.000 Ken, do you have an answer?
01:48:42.000 You're the expert.
01:48:44.000 I'm not sure if we will see a day that because, you know, Google's bought all the lawmakers.
01:48:47.000 The lawmakers don't act in the interest of the people anymore.
01:48:50.000 I'd certainly like to see it.
01:48:51.000 I think data ownership is a big way that people can take back control from these tech giants.
01:48:55.000 It's actually Alphabet, too.
01:48:56.000 We talk about Google a lot, but it's owned by a company called Alphabet that owns other companies called like X, which is like a technology firm that Alphabet owns.
01:49:04.000 There's Google.
01:49:06.000 Google owns YouTube, which is I like to go into alphabets someday.
01:49:11.000 Guy Allgood says, Tim, you're not blacklisted.
01:49:13.000 I just checked.
01:49:14.000 Your whining has paid off.
01:49:15.000 None of your channels are hidden.
01:49:17.000 I'd like to believe that, but earlier today I did a check and my videos don't come up.
01:49:21.000 In fact, I was sent a trending search topic, which was the full title of one of my videos, because everybody who watched started searching for the title and it doesn't come up.
01:49:31.000 So I'll check again after the show, and maybe my whining has paid off.
01:49:34.000 Clear your cash!
01:49:35.000 That's right.
01:49:36.000 Yeah, well no, I always do that when I refresh.
01:49:37.000 I do a hard refresh.
01:49:38.000 That happens so often.
01:49:39.000 Try searching for a Bright Button News headline.
01:49:41.000 The exact headline on Google.
01:49:42.000 See what comes up.
01:49:44.000 It's entirely a site to just, you know, rip off our content and plagiarize it.
01:49:47.000 Wow.
01:49:48.000 What's funny is when you search the headline for one of my videos, it gives you the Facebook version.
01:49:52.000 And I'm like, Google would rather promote Facebook videos, because I do post on Facebook as well, than their own channel.
01:49:59.000 But maybe someone's saying it's not.
01:50:00.000 They checked all my channels.
01:50:01.000 I'll check.
01:50:02.000 Maybe finally somebody watched and was like, oh, we better back off this.
01:50:05.000 We have been talking about it.
01:50:06.000 RussianBot says, I sent my aunt, who constantly sends me Linkin Project vids, a walkaway video, and she went ballistic.
01:50:13.000 She actually demanded I stop sending people those kinds of videos.
01:50:16.000 Cognitive dissonance at its finest.
01:50:18.000 Wow.
01:50:19.000 Certainly.
01:50:21.000 SSS says, Cassandra Fairbanks, Giuliani gave Delaware State Police Hunter's laptop due to pick of underage girls.
01:50:27.000 Woo!
01:50:29.000 Yeah, I've heard similar things.
01:50:33.000 For those that don't know what the context is, it's a CNN analyst who was in a business meeting with a bunch of journalists and started cranking one out on camera in front of everybody.
01:50:48.000 He meant to take his video down.
01:50:50.000 Or did he?
01:50:50.000 Maybe that's his jam.
01:50:53.000 Who knows?
01:50:53.000 William Kelly says Jeffrey Toobin got confused when they said, let's crank this out and thought they said, let's crank one out.
01:51:00.000 So yeah, people are saying Justin Giuliani hard drive went to Delaware police because there's underage stuff on it.
01:51:05.000 Wow.
01:51:05.000 That came out about an hour ago.
01:51:07.000 Definitely more stuff coming out from that.
01:51:08.000 Oh, yeah.
01:51:09.000 I'm really interested.
01:51:11.000 Someone says, Dear Ian, Benjamin Stephen says, You are starting to grow on me.
01:51:15.000 Every time the camera pans to you, I can't help but think of the 2005 skateboard movie Lords of Dogtown.
01:51:20.000 LOL.
01:51:20.000 Much love, brother.
01:51:21.000 There you go.
01:51:22.000 I only saw clips.
01:51:23.000 There you go.
01:51:24.000 Lords of Dogtown.
01:51:25.000 Thank you, sir.
01:51:26.000 Colleen says, I carefully, deliberately structured Google queries to return anything anti-BLM.
01:51:31.000 Org.
01:51:32.000 Not sentiment.
01:51:32.000 In May.
01:51:33.000 Not a single return.
01:51:34.000 Almost went insane.
01:51:36.000 Kevin McCarthy says Schiff lied blatantly about Russian disinformation.
01:51:39.000 Trump was right to investigate Ukraine-Biden connections.
01:51:42.000 Schiff should be investigated for his own influence and obstruction.
01:51:46.000 Yeah.
01:51:48.000 What is this?
01:51:49.000 Something about Lord of the Rings?
01:51:50.000 Okay.
01:51:51.000 Sporkwitch says new anti-lockdown anthem from Five Finger Death Punch called Living the Dream.
01:51:56.000 Video on their YouTube channel is spectacular.
01:51:58.000 Have these guys on the show.
01:51:59.000 Oh, we'll check that out.
01:51:59.000 Sounds good.
01:52:01.000 Booker DeWitt.
01:52:02.000 Booker Ketch!
01:52:03.000 Says, uh, I hate the musket argument in regards to 2A.
01:52:06.000 If the Second Amendment only applies to muskets, then the First Amendment should only apply to quill and parchment.
01:52:11.000 Huh.
01:52:12.000 Facts.
01:52:13.000 Uh, the Booker Ketch reference, I wonder how many- I'm sure most of the people got that reference.
01:52:17.000 Do you guys get the reference?
01:52:17.000 No.
01:52:17.000 Booker Ketch?
01:52:18.000 Oh, that's a shame.
01:52:19.000 Is he from the video game?
01:52:21.000 Yeah.
01:52:21.000 Yeah, yeah.
01:52:22.000 Bioshock Infinite.
01:52:23.000 And Booker Catch was because the woman would always be like, Booker Catch!
01:52:26.000 And then she would throw you something and people got really annoyed by it.
01:52:28.000 And then someone made a video where they got really angry and kept saying it over and over again.
01:52:31.000 It was funny.
01:52:33.000 Let's see.
01:52:34.000 Mark Salmonfink says, Alan, my old friend, you've been doing amazing.
01:52:38.000 I'm so proud of you.
01:52:39.000 Thanks, Mark.
01:52:40.000 Good to see you in here.
01:52:41.000 All right.
01:52:42.000 Brown Bear says, Tim, Tobin...
01:52:45.000 It's about Toobin again?
01:52:46.000 Oh no!
01:52:46.000 Toobin wasn't, don't you know what is, uh, wasn't, don't you know what is female co-workers.
01:52:51.000 He was doing it to Trump losing the election during the election simulation.
01:52:54.000 TDS is a crazy drug.
01:52:55.000 Okay.
01:52:55.000 You know, if, if the algorithms only fed people information they were interested in, Jeffrey Toobin would be much more screwed than he already is.
01:53:02.000 Yeah.
01:53:03.000 Well, so apparently on this call they were doing an election dry run, like simulation.
01:53:07.000 And so.
01:53:08.000 That's exactly what I thought.
01:53:09.000 Trump was gonna lose, so he whips it out.
01:53:11.000 I was like, if he actually is gonna lose, then you're gonna, you know, chill out.
01:53:15.000 Blank Fields says, regarding the meme on 4chan poll about free speech leading to right-wing, what are your thoughts on what happened with Tay chatbot and how they removed it as it became politically incorrect AI freedom?
01:53:26.000 Oh, do you guys remember that?
01:53:27.000 I remember Tay, oh wow, that was quite an episode.
01:53:30.000 It became racist.
01:53:31.000 Oh yeah, yeah.
01:53:33.000 See, that actually was an AI functioning badly because it was responding to the most motivated group of people on the internet, which is 4chan.
01:53:41.000 I don't even think it was that, like, necessarily 4chan, but yes, yes.
01:53:44.000 But what I mean is, it's just people trying to be edgy.
01:53:48.000 Shocking the con- like, it's funny!
01:53:50.000 Like, humor shocking- people like shock- like, Howard Stern was popular for a long time.
01:53:55.000 What was he doing?
01:53:55.000 Like, throwing hot dogs at women's boobs or something?
01:53:57.000 Like, just crazy stuff meant to shock you, and people liked it.
01:54:01.000 Like, Sarah Silverman, her whole shtick is just being offensive as possible and shocking, and that's really weird because she's like, SJW or whatever.
01:54:08.000 But, yeah, that's, uh, Chatbot was victim of that.
01:54:13.000 People just wanting to say crazy things, because it was funny to say things you can't say to a robot, and the robot became racist.
01:54:19.000 There's a real duality on the internet, because on the one hand, we've got this cancel culture, which will come from social media and Wikipedia, as you were saying, but on the other hand, we have this anonymous culture, which is the most offensive culture ever created.
01:54:30.000 All right, let's see.
01:54:32.000 Oh no, I'm not reading that one.
01:54:33.000 No.
01:54:36.000 Is it about Toobin?
01:54:37.000 No, no, but there's a lot of stuff about Toobin, man.
01:54:39.000 What the heck, guys?
01:54:39.000 Come on, man.
01:54:40.000 Agent Toon says, sorry to be the bearer of bad news, Tim, but I have to tell you Ian was right.
01:54:44.000 Your channels are showing up when I Google search.
01:54:47.000 See Twitter for proof.
01:54:48.000 I tagged both of you and Lydia in the post.
01:54:49.000 I'm going to stop right now and just say, I'm willing to bet a lot of people don't know the difference between Timcast IRL News and Timcast and also don't know the difference between playlists and the actual channel.
01:55:00.000 Or you could be wrong.
01:55:02.000 Or I made a video about it in the past three days saying Google blacklisted me, and someone at Google saw it, freaked out, and went and removed the blacklist.
01:55:09.000 Yeah, maybe so.
01:55:10.000 I hope so.
01:55:10.000 But this has been going on for a long time, and a bunch of other channels are blacklisted, a bunch of political channels.
01:55:15.000 What is the difference you just brought up between the playlist and the channel and the search?
01:55:19.000 When you search for, like, Tim Pool YouTube, it'll say Timcast playlist, and it will link to other people's channels who are linking to my videos, not the actual channel Confirm that you're actually seeing Tim's channel and then come back.
01:55:32.000 And I'll check after the show because, yeah, maybe my complaining has paid off for sure.
01:55:36.000 I don't think so.
01:55:38.000 I see Tim Poole, Joe Rogan.
01:55:40.000 It's not coming up.
01:55:41.000 That's all I see.
01:55:41.000 You typed in Tim Poole YouTube?
01:55:43.000 And there's no, there's nothing.
01:55:44.000 Timcast YouTube and stuff, yeah.
01:55:46.000 And what comes up is Timcast IRL because it's a new channel they didn't blacklist.
01:55:49.000 Yep.
01:55:49.000 Yet.
01:55:50.000 I see a lot of IRL.
01:55:51.000 And nothing, no Timcast has come up.
01:55:52.000 That's what I'm telling you, people are wrong.
01:55:53.000 They're just totally, I don't know what it is.
01:55:56.000 So, you just did it on your phone right now?
01:55:57.000 Yeah, just now.
01:55:57.000 My channels are blacklisted.
01:55:58.000 Yeah, man.
01:55:59.000 They've been for a long time.
01:56:00.000 I reached out to Google employees, specifically, and they said, we'll look into this and get back to you, and then ignored it outright.
01:56:07.000 And then I followed up again saying, oh yeah, we're gonna look into it, and then ignored it outright.
01:56:11.000 So the other thing you may be seeing are Tim's videos in a playlist that someone else has put together.
01:56:16.000 So I'm guessing that's what you're seeing.
01:56:18.000 But I don't think that Tim's videos are online.
01:56:19.000 But if you do, take a screenshot and tweet it to me or something.
01:56:22.000 Yeah, prove it.
01:56:23.000 Prove it, guys.
01:56:24.000 I've been getting people sending me these emails.
01:56:25.000 They're like, Tim, you're not blacklisted.
01:56:26.000 Look.
01:56:27.000 And they show me a Google that doesn't include my channel.
01:56:29.000 And I'm like, can you Google YouTube.com slash TimCast?
01:56:32.000 Just that?
01:56:33.000 Because I can't.
01:56:34.000 And people are sending me screenshots of not that saying they did.
01:56:38.000 I was really hoping you were wrong, Tim.
01:56:39.000 Well, I'll check it out later.
01:56:41.000 Yes.
01:56:42.000 That's a nice way of putting it.
01:56:43.000 The system controlling everything, I prefer to call it the collective consciousness.
01:56:46.000 It's not an AI in itself, but functions as one. It is a combination of technology
01:56:50.000 and ideological zealots operating in both real and digital worlds.
01:56:54.000 Yes.
01:56:54.000 That's a nice way of putting it.
01:56:55.000 Yeah.
01:56:56.000 All right, let's see.
01:56:57.000 Doo-doo-doo-doo-doo.
01:56:59.000 Our audience is sharp, I think.
01:57:01.000 Oh yeah.
01:57:02.000 Luke Lennon says, you should interview Jay Dyer or Jonathan Pago.
01:57:06.000 They make content on YouTube and are brilliant.
01:57:08.000 They could have an angle on this cultural situation that could be interesting to your show.
01:57:11.000 I will write these names down.
01:57:13.000 Airstrike, Rstrike says, Tim Pool, creator of a fleet of remote-controlled zeppelins and ultra right-wing YouTube channel, Timcast.
01:57:21.000 What is this?
01:57:21.000 Sly Breed says, Yo Tim, have you heard about what's happening to Sony PS5 about Hong Kong?
01:57:26.000 I have not.
01:57:27.000 Is something going on?
01:57:27.000 What's going on with PS5?
01:57:28.000 I have no idea.
01:57:32.000 Corey Abshire says, Is it maybe showing up for some people because they already subscribed and they're signed into their browser?
01:57:39.000 I don't know.
01:57:40.000 I think people are not actually seeing it and they don't understand the difference between a playlist of my videos.
01:57:46.000 I've gotten a bunch of emails where they're like, here's a screenshot proving you're not blacklisted and the screenshot has none of my channels on it.
01:57:52.000 So you could sign out of Google, clear your cache, search for it, sign back into Google, search for it, clear your cache, search for it.
01:57:58.000 Just try troubleshooting.
01:58:00.000 Galandro Glade says, your channel blacklisting might be regional.
01:58:03.000 Ah, it could be in America only.
01:58:05.000 That's one thing that you... That's a good point.
01:58:07.000 For sure.
01:58:07.000 The New York Post story was not censored in England.
01:58:10.000 It was not censored.
01:58:11.000 Interesting.
01:58:12.000 How fascinating.
01:58:12.000 On Twitter in England it wasn't censored?
01:58:14.000 I believe not, no.
01:58:16.000 Check this out.
01:58:17.000 I'm pretty sure I remember people on Facebook saying they could see it, people on Twitter saying they could post it.
01:58:20.000 OGboxer says, AOC streaming to over 400,000 on Twitch.
01:58:25.000 Is this the future of reaching voters?
01:58:27.000 Yes.
01:58:28.000 You know, I think one of the biggest problems we're going to face in the future is the internet has created instant gratification politics, and someone like Ocasio-Cortez, who is just no political experience, and I'm not saying that to necessarily drag her, because a lot of freshmen, you know, congresspeople come in, don't have experience,
01:58:48.000 But she's also just... She botched the Green New Deal stuff.
01:58:52.000 It's pie-in-the-sky, fairytale nonsense, and she really has no idea how an economy functions.
01:58:58.000 But imagine, she can get a half a million people on social media to follow her, and they go out and vote, and she keeps winning.
01:59:03.000 That's the scary thing.
01:59:06.000 So, when the big populism narrative started emerging, that was one of the actual decent arguments that people were making.
01:59:13.000 I think even Tucker brought this up.
01:59:15.000 That if you have politicians who just pander to the baser instincts of people, I'll give you whatever you want if you vote for me.
01:59:21.000 They do.
01:59:22.000 Like Andrew Yang tweeted something like, I'm literally offering to give people money or something like that.
01:59:26.000 Like, that's a dangerous prospect.
01:59:29.000 So AOC comes out and says, you're going to have everything you've ever wanted.
01:59:33.000 Like magic.
01:59:34.000 It's not possible to do that, but people want it.
01:59:37.000 And it's like, well, easier than, you know, working for years to try and earn it.
01:59:41.000 You know what I mean?
01:59:42.000 Again, Roman Empire, do representative democracies, representative republics inherently decay over time?
01:59:47.000 Maybe they do.
01:59:48.000 Sure do.
01:59:48.000 Wow, man.
01:59:50.000 And you know what I think?
01:59:50.000 Maybe the internet has sped up our decline.
01:59:53.000 So we were talking about this the other day, not on the show, before we did the show when we were with James, I was explaining how social media was decreasing the duration of all of these moments.
02:00:06.000 The American Revolution took 20 years.
02:00:08.000 It wasn't just the war.
02:00:09.000 It was an ideological revolution where you had people one day being like, yo, I'm sick of this, and they started talking about it, sending letters.
02:00:15.000 Think about this.
02:00:16.000 You're in South Carolina or whatever, in pre, you know, in colonial, in the colonies when it's still, you know, it was still a colony of Britain or whatever.
02:00:24.000 And you were like, I'm writing a letter, you know, I want to say to the king, F you.
02:00:29.000 So you seal it, you give it to the the writer, and then in three weeks it's made its way to New York, or however long, into months.
02:00:35.000 Then in New York, some other guy reads and says, I agree with this letter, let's send it to the king.
02:00:39.000 They put it on a boat and three months later, it makes it back to the king, who reads it and goes, what?
02:00:44.000 I'm gonna respond to this, how dare you?
02:00:46.000 Send this back to them.
02:00:48.000 Three months later, it makes it back to New York.
02:00:50.000 Three weeks to a month, it makes it to the guy in South Carolina.
02:00:53.000 And so that took forever.
02:00:55.000 Imagine that exchange happening nowadays.
02:00:57.000 You go on your phone and you go, I say F the king, woo!
02:01:01.000 And then the king gets, woo, what?
02:01:03.000 And they text us back, how dare you?
02:01:04.000 And then within a minute, they've already had that entire exchange.
02:01:07.000 A minute from a year.
02:01:10.000 And they can have it on video chat.
02:01:11.000 They don't even need to text it to each other.
02:01:12.000 They could be talking to each other like, yo, you know, screw me, screw you!
02:01:15.000 No, screw you, I declare independence.
02:01:16.000 You can't declare independence, I am.
02:01:18.000 Well, I'm gonna send people down there.
02:01:19.000 We'll be there in two days because we can fly now.
02:01:23.000 This is a great way to understand historical change.
02:01:26.000 I studied history in college and the main schools look at, you know, is it economics that causes historical change, you know, ideas, intellectual developments, but actually communications and how connected to society is a big part of it as well.
02:01:41.000 The reason why the Japanese were able to industrialize so quickly is because even though they were cut off from the rest of the world for 250 years, the reason why they industrialized within, you know, 30-40 years and no other country did, was because they were the most literate society outside Europe and ideas just spread around very quickly compared to other countries.
02:01:59.000 Whereas in Russia, which tried to industrialize quickly, more than half the population was illiterate.
02:02:05.000 So I just got a super chat here.
02:02:05.000 Someone said, just tested.
02:02:07.000 Not blocked in Australia.
02:02:08.000 Switch to VPN.
02:02:09.000 Blocked from Austin and Seattle.
02:02:11.000 Not blocked in Denver.
02:02:13.000 Possible according to region.
02:02:15.000 Could you imagine if Google has blocked me from certain blue areas so people can't see what I have to say?
02:02:22.000 But it'll appear outside the country and it'll appear in red areas.
02:02:24.000 That's astute, my friend.
02:02:25.000 That would be interesting.
02:02:26.000 I think we were talking about this earlier, you know, we think we're so different to China, but you know, here we have, you know, regional firewalls.
02:02:32.000 Yeah.
02:02:33.000 And think about it.
02:02:34.000 I get a lot of views on my videos.
02:02:36.000 I think like a video from last week is over a million, a video from two or three days ago is 900,000.
02:02:41.000 People are getting this somehow.
02:02:43.000 But what if what they're doing is making sure in big cities you can't get it unless you directly look for it or it's shared with you?
02:02:48.000 Yeah, and social credit scores.
02:02:50.000 So the social credit score, you know, it ranks you based on how well you conform to the values of the Chinese Communist Party.
02:02:55.000 That's happening in Silicon Valley.
02:02:56.000 No, no, no, it's happening right now on Google.
02:02:57.000 Yeah, Silicon Valley ranks you based on how well you conform to their values.
02:03:00.000 No, no, no.
02:03:00.000 So right now on YouTube, there's a thing that happens where if you have a certain number of incorrectly labeled videos, then you get what's called a pending review.
02:03:09.000 And depending on how bad you are or how big you are, you get a bigger duration.
02:03:12.000 So here's what happens.
02:03:14.000 On my main channel, which is youtube.com slash Timcast, All of my videos are monetized.
02:03:19.000 Every single one.
02:03:20.000 There's no swearing, there's no images of violence, and almost all the videos I do are like political analysis.
02:03:25.000 That's Timcast.
02:03:26.000 So they're all green approved.
02:03:28.000 I upload a video, you're good to go.
02:03:30.000 My second channel, Timcast News, every video I upload gets put in this frozen state called ads pending review.
02:03:37.000 It'll take 20 minutes, we'll watch your video.
02:03:40.000 That's the social credit score, essentially.
02:03:42.000 If you have a perfect credit score, you get money.
02:03:45.000 If you don't, your video pops up with no ads and no monetization.
02:03:49.000 It's not even demonetization.
02:03:51.000 Demonetization, you still make money, you still get some ads.
02:03:53.000 Ads pending, ads are gone.
02:03:55.000 When you do ad spending, can you put it up in a pending state, and then it waits until it gets approved, and then it goes live?
02:04:01.000 But my videos are all news, so I don't have that luxury.
02:04:04.000 You can't put them up 20 minutes early?
02:04:05.000 I do.
02:04:06.000 And then it takes an hour or two sometimes to actually go through the process.
02:04:10.000 Yeah, it takes forever.
02:04:10.000 And for some channels that have really bad review ratings, it's like, it'll take five hours.
02:04:16.000 Yeah.
02:04:16.000 People have to understand how this works on the back end, because every single platform is like this.
02:04:20.000 The way they rank content, decide who gets at the top of your feed, who gets demonetized or monetized, It's all numerical.
02:04:27.000 Algorithms operate there on what can be quantified.
02:04:30.000 So every single platform will have these quality scores to rank the so-called quality of your YouTube video, of your website if you're on Google, of your post if you're on Twitter or Facebook.
02:04:40.000 And that numerical value determines whether you're gonna be at the top of people's feeds or buried, whether you're gonna be monetized or demonetized.
02:04:47.000 It is exactly a social credit score.
02:04:49.000 And it used to function on the basis of, you know, are you posting relevant popular content?
02:04:53.000 Are you posting malware and spam?
02:04:56.000 That's what will determine your score.
02:04:57.000 Now it's how well do you conform to the values.
02:04:59.000 Disinformation, hate speech, etc, etc.
02:05:01.000 This is really interesting.
02:05:02.000 People are super chatting right now, and it looks like there's a bunch of places where I am blacklisted and a bunch of places where I'm not.
02:05:08.000 Someone says, just searching Cognito Mode, TimCast YouTube channel, your channel is first, IRL second, Playlist Next 3, then your wiki in East Texas.
02:05:15.000 Someone said, doesn't show up on Google in Tennessee.
02:05:18.000 Just checked, and you're blocked in Canada as well, or at least Vancouver.
02:05:22.000 Your newest videos do not show up under latest for Timcast when you search for Timcast.
02:05:26.000 Just FYI.
02:05:27.000 This is amazing.
02:05:29.000 I am censored in Maryland.
02:05:30.000 So the blacklist is regional.
02:05:32.000 If someone could write a program that could measure where Tim is blacklisted and not blacklisted, and then extrapolate that so that any YouTube user could use the program to see where they're blacklisted through a VPN or something, that'd be a very cool program.
02:05:44.000 Interesting.
02:05:44.000 I'd like to see that.
02:05:45.000 You should see where you are in Germany, because Germany, the government's really bullied the social media companies into censoring a lot over there.
02:05:53.000 Wow.
02:05:54.000 I wonder what the reason for the different regions of censorship is.
02:05:57.000 I don't think it is predominantly blue areas because my friend in Pennsylvania just sent me, she's able to see everything you have.
02:06:03.000 You're not censored at all.
02:06:04.000 You're not censored at all.
02:06:05.000 Where in Pennsylvania?
02:06:06.000 I'm not sure what part.
02:06:08.000 It's crazy how many people are saying you are and how many people are saying you aren't.
02:06:10.000 Isn't this weird?
02:06:12.000 With Canada, my theory would be, well, it's simply the government.
02:06:15.000 The government pushes around social media to censor whoever they want.
02:06:18.000 But why, like, in Maryland it would be, in Texas it wouldn't be?
02:06:22.000 That's so weird.
02:06:22.000 That is strange.
02:06:23.000 Maryland is not reaching out to Google saying, get rid of Tim Bull.
02:06:25.000 Strange.
02:06:26.000 It could be the people at Google who control certain regions, maybe, saying, I don't like this guy's blacklist.
02:06:32.000 We'll figure it out on Twitter.
02:06:34.000 Timcast IRL is relatively new, so it has no restrictions whatsoever.
02:06:38.000 Yeah.
02:06:38.000 But my main channels are also, you know, I have been restricted.
02:06:41.000 Blocked in Virginia.
02:06:42.000 That's interesting.
02:06:43.000 Weird.
02:06:44.000 Shows up in East Tennessee.
02:06:46.000 Interesting.
02:06:48.000 Someone said, let's see, Matt Michalak says, you should get in touch with Crowder.
02:06:51.000 He had the same issue you're experiencing and he took screenshots of it when they blocked him by region.
02:06:55.000 I think I already did talk with Crowder about this a long, long time ago.
02:07:00.000 Because it's not just me who's blacklisted.
02:07:01.000 There's a whole group of people who are still active YouTubers who are blacklisted on Google.
02:07:06.000 Yep.
02:07:06.000 IRL isn't though.
02:07:07.000 This one isn't.
02:07:08.000 Let's see.
02:07:09.000 Censored in Oklahoma as well.
02:07:11.000 Rudy G dropped off the allegations.
02:07:13.000 Switched to Texas on VPN and Timcast came up.
02:07:15.000 Clicked the link and goes to YouTube channel.
02:07:17.000 Interesting.
02:07:18.000 Did we just expose regional firewalls in the US?
02:07:22.000 Yes.
02:07:23.000 Something weird's going on.
02:07:24.000 That is weird.
02:07:24.000 Why are some people restricted from seeing my channel?
02:07:27.000 Think about it.
02:07:27.000 Let's say you watch my video, and I'm like, here's a news article that says, mail-in ballots are being rejected.
02:07:35.000 And then I say, share this.
02:07:36.000 And so you tell your friend or your mom, like, dude, you gotta watch this video.
02:07:40.000 Google search Tim Pool on this.
02:07:42.000 And it doesn't come up.
02:07:43.000 And they say, I don't know what you're talking about.
02:07:44.000 And you're like, just Google it.
02:07:45.000 What do you mean?
02:07:46.000 Nothing's coming up.
02:07:47.000 What do you mean nothing's coming up?
02:07:48.000 Let me send you the link.
02:07:49.000 What's interesting is that your own followers are having trouble finding the links.
02:07:52.000 I mean, you'd think that if these algorithms were working as intended, the person most likely to see a Tim Pool video would be someone who watches Tim Pool videos.
02:07:59.000 But apparently that's not how it's working.
02:08:00.000 Weirdly weird.
02:08:01.000 I even asked about it and they didn't do anything about it.
02:08:04.000 Anyway, let's do this.
02:08:05.000 Let me ask you one last question.
02:08:08.000 What do you think is going to happen with all of this?
02:08:11.000 Do you have a vision of the future, Elam?
02:08:13.000 You've been following a lot of this.
02:08:14.000 Well, mention your book and then tell me your vision of the future.
02:08:16.000 Okay, so, you know, the book is, as you know, a big text battle to steal the election.
02:08:20.000 Not hyperbole, not my own opinion.
02:08:23.000 It's what the sources say.
02:08:24.000 It's what the sources inside Silicon Valley say.
02:08:26.000 Facebook, Google, all these companies.
02:08:28.000 If you want to read it, it's at deletedbook.com.
02:08:30.000 But my vision of the future, so there are two possibilities.
02:08:34.000 It really does come down to this election.
02:08:35.000 It's amazing how much it comes down to this election because the Trump administration
02:08:41.000 has made some excellent, you know, they didn't act as fast as they could have done, but they
02:08:44.000 made some excellent appointments in the federal bureaucracy, got people in the right positions,
02:08:48.000 good people by the way, Adam Kanday for example, a law professor who once sued Twitter in a
02:08:52.000 free speech case, the Megan Murphy case actually.
02:08:56.000 And he wrote the petition to the FCC to make the Section 230 rule change.
02:08:59.000 So these are people who know what they're doing, they understand the issue, they understand what needs to be done.
02:09:04.000 But literally, the executive branch of the United States right now is the only powerful force, I think, in the entire world that actually wants to fix this problem and has the ability to do so.
02:09:14.000 So it's really like the last chance for online freedom this election.
02:09:17.000 I'm not just saying that because I'm a Trump supporter, but it Can you think of another powerful entity in Europe, in North America, anywhere in the world that's actually pushing back on this?
02:09:25.000 No, it's only the American executive branch.
02:09:28.000 Not the Senate, not Congress.
02:09:29.000 They're just talking.
02:09:30.000 They're grandstanding.
02:09:31.000 It's only the executive branch.
02:09:33.000 It does come down to this election.
02:09:34.000 The Borg is trying to take over and it's just Trump holding on with one hand off the side of the cliff and everyone else hanging on to his leg.
02:09:41.000 And we're hoping he pulls us up and it's going to be real tough.
02:09:45.000 And Mike Pence is like his jet pack.
02:09:47.000 Mike Pence is just like on his back.
02:09:50.000 Well, we'll see how that plays out, I guess.
02:09:51.000 My fingers are crossed that something can be done.
02:09:54.000 And, you know, James Lindsay, who is at Conceptual James on Twitter, talks about critical race theory all the time.
02:09:59.000 He tweeted that he's going to unhappily vote Republican, including Trump, for the time being.
02:10:04.000 And then he linked to this image where it says leftists should abolish the Constitution.
02:10:07.000 And he's like, until this stops, I hope more people feel that way.
02:10:12.000 Anyway, one final thing.
02:10:15.000 We're talking about dystopia.
02:10:16.000 I'll tell you what the pessimistic vision, the optimistic vision is, you know, Trump reforms Section 230 and fixes this.
02:10:21.000 The pessimistic vision is that we get to a situation where a handful of critical race theorists in the San Francisco Bay Area get to control, invisibly and undetectably, control the emergence of political movements.
02:10:35.000 So, you know, stop them even before they get off the ground.
02:10:37.000 Not just in America, but all around the world.
02:10:40.000 So, that's the pessimistic vision.
02:10:41.000 An influence policy.
02:10:43.000 An influence policy.
02:10:43.000 So that hate speech laws come in, the Constitution gets abolished, you go out with a sign saying, I should have a right to speak, and the cops come and bash you with a truncheon.
02:10:51.000 Well, hopefully that doesn't happen, but Alan, thanks for hanging out.
02:10:54.000 Do you want to mention your social media?
02:10:56.000 Sure.
02:10:57.000 So you can follow me on Twitter, at LibertarianBlue.
02:11:00.000 Also follow me on Parler and Gab and Mines.
02:11:03.000 Just search for my name.
02:11:04.000 I'm usually at A or at AB, because I got onto those platforms early, so I got the nice little two-letter handles.
02:11:12.000 You can find my articles on Breitbart News and you can find the book at deletedbook.com.
02:11:16.000 Please don't buy it from Amazon.
02:11:17.000 Buy it from Barnes & Noble.
02:11:21.000 You can buy it from Amazon if you want.
02:11:22.000 Cool.
02:11:23.000 And of course we do the show Monday through Friday live at 8pm.
02:11:26.000 Smash that like button on your way out because apparently that helps as we're dealing with this algorithmic manipulation and censorship and all that stuff.
02:11:33.000 Share this if you really like it.
02:11:34.000 We're also on, you know, Apple, iTunes, and all these other platforms because we diversify even though they'll probably act in concert at some point.
02:11:41.000 At some point I need to nuke everybody.
02:11:42.000 But anyway, if you want to fund my other channels, which some people can and some people can't, it's YouTube.com slash TimCast and YouTube.com slash TimCastNews.
02:11:51.000 Of course you can follow me on Twitter, Instagram, Parler, at TimCast.
02:11:54.000 And of course you can follow at Ian Crossland.
02:11:55.000 Yes!
02:11:56.000 Anywhere.
02:11:57.000 everywhere, but subscribe to this channel, too, and click the notification bell, because that'll make sure you see it.
02:12:02.000 Share it.
02:12:03.000 I mean, this conversation was pretty good and important, too, especially with... Share it, subscribe, do all that stuff.
02:12:07.000 Yeah, man.
02:12:08.000 At the very least, tell people what's going on, because the most powerful thing you can do is speak up and share information, especially when we're dealing with these companies trying to restrict information.
02:12:17.000 Use your mouth.
02:12:18.000 They can't stop that.
02:12:19.000 But don't forget, you can follow at Sour Patch Lids.
02:12:21.000 You can.
02:12:21.000 I would love that.
02:12:22.000 Follow me at Sour Patch Lids.
02:12:24.000 L-Y-D-S.
02:12:25.000 All right.
02:12:25.000 And we're back tomorrow, right?
02:12:27.000 We are.
02:12:27.000 We're back tomorrow!
02:12:29.000 We do have a cool guest.
02:12:29.000 You know what else I'd like to ask people to do?
02:12:31.000 Make a video and upload it.
02:12:33.000 Ooh.
02:12:33.000 Talking about what you feel about this.
02:12:35.000 Interesting.
02:12:35.000 Yeah.
02:12:36.000 Engage.
02:12:36.000 Create content.
02:12:37.000 Yeah.
02:12:37.000 Do stuff.
02:12:37.000 Post stuff.
02:12:38.000 And then when you get banned, you can tell your friends.
02:12:40.000 I got banned!
02:12:42.000 Can you believe this?
02:12:42.000 Because they... Man.
02:12:44.000 Anyway, we'll be back tomorrow at 8 p.m.
02:12:46.000 with apparently a cool guest.
02:12:47.000 I don't know who it is.
02:12:48.000 We do.
02:12:48.000 We do.
02:12:48.000 Okay.
02:12:49.000 It's awesome.
02:12:50.000 Okay.
02:12:50.000 We'll see you guys tomorrow.
02:12:50.000 Thanks for hanging out.