In this episode, we're joined by technology reporter Alan Bakari to talk about his new book, "Big Tech's Battle to Erase the Trump Movement and Steal the Election." We also talk about the Justice Department's new anti-trust law, and how far-left extremists are allowed to run wild on the internet.
00:02:13.000I'm the senior technology correspondent at Breitbart News.
00:02:16.000I've been covering technology for them for four years, five years actually now.
00:02:20.000And yeah, so this book is basically a summation of the past five years of work, this descent into internet censorship that we've all witnessed and experienced.
00:02:29.000You've actually exposed a bunch of this stuff, too.
00:02:31.000You know, we'll get into all this, but you covered, I think, the good censor.
00:02:36.000Was it you who released that footage of them, like, crying after Trump won?
00:03:04.000I was co-founder of Minds.com and an admin on the site for eight years, doing a lot of behind the scenes.
00:03:12.000Hands-on with with you know censoring and not sensor and realizing that censoring doesn't mean it's bad.
00:03:18.000Sometimes you have to censor things What we're seeing like now is the political manipulation for power right the versus Censoring things when you have to, because it might be more illegal.
00:03:57.000We have, uh, one of their stories now, this guy's basically saying, like, oh, we definitely can censor, you know, right-wing individuals, and... It's just, it just, it pains me to say, it feels so obvious.
00:04:07.000You know, it's great to get the confirmation, don't get me wrong, and I, I, you know, it's, it's awesome that we have Veritas doing this work.
00:04:13.000But man, to see it again and again and again and to know.
00:04:31.000We do the show Monday through Friday live at 8 p.m.
00:04:33.000And, I don't know, let's just get back into the conversation.
00:04:35.000So I don't even know where to begin because let's do this.
00:04:38.000Let's talk about this DOJ antitrust suit.
00:04:42.000So I've got the story right here from Fox.
00:04:44.000Lawmakers hail DOJ antitrust lawsuit against Google is long overdue.
00:04:49.000Senator Hawley called it the most important antitrust case in a generation.
00:04:53.000Today's lawsuit is the most important in a generation, Senator Josh Hawley said.
00:04:57.000Google and its fellow big tech monopolists exercise unprecedented power over the lives of ordinary Americans, controlling everything from the news we read to the security of our most personal information.
00:05:08.000And Google in particular has gathered and maintained that power through illegal means.
00:05:13.000The DOJ suit alleges that Google has used its dominance in online search and advertising to stifle competition and boost profits.
00:05:19.000The suit could be an opening shot in a battle against a number of big tech companies in coming months.
00:05:24.000A Google spokesperson told Fox News, Today's lawsuit by the Department of Justice is deeply
00:05:29.000People use Google because they choose to, not because they're forced to,
00:05:32.000or because they can't find alternatives.
00:05:34.000We have a fuller statement this morning.
00:05:36.000Later, the tech giant released a lengthy blog post in which it said the DOJ complaint
00:05:41.000relies on dubious antitrust arguments.
00:05:44.000This lawsuit would do nothing to help consumers.
00:05:46.000To the contrary, it would artificially prop up lower quality search alternatives, raise phone prices, and make it harder for people to get the search services they want to use.
00:05:54.000The lawsuit says, for years Google has entered into exclusionary agreements,
00:05:58.000including tying arrangements and engaged in anti-competitive conduct to lock up distribution
00:06:03.000channels and block rivals. American consumers are forced to accept Google's policies,
00:06:07.000privacy practices, and use of personal data, and new companies with innovative business models
00:06:12.000cannot emerge from Google's long shadow. For the sake of American consumers, advertisers,
00:06:16.000and all companies now reliant on the internet economy, the time has come to stop Google's
00:06:20.000anti-competitive conduct and restore competition, it says.
00:06:24.000I think that was a whole lot of hot air and it is the wrong target and it's going to miss because I
00:06:28.000think Google makes a really good point.
00:06:32.000I do use DuckDuckGo sometimes, you know.
00:06:34.000So there are choices when it comes to search.
00:06:37.000The issue with Google is they're more than just a search engine and the DOJ lawsuit is very much focused on search and how they manipulate search.
00:06:46.000No, the issue is that YouTube, for instance, which we're on, is heavily subsidized by other areas of Google.
00:06:52.000Because of that, no other video platform can compete.
00:06:55.000And some argue, no video platform could survive with how expensive it is to do streaming, to do this.
00:07:01.000I mean, I'll tell you guys, we're doing this show right now, cost me nothing.
00:07:04.000To press live on this stream, so that all of you can watch, nothing.
00:07:09.000Google just says, if we make money off it, we'll take a cut.
00:07:12.000So that can't exist on a lot of other platforms, because it's really expensive.
00:07:25.000You were telling me before you don't think this antitrust stuff is going to do anything.
00:07:28.000Well, it's important and it's good that the Republicans are punishing Google and being seen to punish Google because I think the idea that a lot of tech companies have had for a long time is we don't have to listen to the warnings of Republicans about censorship because they're never going to regulate us.
00:08:26.000So it'll be very good for companies like Yelp, but it doesn't really focus on, I think, the real problem of Google, which is its vast power over political information, its ability to swing elections, and its ability to destroy people's livelihoods.
00:08:39.000Like, if you're a website, if you run a website and it goes from page one of Google to page 1,000, it's going to have a huge impact on your business.
00:09:38.000I guess kind of like what you're saying.
00:09:40.000But then when you have Google, Alphabet I suppose now, controlling all of these different industries in one big massive corporation, Maybe the issue isn't search.
00:09:50.000Maybe the issue is search needs to be separate from video, needs to be separate from email, needs to be separate from, you know, Google Drive and all that stuff.
00:09:57.000If you really want competition among the search engines to give, you know, smaller players like DuckDuckGo an advantage, one of the things one of my Google sources told me is that what you need to do is separate the data from the rest of Google.
00:10:08.000I have a friend who's doing this big push for owning your own data.
00:10:46.000It's also the basis of their ability to politically manipulate people, which we can get into.
00:10:50.000Actually, you know, I guess in that regard, I would say their search is a monopoly.
00:10:55.000It's an interesting phenomenon we're seeing with social media, Twitter, Facebook, and Google, where the only reason they're a monopoly, they dominate these spaces, is because no one uses anything else.
00:11:05.000And because no one uses anything else, there can be no competition.
00:11:11.000I can't blame Twitter because everybody's on Twitter, but because everybody's on Twitter, no one does anything else and it's creating monopolistic problems.
00:11:18.000There's an argument that these are natural monopolies because you want to go to a social network that has the most amount of people on it.
00:11:25.000That's where you'll get maximum viewership for your content.
00:11:29.000And with search, the advantage gets locked in once you have that amount of data, because once you have it, your competitors by definition don't have it.
00:11:39.000This is why they've gone into smartphones, they've gone into browsers, they've gone into laptops, because that's just more ways to gather data.
00:11:52.000But I also think you have to free their software code because if it's private code and they give you data, it could be false data and you wouldn't know.
00:12:00.000So the only way to verify that the data is real is if you understand what the code is measuring.
00:12:35.000Well, there needs to be a lot more transparency.
00:12:38.000What I would like to see, especially with regards to political manipulation, is how do they train their so-called hate speech algorithms and their disinformation algorithms.
00:12:46.000So when you're training an algorithm, the way you do it is you're training it to recognize language if you're trying to detect certain types of speech.
00:12:53.000And you'd have to train it on a set of data.
00:14:18.000There are some senators who are very good on the issue, Hawley, Cruz, Blackburn, but like the vast majority of Republican senators, they can't be trusted on this issue either.
00:14:28.000But I will say the second point, the left supporting censorship, this is extremely naive.
00:14:32.000If you think these technologies that have been developed by Silicon Valley won't one day be turned against the left, the anti-war left especially, Yep.
00:15:28.000And that includes, for the most part, the larger group, the conservatives, those who support Trump.
00:15:32.000So, most of the book is based on stuff from Silicon Valley sources.
00:15:36.000But, by the way, if you want to get it, it's called Deleted, and you can find it at deletedbook.com.
00:15:40.000But there's one source I talked to from the government, and he's been following the censorship issue for a long time.
00:15:45.000And the way he put it, I think, was very chilling.
00:15:47.000He said, Up until 2015-2016, Western establishment elites saw internet-free speech as a good thing because it helped them regime change and destabilize countries abroad.
00:15:58.000That was, you know, Twitter, the Arab Spring, Eastern Europe as well, color revolutions.
00:16:03.000But, he said, as soon as you had Brexit, as soon as you had Donald Trump, establishment elites realized, oh no, this technology can be used to regime change us.
00:17:12.000So this is strategic forensics, it's, I guess it's like- So I do remember this, the big Stratfor thing, yeah, yeah, I do remember this.
00:17:17.000And so, in it, we learned that, I think the US Air Force was purchasing sock puppet accounts.
00:17:22.000These are, a sock puppet account is a dummy, it's a bot, that's what they say, bots.
00:17:27.000It's an account with a fake picture and a fake name, and then one person controls 50 of them, they're called sock puppets, to create the perception of popular opinion.
00:17:35.000This is one of the biggest problems that's affecting big corporations today.
00:17:39.000Like, Pepsi will put out a commercial and it's like a guy drinking, you know, and throwing a football, and they'll get inundated with like 50 to 100 messages on Twitter saying, you're racist, and then all of a sudden they're like, we're being attacked, what do we do?
00:18:03.000Middle Eastern intelligence operations and persuasion, psychological operations, and things like that.
00:18:08.000So, this has been an ongoing thing, and at some point, you had the political parties realize they can do it, and I gotta be honest, I think the Democrats figured it out way before the Republicans.
00:18:19.000They were using Facebook and all that stuff.
00:18:21.000First, I mean, ActBlue, which you're familiar with.
00:18:23.000ActBlue is the progressive, you know, fundraising—it's a progressive GoFundMe, essentially.
00:18:28.000And then Republicans only launched WinRed recently, and that's the, you know, the Republican version of it, so the Democrats have had You know, have been on the forefront of this.
00:18:37.000But something weird happened in the Trump era, like in the 2015 with the Trump army, the Trump train, the memesmiths.
00:18:45.000All of a sudden you had an organic explosion of people who are actively producing content and propping up Trump.
00:18:50.000Something that they couldn't pay for, you know?
00:18:52.000Yeah, and you hit the nail on the head.
00:18:55.000This is stuff the Obama campaign was doing in 2012.
00:18:58.000Cambridge Analytica was just a smaller scale version of what Obama was doing in 2012 with Facebook's help.
00:19:04.000And it was a pretty senior Obama campaign person who revealed that, who was working directly on it.
00:19:10.000She said that Facebook actually helped them and gave them assistance in 2012.
00:19:26.000The reason there was this panic after Trump's win was because all of these people, the foreign policy establishment, had been doing this for years.
00:19:34.000They started accusing everyone else of, you know, using bots and sockpuppets and disinformation.
00:19:38.000Of course they were terrified that that would happen to them because they'd be doing it for so long.
00:19:42.000So they couldn't imagine that this movement might actually be organic, might actually be real people.
00:19:47.000They thought it was just the same thing they'd been doing for years and years.
00:19:52.000It probably was like a kid in Russia in an apartment, and they traced some IP, and they got like one.
00:20:01.000And they were like, oh, the Russians, the Russians, it's the government, the Russian government.
00:20:04.000And you're like, no, actually, it was just some dude who maybe had a Russian IP.
00:20:08.000Well, the story now, I guess, is that it's such a weird and complicated conspiracy of nonsense.
00:20:16.000Was it Ratcliffe who issued the statement basically saying that Hillary Clinton created the Russian intelligence, believed that Hillary Clinton was going to create a fake campaign accusing Trump of working with them to get the press cycle off of her email story, and that Obama was briefed on this so they knew?
00:20:37.000At least the insinuation is Hillary Clinton created the fake idea just out of the blue.
00:20:40.000It seemed like it came right after her email scandal dropped.
00:20:52.000It convinced, well maybe they were in on it, I don't know, but all these well-funded foreign policy NGOs that are European NGOs, American NGOs, Just a few days ago, we published a story at Breitbart News about the German Marshall Fund.
00:21:06.000This is one of these extremely well-funded foreign policy NGOs, very focused on Russia, very focused on the Balkans.
00:21:12.000They're the ones who have been pushing the disinformation narrative, the Russian disinformation narrative, for four years now, and pressuring the tech companies to censor what they call disinformation.
00:21:23.000Another one of those groups, the Atlantic Council, are working directly with Facebook.
00:21:28.000And this other one, the German Marshall Fund, they've actually said that Breitbart News, Fox News, The Blaze, basically every conservative news website needs to be suppressed by Facebook.
00:21:37.000They need to apply friction, because in the words of this think tank, which by the way is funded by the US government, and foreign governments, and the German government, sounds like foreign interference to me, I don't know how that word's defined these days, but they said Facebook has to apply friction, which is their word for suppression of all these so-called deceptive news sites.
00:21:56.000There's a funny meme that, someone posted this on 4chan a long time ago, that any sufficiently free internet space becomes right-wing, and the left can only maintain the space if they have hard moderation and censorship.
00:22:11.000If you look at Facebook, in spite of the censorship, the top posts are still Daily Wire, Dan Bongino, Fox News, etc.
00:22:19.000I think that's right, and I think it's in any period of history where you have one faction, social faction, political faction, whatever, trying to maintain an ideology based on complete myth.
00:22:31.000Which many of you know, the assumptions of the cultural left are, you know, gender is a social construct, all of these things.
00:22:39.000As soon as you have free speech, that'll be threatened, and the most popular people will be the people who challenge that.
00:22:44.000The only way they maintain this in popular, that's what cancel culture is.
00:22:49.000You know, you see what happened in San Francisco with that guy?
00:22:51.000The free speech rally got his teeth knocked out?
00:22:54.000Think about how crazy it is that this guy decided to step up and say the billionaires are bad, so Antifa shut up and punched him in the face.
00:23:19.000You've got the neocon never-Trumpers who ran full speed of the Democrats after Trump booted them out.
00:23:24.000You've got Antifa beating people up on behalf of them.
00:23:29.000A guy comes out and he's like, yo, it's really bad that international, you know, multinational billion-dollar corporations are restricting the rights of the people.
00:24:00.000I think the most frustrating thing about the social justice warrior left, the Antifa left, the left that'll punch you in the face, is that they pretend to be anti-establishment.
00:24:09.000They've been doing that for years, but actually they've found themselves in alliance with elites and with corporations because they're the only powers in society that actually suppress the truth.
00:24:20.000And they need the truth to be suppressed because their entire ideology is based on nonsense.
00:24:31.000Yeah, I think they're being manipulated by the powers because they're not smart.
00:24:36.000I don't want to say they're stupid because it's a broad generalization, but anyone that's out on the street just doing random street violence is pretty dumb.
00:25:16.000Bernie was fairly moderate on gun issues when he said it was a rural versus urban debate.
00:25:21.000Because he's in Vermont and he's like, we don't have the same problems.
00:25:23.000Because people in Vermont like their guns.
00:25:25.000You had him saying the TPP, the Trans-Pacific Partnership, was a problem, and we've got to...
00:25:31.000What happened to that guy? All of a sudden now he's like, the billionaires, but they're all
00:25:34.000funding Joe Biden, so go vote for him. It's like, he immediately just said, please let me in the
00:25:39.000establishment. Trump on the other hand kicked the door in, you know? You know, an alliance between
00:25:43.000the populist left and the populist right is probably the thing that terrifies the establishment the
00:25:47.000most. I kind of think that's... If you look back through history, there's always, you know,
00:25:52.000new political consensus is formed to replace the old ones.
00:25:56.000The Reagan consensus, the neoliberal consensus replaced the FDR consensus.
00:25:59.000There's a new consensus waiting to happen, the populist left and the populist right are forming the nucleus of it, but it's being held back by this establishment class that simply won't let go.
00:26:09.000I think that was starting to happen during Occupy Wall Street.
00:26:12.000So, the early days of Occupy Wall Street, when I was down there, I remember seeing a really, like, a 60-year-old man and woman, and they were sitting in chairs with an American flag.
00:26:20.000Nobody complained, and they had it set up, and people were talking about it, because the core of what brought people out was, the big banks in Wall Street are corrupting our system, it's revolving door politics, the whole thing is broken, and you had conservatives, libertarians, and leftists who had shown up.
00:26:40.000The older people who could not live in the park ended up leaving, the younger people did, and then these wealthy college-educated Brooklynite kids with trust funds, not all of them but a handful of them, I know them personally, they are trust fund kids, started preaching intersectionality and identity politics.
00:26:55.000And then all of a sudden, you end up with someone, you know, Serena Williams, who is a black woman worth tens of millions of dollars, and she's oppressed, and the homeless guy sleeping under the bridge is an oppressor because he's white.
00:27:06.000What that did was it fractured any possibility of the populist left and right coming together.
00:27:11.000Because now you got AOC who is full on board with overtly racist ideology.
00:27:16.000That is like... It's freaky how racist these people are.
00:27:19.000They're calling it anti-racism and then literally saying, but we should have racial discrimination.
00:27:28.000It's funny because the ideology they espouse, they're like racism is prejudice plus power.
00:27:34.000Therefore, only white people can be racist.
00:27:37.000Anti-racism is quite literally the same thing when it's for minorities and you discriminate against specific minorities and other racial groups for the benefit of particular racial groups.
00:28:07.000Machine learning fairness is an entirely new field set up in universities, and it's being funded by Silicon Valley, and they have their own machine learning fairness departments now.
00:28:16.000The point is to bring computer science and critical race theory together, and left-wing sociology and left-wing feminism.
00:28:25.000They want the assumptions of those left-wing fields to be embedded in computer science.
00:28:40.000And pattern recognition, noticing data, analyzing data, is actually inherently right-wing, because it busts politically correct narratives.
00:28:47.000Imagine if you train a machine to look at crime data, or you train a machine to look at, you know, the data of women going into certain fields or certain educational fields, you know, or, you know, train a machine to who's more likely to commit a terrorist attack around the world.
00:29:03.000Like, that's going to spit out some very politically incorrect conclusions.
00:29:06.000So this is why they created machine learning fairness, to bias the algorithms.
00:29:37.000If you go on Wikipedia, which is a very formulaic and logic-based system, granted it's subject to manipulation and bias, but the way it works is very simple.
00:29:50.000So, you go to Wikipedia, and you have this web of all of these different articles that interlink with each other.
00:29:57.000Go to Woman on Wikipedia, and what does it say?
00:30:00.000It says, a woman is an adult human female.
00:30:02.000And then you click female, and it says, barring, you know, certain abnormalities, irregularities, females are the members of the, you know, species that produce ova, etc, etc.
00:30:12.000And then if you go to male, that produce a sperm, and then it says, man, an adult human male.
00:30:19.000Then go to trans woman on Wikipedia, and it gives you this very different definition.
00:30:25.000So, it specifically says that The point I'm trying to make is, how do you have the idea that a trans woman is a woman, but these two articles can't connect to each other?
00:30:37.000So if a woman is an adult human female, then a trans woman can't, you know, according to Wikipedia, be a woman because a woman produces ova, so it doesn't make sense.
00:30:45.000The logic of the system is impeded by the fracturing of what the ideology represents.
00:30:51.000Which is why so often you hear anti-SW types say, define woman.
00:30:55.000Because we can very easily using a system like Wikipedia, but in critical theory you can't.
00:31:01.000So the point I'm getting to is if you try... Well, this is why they have to invent the rhetorical trick of, you know, gender and sex are two separate things.
00:31:06.000So you can still have the biology, but you also have this concept of gender.
00:31:12.000The problem is that when you have Wikipedia telling you... So, okay, what they need to do now is they need to go in and they change the definition of what woman is.
00:31:19.000So that's why you get anti-CW people saying define woman because the official definition on Google, Merriam-Webster, and in Wikipedia is adult human female.
00:31:27.000What they should do is make trans woman one word and have a completely different definition.
00:32:15.000What we're trying to do with- Well, there will be some difficulties like that, but at the end of the day, what these leftists are trying to do is Every definition that these algorithms are trained on, everything they're trained to recognize, whether it's hate speech or misinformation, the people doing the defining, the people training the algorithms how to recognize those things, will be left-wingers.
00:32:35.000There will be some cases where, like gender, where the definition is very very tricky for the reasons you mentioned, but there will be other cases where The algorithm will only recognize the left-wing definition of something.
00:32:44.000Is this all stemming from liberal college kids like Zuckerberg, Larry, and Sergei just being, happen to be the ones that coded it from the beginning?
00:33:00.000And Peter Boghossian, for instance, he believes, and this was a while ago we had this conversation, so I don't, you know, maybe his opinion's changed.
00:33:06.000that the intersectionality emerged through the colleges in the eighties and all the stuff.
00:33:12.000My argument was maybe, but it only exists today because of the accident of the Facebook
00:33:42.000As the site grew bigger and bigger, people started following and liking more and more pages.
00:33:45.000You end up with someone who, on average, had 300 friends, but also liked 300 pages, and their feed was so fast, they couldn't read through it.
00:33:52.000So Facebook said, let's create an algorithm to make sure we're showing them what they most likely are to enjoy and click on.
00:33:59.000What ended up happening was, around the same time, digital blogs started popping up, and it was partly because of Facebook they started making money becoming successful.
00:34:07.000The articles were getting shared on Facebook, Facebook liked that because it was creating activity on the platform, and then these companies like Huffington Post, et cetera, started making money.
00:34:16.000But something interesting happens when you incorporate Facebook constantly updating its algorithm to work better and better and better.
00:34:22.000We ended up getting waves of police brutality videos.
00:34:25.000There was a period where there was one website that was ranked global top 400 or something, when it was nothing but police brutality videos.
00:34:33.000Because it was what people were clicking on.
00:34:35.000So Facebook kept shoving it in people's faces, just beating them over the head with it, like, this is what you want!
00:34:40.000And it was shockingly crazy footage of cops just beating people and beating people.
00:34:45.000So what happens is the news organizations, the blogs, that started writing shock content started seeing more traffic and making more money.
00:34:53.000Eventually they learned, wait a minute, there's more to life than just police brutality.
00:34:57.000There's also racism and sexism, right?
00:34:59.000So some article, some website started writing racism, racism.
00:35:16.000Police brutality targeting, you know, black people.
00:35:19.000And boom, Black Lives Matter pops into existence.
00:35:21.000So what happens is these news organizations realized, and whether it was on purpose or not, They made more money when they wrote about all of these subjects combined.
00:35:32.000Like, Vice had one article that was really funny that it was like, it was like black trans women of color fighting back against police brutality for Black Lives Matter, and it's like, they just jammed every possible keyword in there, because Facebook would then share it with more communities, they would get more traffic and make more money.
00:35:47.000Thus, intersectionality was perfect for this algorithm.
00:35:51.000It started to emerge because the algorithm was trying to find what, you know, people would want to see the most.
00:36:34.000But isn't that how politics has always worked though?
00:36:37.000It's always been reaction and anti-reaction.
00:36:39.000I think what was happening on social media was the things people wanted to talk about that maybe the establishment class didn't want to talk about.
00:36:45.000They were sort of out of touch by then.
00:36:49.000But it was still a battle between two distinct sides and I think the anti-SGW side was actually winning because they had the facts, they had the arguments on their side.
00:36:57.000Their content was going just as viral.
00:36:59.000They were the ones who got Trump elected.
00:37:01.000So if the social media companies hadn't clamped down, I think this would have resolved itself naturally.
00:37:06.000I don't know if it would have resolved itself.
00:37:48.000One where the fringe elements of the right have been purged and all that's left are suit-wearing, you know, milquetoast vanilla conservatives.
00:37:55.000Yeah, who won't do anything about social media censorship, by the way.
00:37:57.000But but but the left is is have you seen this video of the six it's six videos of women in their cars Screaming at the top.
00:38:08.000So you have these videos where these women are like back Damn, they're screaming!
00:38:15.000I tell you this, you take a milquetoast vanilla conservative wearing a suit saying, you know, I just think that, you know, we should work hard and make money and then you put that next to the women screaming at the top of their lungs, what's a regular person gonna think?
00:38:51.000The fact that you have these women in their cars screaming like lunatics, it's because they're so heavy-handed in their manipulation.
00:38:58.000That these people are trapped in a paranoid and delusional reality where Trump is literally Hitler and he's taking kids and throwing them in cages and they're freaking out and panicking because they see it all day non-stop and they can't break out of this just trap from big tech.
00:39:13.000Well, I mean, the liberals' media model since 2016 has been, you know, Trump shock content.
00:39:17.000We're going to shock everyone with how evil and bad Trump is, and that's what's leading to the screaming women in their cars.
00:39:24.000But I will push back on the idea that only allowing the reasonable conservatives to stay on social media is somehow a help to the movement, because They're precisely the ones that are least likely to push back on this craziness.
00:39:36.000They're the most likely to just apologize, say, oh you called me a racist, I can explain 10 reasons why I'm not a racist, and that's not convincing to people, that's not persuasive to people, it just looks weak and stupid, and you don't really challenge the arguments.
00:39:49.000Certainly for like elections it might be a help because voters, undecided voters, moderates will just see the craziness of the left And respond accordingly.
00:39:59.000But ultimately, I think, you know, if Trump loses the election, it will be big tech that stole it.
00:40:04.000And this is the whole thesis of the book.
00:40:07.000When you start manipulating search results on Google so that people, you know, it all comes down to undecided voters.
00:40:13.000So you might be like a heavy news consumer, especially if you're watching this show.
00:40:17.000You're very clued up on all the issues, very hard to manipulate you.
00:40:19.000But think about what an undecided voter Someone who doesn't follow politics every day is seeing when they go on to Google to find out about the two candidates.
00:40:50.000The journalists are in, you know, it's really funny, it's like, here's the way I see it, they're all in the ivory tower, and they're sitting in this room with each other, all laughing and sharing fake news with each other, thinking they're in the real world.
00:41:38.000So this is a quote from a Facebook source.
00:41:41.000The plan for polarization is to get people to move closer to the center.
00:41:45.000We have thousands of people on the platform who have gone from far right... This is the way Facebook defines far right, by the way, not necessarily the way we define it.
00:41:54.000Who have gone from far right to center in the past year.
00:41:58.000So we can build a model from those people and try to make everyone else on the right follow the same path.
00:42:06.000He goes on to explain how this would work.
00:42:07.000Let's say everyone who goes from far right to center watched video X. Then maybe we adjust the priority of video X in the feed.
00:42:14.000But it's probably going to be much more complicated than that.
00:42:18.000We can see the bans of the New York Post, we can see these high-profile bans.
00:42:22.000We don't see this very subtle manipulation that's going on behind the scenes, pinpointing exactly what your beliefs are and tailoring the content to those people.
00:42:32.000They used to do that, as you say, just based on how interested you are in a certain topic.
00:42:36.000That's what would drive the algorithm.
00:42:37.000But it's gotten much more specific than that, and much more focused on changing people's opinions.
00:42:43.000This is like their attempt at brainwashing, whether they work or not is another question.
00:42:54.000So what they're doing is you've got like a regular, you got a working class dad who's like firing up the grill and having a burger and they're like, oh, that far right guy, he's a white supremacist.
00:43:02.000Better feed him content to make him a centrist.
00:43:26.000And that's why you see a video of these women screaming at the top of their lungs, crying and like, ah!
00:43:32.000You know, look, there are a lot of us, I think the people who are watching this for the most part, you have sought out information.
00:43:39.000It's the big difference between those who watch the mainstream media and get wrapped up in the fake news, and the people who watch shows like this.
00:43:47.000Sometimes YouTube does recommend the clips we do every day, but like I mentioned, my main two channels, Timcast and Timcast News, are not on Google.
00:44:25.000This is another area where we've actually literally caught them red-handed.
00:44:29.000So back in early 2019, I got a leak from a Google insider, and he literally called this leak the smoking gun proving Google's bias.
00:44:38.000Obviously lawmakers did nothing about it because, you know, we don't have enough smoking guns.
00:44:42.000But this was YouTube's controversial search query blacklist.
00:44:46.000It's an actual list they have inside YouTube with a list of so-called controversial search queries, and whenever they add a term to that list, What it does is it tells the algorithm to prioritize all the videos from the mainstream media and shut out anyone that doesn't meet YouTube standards, meets YouTube's approval.
00:45:03.000And the reason I found out about this list was because a left-wing journalist had reached out to Google saying, look at all these pro-life videos in the top 10 search results for abortion.
00:45:13.000Within hours, Google goes in, they alter that file, they reorder the search results completely.
00:45:19.000Do you know the name of that journalist?
00:45:20.000I was a Slate journalist, if I recall.
00:45:44.000I send a message saying, I couldn't help but notice that the Proud Boys were eating your nachos.
00:45:49.000Do you support white supremacy, and why do you support white supremacy?
00:45:53.000And they immediately respond with, we hereby disavow, we want nothing, and then the journalist puts out a message, Nacho Chip Company disavows the Proud Boys.
00:46:00.000And this is what they did to PewDiePie, right?
00:46:53.000So I talked to people, I was talking to these guys recently, and they wanted to get like a photograph and stuff, and I said, I'm not interested in doing photos with other companies.
00:47:01.000And I was like, because of the political ramifications.
00:47:03.000And they're like, no, no, no, no, trust me, these companies don't care.
00:47:05.000And I said, no, no, no, you don't understand.
00:47:07.000When the company ends up getting inundated by Antifa and the far left, and then not caring about me issues some stupid statement insulting and defaming me, then I have to deal with them, not the wacko far left.
00:47:19.000If I'm seen in a photograph or, you know, I don't think I have to worry about this too much, but if there's somebody who is seen in a photograph with, say, Coke, a Coca-Cola executive, then Coke's gonna be like, we don't know who this guy is or care, disavow.
00:47:32.000And they'll put out a statement saying white supremacy is wrong and we hereby disavow this individual, thus creating a newspaper story calling you a white supremacist.
00:47:57.000So before the Internet, if there was a political scandal, if you were in the news for some bad thing, first of all, it generally only happened to politicians and celebrities, people who were public figures.
00:48:06.000Second of all, you know, it's in the news for a day, it's on the TV, it's in a newspaper, then the newspaper gets chucked in the bin, TV moves on to the next thing, it goes away.
00:48:15.000On the internet, it doesn't go away, the internet is forever.
00:48:17.000As soon as any, you know, online news site, BuzzFeed, the Daily Beast, write something about you, and by the way, they do it to everyone now, not just public figures, as soon as they write it about you, it's on your Google results forever, and Google will prioritize it, and Wikipedia will cite it, and they won't cite articles from right-wing media potentially debunking it.
00:48:34.000So this is what I call the defamation engine.
00:48:37.000It's the connection between the media, Wikipedia, and Google.
00:48:46.000I think Wikipedia is in serious trouble.
00:48:48.000We were talking a little bit about this.
00:48:50.000I'm going to be a little light on the details, just because...
00:48:54.000I don't want to exacerbate anything, but there is a group of individuals who have a forum that's very active and they've come up with an operation that's very, very, very clever.
00:49:03.000The idea is to go on Wikipedia to random articles, nothing important, and put edits into random articles that are inane and arbitrary.
00:49:13.000For instance, they'll go to the Wikipedia page for cardboard and put, you know, Hans Schmidt in 1932 developed a new means of manufacturing cardboard.
00:49:27.000And so what happens is enough of these edits bypass the vandalism filters from the actual Wikipedia editors.
00:49:34.000So if someone goes to, say, Joe Biden's Wikipedia page and says, you know, Joe Biden is an abusive father whose son's a crackhead or whatever, they'll immediately jump in and say, get rid of this vandalism.
00:49:43.000If someone goes in and says, you know, Joe Biden was using his son as an intermediary to make money off of Chinese equity investment and cite Breitbart, they'll say Breitbart is unreliable.
00:49:55.000But if someone goes to the Wikipedia page for cheddar cheese and writes, Alan Bakari was a famous dairy farmer in 1871 who developed a new process for manufacturing cheese that lowered the price dramatically, nobody notices that, nobody cares, and it could sit there forever.
00:50:11.000Now imagine you get thousands of people doing this.
00:50:14.000So that's essentially one of the ideas.
00:50:17.000Wikipedia becomes completely unreliable because it's full of fluff.
00:50:21.000Fake factoids that no one can tell is real or fake.
00:50:24.000So the whole thing is just questionable now.
00:50:45.000Adler, who edited Wikipedia as one of, you know, a really prolific editor for over 15 years.
00:50:51.000So you're calling your own former editors unreliable, if that's what you're saying.
00:50:55.000It's completely broken when you look at, like, Vox is credible and Breitbart isn't.
00:50:59.000And I'm like, Vox has published ridiculous, you know, without getting into naming people, there are some people whose opinions flip depending on what's politically expedient.
00:51:11.000Very famous individuals, lefty, Vox, high ups, higher ups, and they'll tweet something like, we should do this with the Supreme Court, and then like a day later, like, we should do the opposite.
00:51:21.000Whatever works for their politics, it's very, very obvious.
00:52:30.000Sometimes you'll hear a story about a lawsuit and they lose and it's like, I get it, but people need to actually start making these challenges.
00:52:36.000You know what's going on with Patreon?
00:52:39.000I know there's some legal action happening there.
00:52:40.000They actually did make a legal breakthrough, if I remember correctly.
00:52:43.000So I don't know exactly where we're at recently, but this was like a month or two ago, that basically Patreon's gonna be on the hook for tens of millions of dollars because of arbitration.
00:52:53.000Whether they win or lose, they have to pay up front.
00:52:56.000So Patreon could be over, as far as I know.
00:53:00.000And I don't have all the full details in front of me, but the general idea was They banned several people, and they had this provision that you couldn't sue them, you had to go to arbitration.
00:53:09.000So then, these people were like, hey everybody, file a complaint, sue them for five bucks or whatever.
00:53:14.000So Patreon then says, we're going to arbitration, and they say, great, you've got to front X dollars to the arbiters, to the arbitration.
00:53:22.000We're talking about people who have tens of thousands of followers.
00:53:25.000And so all of those lawsuits, all those arbitration claims, and then all of a sudden Patreon's on the hook for tens of millions of dollars they can't pay.
00:53:40.000It's because Patreon is a financial company in some way as well.
00:53:43.000Wikipedia I think it's a lot more difficult because they can claim traditional Section 230 protections because they're just hosting what all these anonymous editors say.
00:54:23.000They know that their case is bonk, but then the lawyer's like, listen, it'll be 10 grand to settle, it'll be 200 to go to court, so just pay them off.
00:54:30.000So right now what's happening is Patreon's like, we're going to- So this is why they sued.
00:54:34.000They said to the judge, we're gonna win this.
00:55:02.000It may work on Wikipedia, actually, because the Wikipedia Foundation has a few hundred million, I think, but it's not as insanely wealthy as a Google or a Facebook.
00:56:08.000Think about how amazing it is on Wikipedia that you could do an hour-long lecture on the dangers of white supremacy and the evil of white supremacy, and then they will call you a white supremacist or alt-right.
00:56:21.000And when you try telling them that you actually campaign against it, and part of my work is criticizing it, they say, you're not reliable.
00:56:28.000Well, actually, they'll give a concession to the left on this.
00:56:31.000If you look at... I did a chapter on Wikipedia, and while I was researching the chapter, I was comparing the Wikipedia pages of people like Sean Hannity and Lou Dobbs to people like Rachel Maddow.
00:56:43.000And at the top of Rachel Maddow's page, you see Rachel Maddow, in an interview with... I can't remember the exact channel, described herself as, you know, a political moderate with some left-wing beliefs, blah, blah, blah.
00:58:16.000It's like the Encyclopedia Britannica is liable for defamation.
00:58:19.000If the Encyclopedia Britannica defames you, you can sue them.
00:58:22.000Wikipedia is a more powerful and more widely read version of that.
00:58:26.000The fact that it's not liable is just insane.
00:58:29.000And you know, like I said, I agree with Twitter, with Facebook, they publish Well, they don't publish, but they host millions and millions of content, millions and millions of posts, new ones every day.
00:58:38.000So yeah, sure, if you held them liable for all of those posts, of course their business model would not make sense.
00:58:43.000But Wikipedia is just big pages like an encyclopedia.
00:58:48.000It would not be crippling to them if they were liable.
00:58:50.000When I go to the page, Alan Bakari, there are no users.
00:58:55.000It says, Wikipedia, Alan Bakari, 19th century dairy farmer.
00:58:59.000That is a statement of fact under the Wikipedia banner and no one else is.
00:59:04.000They would create more stringent posting policies and verification.
00:59:08.000And they would create a two-tiered system.
00:59:10.000Right now, anyone can go in and make an edit.
00:59:12.000Oh, so maybe on Wikipedia it could say, after every statement of fact, it would show who wrote it and when.
00:59:15.000talking about where this operation is going to go in and add inane factoids.
00:59:18.000Oh, so maybe on Wikipedia it could say after every statement of fact it would show who wrote it and
00:59:23.000when. Like a little subscript. So what would need to happen is if they could be sued for defamation
00:59:28.000then what would happen is you would submit your edits and then it would go to an
00:59:32.000And I'm generally in favor of online anonymity.
00:59:35.000and that would protect them more so from liability, though they'd still be responsible for suits.
00:59:39.000You could make the editors liable for defamation, except they're all anonymous.
00:59:43.000And I'm generally in favor of online anonymity.
00:59:45.000I think it allows people to challenge taboos and have discussions you can't have in public,
00:59:49.000especially in the age of cancel culture, anonymity is important.
00:59:51.000But the last people who should have anonymity is a Wikipedia editor, who have more power
00:59:55.000over people's reputation than anyone else.
00:59:57.000They should be accountable for what they write.
00:59:59.000It could destroy someone's reputation.
01:00:01.000I was at an event in the UK years and years and years ago.
01:00:06.000And they found me through Occupy stuff and they wrote the bio for my introduction off of Wikipedia and it was one of the funniest things.
01:00:17.000Tim Pool invented a zeppelin and I was just like, excuse me sir, you clearly just pulled that off Wikipedia.
01:00:24.000You have no idea who I am, you have no idea what I've done, and there were other instances where people had been cancelled because of negative articles on their Wikipedia pages.
01:00:33.000The first thing people see when they search you is the card on Google that says Wikipedia.
01:00:42.000It doesn't matter what you want to credit, it matters that people just use it.
01:00:46.000Which is why there's a group of people now that want to add inane and innocuous random edits to completely break the system because then people are going to be like, what?
01:01:06.000It was this guy who made it up, and they said, calling it Gel-Man Amnesia Effect makes it sound more prestigious and like official, and it was kind of a joke.
01:01:13.000What it means is, you pull up a newspaper, and let's say you're an expert on big tech censorship.
01:01:19.000And you see a story on the front page of the Washington Post that says, there is no big tech censorship.
01:01:23.000And you laugh, saying, I know that's true because I've actually talked to these people.
01:01:44.000So what this operation ends up doing is, eventually someone is going to be a dairy farmer, and knows how to make cheddar cheese, is going to be like, there's no al-Bukhari.
01:04:24.000And it has been weaponized, that's for sure.
01:04:26.000I know people whose job it's called reputation management, and they know how to manipulate, and they have high-powered Wikipedia editors who look like regular people who just want to, you know, help out.
01:04:37.000When in fact they're paid seven figures to control Wikipedia.
01:04:41.000And so there are princes and princesses and politicians across Europe and the US who hire these companies to go in.
01:04:48.000They know how to remove articles from the front page of Google.
01:04:51.000They know how to manipulate the search engine algorithms to get them off.
01:04:54.000They know how to spam Google to get other sources pushed to the top.
01:04:57.000And they know how to place stories in reputable sources so they can control your reputation.
01:05:04.000Now if you're not rich enough to pay them then they just call you, you know, alt-right and whatever and then there you go.
01:05:09.000It's interesting too because it feels like Wikipedia is a place where wealthy people of prominence can control the narrative, but poor people who become prominent just end up as white supremacists.
01:05:32.000Let's talk about anonymity, because that's one of the big issues with Wikipedia that you brought up, and we talked a little bit about this earlier.
01:05:38.000I'm torn on whether or not people should be allowed to be anonymous on the internet, because what ends up happening is sock puppets.
01:05:45.000You get one guy controlling a hundred accounts, convincing Oreo to, like, abandon their, you know, pumpkin spice flavor because it's offensive to Native Americans or some other garbage.
01:06:04.000And they're less likely, and they're not going to get away with sock puppeting and manipulating these companies.
01:06:08.000The problem with more respectful is also more respectful of the people in power, the people who control the narrative, the people who can destroy them.
01:06:16.000I think in the age of cancel culture, and that'll be fake respect by the way, yes I respect the Soviet Commissar because if I don't respect the Soviet Commissar I'm going to the gulag.
01:06:24.000That's what happens when you make everyone accountable for what they say and you don't give people private spaces to discuss controversial topics.
01:06:31.000I think in the age of cancel culture anonymity couldn't be more important.
01:06:35.000Even before the end of the founding fathers thought it was important to, you know, have a free and open debate about the Constitution.
01:06:40.000They wrote under various pseudonyms in the Federalist Papers, right?
01:06:45.000French dissidents under the monarchy in the 1700s wrote on a pseudonym, you know, Voltaire being the most famous.
01:06:51.000So it's been used for a long time for dissident thinkers to express controversial ideas with that while avoiding the consequences of the powerful forces of the day.
01:07:10.000So one of the most terrifying sources I spoke to in the book, by the way, deletedbook.com if you want to get it, is someone who doesn't work for one of the big companies, someone who works for the ad tech industry.
01:07:20.000Now the ad tech industry has a huge interest in de-anonymizing people because they want to find out exactly what people are interested in, even when they're posting anonymously.
01:07:29.000So what she told me is that they've actually paid people to scrape, say, YouTube comments.
01:07:33.000People commenting under this video might think they're anonymous, but actually, your comments may well be being scraped by one of these ad tech companies, stored on a database, so even if you delete it, they'll still have it.
01:07:44.000And then they're scanning the text to match your writing style to, say, your Facebook post or your Twitter post, because they just want to find out everything about you.
01:07:51.000It's quite a benign motivation, but you can easily see it being applied in non-benign
01:08:15.000So imagine you throw away your laptop, you throw away your mobile phone, you change your
01:08:18.000name, you travel to Outer Mongolia, you log on to a new laptop in Outer Mongolia, you
01:08:24.000start typing and instantly they know who you are because of your typing style.
01:08:28.000So what you do is you create a physical mechanism That you type on your keyboard and then it translates to actual physical robotic hands that perfectly separate each keystroke by a millisecond.
01:08:41.000Yeah, I will say that the point I make in the book is the privacy guys, the people who value privacy and anonymity, they've got to get ahead of this stuff.
01:09:22.000They just know too much about you to the point where they can predict seemingly innocuous like like unrelated things So what happens is I remember I went to I went to Walmart and they had these TVs on sale fuck 200 bucks And I remember looking at him again I go home and I go I'm on a computer and I see on Facebook the ad for Walmart TVs in the middle of the aisle just like I had seen and I was like And so it's really simple.
01:09:46.000My location data gave away that I was at a Walmart.
01:09:58.000I mean, I don't know how precise they can get, but the general idea is 18 to 35 year old male at a Walmart probably wants to play video games.
01:10:40.000Facebook's machines know the average duration between when a person of a certain height, weight, and age has to go to the bathroom.
01:10:47.000They know when you wake up, because they know when the phone's not moving and when the phone's moving.
01:10:51.000They know when the phone goes out for lunch.
01:10:54.000You're at your office, then all of a sudden you're at Hardee's or whatever, you know, they have your location services.
01:11:00.000And if they don't have yours, they have your friend, who's saying, hey, I'm getting you the burger at Hardee's, or, hey, I'm at the counter right now, meet me at the front, I'll order for you.
01:13:18.000So every third person gets the X. One of the most important things in the censorship story is that most people getting banned are small accounts.
01:13:28.000For obvious reasons, there's more of them.
01:13:52.000Yeah, people are celebrating that the New York Post story got more distribution than it would have otherwise received because of the censorship, so it was a beneficiary of the Streisand effect.
01:14:58.000When you do that, all of a sudden now, Facebook has a list, a very interesting list, of phone numbers, and on yours, it says, mom, dad, and then, you know, your siblings, Janet, Bill, or whatever.
01:15:45.000They know who your dad is based on your mom calling him Jim and you calling him dad and having the same phone number and it's a game of Sudoku.
01:15:52.000All of these things, they take all these little bits of data, combine it, and if you've never signed up for any of these websites, Everyone else has given your data to them.
01:16:03.000And their addresses and their phone numbers, well obviously, and their email address.
01:16:21.000And even if you don't have the Facebook Messenger app, or if you just visit a website that has like one of the little Facebook buttons, one of the little Twitter buttons, it's getting your data.
01:16:32.000And it will build a profile for you and everyone you know.
01:16:35.000And you don't know exactly what data is being given to them.
01:16:38.000And the craziest thing about it is, when I tell people this, like, listen, Your contact list has your significant other listed as, you know, GF or girlfriend or something.
01:16:49.000It now knows you're in a relationship and it knows who you're in a relationship with because that phone number then correlates to someone who calls your girlfriend Becky.
01:16:57.000Then someone has, you know, little sister and they're like, now we know who your sister is.
01:17:01.000The phone number thing is really obvious to us because we can understand how to correlate that data, but they can build a shadow profile from you off of really innocuous data like when you went to Burger King.
01:17:12.000And then they can see where your contacts go.
01:17:17.000I'm saying you could carry your phone into a Burger King and it can be like this phone and this phone were in the Burger King at the same time and moved around at the same time.
01:17:25.000And now I'm thinking about NFC, near frequency communication.
01:18:07.000Thankfully, I don't think it's got to that point yet, but I do think this election will be a test of, you know, can big tech swing an election?
01:18:48.000And so it can actually backfire and amplify these ideas, like we saw with the New York Post.
01:18:54.000Now, the best thing they can do is nuke the movement from the smallest grassroots first.
01:19:00.000And then let the higher... Like, you know, I think banning Milo, for instance, was a really, really bad idea.
01:19:05.000They should have slowly just started banning his followers until he had no influence anymore.
01:19:09.000Because now he's an account with no interaction, no engagement, and then he has to change up his strategy and his tactics to do... They could have manipulated it that way.
01:19:15.000Instead, they went from the top and it caused the grassroots to freak out, like, whoa, they banned this guy and it created a big story.
01:19:21.000Not that I'm saying he's better off for it.
01:19:23.000I think banning him was bad for him in the long run.
01:19:26.000But what's happening now is they're trying to actively censor.
01:19:29.000But when they do, Alex Jones still has his own website and his own infrastructure and he can build his own thing.
01:19:34.000They're not strong enough to erase someone from the internet yet.
01:19:37.000Though they have started removing people from society in general, like Enrique Tarrio of the Proud Boys, for instance.
01:19:42.000He gets banned from his banks, he gets banned from Uber, he gets banned from MailChimp, which I guess he doesn't even use and never even signed up for.
01:19:48.000They ban him from things he doesn't use at all.
01:19:51.000That's like total removal from society.
01:19:53.000So the issue right now is, Well, they're working on their method, right?
01:19:57.000Because one of the things I know they're working on is link banning.
01:20:00.000You know, so Alex Jones can have his own platform, but if you try and share his links on these mainstream platforms, you're not going to let that happen.
01:21:35.000Why is it that a black man can go out in San Francisco and say censorship is bad and a white antifuck screaming the n-word can knock his teeth out and the media doesn't freak out over that?
01:21:44.000It was acceptable in this weird algorithmic universe they've created.
01:21:51.000The AI that they built and their manipulation, because it's human beings at the end of the day, can't get into your home and they can't get on that stage with you to stop you.
01:22:00.000They can't censor you when you have a constitutional right to speak.
01:22:04.000But they can inflame some Antifa people, allow them to organize violent riots, and show up to your home and punch you in the face and knock your teeth out.
01:22:12.000They didn't go to that guy's house, I'm just saying.
01:22:14.000They've been going to people's houses.
01:22:16.000So, they really need cancel culture to scare people into adhering to the manipulations that they want.
01:22:22.000And they really, really need Antifa to knock people out.
01:22:26.000Otherwise, what does the enforcement actually mean in the real world when people defy you?
01:22:31.000They want to end the defiance in the physical realm, they'll punch you in the face.
01:22:34.000They can already end the defiance in the digital world by just banning you and nuking you.
01:22:58.000Another smoking gun to a long list of smoking guns.
01:23:01.000You know what I wish would be awesome?
01:23:03.000There's some 70-year-old dude who owns and runs all of these companies, and he's sitting in a room, and it was 1977 when he accidentally invented a sentient computer that's telling him what to do, and then he's going and giving his underlings orders.
01:24:00.000And so imagine that, but imagine they're connected in a big circle, and that's what it is.
01:24:03.000So Jack Dorsey creates a platform where he lets the left run rampant, and then eventually the whole political world is inflated with this psychotic far-left ideology that he allowed, and then it infects his own brain, and then he gives money to it, a guy calling for racial segregation, he gives money to him.
01:24:19.000He's created this world where it's literally, he's encouraging people to make sludge and refuse, and then he's accidentally eating it, And corrupting his own mind and body.
01:24:29.000And that money eventually comes back to pressure Twitter to censor someone.
01:24:43.000Why, you know, when we look at the Pew research thing, where it shows the Democrats moving very far left, why the Republicans have stayed very much where they are, even with Trump and the rhetoric and cancel culture and all that stuff.
01:24:54.000I think it's because, I think Jonathan Haidt talks about this, you know, he's the psychologist who does all this research about the roots of our political beliefs.
01:25:01.000He mentions, you know, conservatives are quite good at understanding liberals, often because, you know, they were former liberals, but liberals are terrible at understanding conservatives.
01:25:10.000So this is why I think Trump still has a fighting chance.
01:25:12.000We have all this censorship, we have all this propaganda, but because liberals don't understand conservatives very well, don't understand, you know, normal non-liberal people very well, a lot of the propaganda is just bad.
01:25:23.000Well, so what I think is happening is there's another chart that I love to share where it shows moderates.
01:25:30.00060% of the moderate news diet is liberal and 30% is conservative, or it's like 66 to 33.
01:25:35.000Conservatives is the other way around.
01:25:37.000It's 66% conservative and 33% liberal.
01:25:44.000So I think one of the things we saw with Gallup recently is that if you compare party affiliation from their latest data set in September to 2016, Democrats went from a D plus five advantage to an R plus one advantage.
01:25:58.000People are leaving the Democratic Party and it's because, I always hear it, they started researching on their own.
01:26:05.000The people who are in these videos screaming like lunatics, they're not doing any research at all.
01:26:12.000The reason why Biden is a sort of a strong candidate for the Democrats, you know, stronger than the alternatives, is for the same reasons that the liberals would never imagine.
01:26:18.000and we go, let me check that, I don't know about that.
01:26:59.000I think the threats of violence, the screaming banshees on social media has scared people.
01:27:05.000And there was this research I've cited quite a bit in the past couple weeks, because we're getting close to the election, showing that 10% of Trump voters will lie about who they're voting for.
01:27:39.000So there's a lot of states where Democrats should have a massive advantage and they only have a tiny advantage, or a moderate advantage, suggesting they're underperforming.
01:27:48.000Dude, if Trump wins, are we going to have to deal with this again?
01:27:58.000But if he wins, which I hope he will, do we have to deal with these screaming, no offense, women in the car or whoever, screaming young people, uneducated people that are freaking out?
01:28:10.000Why won't people just chill out and build something?
01:28:13.000Because they're addicted and they're on Facebook and Facebook is beating them over the head and screaming in their faces and they're not smart enough to do a Google search.
01:28:22.000So we need a unified enemy and maybe it can be the tech oligopolies.
01:28:26.000You'd think COVID would have been there, but no.
01:28:45.000She's come on Breitbart News Radio and talked about it.
01:28:47.000So she's happy to work across the aisle.
01:28:49.000And I think, especially the anti-war left, as you said, Tim, they're getting censored as well.
01:28:54.000So there's some recognition in some parts of the left that big tech, the power of big tech giants is a problem.
01:29:01.000The question is, will that ever gain momentum?
01:29:04.000Because, you know, for the left now, it's all about freak... The way to gain followers on the left is to freak out about Trump, to freak out about so-called hate speech, demand even more censorship.
01:29:14.000So, um... I think if Trump wins, and Republicans don't get the Senate or the House, Trump's out.
01:30:36.000Now you've got a mix of many of these, you know, you've got people on the right, libertarian right, and conservatives being like, we shouldn't interfere in the free speech of these big tech companies.
01:30:46.000We should let them control the politics of this country and just run us out of office permanently.
01:30:51.000If Trump and the Republicans win, there is a chance that right now with Trump with his hand on the cliff holding up everybody else can pull us up and slowly start to push back.
01:31:02.000Trump banning critical race theory was a really good example of this.
01:31:06.000All of a sudden we started hearing stories from people on Twitter, I lost my job, my company cancelled our contracts and everything.
01:31:13.000I can't believe I went to school for this.
01:31:14.000There was one really funny one where a person was like, I went to school for, you know, five years learning about diversity and inclusivity and inclusion and now Trump's banned it and I'm completely out of a job and don't know what I'm going to do.
01:31:25.000That's Trump getting rid of this crazy psychotic behavior.
01:31:29.000If Biden wins and the Democrats take over, then you are going to live in an algorithmically manipulated universe where Republicans will probably never win again.
01:31:39.000It will be... Listen, it's not going to be this utopia that people on the left think it is.
01:31:43.000It's not going to be Skittles and candy canes and rainbows and universal health care.
01:31:47.000It's going to be Hitler in a bikini with a female body dancing, doing Tai Chi with the Incredible Hulk while someone sings a nursery rhyme.
01:32:02.000And it created this insane, the Incredible Hulk with Hitler, but Hitler has the body of a woman in a bikini and it's doing Tai Chi and then they're singing a nursery rhyme.
01:32:34.000It's really amazing how the Internet has changed because the Internet as it existed before 2015 was actually unprecedented in human history.
01:33:17.000I just want to talk about one more thing.
01:33:19.000This will sound like a massive tangent, but it isn't.
01:33:23.000The Roman Empire collapsed into dictatorship because the Senate became corrupt, politicians became corrupt, and people turned to powerful strongmen instead.
01:33:33.000We're really seeing that because, as you said, Republican senators, a great many of them, will do nothing to protect ordinary people from these tech giants.
01:33:41.000The only entity in the world perhaps today doing that is the Trump administration.
01:33:46.000They're the ones who petitioned the FCC, got the FCC to come out and say, OK, yes, we'll do something on Section 230.
01:33:52.000They're the ones who appointed Nate Symington to the FCC, a guy who's a social media whore, got rid of the old guy who was skeptical about reigning in tech censorship.
01:34:00.000So really, the American executive branch is the only branch of government that might actually do something.
01:34:05.000Didn't the Roman Empire last a thousand years or something?
01:34:08.000Oh yeah, I mean, hell, Rome under the Caesars was not particularly a bad place to live.
01:34:51.000So it was essentially a law that defined, well, an executive action by the FCC, by Obama's FCC, that made ISPs, Internet Service Providers, Comcast, Verizon, common carriers.
01:35:05.000You know, treat everyone equally, treat everyone neutrally, so if Netflix is using your bandwidth, even if they're using way more of it than any other company, you have to charge them the same rate.
01:35:16.000So it was really something that only helped Netflix and YouTube and these big video streaming platforms.
01:35:22.000And the service providers, Google, Facebook, all these platforms.
01:35:25.000So it was basically what's called in the technical FCC language, the edge providers, Facebook, Google, YouTube, versus the service providers, Verizon, Comcast.
01:35:34.000It was a battle between corporations and under Obama, the edge providers, Facebook, Google, YouTube, Netflix, they won that battle under Trump. They got rid of
01:35:46.000it. It's not really about net neutrality. It doesn't force anyone to be politically
01:35:49.000neutral. And if it does, then it was aimed at the wrong target because the ISPs, to my
01:35:55.000knowledge, have never actually censored political speech. If you're going to have net
01:35:59.000neutrality, apply it to Google and YouTube and Facebook as well, because they're the ones
01:36:03.000who threaten the actual neutrality of the Internet. They're the ones deciding who gets so-called
01:36:20.000If this had been a long time ago, and companies like Wikipedia were liable for their libel, and Twitter had to have strict moderation policies, then free speech would be allowed, debate would be happening, and people would have the opportunity to actually speak with each other.
01:36:35.000A site like Mines, a startup like Mines would have went under.
01:36:38.000Because we only had like three moderators.
01:37:04.000Anyway, it's the opposite of that, because if you get Section 230 reform, the right kind of Section 230 reform, I will say, you don't want to repeal the entire law.
01:37:12.000What you want to say is, it's very simple, it's so simple, I don't know why more people aren't suggesting this, you simply say, if you want to be a platform, if you want to have that legal immunity, You can't filter legal content, constitutionally protected content, on behalf of your users.
01:37:27.000You can filter it, but only if the user chooses to filter it.
01:37:30.000So exactly the same as Google's SafeSearch option, but for all types of content.
01:37:37.000So if someone posts something illegal, I have to get it off the site as a moderator, or the site gets sued because we can't host illegal content.
01:37:48.000So you're saying if I, right now... If you moderate legal content on behalf of a user, so the user doesn't ask you to hide that from their feed, then you are no longer a platform.
01:38:00.000But if you give the user the option to filter that content, say if you make a hate speech filter that liberals can turn on or off, that's fine.
01:38:23.000So right now they can moderate against users' will, but this would say you're not allowed to moderate against their will unless they ask you to moderate for them.
01:38:33.000So free market defenders of the big tech companies will say First Amendment Anyone can moderate their property, their communications platform as much as they want, or you're violating the First Amendment.
01:38:45.000That's actually true, but not every company.
01:38:47.000While every company has First Amendment rights, they don't have these special Section 230 legal immunities.
01:38:52.000So what you can say is if you want these special legal immunities that no other company gets, then you're going to have to behave in a certain way.
01:39:51.000So the general idea that many people have had is you can't remove legal content.
01:39:57.000If it's a violation of the law, like a call to violence, threatening harm, or, you know, horrifying images or whatever, then they can get rid of that.
01:40:09.000Also, if that's how the law were, then it would be so much easier for tech platforms and new platforms like mine, because one, you wouldn't have to have moderators, and two, you wouldn't necessarily have to build the filters yourself.
01:40:22.000What I imagine would happen would there be all sorts of Third-party browser extensions that filter content across the internet for you so you don't ever want to see, say you don't ever want to see obscene content on the internet, you want to keep your internet family friendly.
01:40:34.000I imagine what would happen in this new world where tech platforms can't censor on your behalf is you'd have browser extensions that people install and it would filter your online experience across all websites.
01:40:46.000But what would happen is someone would upload porn and they wouldn't mark it explicit because that's how they live.
01:41:00.000Or you get complaints that it hasn't been flagged yet.
01:41:03.000Well, that's a problem that you have now.
01:41:05.000That's a problem that Google has, you know, so they train algorithms to recognize that kind of content.
01:41:09.000And what would happen is you'd have browser extensions that do the same thing, that have algorithms that recognize and they, you know, they get better and better and they improve over time with updates.
01:41:18.000Yeah, it's time to, in a sense, scale back the desperate attempts.
01:41:23.000Because what's happening is, you get Twitter and Facebook saying, this is objectionable and this isn't.
01:41:28.000And then a bunch of people freak out saying that's not fair.
01:41:30.000A really good example of the problem was when I was talking with Jack Dorsey.
01:41:34.000And I said, your rules are inherently biased.
01:42:31.000And he tries to call his friend on the phone, the phone operator says, sorry, you said hate speech through the phone last time, can't use it today, sorry, you're banned for 10 days.
01:42:39.000Tries to go buy a newspaper, the newsstand owner says, well, your favorite newspaper isn't stocked anymore, it was publishing disinformation.
01:42:46.000Look, that would have been so bizarre to our parents and our grandparents, but we're just accepting it on social media, the exact same thing.
01:42:53.000I'd like to see them become utilities, big companies like that, once they have a certain amount of users per day.
01:43:01.000I think simple 230 reform is the first thing to do.
01:43:04.000And then... So if content goes up, it's objectionable, and someone complains, you're liable for it as a platform because you're supposed to make sure that the filter works.
01:43:14.000If the filter doesn't work... Objectionable doesn't mean illegal.
01:43:18.000But if someone has a filter, and they turn it on, they don't want to see porn, but the porn hasn't been flagged yet, and they see it, The site can get sued.
01:43:32.000There are two, there are two parts to section 230, one which exempts, um, companies from lawsuits over the removal of content and one that exempts them for hosting content.
01:43:41.000So I'm not saying that you should, uh, strip immunity from tech platforms simply for, uh, for hosting.
01:43:48.000They need that immunity, but it should be contingent on behaving like a platform.
01:43:53.000So not moderating content on behalf of cases.
01:44:15.000We'll see if we actually get to that point.
01:44:17.000I think if we enlighten people as to what exactly, so they understand what the change will be, that there will be a lot more support for it.
01:44:54.000If you post a message saying, you know, screw Ellum, and you post it on a board out in the middle of the town square, I don't sue the board.
01:45:04.000But if you stand there and scream screw Ellum, I say you're the one who spoke.
01:45:20.000The problem is, what Twitter has become is, in order to get in, there's a gate, and when you walk in, Jack Dorsey's standing there saying, uh, not that one.
01:46:03.000So, one of the issues right now is that many of these companies are issuing statements as what effectively would be the New York Times.
01:46:12.000Like, when Facebook puts a flag over a piece of content saying it's fake news, well, that's Facebook issuing a declaration that no one else can do.
01:46:32.000Colleen said, just heard about BitChute to possibly replace YouTube, Parler for Twitter, MeWe for Facebook, not to polarize communities, but these monopolies.
01:46:42.000So, I don't think any of these things will actually replace.
01:46:45.000One of the challenges with any one of these alternative platforms is that do they have a big enough community size to make it valuable for people?
01:46:53.000And one of the challenges, yeah, what if people start polarizing because they're all, you know, all the right-wing people go on Parler and all the left-wing people stay on Twitter?
01:47:01.000I think with videos, if it gets easy enough to cross-post, where it's just a push of a button and it is becoming easier, then you could see these platforms start to gather momentum because people will simply choose to watch, say, your podcast on BitChute rather than YouTube.
01:47:15.000One problem with mines and not having cross-posting was that we'd have to take Facebook's API and it would be proprietary and they'd be tracking your browser movements if we implemented Facebook's API like a share to Facebook button.
01:47:28.000So we don't have a share to Facebook button.
01:47:30.000I say we, but mine doesn't have a share to Facebook button because we don't want Facebook to track our users.
01:47:37.000Another regulation that I'd like to see is make it easier for people or force the big companies to develop a shared format where you can migrate all your content from one platform to another with a push of a button.
01:47:53.000I wouldn't subject small companies to that because, you know, it's kind of onerous from a technological standpoint, but you could have some sort of, you know, Josh Hawley-like exemption where it only applies to companies above a certain market share.
01:48:56.000We talk about Google a lot, but it's owned by a company called Alphabet that owns other companies called like X, which is like a technology firm that Alphabet owns.
01:49:17.000I'd like to believe that, but earlier today I did a check and my videos don't come up.
01:49:21.000In fact, I was sent a trending search topic, which was the full title of one of my videos, because everybody who watched started searching for the title and it doesn't come up.
01:49:31.000So I'll check again after the show, and maybe my whining has paid off.
01:50:33.000For those that don't know what the context is, it's a CNN analyst who was in a business meeting with a bunch of journalists and started cranking one out on camera in front of everybody.
01:52:55.000You know, if, if the algorithms only fed people information they were interested in, Jeffrey Toobin would be much more screwed than he already is.
01:53:09.000Trump was gonna lose, so he whips it out.
01:53:11.000I was like, if he actually is gonna lose, then you're gonna, you know, chill out.
01:53:15.000Blank Fields says, regarding the meme on 4chan poll about free speech leading to right-wing, what are your thoughts on what happened with Tay chatbot and how they removed it as it became politically incorrect AI freedom?
01:53:33.000See, that actually was an AI functioning badly because it was responding to the most motivated group of people on the internet, which is 4chan.
01:53:41.000I don't even think it was that, like, necessarily 4chan, but yes, yes.
01:53:44.000But what I mean is, it's just people trying to be edgy.
01:53:55.000Like, throwing hot dogs at women's boobs or something?
01:53:57.000Like, just crazy stuff meant to shock you, and people liked it.
01:54:01.000Like, Sarah Silverman, her whole shtick is just being offensive as possible and shocking, and that's really weird because she's like, SJW or whatever.
01:54:08.000But, yeah, that's, uh, Chatbot was victim of that.
01:54:13.000People just wanting to say crazy things, because it was funny to say things you can't say to a robot, and the robot became racist.
01:54:19.000There's a real duality on the internet, because on the one hand, we've got this cancel culture, which will come from social media and Wikipedia, as you were saying, but on the other hand, we have this anonymous culture, which is the most offensive culture ever created.
01:54:48.000I tagged both of you and Lydia in the post.
01:54:49.000I'm going to stop right now and just say, I'm willing to bet a lot of people don't know the difference between Timcast IRL News and Timcast and also don't know the difference between playlists and the actual channel.
01:55:02.000Or I made a video about it in the past three days saying Google blacklisted me, and someone at Google saw it, freaked out, and went and removed the blacklist.
01:55:10.000But this has been going on for a long time, and a bunch of other channels are blacklisted, a bunch of political channels.
01:55:15.000What is the difference you just brought up between the playlist and the channel and the search?
01:55:19.000When you search for, like, Tim Pool YouTube, it'll say Timcast playlist, and it will link to other people's channels who are linking to my videos, not the actual channel Confirm that you're actually seeing Tim's channel and then come back.
01:55:32.000And I'll check after the show because, yeah, maybe my complaining has paid off for sure.
01:57:40.000I think people are not actually seeing it and they don't understand the difference between a playlist of my videos.
01:57:46.000I've gotten a bunch of emails where they're like, here's a screenshot proving you're not blacklisted and the screenshot has none of my channels on it.
01:57:52.000So you could sign out of Google, clear your cache, search for it, sign back into Google, search for it, clear your cache, search for it.
01:58:28.000You know, I think one of the biggest problems we're going to face in the future is the internet has created instant gratification politics, and someone like Ocasio-Cortez, who is just no political experience, and I'm not saying that to necessarily drag her, because a lot of freshmen, you know, congresspeople come in, don't have experience,
01:58:48.000But she's also just... She botched the Green New Deal stuff.
01:58:52.000It's pie-in-the-sky, fairytale nonsense, and she really has no idea how an economy functions.
01:58:58.000But imagine, she can get a half a million people on social media to follow her, and they go out and vote, and she keeps winning.
01:59:50.000Maybe the internet has sped up our decline.
01:59:53.000So we were talking about this the other day, not on the show, before we did the show when we were with James, I was explaining how social media was decreasing the duration of all of these moments.
02:00:06.000The American Revolution took 20 years.
02:00:09.000It was an ideological revolution where you had people one day being like, yo, I'm sick of this, and they started talking about it, sending letters.
02:00:16.000You're in South Carolina or whatever, in pre, you know, in colonial, in the colonies when it's still, you know, it was still a colony of Britain or whatever.
02:00:24.000And you were like, I'm writing a letter, you know, I want to say to the king, F you.
02:00:29.000So you seal it, you give it to the the writer, and then in three weeks it's made its way to New York, or however long, into months.
02:00:35.000Then in New York, some other guy reads and says, I agree with this letter, let's send it to the king.
02:00:39.000They put it on a boat and three months later, it makes it back to the king, who reads it and goes, what?
02:00:44.000I'm gonna respond to this, how dare you?
02:01:18.000Well, I'm gonna send people down there.
02:01:19.000We'll be there in two days because we can fly now.
02:01:23.000This is a great way to understand historical change.
02:01:26.000I studied history in college and the main schools look at, you know, is it economics that causes historical change, you know, ideas, intellectual developments, but actually communications and how connected to society is a big part of it as well.
02:01:41.000The reason why the Japanese were able to industrialize so quickly is because even though they were cut off from the rest of the world for 250 years, the reason why they industrialized within, you know, 30-40 years and no other country did, was because they were the most literate society outside Europe and ideas just spread around very quickly compared to other countries.
02:01:59.000Whereas in Russia, which tried to industrialize quickly, more than half the population was illiterate.
02:02:26.000I think we were talking about this earlier, you know, we think we're so different to China, but you know, here we have, you know, regional firewalls.
02:03:00.000So right now on YouTube, there's a thing that happens where if you have a certain number of incorrectly labeled videos, then you get what's called a pending review.
02:03:09.000And depending on how bad you are or how big you are, you get a bigger duration.
02:04:16.000People have to understand how this works on the back end, because every single platform is like this.
02:04:20.000The way they rank content, decide who gets at the top of your feed, who gets demonetized or monetized, It's all numerical.
02:04:27.000Algorithms operate there on what can be quantified.
02:04:30.000So every single platform will have these quality scores to rank the so-called quality of your YouTube video, of your website if you're on Google, of your post if you're on Twitter or Facebook.
02:04:40.000And that numerical value determines whether you're gonna be at the top of people's feeds or buried, whether you're gonna be monetized or demonetized.
02:05:02.000People are super chatting right now, and it looks like there's a bunch of places where I am blacklisted and a bunch of places where I'm not.
02:05:08.000Someone says, just searching Cognito Mode, TimCast YouTube channel, your channel is first, IRL second, Playlist Next 3, then your wiki in East Texas.
02:05:15.000Someone said, doesn't show up on Google in Tennessee.
02:05:18.000Just checked, and you're blocked in Canada as well, or at least Vancouver.
02:05:22.000Your newest videos do not show up under latest for Timcast when you search for Timcast.
02:05:32.000If someone could write a program that could measure where Tim is blacklisted and not blacklisted, and then extrapolate that so that any YouTube user could use the program to see where they're blacklisted through a VPN or something, that'd be a very cool program.
02:05:45.000You should see where you are in Germany, because Germany, the government's really bullied the social media companies into censoring a lot over there.
02:07:49.000What's interesting is that your own followers are having trouble finding the links.
02:07:52.000I mean, you'd think that if these algorithms were working as intended, the person most likely to see a Tim Pool video would be someone who watches Tim Pool videos.
02:07:59.000But apparently that's not how it's working.
02:08:24.000It's what the sources inside Silicon Valley say.
02:08:26.000Facebook, Google, all these companies.
02:08:28.000If you want to read it, it's at deletedbook.com.
02:08:30.000But my vision of the future, so there are two possibilities.
02:08:34.000It really does come down to this election.
02:08:35.000It's amazing how much it comes down to this election because the Trump administration
02:08:41.000has made some excellent, you know, they didn't act as fast as they could have done, but they
02:08:44.000made some excellent appointments in the federal bureaucracy, got people in the right positions,
02:08:48.000good people by the way, Adam Kanday for example, a law professor who once sued Twitter in a
02:08:52.000free speech case, the Megan Murphy case actually.
02:08:56.000And he wrote the petition to the FCC to make the Section 230 rule change.
02:08:59.000So these are people who know what they're doing, they understand the issue, they understand what needs to be done.
02:09:04.000But literally, the executive branch of the United States right now is the only powerful force, I think, in the entire world that actually wants to fix this problem and has the ability to do so.
02:09:14.000So it's really like the last chance for online freedom this election.
02:09:17.000I'm not just saying that because I'm a Trump supporter, but it Can you think of another powerful entity in Europe, in North America, anywhere in the world that's actually pushing back on this?
02:09:25.000No, it's only the American executive branch.
02:09:34.000The Borg is trying to take over and it's just Trump holding on with one hand off the side of the cliff and everyone else hanging on to his leg.
02:09:41.000And we're hoping he pulls us up and it's going to be real tough.
02:10:16.000I'll tell you what the pessimistic vision, the optimistic vision is, you know, Trump reforms Section 230 and fixes this.
02:10:21.000The pessimistic vision is that we get to a situation where a handful of critical race theorists in the San Francisco Bay Area get to control, invisibly and undetectably, control the emergence of political movements.
02:10:35.000So, you know, stop them even before they get off the ground.
02:10:37.000Not just in America, but all around the world.
02:10:43.000So that hate speech laws come in, the Constitution gets abolished, you go out with a sign saying, I should have a right to speak, and the cops come and bash you with a truncheon.
02:10:51.000Well, hopefully that doesn't happen, but Alan, thanks for hanging out.
02:10:54.000Do you want to mention your social media?
02:11:23.000And of course we do the show Monday through Friday live at 8pm.
02:11:26.000Smash that like button on your way out because apparently that helps as we're dealing with this algorithmic manipulation and censorship and all that stuff.
02:11:34.000We're also on, you know, Apple, iTunes, and all these other platforms because we diversify even though they'll probably act in concert at some point.
02:11:41.000At some point I need to nuke everybody.
02:11:42.000But anyway, if you want to fund my other channels, which some people can and some people can't, it's YouTube.com slash TimCast and YouTube.com slash TimCastNews.
02:11:51.000Of course you can follow me on Twitter, Instagram, Parler, at TimCast.
02:11:54.000And of course you can follow at Ian Crossland.
02:12:08.000At the very least, tell people what's going on, because the most powerful thing you can do is speak up and share information, especially when we're dealing with these companies trying to restrict information.