RFK Jr. The Defender - December 21, 2023


Censorship Industrial Complex with Matt Taibbi


Episode Stats

Length

42 minutes

Words per Minute

153.68544

Word Count

6,547

Sentence Count

434

Misogynist Sentences

2

Hate Speech Sentences

3


Summary

In this episode, Matt Taibbi talks about the Colorado Supreme Court ruling that throws Trump off the ballot under the 14th Amendment, and why that s a huge mistake. He also talks about how the ruling is part of a larger pattern of disenfranchising voters and replacing them with unaccountable elites, and how we need to stop allowing the rabble to make irrational decisions and get us back on track with our democratic ideals. Matt is the author of four New York Times bestsellers, and an award-winning columnist for Rolling Stone. His recent book, Hate Inc., is a turbocharged take on how the media twists the truth to pit us against each other. In Talks, Matt paints an alarming portrait of politics, media, culture, and all of the big, big grifters on Wall Street and the Fed, from the Fed on down. He is one of the most eloquent deconstructors of Wall Street, and the banking system, and his deconstructions of the financial system paint a disturbing portrait of what's going on in the world, from Wall Street to the Fed and everything in between. This is a must-listen for anyone who wants to understand what's really going on, and what s going on around us, in our economy, politics, the media, and our culture, while providing a way forward against our most urgent crises. And, of course, if you don't like it, you can get your own copy of Hate Inc.'s newest book, "Hate Inc." by clicking here. Matt also wrote The Great Derangement, The Divide, Griftopia, and The Great Distraction. and The Idiot's Guide to the Great Disagreement, which is out now. by Rolling Stone's new book, The Dark Side of America's Most Disturbing Problem, which you should read and listen to. If you do, you'll get a discount code for 10% off your first purchase of a copy of the book, and a free copy of The Hate Inc. Thanks to Matt's newest book: Hate Inc and much more! Matt's book is available on Amazon Prime and Audible, too! and his podcast is available in Kindle, iBookshop, Podcharts, and also on Audible and Podchaser, wherever else you get your book recommendations are available. You can get the book on amazon, too. Thank you, Matt.


Transcript

00:00:00.000 Hey, everybody.
00:00:01.000 I've got one of my favorite guests here, Matt Taibbi.
00:00:03.000 Matt is the author of four New York Times bestsellers and an award-winning columnist for Rolling Stone.
00:00:11.000 His recent book, Hate Inc., is a turbocharged take on how the media twists the truth to pit us against each other.
00:00:19.000 In talks, Taibbi paints an alarming portrait of politics, media, culture, while providing a way forward against our most urgent crises.
00:00:29.000 Taibbi also wrote New York Times bestsellers and St.
00:00:33.000 Cloud President, The Divide, Griftopia, and The Great Derangement.
00:00:37.000 He is a winner of the 2008 National Magazine Award for Columns and Commentary, and he is one of the The most, I think, eloquent deconstructors of Wall Street and the banking system and all of the big, big grifters on our country and our international economy and the Fed, from the Fed on down.
00:01:01.000 Anyway, welcome back, Matt.
00:01:04.000 Thank you.
00:01:04.000 Thank you very much for having me.
00:01:07.000 The Supreme Court case just came out, and it's a Colorado Supreme Court case that probably by the time this shows, everybody will have rehashed and rehashed.
00:01:16.000 Tell me what your take is on it.
00:01:18.000 It throws Trump off the ballot under the 14th Amendment.
00:01:24.000 Because of this statute that was passed after the Civil War to keep Civil War veterans, Confederate veterans from voting in the election, Confederate leadership from voting in future elections to protect Reconstruction in the South.
00:01:40.000 That's really why it was promised.
00:01:42.000 I'm not an attorney, so I'd be curious actually to hear your opinion on this because it feels to me that this is neither the sense for which that law is designed.
00:01:56.000 It sounds crazy to me.
00:01:58.000 It's very much in line with this sort of lawfare phenomenon that we've seen.
00:02:02.000 In the last seven years where, you know, rather than leaving the fate of America to the auspices of voters, whom I think elites increasingly distrust in this country, they are looking for other techniques.
00:02:17.000 It's not dissimilar from what happened, you know, in Florida and what happened with you and some, you know, in some other democratic states.
00:02:27.000 But this is, I think, I think it's a huge and crazy step.
00:02:30.000 I'd be curious to hear what you think.
00:02:32.000 I feel the same way that you do.
00:02:34.000 I think it's bizarre.
00:02:37.000 And then the second point that you make, that it's part of this pattern of disenfranchising voters and choosing our political leaders through appointment of the party leadership and the elites is...
00:02:54.000 I'm not a fan of Trump.
00:02:55.000 I'm running against him.
00:02:56.000 But I don't want to win this way.
00:02:58.000 I don't want to win by kneecapping my opponent.
00:03:01.000 I want to be able to win an election fair and square and not leave half the country angry that they didn't get to vote for their candidate.
00:03:09.000 This is the kind of thing that you see in Pakistan or Iran or some other country that has a pretend democracy where popular leaders are...
00:03:22.000 that they can't participate, that they get rid of them one way or another so that they can't run.
00:03:29.000 And we look at that and we say, "Yeah, that's not a real democracy." Well, that's happening here now.
00:03:34.000 And like I say, I'm not a fan of Trump's, but if you don't believe in democracy, if you don't believe in the people as the demos, which is the demos of democracy, You can't say that you believe in democracy if you mistrust, if you distrust the people to choose their own leaders.
00:03:53.000 Yeah, it's shocking.
00:03:55.000 I mean, obviously we saw hints of this in 2016, 2015, when we started to see the spade of punditry where they were talking about there's just too much democracy.
00:04:06.000 We have to go back to the smoke-filled room.
00:04:09.000 We can't allow the rabble to be making these consequential decisions or irrational decisions.
00:04:15.000 But even going back before that, I think some of the roots of this, you have to put at the feet of the war on terror and think about all of the major decisions that we made, sort of casually throwing out Bill of Rights characteristics that have been sort of casually throwing out Bill of Rights characteristics that have been central to American life for hundreds I mean, we throw out habeas corpus.
00:04:39.000 Due process is gone in a lot of these procedures.
00:04:44.000 If you add COVID, which is kind of the culmination of the war on terror, it started with 9-11 and ended with COVID, where they completely shut the door on the U.S. Constitution, and they got rid of free speech, the First Amendment.
00:04:58.000 They got rid of right to assemble with social...
00:05:01.000 Who could have imagined that there would be a crime to assemble and social distancing?
00:05:06.000 That seemed like a...
00:05:08.000 Well, I even put that in the Constitution, because who's going to tell you you can't get together with your friends?
00:05:15.000 Well, you know, they did it.
00:05:17.000 And then the religious freedom, they closed every church in this country for a year with no scientific citation and no notice and comment rulemaking, no public hearing, no environmental impact statement.
00:05:32.000 And then they got rid of jury trials of the Seventh Amendment by saying if one of these companies hurt you with a vaccine or some other intervention, you can't sue them.
00:05:41.000 And then the whole Fourth Amendment probably envisioning against warrantless searches and seizures was just obliterated by all the track and trace surveillance and you got to show your medical records before you go into a building and all of this.
00:05:56.000 It was really, it's like they targeted the Bill of Rights one after the other for obliteration.
00:06:02.000 Very odd.
00:06:03.000 Very alarming that nobody cared.
00:06:05.000 The Fourth Amendment case, the one involving getting geolocation information from phone providers.
00:06:12.000 When this decision came down, I immediately thought of the sort of internal memos that the Obama administration wrote.
00:06:21.000 In support of the drone program, where there was this very curious part in one of them that talked about how due process didn't necessarily have to involve the party in question, that any kind of process could be due process.
00:06:38.000 That's kind of what this is like.
00:06:39.000 It's sort of like due process, but yet the important party isn't invited.
00:06:44.000 It's nuts, but there's a generation of people, I think, that's growing up That thinks this is normal.
00:06:51.000 And, you know, it's astonishing.
00:06:54.000 I mean, the whole country needs a civics lesson.
00:06:57.000 It's really...
00:06:58.000 I don't know how this...
00:07:00.000 My kids' generation, they think it's normal for the government to be...
00:07:04.000 Like, they're not freaked out at all about the government reading all their emails and everything else.
00:07:09.000 They shrug.
00:07:10.000 And then, you know, when COVID came...
00:07:12.000 I mean, the biggest liberals that I knew, and all of the word liberal...
00:07:19.000 It means freedom.
00:07:20.000 It was drawn from freedom of speech and the idea that they could start censoring people.
00:07:29.000 They started with the vaccines and with these medical interventions, but then they kept expanding it to Ukraine war.
00:07:36.000 It's exactly the stuff you want people talking about.
00:07:40.000 Government policies is the reason we fought the revolution in the first place, because people wanted to be able to criticize their government without It's
00:08:12.000 just extraordinary.
00:08:13.000 I mean, they're not supposed to be propagandizing the public, and they just told them a lie straight out right before the election.
00:08:22.000 Oh, I know.
00:08:23.000 I mean, this was a big theme of all the stuff that, you know, we've been working on in the last year, which is that all of those prohibitions against intelligence agencies propagandizing the domestic population, you know, from the Smith-Month Act to the The charters for the State Department and the CIA. They're all just being violated willy-nilly, all of them.
00:08:47.000 And there's not even any kind of embarrassment about it or any suggestion that anybody's even worried about it.
00:08:55.000 You have agencies like the Global Engagement Center, which is a State Department agency, and they're openly involved in censoring Domestic content or doing content moderation domestically.
00:09:08.000 And they have no legal remit to do that.
00:09:10.000 And it's just considered, you know, people just shrug, as you say.
00:09:15.000 And it's remarkable.
00:09:17.000 I don't want to insert my situation into the conversation, but part of this is why the Secret Service denial, to me, You know, which is the first time in history that anybody who asks for Secret Service protection has been denied.
00:09:35.000 And I exceed all the parameters and metrics for which they've routinely given Secret Service protection.
00:09:42.000 And, you know, it's a minor, minor issue.
00:09:45.000 But it's now all of these agencies.
00:09:49.000 When my dad went into the Justice Department his first week in there, and Arthur Seisinger talks about this in his biography, he gave a speech to all the division heads and branch heads saying nothing is going to be politicized here.
00:10:01.000 You know, we don't go after people.
00:10:03.000 We don't ask their political party.
00:10:05.000 None of that's going to happen in my Justice Department.
00:10:08.000 And that was a routine speech that every attorney general gave.
00:10:14.000 Do the Justice Department because it was so important that Americans have faith in the institutions of democracy and that they aren't politicized and that the whole world looked well.
00:10:28.000 That could not be corrupted.
00:10:30.000 And now you have this, you know, it's really the thing that disturbs me of all the things that, you know, disturb me about President Biden right now.
00:10:39.000 You know, the war and all of that.
00:10:41.000 The war is kind of part of his DNA. He's always been kind of a, you know, a warmonger.
00:10:46.000 And, you know, he's always been kind of fighting for war with Ukraine.
00:10:51.000 But the thing that really...
00:10:54.000 That really irks me is all this politicization of these institutions without him ever saying anything about it.
00:11:03.000 I'm just going along with it.
00:11:06.000 I feel like they've completely lost touch with what America is supposed to look like.
00:11:12.000 People don't even know what it's supposed to look like anymore.
00:11:16.000 Yeah, I mean, absolutely.
00:11:18.000 The Secret Service decision in your case, I mean, it jives with this overall trend of you have this odd contradiction among sort of the upper-class, blue-leaning intellectuals in this country.
00:11:35.000 On the one hand, they're furiously angry that there's been this collapse of trust in elite institutions.
00:11:42.000 And they want desperately to remedy that.
00:11:45.000 But at the same time, they keep taking these steps that guarantee that nobody will trust those institutions.
00:11:52.000 And it's everything from what you're talking about, like with the obvious politicization of the Secret Service to this Supreme Court case to the news media not correcting major errors.
00:12:05.000 You know, year after year after year to the censorship issue.
00:12:10.000 If you want people to listen to the national news media, they've got to stop getting things wrong and they have to, you know, admit it when they do.
00:12:21.000 And they refuse to do it.
00:12:23.000 And then they're surprised that there's a loss of trust.
00:12:25.000 It's baffling to me.
00:12:27.000 I don't really understand it.
00:12:28.000 I mean, I'd be curious.
00:12:30.000 Well, people say to me all the time, you know, accusingly, a lot of the mainstream media, you know, you've been bashing, telling everybody not to trust the institutions of government, and you've got people to doubt NIH and CDC and And if you then get into the presidency after you've destroyed the trust in all these institutions, how are you going to govern?
00:12:51.000 And I say to them, I'm going to make people trust the institutions again.
00:12:56.000 How are you going to do that?
00:12:57.000 I'm going to make them trustworthy.
00:12:58.000 That's what you've got to do.
00:13:01.000 You can't just force people to trust stuff that you'd be out of your mind to trust the government or the mainstream media today.
00:13:08.000 If you trust government, if you trust the mainstream media, you're not paying attention.
00:13:13.000 You're not paying attention to anything.
00:13:17.000 Yeah, and again, this goes back to your situation came up in exactly this way in probably one of the last Twitter files reports that we did that was about the Virality Project, which was this Stanford effort backed by the Department of Homeland Security and the Global Engagement Center and some other government agencies to root out what they called COVID misinformation, disinformation and malinformation.
00:13:47.000 But what they did is there was this weird drift.
00:13:50.000 They would start off talking about things that, you know, are kind of obviously Would fall under the category of misinformation, you know, like the idea of microchips being, you know, implanted in vaccines or something like that.
00:14:04.000 But then, very quickly, they would come to define anything that undermines confidence in government policies, anything that undermines confidence in individual officials like Anthony Fauci, or anything that would, you know, quote-unquote promote hesitancy.
00:14:23.000 They'd define that as misinformation.
00:14:26.000 And there was a passage that particularly referred to you that I thought was one of the most striking things, where they talked about how repeat offenders, people like you, are almost always reportable.
00:14:41.000 You know, to me that was striking because as a journalist, you know, we're trained that if we get something wrong, that you punish the speech, not the speaker.
00:14:50.000 You don't sort of decide that somebody is inherently libelous or, you know, is More prone to libel than somebody else.
00:14:59.000 You deal with each specific case, but that's not how they do it.
00:15:03.000 They're assessing people and sort of making these binary decisions, trustworthy, not trustworthy, and there's no process.
00:15:11.000 They just kind of put people in baskets.
00:15:14.000 It's a crazy way to go about things, and it's totally contrary to the spirit of this country, which is everybody gets a chance to defend themselves.
00:15:22.000 Everybody gets a chance to give their side, and nah, they're not into that.
00:15:27.000 Yeah, I mean, you mentioned an interesting word, malinformation, which I think they made up.
00:15:32.000 And they applied that to me a lot because of what I was writing about.
00:15:37.000 Listen, if somebody said to me, you got this wrong, and there's a couple things I got wrong, and we immediately corrected them.
00:15:43.000 Somebody says to me, you got this wrong, this description, this study didn't happen.
00:15:48.000 There was a Filipino study that I wrote about that I got a fact wrong and I immediately corrected it.
00:15:54.000 But nobody could point to a piece of information that I had put that was false.
00:16:00.000 And we had made it because of, you know, we had a major fact-checking operation at CHD that was revealing at 350 cases.
00:16:09.000 PhD scientists and MD physicians who were on this, you know, scientific advisory board, and we were, everything that went out from us, everything was cited in source.
00:16:19.000 We were very, very careful, and nobody pointed to an actual erroneous statement that I made.
00:16:26.000 But they invented this new word called malinformation, which is not misinformation or disinformation.
00:16:33.000 Disinformation is somebody deliberately seeding the dialogue with a manipulative piece of misinformation.
00:16:42.000 Misinformation is just you got it wrong.
00:16:44.000 Misinformation, you deliberately got it wrong.
00:16:47.000 Malinformation is information that is factually correct, but it is inconvenient, nevertheless, to government officials.
00:16:55.000 And they had to make up this word, which is some bizarre pedigree, I don't know, or etymological root.
00:17:04.000 Yeah, I'm still looking for the first use of that.
00:17:08.000 Yeah, because when we first encountered it, You know, the DHS, the Department of Homeland Security, they had something called the MDM subcommittee.
00:17:17.000 So that's the misinformation, disinformation, and malinformation subcommittee.
00:17:22.000 Malinformation ended up becoming one of the categories that could be applied in the virality project and some other sort of anti-disinformation schemes that went on.
00:17:35.000 But it was definitely used by the Department of Homeland Security So it has to have come from somewhere, and you're right.
00:17:43.000 It's a word that specifically was invented to deal with things that aren't true, but they are true, that aren't false, but they want to treat as false.
00:17:57.000 We saw plans for discussions about sort of future DHS activity where they talked about building resilience in the population against what they call despair-inducing MDM. So despair-inducing malinformation can just be things that are true that make the population dissatisfied.
00:18:21.000 Yeah.
00:18:21.000 Me saying, you know, this individual is corrupt, this government official is corrupt.
00:18:26.000 Right.
00:18:27.000 That is depressing.
00:18:28.000 Right.
00:18:29.000 Yes, exactly.
00:18:30.000 Exactly.
00:18:30.000 But you can see how very quickly a person who is put in the position of evaluating all this and thinks that they're doing God's work, they will come around pretty quickly to starting to define all kinds of things as malinformation.
00:18:44.000 And That's the big problem with this stuff.
00:18:48.000 They start doing things like, oh, there's somebody on Facebook who's talking about a relative who got myocarditis after getting the shot.
00:19:00.000 And this person might even be pro-vaccine, but they'll call that malinformation because it's, you know, quote unquote, promotes hesitancy.
00:19:07.000 And yeah, they had a whole list of those incidents, but it's a crazy concept.
00:19:14.000 It's the kind of thing that Orwell would have invented.
00:19:17.000 And I know that's a cliche, but in this case, it's really, it's very apt, I think.
00:19:22.000 Yeah, I mean, one of the amazing things that you're now encountering, like with this institution at Stanford, is this question about where the money is actually coming from for all of these, like, It looks like a lot of it is coming through intelligence agencies and that they're routed through these, you know, these bizarre chains that people...
00:19:48.000 Tell us what you found and who is funding and who is really behind the censorship industrial complex and who are the characters who are, you know, who they brought into this whole thing.
00:20:00.000 Have you ever run across Averill Haynes?
00:20:03.000 Oh, right.
00:20:04.000 Yes, of course.
00:20:05.000 Yeah.
00:20:06.000 This is part of the reason that the, you know, The Twitter file story and then, you know, some of the other stuff that we've done since then.
00:20:13.000 It's incredibly confusing because this new industry, this kind of censorship industrial complex, as Michael Schellenberger calls it, it comes from a lot of different places.
00:20:24.000 It grew out of sort of the counter messaging operations in the war on terror.
00:20:31.000 So you had groups within the Pentagon that were doing anti-disinformation work.
00:20:37.000 It's targeting, in Arabic, targeting ISIS and Al-Qaeda, and they were funded pretty heavily over the years, but they switched.
00:20:48.000 You know, they went from this, what one person called, one former agent called CT to CP. It's counterterrorism to counterpopulism.
00:20:55.000 So you move...
00:20:56.000 The CSCC, which was strictly anti-terrorist, to the Global Engagement Center, which is kind of strictly anti-disinformation, and that they're one of the partners for Stanford.
00:21:07.000 So we found money from the Department of Defense, from the State Department.
00:21:12.000 From the National Science Foundation, all of those contributed to the Stanford programs.
00:21:18.000 Also, we found a significant amount of money that came from private donors like Reid Hoffman.
00:21:23.000 It was a big one, Pierre Omidyar.
00:21:25.000 I don't know if he was involved with the Stanford programs exactly, but he's involved with a lot of these programs.
00:21:30.000 Explain who those guys were.
00:21:33.000 Reid Hoffman's a LinkedIn billionaire.
00:21:36.000 There's the Newmark Foundation, which is Craigslist, Piero Midiars, eBay.
00:21:41.000 There are a lot of these tech billionaires are major funders of what they call anti-disinformation.
00:21:50.000 NGOs, I think that's a Sometimes an overly generous term because a lot of them aren't really non-governmental.
00:21:58.000 They're pretty explicitly partnered with governments.
00:22:00.000 But there's a lot of private money that gets mixed in with these operations.
00:22:08.000 I think the Stanford Election Integrity Partnership was really It was sort of a prototype for how to do these things.
00:22:16.000 It's kind of started by the Department of Homeland Security.
00:22:20.000 The idea seems to have come from there, but they can't do it because they don't have the legal authority.
00:22:26.000 This is said openly by the people Who run the program that we have to do this because DHS, they even said that the exact quote was they kind of don't have the legal authority to do this or the funding.
00:22:41.000 So you need kind of a private face so you can step in and do this work that would be absolutely illegal if the government did it directly.
00:22:51.000 So the funding, you know, is sometimes routed In this indirect way, you know, Stanford gets a number of government grants, but they're not directly for these programs.
00:23:04.000 They also get some support from the Newmark Foundation, right?
00:23:08.000 Or from somebody like Reid Hoffman will contribute to a group like New Knowledge, which does this kind of work.
00:23:14.000 But there's always the consistent pattern is the involvement in some way of a government agency or intelligence agency A big pile of money that comes at least in part from the private sector and then sometimes like a veneer of an academic project on top of it.
00:23:35.000 And that seems to be what the standard pattern is.
00:23:39.000 So talk about some of the highlights of what you found most recently in the Twitter files.
00:23:46.000 Yes.
00:23:47.000 So, in working on this, we had a bunch of whistleblowers come forward from different...
00:23:52.000 And just lay the groundwork for people who don't know what the Twitter files are.
00:23:56.000 These are all documents that were made available by Elon Musk when he purchased Twitter, correct?
00:24:04.000 Yeah, so Elon Musk, when he bought Twitter, one of the things he did is he invited a bunch of independent journalists to San Francisco, had a surprisingly brief meeting with all of us, basically said, I'm going to open up all the files for one of...
00:24:22.000 America's largest corporations and you can do what you want with that material.
00:24:27.000 And basically did that for about two and a half months.
00:24:31.000 We were just sort of rooting around with no supervision in Twitter's files or with limited supervision, I would say.
00:24:38.000 And we found all kinds of stuff that I think he didn't even know was there, in particular about the relationship between The FBI, the Department of Homeland Security, and all these platforms.
00:24:50.000 We found that was a very sophisticated, constant relationship where they were flagging lots of content.
00:24:57.000 So we did a lot of those reports and they made a lot of noise last year around this time.
00:25:03.000 But since then we've had other people come forward with other documents from other kind of censorship operations.
00:25:13.000 There's this thing called the CTI files that we did a couple of weeks ago.
00:25:19.000 And this was a group that was also put together to, ostensibly, to police COVID misinformation and disinformation It was founded by a British data scientist who has some former defense ties and then somebody who was still working for the Pentagon at the Special Operations Command as a quote-unquote technologist at the time.
00:25:45.000 And they organized this group of volunteers, quote-unquote volunteers, that were largely from the tech world to not only review content, But to do things like create sock puppet accounts to infiltrate groups online,
00:26:03.000 we have training videos where some of the people involved in this group are talking to the new recruits and saying, we're going to do all the things the bad guys do, but for good reasons.
00:26:16.000 So that includes using fake accounts, infiltration, repetition, creating false news stories.
00:26:27.000 It's all kinds of documents like this, and they're openly saying, we want you to create more sock puppet accounts for Twitter and Facebook.
00:26:36.000 We want you to use burner phones.
00:26:38.000 I mean, it's stuff like this.
00:26:39.000 So we just put that out a couple of weeks ago, but the key takeaway to that is the Twitter files are mainly about the defensive aspect of this, which is censorship and de-amplification, but we're also now finding out that there's an offensive component To some of these operations where they're sort of COINTELPRO style, you know, creating the appearance of things that aren't real on the internet.
00:27:04.000 So you might have fake accounts that are trolling individuals.
00:27:09.000 I'm sure your account is subject to it all the time on Twitter.
00:27:13.000 And my guess is that we're going to find more of that as we go forward.
00:27:17.000 Just take a moment to comment on Elon Musk.
00:27:22.000 It's pretty unique because there's no corporate CEO in our country who would allow anything like that to happen.
00:27:31.000 And his lawyers would lock him in a padded cell if he tried it.
00:27:37.000 It's amazing that he led you into this treasure trove of, like...
00:27:43.000 Actionable information that makes the company look terrible.
00:27:48.000 Oh, exactly.
00:27:48.000 I mean, as an attorney, I'm sure you can appreciate this.
00:27:52.000 The first meeting I had with him, there was a fairly senior attorney in the room.
00:27:58.000 And he's sort of going on and on about, yeah, you can look at this and that.
00:28:03.000 And the attorney sort of gently raises a hand and says, we're not talking about privileged material, though, are we?
00:28:11.000 And he's like...
00:28:12.000 Yeah, we are.
00:28:13.000 Why not?
00:28:13.000 I looked over and you can imagine the look on that person's face.
00:28:19.000 And the look on that person's face was actually one of the things that reassured me that this story was for real.
00:28:25.000 And actually, all throughout the project, the kind of look of abject horror on every attorney's face that we saw in the Twitter offices It was an indication that the stuff we were getting was deeply upsetting to them.
00:28:39.000 Because in addition to all the things that were pertinent about censorship and cultural issues, there was all kinds of stuff in there about ongoing litigation and financial information that we could have just dumped out there and it could have made a tremendous headache for the company if we wanted to.
00:28:58.000 But he didn't care.
00:28:59.000 In fact, the only thing he ever really did...
00:29:01.000 I mean, I remember this one moment...
00:29:03.000 Very clearly, there were like 10 of us in a room, just clacking away, looking at all this stuff.
00:29:10.000 And he sort of poked his head in like, you know, the show Fawlty Towers and he said, does anyone need any coffee or anything like that?
00:29:17.000 And, you know, then he disappeared after that.
00:29:21.000 That was his contribution to the whole thing.
00:29:24.000 There was none of this kind of overlord You know, hanging over our shoulder watching what we were doing, which was, it was really weird, Robert, but it was amazing.
00:29:34.000 I mean, again, I'd be curious to hear, I mean, as an attorney, I can't imagine that they would have gone to bed at night thinking anything but just pure horror about the whole thing.
00:29:44.000 Yeah, I'm stunned because I'm sure that they were screaming at him.
00:29:51.000 I've never heard anything like this.
00:29:52.000 It's completely irresponsible, but it was so beautiful that he did it.
00:29:56.000 And he's a South African and he loves our country enough and the whole idea of free speech.
00:30:03.000 He's only doing it here, by the way.
00:30:06.000 In the other country, right now, all of these companies are...
00:30:11.000 Are bowing and scraping to foreign leaders.
00:30:13.000 And they're all over Europe.
00:30:15.000 They're censoring everything.
00:30:16.000 And, you know, they're spying and censoring.
00:30:19.000 But he's carved out this country and said, you know, we're going to keep this as kind of an oasis of free speech.
00:30:25.000 Because he has to.
00:30:26.000 He can't.
00:30:26.000 In Europe, they pass these terrible laws that say if you violate by putting up, questioning vaccines, that kind of stuff, you pay these huge penalties.
00:30:37.000 They're ruinous.
00:30:38.000 They'll bankrupt you.
00:30:39.000 Of course, in China, you know, nobody can operate in China without doing exactly what the government tells them to do.
00:30:46.000 So he's doing it over there.
00:30:48.000 You know, he's keeping his business model alive by, you know, by gaving into him over there.
00:30:53.000 But it's just unbelievable what he's done here in terms of protecting free speech.
00:30:58.000 And think about...
00:30:59.000 Where we'd be right now in this country if it weren't for Elon Musk, because Facebook is not going to lift the censorship.
00:31:06.000 Google's not going to lift it.
00:31:08.000 Instagram, YouTube, they're all heavily censored.
00:31:11.000 And the only place that free speech survives, and then everybody else has got corporate control.
00:31:17.000 You don't hear anything.
00:31:19.000 And Twitter's the only place where there's free speech left on a big platform in our country.
00:31:25.000 Yeah, I mean, and obviously, look, Elon has his foibles.
00:31:28.000 I mean, he's having a spat with the company where I publish right now, and some of the, you know, so he's suppressing some links there.
00:31:37.000 But overall, yeah, absolutely.
00:31:39.000 The only way...
00:31:41.000 The public, A, would know anything about any of the stuff that's going on, and then B, that there would be any chance for any kind of carve-out in what turns out to be a pretty ironclad informational cartel, not just in America, but all around the Western world, is if a highly eccentric billionaire decides to opt out.
00:32:04.000 I mean, this is like the one scenario that they didn't You know, account for when they built this system.
00:32:10.000 And even then, I don't know how long he's going to be able to hold out because they have so many different ways of applying pressure.
00:32:17.000 And the law you referred to in Europe, the Digital Services Act, that's going to be the prototype of the kind of thing they're going to install everywhere.
00:32:26.000 They obviously want to do it here.
00:32:28.000 It'll be harder because we have a different tradition, but You're right.
00:32:32.000 The penalties are crippling for even one violation of that act.
00:32:35.000 So it's going to be interesting to see what happens there.
00:32:39.000 Yeah, you know, I talked to Jack Dorsey about it, and he really admires Elon.
00:32:46.000 He's very interesting about it because he's critical about some of the financial choices Elon made at the beginning.
00:32:53.000 He thinks he should have unloaded Twitter and then maybe bought it after it.
00:32:59.000 Because it was clear that it was going to plummet.
00:33:02.000 But he said a couple of interesting things.
00:33:04.000 One is he said, ultimately, they're going to make Elon cave because there's so many ways they can come after him.
00:33:11.000 And they can, first of all, get rid of all the advertisers, which is what they're doing.
00:33:15.000 And he's been really courageous about that, saying, go ahead, do it.
00:33:20.000 Hey, I lived in that great moment, right?
00:33:22.000 The go-after-yourself moment, which was fantastic.
00:33:25.000 The other thing that Jack Dorsey said, because I asked him, what are you, you know, what's the solution to all this manipulation that's going on in the internet where these sites are, you know, are censoring and manipulating the way we think about things, the way we see the world, the way that we experience the world.
00:33:45.000 And, you know, everything is Things can be programmed.
00:33:53.000 Societies can be programmed.
00:33:55.000 And this instrument of the internet is the perfect way, as it turns out, to program human beings for compliance.
00:34:04.000 And what Jack Dorsey said is the answer to that is to make all the algorithms transparent so that you can choose your own algorithm.
00:34:15.000 So right now, If you're a Republican and you ask a question, you'll get a different set of information than your neighbor who's a Democrat, because the algorithm is trying to figure out how to accomplish certain things, mainly to maximize the amount of time that you're going to spend on this site.
00:34:34.000 And they do that by feeding you information that fortifies your existing worldview and your existing beliefs.
00:34:41.000 But a lot of the manipulation is taking place involuntarily.
00:34:44.000 They're trying to make us see the world in a certain way.
00:34:48.000 And he said that the only way to counter that is to make all the algorithms transparent and allow you to choose your own algorithm.
00:34:56.000 So you can say, you know, I want a Republican algorithm.
00:35:00.000 I want a Democrat algorithm.
00:35:01.000 I want an algorithm that feeds me stories about biology that does this, this and the other.
00:35:07.000 So at least, you know, you're in charge of your own manipulation.
00:35:12.000 And I said to him, that's actually a great idea.
00:35:14.000 Yeah.
00:35:15.000 And he said, I've said it five times.
00:35:17.000 I've testified in front of Congress.
00:35:19.000 This is the way to solve the problem.
00:35:21.000 And he said, they just, they know everything.
00:35:24.000 It just doesn't even make a ripple.
00:35:26.000 But anyway, it's an interesting idea.
00:35:28.000 Well, he, he, he's developed a social media or he, he helped develop a social media platform called Noster, which is really fascinating because the concept of it is to make the social media program not a full service program like Twitter, but more like a protocol like email.
00:35:50.000 So everybody uses email, but everybody can use your own version of it.
00:35:56.000 You can have Gmail if you want.
00:35:58.000 You can have Yahoo Mail if you want.
00:36:00.000 It'll have different features to it.
00:36:02.000 It will sort your mail in a different way or whatever.
00:36:05.000 And the idea behind Noster is that the protocol would basically be non-manipulatable, but you could overlay your own filters and algorithms.
00:36:18.000 I think that's a really great idea.
00:36:21.000 If they can make that functional, then that's terrific.
00:36:25.000 Because the problem right now is that anything that's owned, they're going to be able to manipulate it.
00:36:31.000 Even with Elon sitting there trying desperately to keep control over his own company, they're able to impact the revenue In so many different ways.
00:36:42.000 And we even said this to each other in the first days of the Twitter files, the reporters.
00:36:46.000 We were like, whatever this is, it's temporary.
00:36:49.000 They're going to pressure this company and this thing is going to get locked down.
00:36:55.000 So yeah, I think you have to find a way to make it so that it's not controllable and not susceptible to manipulation.
00:37:04.000 And Jack's right.
00:37:06.000 Do you use Noster?
00:37:08.000 You know, I like Noster a lot.
00:37:10.000 I would use it more.
00:37:12.000 The functionality isn't perfect for what I do.
00:37:14.000 There are some features that are missing.
00:37:17.000 It's not easily searchable for news, which makes it pretty hard for somebody like me to use it as a primary social media tool.
00:37:27.000 But I think in the near future, they'll figure out a way to get around that and Then that's where people will go.
00:37:36.000 This is one of the interesting things about the internet in this period is that we're seeing that audiences are capable of mass moving from one place to another pretty quickly.
00:37:46.000 And you have to learn how to navigate that landscape.
00:37:49.000 I mean, as a political candidate, you must have to be thinking about that right now.
00:37:54.000 Because the strategies are totally different than they were even a year ago.
00:37:58.000 Yeah.
00:37:59.000 I mean, we have a whole team.
00:38:00.000 And luckily, Amaryllis is running the campaign.
00:38:03.000 You know, was like a marketing director or something for Twitter and had her own tech company.
00:38:08.000 And so she's in kind of the perfect position to do this.
00:38:11.000 But I use Noster.
00:38:13.000 And, you know, it's kind of like all the cool people now are using Noster.
00:38:18.000 Yeah, I definitely like it.
00:38:20.000 I think it's cool.
00:38:20.000 Yeah.
00:38:22.000 Anyway, we've got to get off because I'm being told we've got another thing, which I hate to leave you because there's so much more to talk about.
00:38:30.000 Tell people how they can find you.
00:38:31.000 You're on Rumble, right?
00:38:33.000 I'm actually on Substack.
00:38:35.000 So it's at www.racket.news and you can find our stuff there.
00:38:43.000 It's R-A-C-K-E-T. Racket.
00:38:47.000 Like a tennis racket.
00:38:49.000 Yeah, you know, we're publishing a lot these days and have a lot of stuff coming out about the subject.
00:38:56.000 I got Labor Party files coming out about the CCDH and some other things.
00:39:01.000 So, yeah, which I think you'll find interesting.
00:39:03.000 That's the Center for Countering Digital Hate.
00:39:06.000 Yes, they're big fans of yours, I'll put it that way.
00:39:10.000 I'm on their disinformation.
00:39:14.000 People have been doing a lot of work.
00:39:15.000 I think Paul Thacker has been doing some good work, and he was part of your crew on the Twitter phone.
00:39:21.000 He was.
00:39:21.000 Yeah, he came in.
00:39:23.000 I've known Paul a long time, back to the days when he was a Senate staffer.
00:39:28.000 Terrific journalist, great investigator.
00:39:30.000 Yeah, and that Center for Countering Digital Aid That has a really interesting pedigree.
00:39:37.000 And the money sources of that are very, very, most of them are very obscure.
00:39:42.000 And anyway, that'll be interesting as that begins to unravel.
00:39:46.000 Just quickly about that.
00:39:47.000 I mean, that's one of the things we've got in these documents is sort of concrete proof that the CCDH started as a project of something called Labour Together, which is...
00:39:58.000 Destroyed the left wing of the Labour Party in Britain.
00:40:01.000 Exactly.
00:40:03.000 And they basically are doing the same thing to the Democratic Party here.
00:40:06.000 They took the Labour Party in Britain and turned it over.
00:40:11.000 They empowered the corporate wing, it was corporate controlled, and they destroyed all the progressive wing, and they did the same thing to the Democratic Party here.
00:40:21.000 Yeah, it's like an exaggerated version of what the DLC did to the Democratic Party, right?
00:40:26.000 But they did it with the aim of getting rid of Jeremy Corbyn first, but now this project, the CCDH, has morphed into this massive, extremely ambitious thing that impacts quite a lot in the world.
00:40:40.000 So, anyway, you know that.
00:40:42.000 They've paid a lot of attention to you.
00:40:44.000 So, more on that is coming.
00:40:46.000 All right, Matt.
00:40:47.000 Wonderful to have you.
00:40:48.000 I hope we can get back soon.
00:40:49.000 Absolutely.
00:40:50.000 Thank you, Robert.
00:40:51.000 Thanks for everything you do.
00:40:52.000 Campaign trail.
00:40:54.000 Take care.
00:40:55.000 Thank you, Matt.
00:40:56.000 Thanks, man.
00:40:57.000 That was great.
00:40:58.000 Good luck.
00:41:00.000 What do you know about Noah Schlackman?
00:41:03.000 Oh my god, a total idiot.
00:41:07.000 You know, so all the people who are at Rolling Stone working under him now.
00:41:13.000 Yeah, I only know a few people left from that editorial staff, but they were...
00:41:17.000 He's not a popular editor, let's put it that way.
00:41:20.000 But he's a hardcore, hashtag resistance believer.
00:41:25.000 Not a journalism guy, he's a political guy.
00:41:29.000 Are you having a thing with him?
00:41:31.000 Oh yeah, he is a vendetta against me.
00:41:35.000 We also publish an article at CHD showing his intelligence agency ties.
00:41:40.000 Oh really?
00:41:41.000 You should look it up.
00:41:42.000 It's Dick Russell did a two-part.
00:41:44.000 It's called The Belly of the Daily Beast or something, and it's about the CIA takeover of, I think, Rawlings' Daily Beast on Daily Kos, and it shows the intelligence agency pedigree in Slack.
00:42:03.000 There's some interesting stuff in that.
00:42:05.000 It's been Dick Russell, and it's in The Defender, and it's a two-part series, but there's some interesting things in there about him.
00:42:11.000 It's called The Belly of the Daily Beast.
00:42:14.000 Belly of the Daily Beast, wow.
00:42:16.000 And John Avalon, you know, all of these guys who have agency ties.
00:42:24.000 Interesting.
00:42:25.000 Excellent.
00:42:26.000 Can't wait to look.
00:42:27.000 Well, thanks again.
00:42:28.000 And I definitely want to check you out on the campaign trail, so we should talk about that.
00:42:32.000 Oh, that'd be great.
00:42:33.000 Love that.
00:42:34.000 Thanks, Matt.
00:42:35.000 All right.
00:42:35.000 Take care now.