In this episode, Matt Taibbi talks about the Colorado Supreme Court ruling that throws Trump off the ballot under the 14th Amendment, and why that s a huge mistake. He also talks about how the ruling is part of a larger pattern of disenfranchising voters and replacing them with unaccountable elites, and how we need to stop allowing the rabble to make irrational decisions and get us back on track with our democratic ideals. Matt is the author of four New York Times bestsellers, and an award-winning columnist for Rolling Stone. His recent book, Hate Inc., is a turbocharged take on how the media twists the truth to pit us against each other. In Talks, Matt paints an alarming portrait of politics, media, culture, and all of the big, big grifters on Wall Street and the Fed, from the Fed on down. He is one of the most eloquent deconstructors of Wall Street, and the banking system, and his deconstructions of the financial system paint a disturbing portrait of what's going on in the world, from Wall Street to the Fed and everything in between. This is a must-listen for anyone who wants to understand what's really going on, and what s going on around us, in our economy, politics, the media, and our culture, while providing a way forward against our most urgent crises. And, of course, if you don't like it, you can get your own copy of Hate Inc.'s newest book, "Hate Inc." by clicking here. Matt also wrote The Great Derangement, The Divide, Griftopia, and The Great Distraction. and The Idiot's Guide to the Great Disagreement, which is out now. by Rolling Stone's new book, The Dark Side of America's Most Disturbing Problem, which you should read and listen to. If you do, you'll get a discount code for 10% off your first purchase of a copy of the book, and a free copy of The Hate Inc. Thanks to Matt's newest book: Hate Inc and much more! Matt's book is available on Amazon Prime and Audible, too! and his podcast is available in Kindle, iBookshop, Podcharts, and also on Audible and Podchaser, wherever else you get your book recommendations are available. You can get the book on amazon, too. Thank you, Matt.
00:00:01.000I've got one of my favorite guests here, Matt Taibbi.
00:00:03.000Matt is the author of four New York Times bestsellers and an award-winning columnist for Rolling Stone.
00:00:11.000His recent book, Hate Inc., is a turbocharged take on how the media twists the truth to pit us against each other.
00:00:19.000In talks, Taibbi paints an alarming portrait of politics, media, culture, while providing a way forward against our most urgent crises.
00:00:29.000Taibbi also wrote New York Times bestsellers and St.
00:00:33.000Cloud President, The Divide, Griftopia, and The Great Derangement.
00:00:37.000He is a winner of the 2008 National Magazine Award for Columns and Commentary, and he is one of the The most, I think, eloquent deconstructors of Wall Street and the banking system and all of the big, big grifters on our country and our international economy and the Fed, from the Fed on down.
00:01:07.000The Supreme Court case just came out, and it's a Colorado Supreme Court case that probably by the time this shows, everybody will have rehashed and rehashed.
00:01:18.000It throws Trump off the ballot under the 14th Amendment.
00:01:24.000Because of this statute that was passed after the Civil War to keep Civil War veterans, Confederate veterans from voting in the election, Confederate leadership from voting in future elections to protect Reconstruction in the South.
00:01:42.000I'm not an attorney, so I'd be curious actually to hear your opinion on this because it feels to me that this is neither the sense for which that law is designed.
00:01:58.000It's very much in line with this sort of lawfare phenomenon that we've seen.
00:02:02.000In the last seven years where, you know, rather than leaving the fate of America to the auspices of voters, whom I think elites increasingly distrust in this country, they are looking for other techniques.
00:02:17.000It's not dissimilar from what happened, you know, in Florida and what happened with you and some, you know, in some other democratic states.
00:02:27.000But this is, I think, I think it's a huge and crazy step.
00:02:30.000I'd be curious to hear what you think.
00:02:37.000And then the second point that you make, that it's part of this pattern of disenfranchising voters and choosing our political leaders through appointment of the party leadership and the elites is...
00:02:58.000I don't want to win by kneecapping my opponent.
00:03:01.000I want to be able to win an election fair and square and not leave half the country angry that they didn't get to vote for their candidate.
00:03:09.000This is the kind of thing that you see in Pakistan or Iran or some other country that has a pretend democracy where popular leaders are...
00:03:22.000that they can't participate, that they get rid of them one way or another so that they can't run.
00:03:29.000And we look at that and we say, "Yeah, that's not a real democracy." Well, that's happening here now.
00:03:34.000And like I say, I'm not a fan of Trump's, but if you don't believe in democracy, if you don't believe in the people as the demos, which is the demos of democracy, You can't say that you believe in democracy if you mistrust, if you distrust the people to choose their own leaders.
00:03:55.000I mean, obviously we saw hints of this in 2016, 2015, when we started to see the spade of punditry where they were talking about there's just too much democracy.
00:04:06.000We have to go back to the smoke-filled room.
00:04:09.000We can't allow the rabble to be making these consequential decisions or irrational decisions.
00:04:15.000But even going back before that, I think some of the roots of this, you have to put at the feet of the war on terror and think about all of the major decisions that we made, sort of casually throwing out Bill of Rights characteristics that have been sort of casually throwing out Bill of Rights characteristics that have been central to American life for hundreds I mean, we throw out habeas corpus.
00:04:39.000Due process is gone in a lot of these procedures.
00:04:44.000If you add COVID, which is kind of the culmination of the war on terror, it started with 9-11 and ended with COVID, where they completely shut the door on the U.S. Constitution, and they got rid of free speech, the First Amendment.
00:04:58.000They got rid of right to assemble with social...
00:05:01.000Who could have imagined that there would be a crime to assemble and social distancing?
00:05:17.000And then the religious freedom, they closed every church in this country for a year with no scientific citation and no notice and comment rulemaking, no public hearing, no environmental impact statement.
00:05:32.000And then they got rid of jury trials of the Seventh Amendment by saying if one of these companies hurt you with a vaccine or some other intervention, you can't sue them.
00:05:41.000And then the whole Fourth Amendment probably envisioning against warrantless searches and seizures was just obliterated by all the track and trace surveillance and you got to show your medical records before you go into a building and all of this.
00:05:56.000It was really, it's like they targeted the Bill of Rights one after the other for obliteration.
00:06:05.000The Fourth Amendment case, the one involving getting geolocation information from phone providers.
00:06:12.000When this decision came down, I immediately thought of the sort of internal memos that the Obama administration wrote.
00:06:21.000In support of the drone program, where there was this very curious part in one of them that talked about how due process didn't necessarily have to involve the party in question, that any kind of process could be due process.
00:07:20.000It was drawn from freedom of speech and the idea that they could start censoring people.
00:07:29.000They started with the vaccines and with these medical interventions, but then they kept expanding it to Ukraine war.
00:07:36.000It's exactly the stuff you want people talking about.
00:07:40.000Government policies is the reason we fought the revolution in the first place, because people wanted to be able to criticize their government without It's
00:08:23.000I mean, this was a big theme of all the stuff that, you know, we've been working on in the last year, which is that all of those prohibitions against intelligence agencies propagandizing the domestic population, you know, from the Smith-Month Act to the The charters for the State Department and the CIA. They're all just being violated willy-nilly, all of them.
00:08:47.000And there's not even any kind of embarrassment about it or any suggestion that anybody's even worried about it.
00:08:55.000You have agencies like the Global Engagement Center, which is a State Department agency, and they're openly involved in censoring Domestic content or doing content moderation domestically.
00:09:08.000And they have no legal remit to do that.
00:09:10.000And it's just considered, you know, people just shrug, as you say.
00:09:17.000I don't want to insert my situation into the conversation, but part of this is why the Secret Service denial, to me, You know, which is the first time in history that anybody who asks for Secret Service protection has been denied.
00:09:35.000And I exceed all the parameters and metrics for which they've routinely given Secret Service protection.
00:09:42.000And, you know, it's a minor, minor issue.
00:09:49.000When my dad went into the Justice Department his first week in there, and Arthur Seisinger talks about this in his biography, he gave a speech to all the division heads and branch heads saying nothing is going to be politicized here.
00:10:05.000None of that's going to happen in my Justice Department.
00:10:08.000And that was a routine speech that every attorney general gave.
00:10:14.000Do the Justice Department because it was so important that Americans have faith in the institutions of democracy and that they aren't politicized and that the whole world looked well.
00:10:30.000And now you have this, you know, it's really the thing that disturbs me of all the things that, you know, disturb me about President Biden right now.
00:11:18.000The Secret Service decision in your case, I mean, it jives with this overall trend of you have this odd contradiction among sort of the upper-class, blue-leaning intellectuals in this country.
00:11:35.000On the one hand, they're furiously angry that there's been this collapse of trust in elite institutions.
00:11:42.000And they want desperately to remedy that.
00:11:45.000But at the same time, they keep taking these steps that guarantee that nobody will trust those institutions.
00:11:52.000And it's everything from what you're talking about, like with the obvious politicization of the Secret Service to this Supreme Court case to the news media not correcting major errors.
00:12:05.000You know, year after year after year to the censorship issue.
00:12:10.000If you want people to listen to the national news media, they've got to stop getting things wrong and they have to, you know, admit it when they do.
00:12:30.000Well, people say to me all the time, you know, accusingly, a lot of the mainstream media, you know, you've been bashing, telling everybody not to trust the institutions of government, and you've got people to doubt NIH and CDC and And if you then get into the presidency after you've destroyed the trust in all these institutions, how are you going to govern?
00:12:51.000And I say to them, I'm going to make people trust the institutions again.
00:13:01.000You can't just force people to trust stuff that you'd be out of your mind to trust the government or the mainstream media today.
00:13:08.000If you trust government, if you trust the mainstream media, you're not paying attention.
00:13:13.000You're not paying attention to anything.
00:13:17.000Yeah, and again, this goes back to your situation came up in exactly this way in probably one of the last Twitter files reports that we did that was about the Virality Project, which was this Stanford effort backed by the Department of Homeland Security and the Global Engagement Center and some other government agencies to root out what they called COVID misinformation, disinformation and malinformation.
00:13:47.000But what they did is there was this weird drift.
00:13:50.000They would start off talking about things that, you know, are kind of obviously Would fall under the category of misinformation, you know, like the idea of microchips being, you know, implanted in vaccines or something like that.
00:14:04.000But then, very quickly, they would come to define anything that undermines confidence in government policies, anything that undermines confidence in individual officials like Anthony Fauci, or anything that would, you know, quote-unquote promote hesitancy.
00:14:26.000And there was a passage that particularly referred to you that I thought was one of the most striking things, where they talked about how repeat offenders, people like you, are almost always reportable.
00:14:41.000You know, to me that was striking because as a journalist, you know, we're trained that if we get something wrong, that you punish the speech, not the speaker.
00:14:50.000You don't sort of decide that somebody is inherently libelous or, you know, is More prone to libel than somebody else.
00:14:59.000You deal with each specific case, but that's not how they do it.
00:15:03.000They're assessing people and sort of making these binary decisions, trustworthy, not trustworthy, and there's no process.
00:15:11.000They just kind of put people in baskets.
00:15:14.000It's a crazy way to go about things, and it's totally contrary to the spirit of this country, which is everybody gets a chance to defend themselves.
00:15:22.000Everybody gets a chance to give their side, and nah, they're not into that.
00:15:27.000Yeah, I mean, you mentioned an interesting word, malinformation, which I think they made up.
00:15:32.000And they applied that to me a lot because of what I was writing about.
00:15:37.000Listen, if somebody said to me, you got this wrong, and there's a couple things I got wrong, and we immediately corrected them.
00:15:43.000Somebody says to me, you got this wrong, this description, this study didn't happen.
00:15:48.000There was a Filipino study that I wrote about that I got a fact wrong and I immediately corrected it.
00:15:54.000But nobody could point to a piece of information that I had put that was false.
00:16:00.000And we had made it because of, you know, we had a major fact-checking operation at CHD that was revealing at 350 cases.
00:16:09.000PhD scientists and MD physicians who were on this, you know, scientific advisory board, and we were, everything that went out from us, everything was cited in source.
00:16:19.000We were very, very careful, and nobody pointed to an actual erroneous statement that I made.
00:16:26.000But they invented this new word called malinformation, which is not misinformation or disinformation.
00:16:33.000Disinformation is somebody deliberately seeding the dialogue with a manipulative piece of misinformation.
00:16:42.000Misinformation is just you got it wrong.
00:16:44.000Misinformation, you deliberately got it wrong.
00:16:47.000Malinformation is information that is factually correct, but it is inconvenient, nevertheless, to government officials.
00:16:55.000And they had to make up this word, which is some bizarre pedigree, I don't know, or etymological root.
00:17:04.000Yeah, I'm still looking for the first use of that.
00:17:08.000Yeah, because when we first encountered it, You know, the DHS, the Department of Homeland Security, they had something called the MDM subcommittee.
00:17:17.000So that's the misinformation, disinformation, and malinformation subcommittee.
00:17:22.000Malinformation ended up becoming one of the categories that could be applied in the virality project and some other sort of anti-disinformation schemes that went on.
00:17:35.000But it was definitely used by the Department of Homeland Security So it has to have come from somewhere, and you're right.
00:17:43.000It's a word that specifically was invented to deal with things that aren't true, but they are true, that aren't false, but they want to treat as false.
00:17:57.000We saw plans for discussions about sort of future DHS activity where they talked about building resilience in the population against what they call despair-inducing MDM. So despair-inducing malinformation can just be things that are true that make the population dissatisfied.
00:18:30.000But you can see how very quickly a person who is put in the position of evaluating all this and thinks that they're doing God's work, they will come around pretty quickly to starting to define all kinds of things as malinformation.
00:18:44.000And That's the big problem with this stuff.
00:18:48.000They start doing things like, oh, there's somebody on Facebook who's talking about a relative who got myocarditis after getting the shot.
00:19:00.000And this person might even be pro-vaccine, but they'll call that malinformation because it's, you know, quote unquote, promotes hesitancy.
00:19:07.000And yeah, they had a whole list of those incidents, but it's a crazy concept.
00:19:14.000It's the kind of thing that Orwell would have invented.
00:19:17.000And I know that's a cliche, but in this case, it's really, it's very apt, I think.
00:19:22.000Yeah, I mean, one of the amazing things that you're now encountering, like with this institution at Stanford, is this question about where the money is actually coming from for all of these, like, It looks like a lot of it is coming through intelligence agencies and that they're routed through these, you know, these bizarre chains that people...
00:19:48.000Tell us what you found and who is funding and who is really behind the censorship industrial complex and who are the characters who are, you know, who they brought into this whole thing.
00:20:00.000Have you ever run across Averill Haynes?
00:20:06.000This is part of the reason that the, you know, The Twitter file story and then, you know, some of the other stuff that we've done since then.
00:20:13.000It's incredibly confusing because this new industry, this kind of censorship industrial complex, as Michael Schellenberger calls it, it comes from a lot of different places.
00:20:24.000It grew out of sort of the counter messaging operations in the war on terror.
00:20:31.000So you had groups within the Pentagon that were doing anti-disinformation work.
00:20:37.000It's targeting, in Arabic, targeting ISIS and Al-Qaeda, and they were funded pretty heavily over the years, but they switched.
00:20:48.000You know, they went from this, what one person called, one former agent called CT to CP. It's counterterrorism to counterpopulism.
00:20:56.000The CSCC, which was strictly anti-terrorist, to the Global Engagement Center, which is kind of strictly anti-disinformation, and that they're one of the partners for Stanford.
00:21:07.000So we found money from the Department of Defense, from the State Department.
00:21:12.000From the National Science Foundation, all of those contributed to the Stanford programs.
00:21:18.000Also, we found a significant amount of money that came from private donors like Reid Hoffman.
00:21:33.000Reid Hoffman's a LinkedIn billionaire.
00:21:36.000There's the Newmark Foundation, which is Craigslist, Piero Midiars, eBay.
00:21:41.000There are a lot of these tech billionaires are major funders of what they call anti-disinformation.
00:21:50.000NGOs, I think that's a Sometimes an overly generous term because a lot of them aren't really non-governmental.
00:21:58.000They're pretty explicitly partnered with governments.
00:22:00.000But there's a lot of private money that gets mixed in with these operations.
00:22:08.000I think the Stanford Election Integrity Partnership was really It was sort of a prototype for how to do these things.
00:22:16.000It's kind of started by the Department of Homeland Security.
00:22:20.000The idea seems to have come from there, but they can't do it because they don't have the legal authority.
00:22:26.000This is said openly by the people Who run the program that we have to do this because DHS, they even said that the exact quote was they kind of don't have the legal authority to do this or the funding.
00:22:41.000So you need kind of a private face so you can step in and do this work that would be absolutely illegal if the government did it directly.
00:22:51.000So the funding, you know, is sometimes routed In this indirect way, you know, Stanford gets a number of government grants, but they're not directly for these programs.
00:23:04.000They also get some support from the Newmark Foundation, right?
00:23:08.000Or from somebody like Reid Hoffman will contribute to a group like New Knowledge, which does this kind of work.
00:23:14.000But there's always the consistent pattern is the involvement in some way of a government agency or intelligence agency A big pile of money that comes at least in part from the private sector and then sometimes like a veneer of an academic project on top of it.
00:23:35.000And that seems to be what the standard pattern is.
00:23:39.000So talk about some of the highlights of what you found most recently in the Twitter files.
00:23:47.000So, in working on this, we had a bunch of whistleblowers come forward from different...
00:23:52.000And just lay the groundwork for people who don't know what the Twitter files are.
00:23:56.000These are all documents that were made available by Elon Musk when he purchased Twitter, correct?
00:24:04.000Yeah, so Elon Musk, when he bought Twitter, one of the things he did is he invited a bunch of independent journalists to San Francisco, had a surprisingly brief meeting with all of us, basically said, I'm going to open up all the files for one of...
00:24:22.000America's largest corporations and you can do what you want with that material.
00:24:27.000And basically did that for about two and a half months.
00:24:31.000We were just sort of rooting around with no supervision in Twitter's files or with limited supervision, I would say.
00:24:38.000And we found all kinds of stuff that I think he didn't even know was there, in particular about the relationship between The FBI, the Department of Homeland Security, and all these platforms.
00:24:50.000We found that was a very sophisticated, constant relationship where they were flagging lots of content.
00:24:57.000So we did a lot of those reports and they made a lot of noise last year around this time.
00:25:03.000But since then we've had other people come forward with other documents from other kind of censorship operations.
00:25:13.000There's this thing called the CTI files that we did a couple of weeks ago.
00:25:19.000And this was a group that was also put together to, ostensibly, to police COVID misinformation and disinformation It was founded by a British data scientist who has some former defense ties and then somebody who was still working for the Pentagon at the Special Operations Command as a quote-unquote technologist at the time.
00:25:45.000And they organized this group of volunteers, quote-unquote volunteers, that were largely from the tech world to not only review content, But to do things like create sock puppet accounts to infiltrate groups online,
00:26:03.000we have training videos where some of the people involved in this group are talking to the new recruits and saying, we're going to do all the things the bad guys do, but for good reasons.
00:26:16.000So that includes using fake accounts, infiltration, repetition, creating false news stories.
00:26:27.000It's all kinds of documents like this, and they're openly saying, we want you to create more sock puppet accounts for Twitter and Facebook.
00:26:39.000So we just put that out a couple of weeks ago, but the key takeaway to that is the Twitter files are mainly about the defensive aspect of this, which is censorship and de-amplification, but we're also now finding out that there's an offensive component To some of these operations where they're sort of COINTELPRO style, you know, creating the appearance of things that aren't real on the internet.
00:27:04.000So you might have fake accounts that are trolling individuals.
00:27:09.000I'm sure your account is subject to it all the time on Twitter.
00:27:13.000And my guess is that we're going to find more of that as we go forward.
00:27:17.000Just take a moment to comment on Elon Musk.
00:27:22.000It's pretty unique because there's no corporate CEO in our country who would allow anything like that to happen.
00:27:31.000And his lawyers would lock him in a padded cell if he tried it.
00:27:37.000It's amazing that he led you into this treasure trove of, like...
00:27:43.000Actionable information that makes the company look terrible.
00:28:13.000I looked over and you can imagine the look on that person's face.
00:28:19.000And the look on that person's face was actually one of the things that reassured me that this story was for real.
00:28:25.000And actually, all throughout the project, the kind of look of abject horror on every attorney's face that we saw in the Twitter offices It was an indication that the stuff we were getting was deeply upsetting to them.
00:28:39.000Because in addition to all the things that were pertinent about censorship and cultural issues, there was all kinds of stuff in there about ongoing litigation and financial information that we could have just dumped out there and it could have made a tremendous headache for the company if we wanted to.
00:29:03.000Very clearly, there were like 10 of us in a room, just clacking away, looking at all this stuff.
00:29:10.000And he sort of poked his head in like, you know, the show Fawlty Towers and he said, does anyone need any coffee or anything like that?
00:29:17.000And, you know, then he disappeared after that.
00:29:21.000That was his contribution to the whole thing.
00:29:24.000There was none of this kind of overlord You know, hanging over our shoulder watching what we were doing, which was, it was really weird, Robert, but it was amazing.
00:29:34.000I mean, again, I'd be curious to hear, I mean, as an attorney, I can't imagine that they would have gone to bed at night thinking anything but just pure horror about the whole thing.
00:29:44.000Yeah, I'm stunned because I'm sure that they were screaming at him.
00:30:26.000In Europe, they pass these terrible laws that say if you violate by putting up, questioning vaccines, that kind of stuff, you pay these huge penalties.
00:31:41.000The public, A, would know anything about any of the stuff that's going on, and then B, that there would be any chance for any kind of carve-out in what turns out to be a pretty ironclad informational cartel, not just in America, but all around the Western world, is if a highly eccentric billionaire decides to opt out.
00:32:04.000I mean, this is like the one scenario that they didn't You know, account for when they built this system.
00:32:10.000And even then, I don't know how long he's going to be able to hold out because they have so many different ways of applying pressure.
00:32:17.000And the law you referred to in Europe, the Digital Services Act, that's going to be the prototype of the kind of thing they're going to install everywhere.
00:32:28.000It'll be harder because we have a different tradition, but You're right.
00:32:32.000The penalties are crippling for even one violation of that act.
00:32:35.000So it's going to be interesting to see what happens there.
00:32:39.000Yeah, you know, I talked to Jack Dorsey about it, and he really admires Elon.
00:32:46.000He's very interesting about it because he's critical about some of the financial choices Elon made at the beginning.
00:32:53.000He thinks he should have unloaded Twitter and then maybe bought it after it.
00:32:59.000Because it was clear that it was going to plummet.
00:33:02.000But he said a couple of interesting things.
00:33:04.000One is he said, ultimately, they're going to make Elon cave because there's so many ways they can come after him.
00:33:11.000And they can, first of all, get rid of all the advertisers, which is what they're doing.
00:33:15.000And he's been really courageous about that, saying, go ahead, do it.
00:33:20.000Hey, I lived in that great moment, right?
00:33:22.000The go-after-yourself moment, which was fantastic.
00:33:25.000The other thing that Jack Dorsey said, because I asked him, what are you, you know, what's the solution to all this manipulation that's going on in the internet where these sites are, you know, are censoring and manipulating the way we think about things, the way we see the world, the way that we experience the world.
00:33:45.000And, you know, everything is Things can be programmed.
00:33:55.000And this instrument of the internet is the perfect way, as it turns out, to program human beings for compliance.
00:34:04.000And what Jack Dorsey said is the answer to that is to make all the algorithms transparent so that you can choose your own algorithm.
00:34:15.000So right now, If you're a Republican and you ask a question, you'll get a different set of information than your neighbor who's a Democrat, because the algorithm is trying to figure out how to accomplish certain things, mainly to maximize the amount of time that you're going to spend on this site.
00:34:34.000And they do that by feeding you information that fortifies your existing worldview and your existing beliefs.
00:34:41.000But a lot of the manipulation is taking place involuntarily.
00:34:44.000They're trying to make us see the world in a certain way.
00:34:48.000And he said that the only way to counter that is to make all the algorithms transparent and allow you to choose your own algorithm.
00:34:56.000So you can say, you know, I want a Republican algorithm.
00:35:28.000Well, he, he, he's developed a social media or he, he helped develop a social media platform called Noster, which is really fascinating because the concept of it is to make the social media program not a full service program like Twitter, but more like a protocol like email.
00:35:50.000So everybody uses email, but everybody can use your own version of it.
00:36:02.000It will sort your mail in a different way or whatever.
00:36:05.000And the idea behind Noster is that the protocol would basically be non-manipulatable, but you could overlay your own filters and algorithms.
00:36:21.000If they can make that functional, then that's terrific.
00:36:25.000Because the problem right now is that anything that's owned, they're going to be able to manipulate it.
00:36:31.000Even with Elon sitting there trying desperately to keep control over his own company, they're able to impact the revenue In so many different ways.
00:36:42.000And we even said this to each other in the first days of the Twitter files, the reporters.
00:36:46.000We were like, whatever this is, it's temporary.
00:36:49.000They're going to pressure this company and this thing is going to get locked down.
00:36:55.000So yeah, I think you have to find a way to make it so that it's not controllable and not susceptible to manipulation.
00:37:12.000The functionality isn't perfect for what I do.
00:37:14.000There are some features that are missing.
00:37:17.000It's not easily searchable for news, which makes it pretty hard for somebody like me to use it as a primary social media tool.
00:37:27.000But I think in the near future, they'll figure out a way to get around that and Then that's where people will go.
00:37:36.000This is one of the interesting things about the internet in this period is that we're seeing that audiences are capable of mass moving from one place to another pretty quickly.
00:37:46.000And you have to learn how to navigate that landscape.
00:37:49.000I mean, as a political candidate, you must have to be thinking about that right now.
00:37:54.000Because the strategies are totally different than they were even a year ago.
00:38:22.000Anyway, we've got to get off because I'm being told we've got another thing, which I hate to leave you because there's so much more to talk about.
00:39:47.000I mean, that's one of the things we've got in these documents is sort of concrete proof that the CCDH started as a project of something called Labour Together, which is...
00:39:58.000Destroyed the left wing of the Labour Party in Britain.
00:40:03.000And they basically are doing the same thing to the Democratic Party here.
00:40:06.000They took the Labour Party in Britain and turned it over.
00:40:11.000They empowered the corporate wing, it was corporate controlled, and they destroyed all the progressive wing, and they did the same thing to the Democratic Party here.
00:40:21.000Yeah, it's like an exaggerated version of what the DLC did to the Democratic Party, right?
00:40:26.000But they did it with the aim of getting rid of Jeremy Corbyn first, but now this project, the CCDH, has morphed into this massive, extremely ambitious thing that impacts quite a lot in the world.
00:41:44.000It's called The Belly of the Daily Beast or something, and it's about the CIA takeover of, I think, Rawlings' Daily Beast on Daily Kos, and it shows the intelligence agency pedigree in Slack.
00:42:03.000There's some interesting stuff in that.
00:42:05.000It's been Dick Russell, and it's in The Defender, and it's a two-part series, but there's some interesting things in there about him.
00:42:11.000It's called The Belly of the Daily Beast.