On today's show, Megyn Kelly and Matt Taibbi discuss the Biden 2020 campaign and the new administration's new push to make the internet a place that is under some kind of governmental control. Plus, a story about a recent Fox interview that was pulled because the host demanded an edit.
00:02:07.680And now they are attacking us as disinformation purveyors, the random American citizen who has access to a computer or a microphone.
00:02:18.580And the chief, the chief of this effort is going to be a woman named Nina Jankowicz, who will be the executive director.
00:02:26.220As far as I can tell so far, her most notable accomplishment is calling the Hunter Biden laptop a, quote, fairy tale, a, quote, Trump campaign product.
00:02:36.260And this is the person who now will be in charge of regulating what is and is not, quote, disinformation.
00:02:43.760Well, this has been going on for five or six years now.
00:02:47.760There's been, ever since Donald Trump was elected, a pretty concerted effort on the part of mainstream politicians, really in both parties, but particularly in the Democratic Party, to make the Internet a place that will be under some kind of governmental control.
00:03:08.700And this began in 2017 when we had members of the Senate calling up executives from Facebook, Google, and Twitter to the Hill and essentially demanding that they come up with strategies to prevent what they called the foment of discord.
00:03:26.880Back then, the bread and butter, right, exactly.
00:03:31.640The the boogeyman back then was was Russian disinformation.
00:03:37.520Then it was disinformation about the pandemic.
00:03:42.480You know, now we're circling back to Russian disinformation with regard to the Ukrainian conflict.
00:03:48.780And, you know, I think the problem is we're we're in a generation of people who they agree that there's a problem with disinformation in the media landscape, but they don't understand that the biggest lies are always official lies.
00:04:05.440And the only real defense against that is is free speech.
00:04:09.520And so they they want this top down system of of control, which I think is very, very dangerous.
00:04:17.400If you hear the Barack Obama remarks from late last week, he was longing nostalgically for the days in which it was just ABC, NBC and CBS and information was controlled.
00:04:30.120Right. We didn't have all these Internet hacks and trolls out there pushing so-called disinformation.
00:04:35.700And I'm sure that was a much more delightful time for people in the position like he had at the White House.
00:04:42.300But think of all the lies that have been told to us over the years from people in that post that the evening news, for whatever reason, went along with or had an incentive not to check too far into.
00:05:05.040Right. But the problem is people have an alternative now.
00:05:09.640They have a way to get around that, which they didn't have before, as as President Obama noted.
00:05:15.040And I remember this pretty graphically because I was a campaign reporter in 2004 and 2008, back in those alleged salad days, or I guess towards the end of them.
00:05:28.040And I would be on the bus listening to journalists talk about which candidates they thought were serious or electable and which ones weren't.
00:05:39.960So, you know, you'd be in a bus full of CNN and and Fox and MSNBC anchors and they'd be scoffing at Dennis Kucinich saying, no, he's we're not going to take him seriously.
00:05:51.540And then there would be some other candidate like John Kerry, like, oh, he's electable or, you know, and they made those judgments and they were important judgments because, you know, what they signaled to audiences back then had an enormous impact on how people voters behaved at the ballot box.
00:06:09.160It's it's different now, like, you know, ironically, Barack Obama was it was a beneficiary.
00:06:14.260He was one of the first people to to lose the so-called invisible primary, which of donors and still win the nomination.
00:06:25.800But then when when Trump broke through in 2016, that was really when the chokehold of those networks collapsed.
00:06:32.180And they missed that. They just they really do.
00:06:35.480And so how do we think this is going to work?
00:06:37.440I mean, this woman can't really crack down on on anything like what the DHS is going to come try to what sensor what happens on your show, on this show, on your sub stack.
00:06:49.920Like, how on earth is this going to work?
00:06:52.760I don't know. I mean, but I think we've already seen that that there they'll go to pretty extraordinary lengths to try to have influence over information that that's online.
00:07:04.660And we've seen in the last six years that there's been pretty extraordinary cooperation between the Senate, between bodies like the CDC and and the FDA and platforms like Facebook, YouTube and Twitter.
00:07:23.960I mean, I did a story randomly about a podcaster who was having trouble with with YouTube.
00:07:33.440And when I called them up for comment and asked them what they decided was how they decided what what was in misinformation and what wasn't, they just told me outright that they they made those decisions in consultation with federal agencies.
00:07:46.360So I think I think I think this is the world we're going to be living in where we have a basically a privatized speech landscape, but they're going to be political actors from the government that are going to be influencing the moderation decisions of those platforms.
00:08:02.780It's very, you know, we're we're going a lot closer to blatantly protected First Amendment speech like the reason Twitter and Facebook and YouTube can censor content is because they're not they're not the government.
00:08:18.500They're pretty close. They're pretty close to having the power and certainly the fingerprints of the government all over their editorial decisions.
00:08:25.180But technically, still, the law does not recognize them as the equivalent of or certainly as an actual government actor.
00:08:32.040Not the case for DHS that they're not allowed to censor our speech.
00:08:35.860So I don't really know what they think they can do, but they may be sad to realize it's written right there in the First Amendment.
00:08:42.620They're not allowed to censor speech. There's a teeny tiny category they get to touch and the vast majority of what they're going to object to won't be in it.
00:08:50.220That's true. That's true. But somebody still has to do the test case.
00:08:55.560They have to to file that lawsuit and win that First Amendment case.
00:09:01.120And, you know, who's going to do that? The reality is these government agencies have already been meddling with speech on private platforms,
00:09:11.460whether it's the FDA and the CDC, you know, sort of encouraging platforms like YouTube to go by their guidelines and deciding what's misinformation and what isn't.
00:09:23.740Or it's the FBI, which has been in consultation with some of these platforms about things like hate speech and which groups might need cracking down on.
00:09:34.440So, yes, you're right. I think there's already a powerful First Amendment argument that that they've crossed the line, but that has to be challenged.
00:09:44.700Then who's going to issue that challenge? It's a very difficult road ahead.
00:09:49.160I mean, possession is nine tenths of the law. And if there are if they're already doing this, it has to be undone.
00:09:55.820It doesn't matter if it's illegal. Hmm. Well, speaking of possession being nine tenths of the law, people on the right half of the aisle are just the sane half.
00:10:04.960It's not just all righties. It's a lot of centrist lefties, too, are rejoicing that Elon Musk is taking over Twitter, is going to buy Twitter.
00:10:12.900And so it appears. And there's a stock problem at the moment.
00:10:15.540And people we have to keep our eye on the Twitter shares and on the Tesla shares because he does need money in order to buy it.
00:10:21.480But Peter Schiff came on and explained all that to us yesterday. I don't totally understand it, but it's not a done deal.
00:10:27.500It's not a totally done deal. He's very, very rich, but he does need his actual forty four billion dollars or at least twenty one billion of it to buy.
00:10:34.140So they're happy, but he's already taking crap from, I mean, all corners of Twitter and the Twitter top executives, the chief legal officer, the general counselor was reportedly in tears when upon learning on Monday that he was actually going to close the sale.
00:10:49.540You know, they agreed to the sale and Elon, I guess, like to tweet or took a shot at her yesterday and then took all sorts of crap in response.
00:11:02.740Right. Like he let's say he shared a meme Wednesday that mocked her response to accusations of the company's political bias.
00:11:11.120And it was this thing involving Tim Poole and Joe Rogan and when they all went on there and how she's got the circular reasoning and she denies viewpoint discrimination.
00:11:17.460But we all know they do it. And this is the thing that, you know, the the Twitter CEO from 2010 to 2015 response.
00:11:25.560What's going on? You're making an executive of the company. You just bought the target of harassment and threats.
00:11:30.700Bullying is not leadership. And Elon kind of defended himself saying, you know, what are you talking about?
00:11:35.400So this is the thing. He doesn't behave like your normal CEO or owner when he takes over these massive corporations.
00:11:43.240You know, he's being sued right now for some of the tweets he sent out about Tesla.
00:11:46.960But this is how they get you. You offered your opinion.
00:11:51.240His opinion is that this woman is biased and has been wrongfully manipulating content on Twitter.
00:11:58.780He said it publicly. The way they shut you down is how dare you? How dare you?
00:12:04.860Online harassment, sexual harassment, comments threatening. Right.
00:12:09.640We talked about this yesterday with Vivek Ramaswamy, which is he said they smuggle in viewpoint discrimination through the guise of hate speech threats.
00:12:21.580You know, words are violence. All of those principles which are in active form at Twitter.
00:12:26.320Yeah, absolutely. I and I'm one of the people who's who's enjoying a schadenfreude a moment this week, you know, to watch the reaction to Musk potentially buying Twitter.
00:12:41.740Because, first of all, I'm one of the few reporters who for years now has been covering the phenomenon of content moderation, who took it seriously from the very beginning, warned that this was going to be a problem that was going to become a bigger and bigger part of American life.
00:13:01.700And I was laughed at by, you know, a lot of my colleagues, particularly like my left leaning colleagues who I thought were free speech advocates, you know, after the Alex Jones episode, which I thought was deeply troubling, not because I like Alex Jones,
00:13:18.240but because the precedent of big companies like, you know, Apple and Facebook and Spotify getting together and making sort of an ex parte decision to kick somebody off the Internet.
00:13:32.720That was a radically different approach to policing speech than what I grew up with, which is if you make a mistake, you get sued and there's an open forum and there's a process and it's it's all transparent.
00:13:45.360I think what you know what I said from the beginning, the issue isn't who is being censored.
00:13:52.260The issue is how it's being done. And if you're if you were in favor of a handful of mega wealthy executives back then kicking off people like Alex Jones and, you know, and then eventually Donald Trump off the Internet,
00:14:08.460that you can't now cry that there's a different billionaire who you just happen not to like sitting in that same chair and meddling with speech in a way that you don't like.
00:14:20.420You had to have objected on principle grounds before and they didn't. And so, you know, look, I don't have any sympathy for these people.
00:14:28.760They they they had a chance to stand up for something like speech principles once upon a time, and they didn't do it because they wanted to censor people.
00:14:38.880They just and now they're now they're feeling, you know, a taste of their own medicine.
00:14:45.480Let's talk about the Alex Jones case for one second, because I reported on all of that and lived through that in a weird way myself.
00:14:53.820But I know very well that Alex Jones was in a weird place versus most people who get targeted on the Internet because he had been serially unleashing like very personal and in your home threats pretty much on purpose to the most sympathetic group of people in our country, namely the Newtown grieving parents.
00:15:22.120So people who had their first graders shot to death in class, kept getting harassed by his listeners who he kept telling to believe this was all a hoax, that it was made up.
00:15:39.560It was found in court that they had been receiving death threats by this lunatic inspired by Alex Jones.
00:15:44.240Many of them had been having to deal with the Alex Jones listeners for years in deeply painful ways.
00:15:51.820And honestly, Matt, it was like that's I don't think you can compare that to, you know, James O'Keefe with his secret camera getting dishonest New York Times reporters saying something in a bar one way versus what they put in the pages of the Times another way.
00:16:10.200I just think he's in a class of his own. And it wasn't just the Newtown families.
00:16:13.960I could go down the list for you of people who have been actually hurt by people he intentionally inflamed.
00:16:20.840It's much closer legally to what we know as incitement, which is not protected speech.
00:16:25.700Well, I don't I don't think anything that he said with regard to Newtown was protected speech.
00:16:32.120And I said that at the time. I also said that I think it was probably pretty obvious that he violated the terms of service of each one of those platforms.
00:16:42.420Again, I had no interest in in defending Alex Jones on any grounds.
00:16:48.520The issue for me had to do with the method. Right.
00:16:53.660So once upon a time, the way we would have dealt with speech like Alex Jones was he would have been sued and the penalties, the financial penalties would have been so great that he probably would not have emerged with a career at the end of it.
00:17:12.940That's happening. That's happening. It took a long time against him.
00:17:16.500Right. And people were impatient to go through that process.
00:17:22.780And I understand that, like, look, as this is happening, none of those none of those private businesses want to want to deal with that.
00:17:30.600I totally get that. The problem is, is that is that by doing this, they open the door for a new a new kind of speech policing that didn't involve any kind of open and transparent process.
00:17:46.500All these companies got together, clearly coordinated. You know, they all did it at the same time.
00:17:52.580And they essentially decided, you know, this person is going to no longer be on the Internet.
00:17:57.680So that opens the door. You know, the next thing is going to be O'Keefe.
00:18:03.720And then before you know it, it's the Babylon Bee. And that was that was the issue that I had.
00:18:09.380I mean, that is what happened. That's why it's hard. That's why it's hard, because for years they've been censoring like these radical Islamists who wanted to show people how to build bombs and commit terrorist attacks.
00:18:21.620And I've got zero problem with that censorship. Go for it. We don't need that shit on the Internet.
00:18:26.200And, you know, people will die as a result of that. And I just that's indisputably not OK to me.
00:18:32.720I don't see anybody defending that censorship by the big tech companies.
00:18:36.740But that's that's not even censorship, because those things are actually against the law.
00:18:40.960Like, you know, the authorities can come in and they can stop actual incitement, imminent incitement to violence.
00:18:48.800No, but I'm talking about if I but if I just sit in front of the camera and I show you how to make a dirty bomb, that's not against the law.
00:18:56.260No, it's not. But, you know, and that's neither is hate, hate speech.
00:19:00.060But I know that's part of part of what the American experiment is all about.
00:19:03.420They raise that bar very, very high for a reason.
00:19:06.500If you go back and look at those cases, you know, in the Supreme Court that, you know, that decided what what's legal speech and what is not, they they they, you know, made it pretty clear that they were willing to tolerate some pretty extreme stuff in order to protect the principle of free speech.
00:19:27.820No, I agree with that. But that's why this is so dicey, because if I were running YouTube, I would not allow that.
00:19:34.080I would not allow videos of how to make a dirty bomb to be posted, even if there weren't.
00:19:38.960I mean, and it's not it's it's not unprotected.
00:19:42.380That is protected speech. But am I going to let somebody sit there and and show people how to create that that level of dangerous weapon that could kill a bunch of people on my platform?
00:19:52.600I'm not because I'm not a government actor and I don't have to.
00:19:55.680So I would draw some lines, but I don't know.
00:20:00.700And would I have allowed the Alex Jones speech against the Newtown families over and over and over?
00:20:06.060I don't know. I mean, it's very sticky, right?
00:20:09.360It's like there's there's a great there are gradations on this.
00:20:12.140And if you wind up canceling the Babylon Bee, you've gone too far.
00:20:16.040If you cancel James O'Keefe, you've gone too far.
00:20:18.700So I just feel like why aren't there adults in the room who can distinguish between genuinely dangerous behavior that can and has gotten people hurt or killed?
00:20:31.200And these false claims of words are violence that, you know, like claiming what what was said about this Twitter general counsel is somehow the same as this other stuff.
00:20:42.180You know, look, I'm as close as one gets to a free speech absolutist.
00:20:49.520I but even even I, you know, grew up understanding that there were a whole ranges of things that as a journalist, I can't say.
00:20:59.000Right. You know, we're trained that we can't commit libel.
00:21:09.000And we have to run things through lawyers before we publish and all that.
00:21:12.420And that's not the case in the Internet.
00:21:13.800And I understand that we have to come up with some kind of process for dealing with difficult speech.
00:21:20.940My criticism throughout this this this period has been that a lot of the people who are looking at this problem,
00:21:29.440I don't think they're really interested in solving those difficult issues that you talk about.
00:21:35.360But like, if you ask me, I think, you know, for something like Alex Jones or, you know, making bombs,
00:21:42.220I think there should be some kind of transparent, open process where, you know, you get to actually see how these things are decided.
00:21:51.340But what you what you you've seen instead is you've seen a lot of politicians who seem very, very anxious to use the, you know, quasi monopolistic power of these platforms to push speech in a certain direction.
00:23:13.260I would just just quickly like to point out that when when they started this campaign,
00:23:17.640obviously, a lot of the people who were sort of discriminated against first and you talk about viewpoint discrimination, a lot of them were on the right, but a lot of them were on the left, too.
00:23:30.180I mean, you the some of the companies, the media outlets that saw enormous drops in traffic when companies like Google were told that they had to prevent the foment of discord.
00:23:43.080They were outlets like Truthdig and the World Socialist website and even Democracy Now because the new algorithms essentially just favored large carriers over small ones.
00:23:55.980So I just wanted to point that out that it's not.
00:24:03.920But I do think that there has to be some way to do this that mimics the effectiveness of the litigation based system that we had dating back to New York Times v.
00:24:17.920You know, as a journalist growing up in that era, I always felt like the system worked extremely well because the rules were very clear about what we were and we're not allowed to publish.
00:24:34.040There was a pretty high bar that you had to meet to prove that somebody had committed libel or slander.
00:24:40.540And yet when there was a real egregious violation, it was usually, if not career ending, close to it.
00:24:49.000And you just raised a good point, though, Matt, you raised a good point, because back in those days, you know, this is this is the Barack Obama golden days.
00:24:55.800And in this way, I see the point there was a self-imposed high bar of class, of dignity, of not, you know, unfairly targeting one individual over and over or creating a circumstance where somebody could literally get hurt.
00:25:10.040You know, there you wouldn't have had Alex Jones in print, you know, in the in the Times and in the Post back then.
00:25:18.020And those papers were more respectable.
00:25:20.640They still had a left wing bias, but they were nothing to what they are today.
00:25:24.280You know, they were definitely more committed to trying to be fair.
00:25:29.180And then they would not have allowed these types of things to appear in their papers.
00:25:35.420So it was sort of a better approach on both sides.
00:25:38.620They were less censorious, but they had a higher bar for what could be printed and, you know, who could be targeted in the first place.
00:25:45.640Well, that's what I mean, I think that's what we're all striving for is a system where where there's kind of sensible self-censorship before you you print something.
00:25:58.600I mean, I think the processes that we went through before we published things in major magazines, I always thought that was that was a good process that we weren't afraid to use strong language.
00:26:11.820We weren't afraid to say things about people if we had a strong opinion.
00:26:15.400But when it came down to facts, you know, we had we had to be accurate.
00:26:22.160And if it was a close call, we usually err on the side of caution, left it out of the paper because the penalties were high.
00:26:31.540Now, on the Internet, there's nothing like that right now.
00:26:34.700No. And there's plenty of people who don't do any fact checking at all.
00:26:37.540No, there's no fact checking. And this is bled into, quote unquote, mainstream media, which is which is learned that its audiences now forgive mistakes as long as they're in the right direction.
00:26:51.960So they're not careful anymore. They they they they make constant factual errors.
00:26:57.820They don't worry about it. They don't worry about being sued for libel nearly as much as they would have once upon a time.
00:27:05.260Again, I I'm not particularly sympathetic to Kyle Rittenhouse, but I was shocked by the way he was described in the first days of that story.
00:27:15.100Like there were major news outlets that were calling him a white supremacist.
00:27:19.180The president was calling him that and or the sorry, the future president.
00:27:24.320And and again, once upon a time, you would have needed something to go on in order to use that terminology.
00:27:32.420And they didn't. They just did it because the landscape has changed so much.
00:27:36.800So, yeah, it's a big problem. I understand that there has to be something that done to fix all the craziness on the Internet or at least address it.
00:27:49.420But what they're doing instead, I think, is is they want to leave the system in place so that they can push speech in a certain direction.
00:27:58.480And that that's the sense that the clear sense that I get from.
00:28:03.260I mean, I think if you asked me to sit there and say, what's the difference between, you know.
00:28:11.400Threatening messages that actually could harm, you know, physically harm somebody, forget emotional harm, that's just we can't deal with that.
00:28:18.900But physical harm, I could tell the difference between a tweet that did that and something that just expressed a controversial view.
00:28:26.420And so, you know, I feel like maybe what Elon Musk needs is people who are just less ideological, you know, people who are committed to the to free speech as a principle.
00:28:37.800But but people who are reasonable and don't want to see, you know, people get hurt unnecessarily because you've got some lunatic on the Internet continuing to dock somebody and call for violence or, you know, stretch come close enough to the line.
00:28:50.280But they don't have ideological diversity at these companies.
00:28:53.940And, you know, I've told my audience before, Matt, I went out to Silicon Valley in 2016.
00:28:59.220It was 2016, right before the election.
00:29:01.720And I met with the heads of a lot of these companies.
00:29:06.320I was meeting with the top executives.
00:29:07.660And they wanted to know my thoughts on how they could do better at what they recognize as their own ideological bias.
00:29:15.440And I told them all the same thing, which is get more ideological diversity on your boards, get more ideological diversity in your C-suite.
00:29:24.880And certainly if you if you have any sort of a monitoring or a censorship group, make sure it's totally even, totally even.
00:29:33.140You can't just have a bunch of people on one side of the aisle making all these calls and not expect that to be reflected in your decision making.
00:29:45.980And again, I think this gets gets to the fact that although some people ask you for your advice, mostly people don't want to do that kind of self-reflection.
00:29:58.500Mostly they want to exercise that authority in a certain way, which is unfortunate.
00:30:03.900And the clear line between threats and opinion, there's lots of stuff that's already illegal that's allowed on these platforms.
00:30:15.060And, you know, the platforms would do well if they were if they just focused on, well, let's eliminate the stuff that we that's already against the law.
00:30:26.480Let's try to cut down on threats because those are already against the law.
00:30:29.980Right. We don't need a special, you know, new policy to deal with that.
00:30:37.020There are laws about that where they get in trouble is where they try to they try to establish things like factual truth and say that something is is this information or misinformation, because that's a moving target that you basically can't get right in a way that's going to be fair.
00:30:56.380And, you know, or if they're trying to define something that's an opinion as being beyond the pale and abusive and hurtful, like hurtfulness isn't a standard that can be applied in any way.
00:31:20.880That's why it's like, OK, well, what is bullying?
00:31:23.720Perhaps if there's some large campaign, you know, designated at one person that just completely upends the person's life, it would have to be massive, massive.
00:31:33.360Not just a few tweets from the Babylon Bee that gets, you know, a bunch of likes.
00:32:10.940I love all these libs who are like, it's rainbow and unicorns.
00:32:14.100Walk a mile in my shoes on the Internet while you people run it because it's been disgusting.
00:32:19.540Yeah, and totally humorless and miserable experience for for quite some time now.
00:32:26.320I also I also think they get into incredible trouble when they try to police misinformation and disinformation, because I think most journalists understand.
00:32:40.340I mean, Megan, you know this in the first days of any news story, there's always some error baked into the reporting that only comes out later.
00:32:49.680Right. So if you you know, if you have some kind of star chamber of fact checkers who are declaring this or that to be the truth and everything else needs to be wiped out, inevitably, what's going to happen is you're going to have fiascos like the lab league business where, you know, for some initial period, they're going to declare.
00:33:13.520Well, this is this is an untruth. This is a this is conspiracy theory.
00:33:17.680Oh, but six, six months later, it turns out it might be true.
00:33:25.400And once you do that, you lose all credibility with audiences.
00:33:28.660And now what's going to happen is they're going to they're going to trust what you call the official trusted version of reality.
00:33:37.940They're going to distrust that even more once you make a couple of mistakes like that, and they're going to drift even more towards conspiracy theories.
00:33:45.800So that that for me is like that's a that's a fundamental misunderstanding of how news consumers work.
00:33:51.240If you if you try to weed out conspiracy theories and crackpots and all these other things in the name of truth, what you end up with most of the time is more of that.
00:34:02.660And and I think that that's not very well understood.
00:34:06.100Good. I'm going to squeeze in a break, but I'll read this from the very well worth your time.
00:34:54.200So the reason I stumbled on the intro is because I've got Joe Biden in my head.
00:34:57.600This just in, he made remarks this morning that Tom Cotton, Senator Tom Cotton of Arkansas is tweeting out as, quote, alarming because of the little bit of slurring and a lot of stumbling.
00:35:12.360We're going to seize their yachts, their luxury homes and other ill-begotten gains of Putin's kleptocracy and the guys who are the kleptocracy.
00:35:34.400Well, I told this story before, but this was trying to cover Biden's issues on that front was actually one of the reasons I ended up moving to Substack.
00:35:50.920Because I was covering a I was doing a feature on on Biden on the campaign trail for Rolling Stone, and I was noticing what everybody else was noticing, like this guy's having trouble getting through sentences.
00:36:05.320Every time he has to ad lib, he gets lost, he forgets where he is, he forgets what what the question is, and I called back some of the people I had talked to for a story about the potential use of the 25th Amendment to get Donald Trump removed on on the grounds that he was mentally incompetent.
00:36:28.420If you remember, there was a big drive to do that, and I was assigned to cover that story, and lots of psychiatrists were very happy to talk about that then, but nobody would talk about the Biden issue.
00:36:41.320And I just realized that we were in a completely different media environment where, you know, certain things were just sort of off limits.
00:36:49.060And I think it was we did kind of the country a disservice by not talking about this a whole lot before he was elected.
00:36:56.820Right. Did you see the Title 42 thing last week?
00:37:01.480Oh, oh, you've got to see it. We have it. So he was asked about, I think, about Title 42.
00:37:09.120My team will refresh me whether the question was about 42 or the mask mandate being struck down.
00:37:14.880It was one or the other. Hold on. Go ahead.
00:37:18.100OK, so the question was about the mask mandate being struck down by a federal district judge in Florida, and he answered it about Title 42, the covid immigration regulation that allows our our border agents to reject everyone who wants asylum.
00:37:37.320So he gets totally confused about the two. They start meandering. He starts intertwining. Just take a listen.
00:37:44.840I'm Title 42, sir. Are you considering delaying lifting Title 42?
00:37:49.360Now, what I'm considering is continuing to hear from my my first of all, there's going to be an appeal by the Justice Department, because as a matter of principle, we want to be able to be in a position where if, in fact, it is strongly concluded by the scientists that we need Title 42, that we'd be able to do that.
00:38:14.360Like, my God, so you hear he's asked about the mask mandate. He starts meandering all over about 42.
00:38:20.080He can't keep it straight. Vice versa. Neither can I right now.
00:38:23.360But I'm not the president and I wasn't facing the reporters and he had to issue a cleanup later in a written statement.
00:38:29.240We've seen it happen time and time again.
00:38:31.700Yeah, it's it's certainly not reassuring when you look up at the president of the United States.
00:38:38.160And the emotion that's being betrayed in his eyes is terror because he's not he's not quite sure what what the question is and or whether he's answering appropriately.
00:38:52.320I've seen this with some other politicians in the past and and but Biden got got worse quickly in in the last election.
00:39:04.480And again, I think the reporter is just kind of decided to not talk about it because they had already decided that he was going to be taking on Donald Trump and they didn't want to give him ammunition, which which I think was a huge mistake.
00:39:17.660Did those president's last names rhyme with Megan? Because there was a real issue with one of them in his second term that went on to become quite a news story.
00:39:27.200Right. Yeah. Well, Reagan was one of the ones I was thinking of.
00:39:30.100You know, I've seen it. I saw with Boris Yeltsin when I lived in Russia.
00:39:35.040You know, I think the issues there might have been a little bit different, but, you know, similarly, he had some cognitive issues.
00:39:41.980But but but look, you know, this is what happens when reporters start messing with things beyond their purview.
00:39:52.300Like our our job is just to tell you like what we see and, you know, worry about whether it's right or right or wrong.
00:39:59.140And then it's up to the public to figure out what they think about it.
00:40:02.160But what started to happen in 2016 when Trump came on the scene is reporters suddenly were like looking at news stories.
00:40:09.920Just to take an example, there was that issue with Hillary Clinton not filling up her crowds.
00:40:16.460Right. So she she was having trouble filling the halls and reporters got together and they kind of silently decided not to make an issue out of that because they didn't want to make it look like her campaign was doing badly.
00:40:29.960But that ended up hurting her because it created a false sense of security in the campaign.
00:40:37.040And and, you know, instead of doing something to try to fix it, they just kept going and they ended up losing.
00:40:43.960So, you know, reporters should just, you know, tell us what they see and, you know, let the chips fall where they may.
00:40:51.740And they won't make they won't affect history in a negative way, at least that way.
00:40:55.860Well, and it's like, you know, when grandpa starts to lose his marbles, you know, when he starts to starts to go south, grandpa can be easily manipulated.
00:41:05.060You know, we don't do that because we love grandpa.
00:41:07.100But this is the sitting president of the United States.
00:41:09.700And we were promised somebody who wasn't going to be some far left wokester.
00:41:19.620And we were promised somebody who said he was very skeptical of, quote, forgiving student loans because he understood the problems that would create and the fairness issues it would create.
00:42:02.340I didn't I didn't really do it either.
00:42:04.380But somebody needed to do that story and it needs to do it now, too.
00:42:10.020And we're not we're not really doing it.
00:42:12.460We know that we know that there's some infighting, but we don't know.
00:42:15.700We don't know exactly how decisions are being made.
00:42:18.320Well, so Joe Biden is doing something that Trump didn't do.
00:42:21.980And that is, as the sitting president, he's about to go to the now, you know, reborn White House Correspondents Dinner, which is going to happen in Washington, D.C. this weekend.
00:43:35.360And the other sort of sub line to all this, Matt, is that Dr. Fauci was supposed to go, but bailed because the four time vaccinated Fauci doesn't think this is safe.
00:43:45.020Yeah, I mean, that story is ridiculous on so many levels that it's it's just hard to even know where to begin.
00:43:53.700But they've been consistently irrational about this from the very beginning.
00:43:59.700You know, what from the very start, they were they were saying to us that they didn't really think the vaccines worked.
00:44:10.500We know why did we have to stay in lockdown if the if the if the vaccines were effective?
00:44:15.340Well, you know, they just don't really believe in them.
00:44:19.160And I think there's some sending mixed messages, which, again, gets back to the point of, you know, when people stop trusting you, that's when they drift even more towards conspiratorial interpretations of things.
00:44:29.160So they I think it sends a terrible message, what he what he's doing.
00:44:35.860It's like, aren't the vaccines supposed to protect us from severe illness or death and reduce covid to something rather mild that the average person can handle?
00:44:47.200So why are they behaving like this is the very first form of covid, which actually was more severe, far more than what we're dealing with now, Omicron, whatever, the second version of Omicron?
00:44:57.280Why are they pretending like it's still that version and we have no vaccine and we have no therapeutics, right?
00:45:04.000They they aren't going out and living their lives.
00:45:06.320Or maybe it's just all one big, massive virtue signal to try to cover for their overextended big government hand, which is still literally over the mouths.
00:45:17.940In effect, I guess not literal of little children in New York City, two year olds who are masked.
00:45:22.520Yeah, clearly there were people who just loved all of the rules to a to a degree that was a little bit unseemly.
00:45:32.400Like there were lots of policies in the last two years where I thought, well, maybe I agree with that.
00:45:37.600That's it's possible that that might be the sensible thing to do.
00:45:41.320But but I was put off by by the glee with which people were, you know, glad to impose some of these these restrictions, especially with schools and kids where, you know, it was suddenly became taboo to talk about the fact that kids didn't really get sick with this very much.
00:46:06.560Well, it's like Brian Stelter, would you go to a party with no rules speaks for so many of them?
00:46:15.540OK, listen, when we come back, I'm going to play you, Dr. Fauci, who literally in the course of a few hours declared the pandemic was over only to reverse himself moments later.
00:46:23.960It's not over. It's over. Celebrate you.
00:46:30.920There's much, much more to go over, including the news we just got about what we're prepared to do in Ukraine, where Matt has had some good thoughts on Russia and what our potential role should be all along.
00:46:46.260So staying on the subject of Fauci, literally in the course of a few hours, he said the pandemic was over only to reverse himself and say, no, it's not over.
00:46:57.920It's never going to be over for Dr. Fauci, I'm sure.
00:47:01.040Take a listen to these budded soundbites.
00:47:03.400We are certainly right now in this country out of the pandemic phase.
00:47:26.300And again, this just gets back to why you can't have YouTube or Google or Facebook or Twitter relying upon government officials to tell you what the truth is about something.
00:47:42.500Because even they don't know, they change their minds every 10 seconds about stuff, including like really important things, like whether or not to wear a mask or, you know, whether the vaccine is actually going to protect you from getting infected.
00:47:56.500Like that's why you need you, you cannot have top down information controls because, you know, the truth is always a moving target.
00:48:09.940I feel like he either he had a momentary slip, you know, when he said it's over because I don't think he's ever going to say that and really mean it.
00:48:35.420Yeah, I think they took out the cattle fraud and found a nice quiet room somewhere to to to set him straight about what the official message message is.
00:49:17.960Do you think, you know, do these politicians and bureaucrats and school administrators follow through with these things?
00:49:24.160The writer, Christopher Lash, once said the essence of propaganda was keeping the public in an ongoing state of emergency.
00:49:35.060And I think we've in especially in the Trump years, we've we've fallen into the pattern of always being in an emergency and politicians finding ways to to find that useful.
00:49:47.680The pandemic has been extremely useful to politicians.
00:49:52.960It has given them the ability to dictate all kinds of behaviors and to allow them to stick their fingers in things like the news and Internet content moderation.
00:50:07.440I don't think they want the emergency to end.
00:50:10.280I think they like this new normal, you know, and it's a problem.
00:50:15.360You know, the the the fact the idea that there aren't people who are motivated to end crises is is a big problem just generally, I think, in politics.
00:50:26.700So speaking of the vaccine mandates and how they've impacted people's lives, an interesting couple of cases in the news.
00:50:38.220Sage Steele of ESPN just filed a lawsuit against ESPN and its parent company, Walt Disney, alleging that the company treated her unfairly for comments she made on a podcast interview last September.
00:51:03.840OK, but since that interview, she says she's been sidelined for the prime assignments.
00:51:09.920She does continue to anchor the noon Sports Center broadcast, but quite a few things were taken away from her and she was pulled off the air for some big assignments, she says.
00:51:17.460So she had gone on former NFL quarterback Jay Cutler's podcast and shared her thoughts on ESPN's vaccine mandate, sexism in sports, journalism and on Obama's ethnicity.
00:51:29.760The fact that he selected black as his ethnicity on the census because he's biracial and she's also biracial and had some thoughts on it.
00:51:38.740So here's what she said on the Jay Cutler podcast that she's now alleging she was punished for.
00:52:15.760That said, we expect those points of view to be expressed respectfully in a manner consistent with our values and in line with our internal policies.
00:52:23.200She got hit by, of course, Jemele Hill, who just once again lost yet another show over there on CNN Plus.
00:52:33.020And then ESPN required her to issue an apology.
00:52:37.340So the thing about ESPN and normally they could punish her for her viewpoints because they are not a government actor.
00:52:44.720But the state of Connecticut, where she is and where I am, they have apparently a law that actually says corporations can't always do that.
00:53:05.780I'm of two minds about this because, you know, I remember when Liz Spade, the former public editor of the New York Times, got in trouble some years ago, among other things for talking about New York Times writers being on social media too much.
00:53:28.120And, you know, I understand the rationale for that because once upon a time, you know, in my father's day when he was on the news, viewers didn't really know a whole lot about the political views of reporters.
00:53:43.600And that was and that actually added to their credibility like they, you know, if you didn't know whether a person was liberal or conservative and they were just delivering the news, it did kind of tend to make people feel like they were more likely to believe just that they were watching a news program.
00:54:06.080However, you know, nobody really is just a pure newsreader anymore and everybody has a social media presence.
00:54:13.800So you can't I also think you can't you can't especially at ESPN for talking.
00:54:19.920They're encouraging these anchors to go out there and they've they forced moments of silence on them and they've gotten very politically active on the air there.
00:54:29.760Yeah, I mean, and again, I know a lot of people in the news in the news business who were who were outright told by their bosses, like you have to get a Twitter handle.
00:54:42.220You've got to have more of a presence in social media.
00:54:46.040Clearly on ESPN, you know, they're trying to build up the brand, the individual brands of all of these on air personalities.
00:54:53.100So when they do that, but they do that in a way that doesn't fit with some kind of orthodoxy, I don't I don't think you can punish those people.
00:55:04.260It's once again, it's viewpoint discrimination.
00:55:06.380By the way, the Connecticut law, just to clarify what I said, it states companies cannot discipline employees for exercising their First Amendment rights as long as the comments do not directly impact their work performance or the company.
00:55:17.420She's arguing that her comments remain in a third party podcast and that she should be considered a private citizen in this situation, making these comments.
00:55:25.040The thing is, like, I don't see how ESPN gets away with punishing just her, given its push to make its anchors go totally woke on the air.
00:55:35.180And now you have one person here who happens to be a woman of color who pushes back on some of the narrative.
00:56:15.620Same as, you know, some audience members may get offended by the incredibly woke, anti-patriotic statements coming out of the mouths of the anchors sitting on set during the big basketball games or the big football games.
00:56:57.400I mean, the I think a lot of these these companies that have gotten away from what really works.
00:57:06.060You know, sportscasting used to be a really, you know, interesting and colorful and creative wing of the media world because they they were able to write with style.
00:57:20.020They were able to use humor and wit in ways that regular newscasters weren't really allowed to do.
00:57:27.300But it's become just as dreary in a lot of ways as the rest of media.
00:57:31.480And I don't really understand why they would voluntarily do that.
00:57:51.300These cases is the Amber Heard Johnny Depp defamation case.
00:57:55.620She claimed in The Washington Post that she was a domestic abuse victim in 2018.
00:58:00.460This is two years after she had made sure she was caught on camera by the paparazzi with what she claimed was a bruise on her face from what she claimed was a phone thrown at it.
00:58:23.800He got fired from, I guess it was the fifth installation of Pirates of the Caribbean right after that and lost millions of dollars, not to mention reputational damage.
00:58:33.360And he's filed a lawsuit for defamation against her.
00:58:36.520And the trial has not gone well for her.
01:00:14.380So that's her admitting basically on tape that she cut his finger off with a vodka bottle and him complaining about it and her kind of mocking him like, oh, go ahead.
01:00:35.880You know, I've obviously gotten in trouble over the subject in the past.
01:00:40.680And I do understand the idea that there needs to be an initial reaction that we believe women at least enough so that they get a hearing, you know, to not believe.
01:01:11.920Which is what what happened, you know, in the past.
01:01:14.440And that's something that definitely needs to be corrected.
01:01:16.900Yeah, I'm doing a story right now about I can't really talk about who it is, but there's a there's a company that's, you know, that's gotten in a lot of trouble and had all sorts of issues financially, really over allegations and not not really about substantiated conduct.
01:01:38.940And this is something that's just become a little bit, I think, too easy in modern media, which is, you know, we raise an allegation of something or we imply that some something happened.
01:01:50.480And before you know it, you know, the Twitter takes off and turns it into a fact.
01:01:55.920And next thing you know, it's a reputational harm issue.
01:01:59.080And we can't have that person working at our company because, you know, the the staff will be upset about it.
01:02:09.680Like, I think there has to be some kind of happy medium where you have to prove these things out before before people really, you know, go through serious damage.
01:02:20.140I mean, you're not wrong because I mean, I do think like the believe all women thing was always a lie and stupid and absolutely un-American.
01:02:32.220It's the worst case scenario is you're you're charged with a crime and the system says you get a presumption of innocence because the state has such an advantage over you when you're sitting there in shackles and he or she gets to go in on the other side in their suit saying, I represent the United States of America.
01:02:49.000For those reasons, because the deck is stacked against the defendants, we give them a presumption of innocence.
01:02:53.720We want to hold the system to account for we throw somebody in jail, take away their freedom.
01:02:57.920But that you don't get that presumption of truth telling in any forum, including a court.
01:03:05.380And so I'm glad he brought this case because she really was painted as just this poor victim who'd been abused by him.
01:03:15.560And definitely he suffered from it financially and otherwise.
01:03:18.080Not that he needs the money, but still, it's just the principle.
01:03:21.720And I think this trial has exposed that at a minimum, these situations can be a lot more complicated than we we admit.
01:03:30.820Well, and this has always been a big issue for me over the years, which is that a lot of reporters think that, you know, there's a playbook to news stories or that you can, you know, lapse into cliches when you report things.
01:03:54.040The reality is you have to clean your slate every time and approach every new story as a completely new set of facts.
01:04:02.000Because, you know, what might be a Matt Lauer story, you know, in one instance, you know, you might have a completely different fact pattern the next time.
01:04:11.700You can't carry over expectations from your previous reporting and just kind of shoehorn in a, you know, a cliched understanding of what happened.
01:04:22.780And I think that's, we've gotten away from doing that, of just wiping the slate clean each time.
01:05:08.220Germany has now reversed itself on sending arms to Ukraine after claiming it would tap into its reserves.
01:05:15.440So some rollback from the Europeans, America sending more money.
01:05:19.560There's still some calls from Republicans and Democrats even for us to get more involved, more weapons.
01:05:27.060And even still, some people are saying no fly zone and so on, though I don't think that's going to happen.
01:05:31.660So where do you make of where the United States is now and where this conflict is now?
01:05:36.360So, uh, first of all, I, I, I was one of the people that got this wrong.
01:05:42.860Like I never expected, uh, Russia to actually invade Ukraine or at least, uh, the Western part of Ukraine.
01:05:51.660And so I, I made a wrong call on that.
01:05:53.980And, and, and then you did something extraordinary.
01:05:55.720You admitted that you were wrong and you apologized to your listeners and your readers, which is all that's expected, but nobody does that anymore.
01:06:08.780You do have to do that, but you know, I, I got that wrong and it's an unpredictable situation.
01:06:13.580But, uh, but I think what's happened, um, over time is that, uh, we're not really reporting on, um, what the United States is, is, uh, what their policy is.
01:06:28.300Uh, you know, secretary of state, uh, of defense Austin said this, I thought really fascinating thing this week where he said that, you know, our, basically what our plan is, is to, is to weaken Russia so that it can't do this to the next.
01:06:43.900Now that seems to me at cross purposes with Ukraine's mission in all this, I'm sure, I'm sure Ukraine wants to defeat Russia militarily, but they may also come to a point where they just want to end the conflict with minimal damage.
01:06:59.460And, and so if the United States, um, is, is committed to a different policy where we're not going to give them the, the, uh, um, the ability to negotiate, for instance, the end of sanctions.
01:07:12.900Uh, then Russia is really at war with us, not with Ukraine.
01:07:17.140Like if, if, if, if, uh, if Ukraine doesn't have that, um, autonomy, uh, then this is an immensely complicated situation.
01:07:26.100And I also think the United States is delusional if, if they think that this is going to end in some kind of happy regime change scenario in Russia, the much more likely, uh, outcome is that you're going to get a more hardline leader, uh, who's going to come in after Putin.
01:07:42.380And, and they're going to drop vacuum bombs on every city in Ukraine.
01:07:56.180I was joking the other day that they, they seem to think Biden and, you know, those around him that if they could just get rid of Putin, they'd get Jed Bartlett, you know, there he is just waiting.
01:08:13.420I mean, were they not paying attention in the last 30 years?
01:08:17.560I mean, that's, that's the thing that's amazing to me is the United States has already been, been around this track many times with Russia.
01:08:24.940Like, you know, we, we tried to voice, you know, an America friendly leader on, on Russia.
01:08:32.800Uh, and those people were hugely unpopular, mainly because they were friendly with the West.
01:08:39.060And it was part of the reason we got Putin in the first place, uh, because, you know, Boris Yeltsin was seen as too close to the United States.
01:08:47.780Uh, Putin was seen as somebody who stood up to us.
01:08:52.760Um, so the, the person who comes in after Putin, if they, if they think it's going to be like, you know, Emmanuel Macron or something like that, they're, they're, they're, they're, they're,