The Megyn Kelly Show - April 28, 2022


Elite Panic Over Alternative Media Power, and Press Ignores Biden's Mental Fitness, with Matt Taibbi | Ep. 310


Episode Stats

Length

1 hour and 10 minutes

Words per Minute

165.16682

Word Count

11,618

Sentence Count

769

Misogynist Sentences

12

Hate Speech Sentences

5


Summary

On today's show, Megyn Kelly and Matt Taibbi discuss the Biden 2020 campaign and the new administration's new push to make the internet a place that is under some kind of governmental control. Plus, a story about a recent Fox interview that was pulled because the host demanded an edit.


Transcript

00:00:00.500 Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.700 Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show. In just a second, I'm going to be joined by Matt Taibbi.
00:00:18.200 But first, I wanted to tell you that today we uploaded exclusive content on our podcast feed for all of our podcast listeners.
00:00:27.240 It has insider details on our show, my thoughts on how our first 18 months on the air has gone.
00:00:34.880 We've done 18 months of the podcast and I don't know how many months, since September now with Sirius.
00:00:42.560 And we've included a story about a recent interview we pulled.
00:00:46.400 We did not air because the guest, who champions themselves as a warrior for free speech and non-divisive politics,
00:00:55.740 lifting themselves above divisive politics, demanded an edit right after the interview that we refused to comply with.
00:01:03.680 Turns out the guest wasn't quite as courageous as they wanted our audience to believe, and we'll get into what happened after that.
00:01:10.880 So you might find it interesting. It's short and sweet, and I think you'll enjoy it.
00:01:16.100 Anyway, you can find the podcast on Apple, Spotify, Pandora, Stitcher, or wherever you get your podcasts for free.
00:01:21.920 And while you're there, you will find over 300 episodes. Our archives are all on our podcast feed.
00:01:28.080 So go ahead and download, enjoy, follow the show, so you can listen whenever you want.
00:01:33.680 If you're not able to catch us live on Sirius XM Triumph Channel 111.
00:01:37.680 Thanks for having me, Megan.
00:02:07.680 And now they are attacking us as disinformation purveyors, the random American citizen who has access to a computer or a microphone.
00:02:18.580 And the chief, the chief of this effort is going to be a woman named Nina Jankowicz, who will be the executive director.
00:02:26.220 As far as I can tell so far, her most notable accomplishment is calling the Hunter Biden laptop a, quote, fairy tale, a, quote, Trump campaign product.
00:02:36.260 And this is the person who now will be in charge of regulating what is and is not, quote, disinformation.
00:02:42.740 What do we make of it?
00:02:43.760 Well, this has been going on for five or six years now.
00:02:47.760 There's been, ever since Donald Trump was elected, a pretty concerted effort on the part of mainstream politicians, really in both parties, but particularly in the Democratic Party, to make the Internet a place that will be under some kind of governmental control.
00:03:08.700 And this began in 2017 when we had members of the Senate calling up executives from Facebook, Google, and Twitter to the Hill and essentially demanding that they come up with strategies to prevent what they called the foment of discord.
00:03:26.880 Back then, the bread and butter, right, exactly.
00:03:31.640 The the boogeyman back then was was Russian disinformation.
00:03:35.760 Then it became hate speech.
00:03:37.520 Then it was disinformation about the pandemic.
00:03:42.480 You know, now we're circling back to Russian disinformation with regard to the Ukrainian conflict.
00:03:48.780 And, you know, I think the problem is we're we're in a generation of people who they agree that there's a problem with disinformation in the media landscape, but they don't understand that the biggest lies are always official lies.
00:04:05.440 And the only real defense against that is is free speech.
00:04:09.520 And so they they want this top down system of of control, which I think is very, very dangerous.
00:04:16.020 Hmm. That's so true.
00:04:17.400 If you hear the Barack Obama remarks from late last week, he was longing nostalgically for the days in which it was just ABC, NBC and CBS and information was controlled.
00:04:30.120 Right. We didn't have all these Internet hacks and trolls out there pushing so-called disinformation.
00:04:35.700 And I'm sure that was a much more delightful time for people in the position like he had at the White House.
00:04:42.300 But think of all the lies that have been told to us over the years from people in that post that the evening news, for whatever reason, went along with or had an incentive not to check too far into.
00:04:53.400 And today it's no different.
00:04:54.880 You know, today it's like, OK, do we not think that the people at Fox were manipulated by Trump?
00:05:00.180 Do we not think that the people at every other network are manipulated by Biden?
00:05:03.980 Like, that's the way it works.
00:05:05.040 Right. But the problem is people have an alternative now.
00:05:09.640 They have a way to get around that, which they didn't have before, as as President Obama noted.
00:05:15.040 And I remember this pretty graphically because I was a campaign reporter in 2004 and 2008, back in those alleged salad days, or I guess towards the end of them.
00:05:28.040 And I would be on the bus listening to journalists talk about which candidates they thought were serious or electable and which ones weren't.
00:05:39.960 So, you know, you'd be in a bus full of CNN and and Fox and MSNBC anchors and they'd be scoffing at Dennis Kucinich saying, no, he's we're not going to take him seriously.
00:05:51.540 And then there would be some other candidate like John Kerry, like, oh, he's electable or, you know, and they made those judgments and they were important judgments because, you know, what they signaled to audiences back then had an enormous impact on how people voters behaved at the ballot box.
00:06:09.160 It's it's different now, like, you know, ironically, Barack Obama was it was a beneficiary.
00:06:14.260 He was one of the first people to to lose the so-called invisible primary, which of donors and still win the nomination.
00:06:25.800 But then when when Trump broke through in 2016, that was really when the chokehold of those networks collapsed.
00:06:32.180 And they missed that. They just they really do.
00:06:35.480 And so how do we think this is going to work?
00:06:37.440 I mean, this woman can't really crack down on on anything like what the DHS is going to come try to what sensor what happens on your show, on this show, on your sub stack.
00:06:49.920 Like, how on earth is this going to work?
00:06:52.760 I don't know. I mean, but I think we've already seen that that there they'll go to pretty extraordinary lengths to try to have influence over information that that's online.
00:07:04.660 And we've seen in the last six years that there's been pretty extraordinary cooperation between the Senate, between bodies like the CDC and and the FDA and platforms like Facebook, YouTube and Twitter.
00:07:23.960 I mean, I did a story randomly about a podcaster who was having trouble with with YouTube.
00:07:33.440 And when I called them up for comment and asked them what they decided was how they decided what what was in misinformation and what wasn't, they just told me outright that they they made those decisions in consultation with federal agencies.
00:07:46.360 So I think I think I think this is the world we're going to be living in where we have a basically a privatized speech landscape, but they're going to be political actors from the government that are going to be influencing the moderation decisions of those platforms.
00:08:02.780 It's very, you know, we're we're going a lot closer to blatantly protected First Amendment speech like the reason Twitter and Facebook and YouTube can censor content is because they're not they're not the government.
00:08:18.500 They're pretty close. They're pretty close to having the power and certainly the fingerprints of the government all over their editorial decisions.
00:08:25.180 But technically, still, the law does not recognize them as the equivalent of or certainly as an actual government actor.
00:08:32.040 Not the case for DHS that they're not allowed to censor our speech.
00:08:35.860 So I don't really know what they think they can do, but they may be sad to realize it's written right there in the First Amendment.
00:08:42.620 They're not allowed to censor speech. There's a teeny tiny category they get to touch and the vast majority of what they're going to object to won't be in it.
00:08:50.220 That's true. That's true. But somebody still has to do the test case.
00:08:55.560 They have to to file that lawsuit and win that First Amendment case.
00:09:01.120 And, you know, who's going to do that? The reality is these government agencies have already been meddling with speech on private platforms,
00:09:11.460 whether it's the FDA and the CDC, you know, sort of encouraging platforms like YouTube to go by their guidelines and deciding what's misinformation and what isn't.
00:09:23.740 Or it's the FBI, which has been in consultation with some of these platforms about things like hate speech and which groups might need cracking down on.
00:09:34.440 So, yes, you're right. I think there's already a powerful First Amendment argument that that they've crossed the line, but that has to be challenged.
00:09:44.700 Then who's going to issue that challenge? It's a very difficult road ahead.
00:09:49.160 I mean, possession is nine tenths of the law. And if there are if they're already doing this, it has to be undone.
00:09:55.820 It doesn't matter if it's illegal. Hmm. Well, speaking of possession being nine tenths of the law, people on the right half of the aisle are just the sane half.
00:10:04.960 It's not just all righties. It's a lot of centrist lefties, too, are rejoicing that Elon Musk is taking over Twitter, is going to buy Twitter.
00:10:12.900 And so it appears. And there's a stock problem at the moment.
00:10:15.540 And people we have to keep our eye on the Twitter shares and on the Tesla shares because he does need money in order to buy it.
00:10:21.480 But Peter Schiff came on and explained all that to us yesterday. I don't totally understand it, but it's not a done deal.
00:10:27.500 It's not a totally done deal. He's very, very rich, but he does need his actual forty four billion dollars or at least twenty one billion of it to buy.
00:10:34.140 So they're happy, but he's already taking crap from, I mean, all corners of Twitter and the Twitter top executives, the chief legal officer, the general counselor was reportedly in tears when upon learning on Monday that he was actually going to close the sale.
00:10:49.540 You know, they agreed to the sale and Elon, I guess, like to tweet or took a shot at her yesterday and then took all sorts of crap in response.
00:11:02.740 Right. Like he let's say he shared a meme Wednesday that mocked her response to accusations of the company's political bias.
00:11:11.120 And it was this thing involving Tim Poole and Joe Rogan and when they all went on there and how she's got the circular reasoning and she denies viewpoint discrimination.
00:11:17.460 But we all know they do it. And this is the thing that, you know, the the Twitter CEO from 2010 to 2015 response.
00:11:25.560 What's going on? You're making an executive of the company. You just bought the target of harassment and threats.
00:11:30.700 Bullying is not leadership. And Elon kind of defended himself saying, you know, what are you talking about?
00:11:35.400 So this is the thing. He doesn't behave like your normal CEO or owner when he takes over these massive corporations.
00:11:43.240 You know, he's being sued right now for some of the tweets he sent out about Tesla.
00:11:46.960 But this is how they get you. You offered your opinion.
00:11:51.240 His opinion is that this woman is biased and has been wrongfully manipulating content on Twitter.
00:11:58.780 He said it publicly. The way they shut you down is how dare you? How dare you?
00:12:04.860 Online harassment, sexual harassment, comments threatening. Right.
00:12:09.640 We talked about this yesterday with Vivek Ramaswamy, which is he said they smuggle in viewpoint discrimination through the guise of hate speech threats.
00:12:21.580 You know, words are violence. All of those principles which are in active form at Twitter.
00:12:26.320 Yeah, absolutely. I and I'm one of the people who's who's enjoying a schadenfreude a moment this week, you know, to watch the reaction to Musk potentially buying Twitter.
00:12:41.740 Because, first of all, I'm one of the few reporters who for years now has been covering the phenomenon of content moderation, who took it seriously from the very beginning, warned that this was going to be a problem that was going to become a bigger and bigger part of American life.
00:13:01.700 And I was laughed at by, you know, a lot of my colleagues, particularly like my left leaning colleagues who I thought were free speech advocates, you know, after the Alex Jones episode, which I thought was deeply troubling, not because I like Alex Jones,
00:13:18.240 but because the precedent of big companies like, you know, Apple and Facebook and Spotify getting together and making sort of an ex parte decision to kick somebody off the Internet.
00:13:32.720 That was a radically different approach to policing speech than what I grew up with, which is if you make a mistake, you get sued and there's an open forum and there's a process and it's it's all transparent.
00:13:45.360 I think what you know what I said from the beginning, the issue isn't who is being censored.
00:13:52.260 The issue is how it's being done. And if you're if you were in favor of a handful of mega wealthy executives back then kicking off people like Alex Jones and, you know, and then eventually Donald Trump off the Internet,
00:14:08.460 that you can't now cry that there's a different billionaire who you just happen not to like sitting in that same chair and meddling with speech in a way that you don't like.
00:14:20.420 You had to have objected on principle grounds before and they didn't. And so, you know, look, I don't have any sympathy for these people.
00:14:28.760 They they they had a chance to stand up for something like speech principles once upon a time, and they didn't do it because they wanted to censor people.
00:14:38.880 They just and now they're now they're feeling, you know, a taste of their own medicine.
00:14:45.480 Let's talk about the Alex Jones case for one second, because I reported on all of that and lived through that in a weird way myself.
00:14:53.820 But I know very well that Alex Jones was in a weird place versus most people who get targeted on the Internet because he had been serially unleashing like very personal and in your home threats pretty much on purpose to the most sympathetic group of people in our country, namely the Newtown grieving parents.
00:15:22.120 So people who had their first graders shot to death in class, kept getting harassed by his listeners who he kept telling to believe this was all a hoax, that it was made up.
00:15:36.720 And one family had to go into hiding.
00:15:39.560 It was found in court that they had been receiving death threats by this lunatic inspired by Alex Jones.
00:15:44.240 Many of them had been having to deal with the Alex Jones listeners for years in deeply painful ways.
00:15:51.820 And honestly, Matt, it was like that's I don't think you can compare that to, you know, James O'Keefe with his secret camera getting dishonest New York Times reporters saying something in a bar one way versus what they put in the pages of the Times another way.
00:16:10.200 I just think he's in a class of his own. And it wasn't just the Newtown families.
00:16:13.960 I could go down the list for you of people who have been actually hurt by people he intentionally inflamed.
00:16:20.840 It's much closer legally to what we know as incitement, which is not protected speech.
00:16:25.700 Well, I don't I don't think anything that he said with regard to Newtown was protected speech.
00:16:32.120 And I said that at the time. I also said that I think it was probably pretty obvious that he violated the terms of service of each one of those platforms.
00:16:42.420 Again, I had no interest in in defending Alex Jones on any grounds.
00:16:48.520 The issue for me had to do with the method. Right.
00:16:53.660 So once upon a time, the way we would have dealt with speech like Alex Jones was he would have been sued and the penalties, the financial penalties would have been so great that he probably would not have emerged with a career at the end of it.
00:17:10.880 I mean, that's that's that's happening.
00:17:12.940 That's happening. That's happening. It took a long time against him.
00:17:16.500 Right. And people were impatient to go through that process.
00:17:22.780 And I understand that, like, look, as this is happening, none of those none of those private businesses want to want to deal with that.
00:17:30.600 I totally get that. The problem is, is that is that by doing this, they open the door for a new a new kind of speech policing that didn't involve any kind of open and transparent process.
00:17:46.500 All these companies got together, clearly coordinated. You know, they all did it at the same time.
00:17:52.580 And they essentially decided, you know, this person is going to no longer be on the Internet.
00:17:57.680 So that opens the door. You know, the next thing is going to be O'Keefe.
00:18:03.720 And then before you know it, it's the Babylon Bee. And that was that was the issue that I had.
00:18:09.380 I mean, that is what happened. That's why it's hard. That's why it's hard, because for years they've been censoring like these radical Islamists who wanted to show people how to build bombs and commit terrorist attacks.
00:18:21.620 And I've got zero problem with that censorship. Go for it. We don't need that shit on the Internet.
00:18:26.200 And, you know, people will die as a result of that. And I just that's indisputably not OK to me.
00:18:32.720 I don't see anybody defending that censorship by the big tech companies.
00:18:36.740 But that's that's not even censorship, because those things are actually against the law.
00:18:40.960 Like, you know, the authorities can come in and they can stop actual incitement, imminent incitement to violence.
00:18:48.800 No, but I'm talking about if I but if I just sit in front of the camera and I show you how to make a dirty bomb, that's not against the law.
00:18:56.260 No, it's not. But, you know, and that's neither is hate, hate speech.
00:19:00.060 But I know that's part of part of what the American experiment is all about.
00:19:03.420 They raise that bar very, very high for a reason.
00:19:06.500 If you go back and look at those cases, you know, in the Supreme Court that, you know, that decided what what's legal speech and what is not, they they they, you know, made it pretty clear that they were willing to tolerate some pretty extreme stuff in order to protect the principle of free speech.
00:19:27.820 No, I agree with that. But that's why this is so dicey, because if I were running YouTube, I would not allow that.
00:19:34.080 I would not allow videos of how to make a dirty bomb to be posted, even if there weren't.
00:19:38.960 I mean, and it's not it's it's not unprotected.
00:19:42.380 That is protected speech. But am I going to let somebody sit there and and show people how to create that that level of dangerous weapon that could kill a bunch of people on my platform?
00:19:52.600 I'm not because I'm not a government actor and I don't have to.
00:19:55.680 So I would draw some lines, but I don't know.
00:20:00.700 And would I have allowed the Alex Jones speech against the Newtown families over and over and over?
00:20:06.060 I don't know. I mean, it's very sticky, right?
00:20:09.360 It's like there's there's a great there are gradations on this.
00:20:12.140 And if you wind up canceling the Babylon Bee, you've gone too far.
00:20:16.040 If you cancel James O'Keefe, you've gone too far.
00:20:18.700 So I just feel like why aren't there adults in the room who can distinguish between genuinely dangerous behavior that can and has gotten people hurt or killed?
00:20:31.200 And these false claims of words are violence that, you know, like claiming what what was said about this Twitter general counsel is somehow the same as this other stuff.
00:20:42.180 You know, look, I'm as close as one gets to a free speech absolutist.
00:20:49.520 I but even even I, you know, grew up understanding that there were a whole ranges of things that as a journalist, I can't say.
00:20:59.000 Right. You know, we're trained that we can't commit libel.
00:21:02.620 We can't we can't incite people.
00:21:05.580 We can't do a whole list of things.
00:21:09.000 And we have to run things through lawyers before we publish and all that.
00:21:12.420 And that's not the case in the Internet.
00:21:13.800 And I understand that we have to come up with some kind of process for dealing with difficult speech.
00:21:20.940 My criticism throughout this this this period has been that a lot of the people who are looking at this problem,
00:21:29.440 I don't think they're really interested in solving those difficult issues that you talk about.
00:21:35.360 But like, if you ask me, I think, you know, for something like Alex Jones or, you know, making bombs,
00:21:42.220 I think there should be some kind of transparent, open process where, you know, you get to actually see how these things are decided.
00:21:51.340 But what you what you you've seen instead is you've seen a lot of politicians who seem very, very anxious to use the, you know, quasi monopolistic power of these platforms to push speech in a certain direction.
00:22:06.260 They're attracted by that power.
00:22:09.980 And and that's that's where the danger is, because as soon as somebody sees that, oh, wow, if I just flick a switch, this person's gone.
00:22:18.140 They're going to be tempted to do it, to take the next step and find the next person they don't like.
00:22:22.820 And and and that's that's how you end up with the Babylon.
00:22:25.800 But here's the problem.
00:22:26.960 So here.
00:22:27.340 So, OK, let's say that they do.
00:22:29.700 They do make it more transparent.
00:22:31.360 You know, we're going to be more open about how we ban somebody or what have you.
00:22:36.920 That's they don't care.
00:22:38.520 They don't care about saying you, Babylon B, said Rachel Levine is a man and that's hate speech.
00:22:44.780 That's harassing.
00:22:45.860 We have a policy against harassing someone based on their gender identity.
00:22:48.900 That's what you're doing in the view of of us.
00:22:51.240 And then flash to the trans person on their board who says hateful.
00:22:55.200 You have no idea the suicide rate.
00:22:57.180 You get that.
00:22:58.720 Therefore, you're banned.
00:23:00.660 And I don't think they'd have any qualms about owning what what we see as viewpoint discrimination.
00:23:06.260 But what they see is just this universal nonbullying campaign.
00:23:11.640 Yeah, I agree.
00:23:13.260 I would just just quickly like to point out that when when they started this campaign,
00:23:17.640 obviously, a lot of the people who were sort of discriminated against first and you talk about viewpoint discrimination, a lot of them were on the right, but a lot of them were on the left, too.
00:23:30.180 I mean, you the some of the companies, the media outlets that saw enormous drops in traffic when companies like Google were told that they had to prevent the foment of discord.
00:23:43.080 They were outlets like Truthdig and the World Socialist website and even Democracy Now because the new algorithms essentially just favored large carriers over small ones.
00:23:55.980 So I just wanted to point that out that it's not.
00:24:01.360 But no, I agree with you.
00:24:03.920 But I do think that there has to be some way to do this that mimics the effectiveness of the litigation based system that we had dating back to New York Times v.
00:24:16.600 Sullivan in 1963.
00:24:17.920 You know, as a journalist growing up in that era, I always felt like the system worked extremely well because the rules were very clear about what we were and we're not allowed to publish.
00:24:34.040 There was a pretty high bar that you had to meet to prove that somebody had committed libel or slander.
00:24:40.540 And yet when there was a real egregious violation, it was usually, if not career ending, close to it.
00:24:49.000 And you just raised a good point, though, Matt, you raised a good point, because back in those days, you know, this is this is the Barack Obama golden days.
00:24:55.800 And in this way, I see the point there was a self-imposed high bar of class, of dignity, of not, you know, unfairly targeting one individual over and over or creating a circumstance where somebody could literally get hurt.
00:25:10.040 You know, there you wouldn't have had Alex Jones in print, you know, in the in the Times and in the Post back then.
00:25:18.020 And those papers were more respectable.
00:25:20.340 Yeah, sure.
00:25:20.640 They still had a left wing bias, but they were nothing to what they are today.
00:25:24.280 You know, they were definitely more committed to trying to be fair.
00:25:29.180 And then they would not have allowed these types of things to appear in their papers.
00:25:35.420 So it was sort of a better approach on both sides.
00:25:38.620 They were less censorious, but they had a higher bar for what could be printed and, you know, who could be targeted in the first place.
00:25:45.640 Well, that's what I mean, I think that's what we're all striving for is a system where where there's kind of sensible self-censorship before you you print something.
00:25:58.600 I mean, I think the processes that we went through before we published things in major magazines, I always thought that was that was a good process that we weren't afraid to use strong language.
00:26:11.820 We weren't afraid to say things about people if we had a strong opinion.
00:26:15.400 But when it came down to facts, you know, we had we had to be accurate.
00:26:20.520 We had to check.
00:26:22.160 And if it was a close call, we usually err on the side of caution, left it out of the paper because the penalties were high.
00:26:31.540 Now, on the Internet, there's nothing like that right now.
00:26:34.700 No. And there's plenty of people who don't do any fact checking at all.
00:26:37.540 No, there's no fact checking. And this is bled into, quote unquote, mainstream media, which is which is learned that its audiences now forgive mistakes as long as they're in the right direction.
00:26:51.960 So they're not careful anymore. They they they they make constant factual errors.
00:26:57.820 They don't worry about it. They don't worry about being sued for libel nearly as much as they would have once upon a time.
00:27:05.260 Again, I I'm not particularly sympathetic to Kyle Rittenhouse, but I was shocked by the way he was described in the first days of that story.
00:27:15.100 Like there were major news outlets that were calling him a white supremacist.
00:27:19.180 The president was calling him that and or the sorry, the future president.
00:27:24.320 And and again, once upon a time, you would have needed something to go on in order to use that terminology.
00:27:32.420 And they didn't. They just did it because the landscape has changed so much.
00:27:36.800 So, yeah, it's a big problem. I understand that there has to be something that done to fix all the craziness on the Internet or at least address it.
00:27:49.420 But what they're doing instead, I think, is is they want to leave the system in place so that they can push speech in a certain direction.
00:27:58.480 And that that's the sense that the clear sense that I get from.
00:28:03.260 I mean, I think if you asked me to sit there and say, what's the difference between, you know.
00:28:11.400 Threatening messages that actually could harm, you know, physically harm somebody, forget emotional harm, that's just we can't deal with that.
00:28:18.900 But physical harm, I could tell the difference between a tweet that did that and something that just expressed a controversial view.
00:28:26.420 And so, you know, I feel like maybe what Elon Musk needs is people who are just less ideological, you know, people who are committed to the to free speech as a principle.
00:28:37.800 But but people who are reasonable and don't want to see, you know, people get hurt unnecessarily because you've got some lunatic on the Internet continuing to dock somebody and call for violence or, you know, stretch come close enough to the line.
00:28:50.280 But they don't have ideological diversity at these companies.
00:28:53.940 And, you know, I've told my audience before, Matt, I went out to Silicon Valley in 2016.
00:28:59.220 It was 2016, right before the election.
00:29:01.720 And I met with the heads of a lot of these companies.
00:29:05.300 I was on the campuses.
00:29:06.320 I was meeting with the top executives.
00:29:07.660 And they wanted to know my thoughts on how they could do better at what they recognize as their own ideological bias.
00:29:15.440 And I told them all the same thing, which is get more ideological diversity on your boards, get more ideological diversity in your C-suite.
00:29:24.880 And certainly if you if you have any sort of a monitoring or a censorship group, make sure it's totally even, totally even.
00:29:33.000 Right.
00:29:33.140 You can't just have a bunch of people on one side of the aisle making all these calls and not expect that to be reflected in your decision making.
00:29:40.380 And guess what?
00:29:41.320 Nobody listened to me.
00:29:44.240 Well, of course.
00:29:45.340 Yeah.
00:29:45.820 No.
00:29:45.980 And again, I think this gets gets to the fact that although some people ask you for your advice, mostly people don't want to do that kind of self-reflection.
00:29:58.500 Mostly they want to exercise that authority in a certain way, which is unfortunate.
00:30:03.900 And the clear line between threats and opinion, there's lots of stuff that's already illegal that's allowed on these platforms.
00:30:15.060 And, you know, the platforms would do well if they were if they just focused on, well, let's eliminate the stuff that we that's already against the law.
00:30:24.340 Let's let's try to cut down on libel.
00:30:26.480 Let's try to cut down on threats because those are already against the law.
00:30:29.980 Right. We don't need a special, you know, new policy to deal with that.
00:30:37.020 There are laws about that where they get in trouble is where they try to they try to establish things like factual truth and say that something is is this information or misinformation, because that's a moving target that you basically can't get right in a way that's going to be fair.
00:30:56.380 And, you know, or if they're trying to define something that's an opinion as being beyond the pale and abusive and hurtful, like hurtfulness isn't a standard that can be applied in any way.
00:31:12.500 I think that's rational.
00:31:14.260 I agree.
00:31:15.280 It just it just it just can't hold up.
00:31:18.360 I agree.
00:31:19.020 As something that's consistent.
00:31:20.880 That's why it's like, OK, well, what is bullying?
00:31:23.720 Perhaps if there's some large campaign, you know, designated at one person that just completely upends the person's life, it would have to be massive, massive.
00:31:33.360 Not just a few tweets from the Babylon Bee that gets, you know, a bunch of likes.
00:31:36.600 Maybe I don't know.
00:31:38.760 But otherwise, like we can't really do feelings.
00:31:42.480 We can't do feelings.
00:31:43.560 We can definitely take account for physical threats.
00:31:46.240 But words are violence and my feelings are hurt.
00:31:49.320 Stay off the Internet.
00:31:50.220 It's a cesspool.
00:31:51.140 If you don't know that, you know, I'm sorry, but big reveal.
00:31:55.100 And by the way, you can get smartphones that don't have an Internet button on them.
00:31:59.060 Like there are ways of protecting yourself in modern day America.
00:32:02.200 If you just choose not to engage with forums that, you know, are hurtful and toxic, they are under Elon.
00:32:08.460 They will be right now.
00:32:09.840 Twitter's totally toxic.
00:32:10.940 I love all these libs who are like, it's rainbow and unicorns.
00:32:14.100 Walk a mile in my shoes on the Internet while you people run it because it's been disgusting.
00:32:19.540 Yeah, and totally humorless and miserable experience for for quite some time now.
00:32:26.320 I also I also think they get into incredible trouble when they try to police misinformation and disinformation, because I think most journalists understand.
00:32:40.340 I mean, Megan, you know this in the first days of any news story, there's always some error baked into the reporting that only comes out later.
00:32:49.680 Right. So if you you know, if you have some kind of star chamber of fact checkers who are declaring this or that to be the truth and everything else needs to be wiped out, inevitably, what's going to happen is you're going to have fiascos like the lab league business where, you know, for some initial period, they're going to declare.
00:33:13.520 Well, this is this is an untruth. This is a this is conspiracy theory.
00:33:17.680 Oh, but six, six months later, it turns out it might be true.
00:33:22.160 And like the covid lab leak theory.
00:33:24.460 Right. Yeah, exactly.
00:33:25.400 And once you do that, you lose all credibility with audiences.
00:33:28.660 And now what's going to happen is they're going to they're going to trust what you call the official trusted version of reality.
00:33:37.940 They're going to distrust that even more once you make a couple of mistakes like that, and they're going to drift even more towards conspiracy theories.
00:33:45.800 So that that for me is like that's a that's a fundamental misunderstanding of how news consumers work.
00:33:51.240 If you if you try to weed out conspiracy theories and crackpots and all these other things in the name of truth, what you end up with most of the time is more of that.
00:34:02.660 And and I think that that's not very well understood.
00:34:06.100 Good. I'm going to squeeze in a break, but I'll read this from the very well worth your time.
00:34:11.700 Substack from Tybee.
00:34:13.620 This site talking about Twitter used to be fun, funny and a great tool for exchanging information.
00:34:18.860 Now it feels like what the world would be if the eight most vile people in Brooklyn were put in charge of all human life.
00:34:25.140 A giant hyper pretentious thought Starbucks.
00:34:29.640 So good.
00:34:31.440 All right. Stand by, Matt.
00:34:32.640 More with Matt Taibbi after a quick break.
00:34:35.600 Loving this conversation.
00:34:36.720 And we'll tell you about Dr.
00:34:38.060 Fauci's reversal and what Biden's doing that Trump never did before.
00:34:48.320 Back with me now, Matt Taibbi, editor of the TK News Substack.
00:34:53.920 All right.
00:34:54.200 So the reason I stumbled on the intro is because I've got Joe Biden in my head.
00:34:57.600 This just in, he made remarks this morning that Tom Cotton, Senator Tom Cotton of Arkansas is tweeting out as, quote, alarming because of the little bit of slurring and a lot of stumbling.
00:35:11.280 Take a listen for yourself.
00:35:12.360 We're going to seize their yachts, their luxury homes and other ill-begotten gains of Putin's kleptocracy and the guys who are the kleptocracy.
00:35:26.200 But these are bad guys.
00:35:31.100 Oh, my God, Matt.
00:35:32.880 I, too, find it alarming.
00:35:34.400 Well, I told this story before, but this was trying to cover Biden's issues on that front was actually one of the reasons I ended up moving to Substack.
00:35:50.920 Because I was covering a I was doing a feature on on Biden on the campaign trail for Rolling Stone, and I was noticing what everybody else was noticing, like this guy's having trouble getting through sentences.
00:36:05.320 Every time he has to ad lib, he gets lost, he forgets where he is, he forgets what what the question is, and I called back some of the people I had talked to for a story about the potential use of the 25th Amendment to get Donald Trump removed on on the grounds that he was mentally incompetent.
00:36:28.420 If you remember, there was a big drive to do that, and I was assigned to cover that story, and lots of psychiatrists were very happy to talk about that then, but nobody would talk about the Biden issue.
00:36:41.320 And I just realized that we were in a completely different media environment where, you know, certain things were just sort of off limits.
00:36:49.060 And I think it was we did kind of the country a disservice by not talking about this a whole lot before he was elected.
00:36:56.820 Right. Did you see the Title 42 thing last week?
00:37:00.620 No.
00:37:01.480 Oh, oh, you've got to see it. We have it. So he was asked about, I think, about Title 42.
00:37:09.120 My team will refresh me whether the question was about 42 or the mask mandate being struck down.
00:37:14.880 It was one or the other. Hold on. Go ahead.
00:37:18.100 OK, so the question was about the mask mandate being struck down by a federal district judge in Florida, and he answered it about Title 42, the covid immigration regulation that allows our our border agents to reject everyone who wants asylum.
00:37:35.960 Just saying, oh, it's covid. Get out.
00:37:37.320 So he gets totally confused about the two. They start meandering. He starts intertwining. Just take a listen.
00:37:44.840 I'm Title 42, sir. Are you considering delaying lifting Title 42?
00:37:49.360 Now, what I'm considering is continuing to hear from my my first of all, there's going to be an appeal by the Justice Department, because as a matter of principle, we want to be able to be in a position where if, in fact, it is strongly concluded by the scientists that we need Title 42, that we'd be able to do that.
00:38:12.700 But there has been no decision.
00:38:14.360 Like, my God, so you hear he's asked about the mask mandate. He starts meandering all over about 42.
00:38:20.080 He can't keep it straight. Vice versa. Neither can I right now.
00:38:23.360 But I'm not the president and I wasn't facing the reporters and he had to issue a cleanup later in a written statement.
00:38:29.240 We've seen it happen time and time again.
00:38:31.700 Yeah, it's it's certainly not reassuring when you look up at the president of the United States.
00:38:38.160 And the emotion that's being betrayed in his eyes is terror because he's not he's not quite sure what what the question is and or whether he's answering appropriately.
00:38:52.320 I've seen this with some other politicians in the past and and but Biden got got worse quickly in in the last election.
00:39:04.480 And again, I think the reporter is just kind of decided to not talk about it because they had already decided that he was going to be taking on Donald Trump and they didn't want to give him ammunition, which which I think was a huge mistake.
00:39:17.660 Did those president's last names rhyme with Megan? Because there was a real issue with one of them in his second term that went on to become quite a news story.
00:39:27.200 Right. Yeah. Well, Reagan was one of the ones I was thinking of.
00:39:30.100 You know, I've seen it. I saw with Boris Yeltsin when I lived in Russia.
00:39:35.040 You know, I think the issues there might have been a little bit different, but, you know, similarly, he had some cognitive issues.
00:39:41.980 But but but look, you know, this is what happens when reporters start messing with things beyond their purview.
00:39:52.300 Like our our job is just to tell you like what we see and, you know, worry about whether it's right or right or wrong.
00:39:59.140 And then it's up to the public to figure out what they think about it.
00:40:02.160 But what started to happen in 2016 when Trump came on the scene is reporters suddenly were like looking at news stories.
00:40:09.920 Just to take an example, there was that issue with Hillary Clinton not filling up her crowds.
00:40:16.460 Right. So she she was having trouble filling the halls and reporters got together and they kind of silently decided not to make an issue out of that because they didn't want to make it look like her campaign was doing badly.
00:40:29.960 But that ended up hurting her because it created a false sense of security in the campaign.
00:40:37.040 And and, you know, instead of doing something to try to fix it, they just kept going and they ended up losing.
00:40:43.960 So, you know, reporters should just, you know, tell us what they see and, you know, let the chips fall where they may.
00:40:51.740 And they won't make they won't affect history in a negative way, at least that way.
00:40:55.860 Well, and it's like, you know, when grandpa starts to lose his marbles, you know, when he starts to starts to go south, grandpa can be easily manipulated.
00:41:05.060 You know, we don't do that because we love grandpa.
00:41:07.100 But this is the sitting president of the United States.
00:41:09.700 And we were promised somebody who wasn't going to be some far left wokester.
00:41:13.440 And he has been.
00:41:14.700 And we were promised somebody who was going to be the voice of reason.
00:41:17.980 And he hasn't been.
00:41:19.620 And we were promised somebody who said he was very skeptical of, quote, forgiving student loans because he understood the problems that would create and the fairness issues it would create.
00:41:29.260 And now he's about to do it.
00:41:30.860 And one wonders, what did I buy?
00:41:34.080 What did I get?
00:41:35.260 What who is running the show legitimately?
00:41:37.340 Who is making these decisions?
00:41:38.900 And if it is Joe Biden, who is manipulating him into these decisions?
00:41:42.640 Because I'm not sure I elected them.
00:41:45.340 Yeah.
00:41:45.500 And that was another question because there was so clearly a competency issue with Biden.
00:41:53.240 There should have been a secondary news story like who's actually going to be running the country if this guy gets elected.
00:41:58.380 And there weren't a whole lot of those stories.
00:42:00.680 I mean, I blame myself.
00:42:02.340 I didn't I didn't really do it either.
00:42:04.380 But somebody needed to do that story and it needs to do it now, too.
00:42:10.020 And we're not we're not really doing it.
00:42:12.460 We know that we know that there's some infighting, but we don't know.
00:42:15.700 We don't know exactly how decisions are being made.
00:42:18.320 Well, so Joe Biden is doing something that Trump didn't do.
00:42:21.980 And that is, as the sitting president, he's about to go to the now, you know, reborn White House Correspondents Dinner, which is going to happen in Washington, D.C. this weekend.
00:42:31.360 Cue the vomit emoji.
00:42:35.220 I know that.
00:42:36.560 I call it the White House self-congratulation dinner, but go ahead.
00:42:40.280 It's disgusting.
00:42:41.080 They're awful.
00:42:41.760 My favorite was I went to one where Pamela Anderson was.
00:42:45.800 She was, you know, they always invite these celebrities.
00:42:47.560 George Clooney was there once.
00:42:48.700 He was like the biggest star ever there, bigger than any president.
00:42:52.640 Pam Anderson was at one.
00:42:54.080 And they said, so, you know, Ms. Anderson, what are you doing at the White House Correspondents Dinner?
00:42:58.340 And she said, oh, I'm sorry, I thought I was at the White Trash Correspondents Dinner.
00:43:07.220 Greatest thing ever to happen.
00:43:09.920 That's great.
00:43:11.000 That's great.
00:43:11.620 So Biden's going to go.
00:43:13.160 He's only going to sit.
00:43:14.500 He's not going to have the dinner out of COVID fears.
00:43:16.880 He wants to be responsible.
00:43:18.140 He's not going to sit for the actual dinner.
00:43:19.480 He's just going to go for the, you know, the humor and the roasts.
00:43:22.800 I mean, that's what everybody wants to do.
00:43:24.020 No one wants to sit for the damn dinner.
00:43:25.280 So he's basically just, you know, parachuting in for the comedian.
00:43:29.460 But then it turns out the comedian is Trevor Noah.
00:43:32.080 So who wants to see that?
00:43:33.860 We all know what we're going to get.
00:43:35.360 And the other sort of sub line to all this, Matt, is that Dr. Fauci was supposed to go, but bailed because the four time vaccinated Fauci doesn't think this is safe.
00:43:45.020 Yeah, I mean, that story is ridiculous on so many levels that it's it's just hard to even know where to begin.
00:43:53.700 But they've been consistently irrational about this from the very beginning.
00:43:59.700 You know, what from the very start, they were they were saying to us that they didn't really think the vaccines worked.
00:44:10.500 We know why did we have to stay in lockdown if the if the if the vaccines were effective?
00:44:15.340 Well, you know, they just don't really believe in them.
00:44:19.160 And I think there's some sending mixed messages, which, again, gets back to the point of, you know, when people stop trusting you, that's when they drift even more towards conspiratorial interpretations of things.
00:44:29.160 So they I think it sends a terrible message, what he what he's doing.
00:44:34.540 It's so true. Right.
00:44:35.860 It's like, aren't the vaccines supposed to protect us from severe illness or death and reduce covid to something rather mild that the average person can handle?
00:44:46.000 Yes, is the answer.
00:44:47.200 So why are they behaving like this is the very first form of covid, which actually was more severe, far more than what we're dealing with now, Omicron, whatever, the second version of Omicron?
00:44:57.280 Why are they pretending like it's still that version and we have no vaccine and we have no therapeutics, right?
00:45:04.000 They they aren't going out and living their lives.
00:45:06.320 Or maybe it's just all one big, massive virtue signal to try to cover for their overextended big government hand, which is still literally over the mouths.
00:45:17.940 In effect, I guess not literal of little children in New York City, two year olds who are masked.
00:45:22.520 Yeah, clearly there were people who just loved all of the rules to a to a degree that was a little bit unseemly.
00:45:32.400 Like there were lots of policies in the last two years where I thought, well, maybe I agree with that.
00:45:37.600 That's it's possible that that might be the sensible thing to do.
00:45:41.320 But but I was put off by by the glee with which people were, you know, glad to impose some of these these restrictions, especially with schools and kids where, you know, it was suddenly became taboo to talk about the fact that kids didn't really get sick with this very much.
00:45:59.360 You know, that's disturbing.
00:46:00.760 I think there are there are people who just like it too much, like the rules too much.
00:46:04.320 And that's that's not a good thing.
00:46:06.560 Well, it's like Brian Stelter, would you go to a party with no rules speaks for so many of them?
00:46:15.540 OK, listen, when we come back, I'm going to play you, Dr. Fauci, who literally in the course of a few hours declared the pandemic was over only to reverse himself moments later.
00:46:23.960 It's not over. It's over. Celebrate you.
00:46:26.340 We finally know it's not.
00:46:27.620 Are is anyone surprised?
00:46:30.920 There's much, much more to go over, including the news we just got about what we're prepared to do in Ukraine, where Matt has had some good thoughts on Russia and what our potential role should be all along.
00:46:41.260 More with Matt coming up.
00:46:45.520 All right, Matt.
00:46:46.260 So staying on the subject of Fauci, literally in the course of a few hours, he said the pandemic was over only to reverse himself and say, no, it's not over.
00:46:57.920 It's never going to be over for Dr. Fauci, I'm sure.
00:47:01.040 Take a listen to these budded soundbites.
00:47:03.400 We are certainly right now in this country out of the pandemic phase.
00:47:08.280 Is the pandemic still here?
00:47:10.460 Absolutely.
00:47:10.940 So when I said phase, I probably should have said the acute stage of the pandemic phase.
00:47:19.600 I see you laughing.
00:47:22.080 It is laughable.
00:47:24.500 It is.
00:47:25.740 It is.
00:47:26.300 And again, this just gets back to why you can't have YouTube or Google or Facebook or Twitter relying upon government officials to tell you what the truth is about something.
00:47:42.500 Because even they don't know, they change their minds every 10 seconds about stuff, including like really important things, like whether or not to wear a mask or, you know, whether the vaccine is actually going to protect you from getting infected.
00:47:56.500 Like that's why you need you, you cannot have top down information controls because, you know, the truth is always a moving target.
00:48:08.180 Mm hmm.
00:48:09.320 I know.
00:48:09.940 I feel like he either he had a momentary slip, you know, when he said it's over because I don't think he's ever going to say that and really mean it.
00:48:18.440 He doesn't really want that.
00:48:20.040 Or he just got woodshedded.
00:48:21.480 He said it because it's actually a fact.
00:48:23.440 And he slipped into factual reporting for a second there only to get woodshedded by the administration.
00:48:27.280 He said, we're not admitting that we have mandates in place.
00:48:30.980 We're still firing people for not like, no, it's not over.
00:48:34.440 Get back on message.
00:48:35.420 Yeah, I think they took out the cattle fraud and found a nice quiet room somewhere to to to set him straight about what the official message message is.
00:48:48.780 Yeah.
00:48:49.520 That's what it sounds like.
00:48:50.960 A lot of these vaccine mandates are still in place.
00:48:53.340 People are still getting fired.
00:48:55.160 Even I wonder about in my schools, they have vaccine mandates in our schools.
00:48:58.820 They don't kick in until they're 16 years old.
00:49:01.780 And I don't I'm not there yet with my kids.
00:49:03.420 But I wonder, like, how do you justify that?
00:49:05.920 Right.
00:49:06.060 For the kids who are about to turn 15 to 16, you can't justify that anymore.
00:49:10.020 You got you got Fauci on tape saying the pandemic phase, at least, is over.
00:49:16.180 It's over.
00:49:17.060 So what's going to happen?
00:49:17.960 Do you think, you know, do these politicians and bureaucrats and school administrators follow through with these things?
00:49:24.160 The writer, Christopher Lash, once said the essence of propaganda was keeping the public in an ongoing state of emergency.
00:49:35.060 And I think we've in especially in the Trump years, we've we've fallen into the pattern of always being in an emergency and politicians finding ways to to find that useful.
00:49:47.680 The pandemic has been extremely useful to politicians.
00:49:52.960 It has given them the ability to dictate all kinds of behaviors and to allow them to stick their fingers in things like the news and Internet content moderation.
00:50:07.440 I don't think they want the emergency to end.
00:50:10.280 I think they like this new normal, you know, and it's a problem.
00:50:15.360 You know, the the the fact the idea that there aren't people who are motivated to end crises is is a big problem just generally, I think, in politics.
00:50:26.700 So speaking of the vaccine mandates and how they've impacted people's lives, an interesting couple of cases in the news.
00:50:34.900 One has to do with the mandates.
00:50:36.880 One doesn't.
00:50:38.220 Sage Steele of ESPN just filed a lawsuit against ESPN and its parent company, Walt Disney, alleging that the company treated her unfairly for comments she made on a podcast interview last September.
00:50:51.880 This made news at the time.
00:50:53.100 She she had been one of the lead anchors for ESPN's flagship show Sports Center.
00:50:58.900 I know you're big into the NFL draft and things like that.
00:51:00.820 I am not.
00:51:01.340 I know nothing about sports.
00:51:02.620 So I'm reading this.
00:51:03.840 OK, but since that interview, she says she's been sidelined for the prime assignments.
00:51:09.920 She does continue to anchor the noon Sports Center broadcast, but quite a few things were taken away from her and she was pulled off the air for some big assignments, she says.
00:51:17.460 So she had gone on former NFL quarterback Jay Cutler's podcast and shared her thoughts on ESPN's vaccine mandate, sexism in sports, journalism and on Obama's ethnicity.
00:51:29.760 The fact that he selected black as his ethnicity on the census because he's biracial and she's also biracial and had some thoughts on it.
00:51:38.740 So here's what she said on the Jay Cutler podcast that she's now alleging she was punished for.
00:51:44.040 I respect everyone's decision.
00:51:45.960 I really do.
00:51:46.960 But to mandate it is sick and scary to me in many ways.
00:51:54.200 But I have a job, a job that I love and frankly, a job that I need.
00:52:00.840 But again, I love it.
00:52:02.500 I just I'm not surprised it got to this point, especially with Disney.
00:52:06.840 I mean, a global company.
00:52:08.900 So ESPN melted down.
00:52:12.120 We embrace different points of view.
00:52:13.920 Dialogue and discussion are great.
00:52:15.760 That said, we expect those points of view to be expressed respectfully in a manner consistent with our values and in line with our internal policies.
00:52:23.200 She got hit by, of course, Jemele Hill, who just once again lost yet another show over there on CNN Plus.
00:52:30.420 How many shows can Jemele Hill lose?
00:52:33.020 And then ESPN required her to issue an apology.
00:52:37.340 So the thing about ESPN and normally they could punish her for her viewpoints because they are not a government actor.
00:52:44.720 But the state of Connecticut, where she is and where I am, they have apparently a law that actually says corporations can't always do that.
00:52:55.940 And she's taking advantage of that.
00:52:57.940 So what do you make of her fighting back against what this company allegedly did to her?
00:53:03.720 This is a difficult issue for me.
00:53:05.780 I'm of two minds about this because, you know, I remember when Liz Spade, the former public editor of the New York Times, got in trouble some years ago, among other things for talking about New York Times writers being on social media too much.
00:53:28.120 And, you know, I understand the rationale for that because once upon a time, you know, in my father's day when he was on the news, viewers didn't really know a whole lot about the political views of reporters.
00:53:43.600 And that was and that actually added to their credibility like they, you know, if you didn't know whether a person was liberal or conservative and they were just delivering the news, it did kind of tend to make people feel like they were more likely to believe just that they were watching a news program.
00:54:06.080 However, you know, nobody really is just a pure newsreader anymore and everybody has a social media presence.
00:54:13.800 So you can't I also think you can't you can't especially at ESPN for talking.
00:54:18.500 Yeah, especially at ESPN.
00:54:19.920 They're encouraging these anchors to go out there and they've they forced moments of silence on them and they've gotten very politically active on the air there.
00:54:28.320 So why single out Sage?
00:54:29.760 Yeah, I mean, and again, I know a lot of people in the news in the news business who were who were outright told by their bosses, like you have to get a Twitter handle.
00:54:42.220 You've got to have more of a presence in social media.
00:54:46.040 Clearly on ESPN, you know, they're trying to build up the brand, the individual brands of all of these on air personalities.
00:54:53.100 So when they do that, but they do that in a way that doesn't fit with some kind of orthodoxy, I don't I don't think you can punish those people.
00:55:01.800 I think that's that's crazy.
00:55:04.260 It's once again, it's viewpoint discrimination.
00:55:06.380 By the way, the Connecticut law, just to clarify what I said, it states companies cannot discipline employees for exercising their First Amendment rights as long as the comments do not directly impact their work performance or the company.
00:55:17.420 She's arguing that her comments remain in a third party podcast and that she should be considered a private citizen in this situation, making these comments.
00:55:25.040 The thing is, like, I don't see how ESPN gets away with punishing just her, given its push to make its anchors go totally woke on the air.
00:55:35.180 And now you have one person here who happens to be a woman of color who pushes back on some of the narrative.
00:55:41.680 She didn't want to get the vaccine.
00:55:43.080 She didn't think it made sense.
00:55:44.100 She didn't like what she didn't say she didn't like the Obama choosing black, just to clarify what she actually said.
00:55:50.240 She said Barack Obama chose black and he's biracial.
00:55:52.740 I'm like, well, congratulations to the president.
00:55:55.040 That's his thing.
00:55:56.140 I think it's fascinating considering his black dad was nowhere to be found, but his white mom and grandma raised him.
00:56:01.420 But, hey, you do you.
00:56:02.540 I'm going to do me.
00:56:03.880 That's why is that an unfair point?
00:56:05.740 She's basically asking, why do you identify with one side of the family versus the other when it was the other that raised you?
00:56:11.300 OK, you can say I'm offended by that.
00:56:13.200 I don't like that.
00:56:13.940 It's her POV.
00:56:15.620 Same as, you know, some audience members may get offended by the incredibly woke, anti-patriotic statements coming out of the mouths of the anchors sitting on set during the big basketball games or the big football games.
00:56:27.540 And we've heard that, too.
00:56:28.980 ESPN has no problem with that.
00:56:30.120 Yeah, and what I would say is as a sports fan, I don't want to hear it.
00:56:37.980 Like when I when I turn on ESPN, I'm turning it on or I used to anyway, because I'm looking for an escape from politics.
00:56:45.580 Yeah, that's the thing.
00:56:46.420 She didn't do it in the anchor chair.
00:56:47.800 Unlike those guys.
00:56:48.800 She did it on a podcast.
00:56:51.520 Right.
00:56:51.820 Exactly.
00:56:53.380 You know, so I don't know.
00:56:56.140 I don't know what they're thinking.
00:56:57.400 I mean, the I think a lot of these these companies that have gotten away from what really works.
00:57:06.060 You know, sportscasting used to be a really, you know, interesting and colorful and creative wing of the media world because they they were able to write with style.
00:57:20.020 They were able to use humor and wit in ways that regular newscasters weren't really allowed to do.
00:57:27.300 But it's become just as dreary in a lot of ways as the rest of media.
00:57:31.480 And I don't really understand why they would voluntarily do that.
00:57:34.080 But what did you call the Starbucks?
00:57:35.980 What did you call the Starbucks?
00:57:39.080 Thought Starbucks.
00:57:40.180 There is thought Starbucks, too.
00:57:41.640 Now, nobody wants to be that.
00:57:44.080 OK, so the second lawsuit I wanted to ask you about, I realize you're not here in any legal capacity, but they're interesting.
00:57:50.080 People are talking about him.
00:57:51.300 These cases is the Amber Heard Johnny Depp defamation case.
00:57:55.620 She claimed in The Washington Post that she was a domestic abuse victim in 2018.
00:58:00.460 This is two years after she had made sure she was caught on camera by the paparazzi with what she claimed was a bruise on her face from what she claimed was a phone thrown at it.
00:58:10.760 The face by Johnny Depp.
00:58:12.740 We've had witness.
00:58:13.920 So she's laying the foundation.
00:58:15.400 I'm an abuse victim at his hands.
00:58:17.840 The WAPO op-ed did not name Johnny Depp, but everybody knew that's who she meant.
00:58:21.400 And now he's sued her.
00:58:23.800 He got fired from, I guess it was the fifth installation of Pirates of the Caribbean right after that and lost millions of dollars, not to mention reputational damage.
00:58:33.360 And he's filed a lawsuit for defamation against her.
00:58:36.520 And the trial has not gone well for her.
00:58:39.780 It has not gone well at all.
00:58:40.960 There's been plenty of testimony about how they're both hot messes and they're both way into drugs and violent and weird.
00:58:48.300 But it's certainly established at a minimum she has attacked him repeatedly.
00:58:54.240 And so I mean, that's at a minimum.
00:58:56.040 OK, best case scenario for her is they attacked each other.
00:59:00.120 She did it more, but he a couple of times may have hit her, too.
00:59:04.140 That's best case scenario.
00:59:05.580 All inferences in her favor.
00:59:07.240 That does not necessarily support I'm an abuse victim and I've I've had the you know, the Internet unleashed against me.
00:59:15.220 Is it you abused him repeatedly?
00:59:18.340 You cost him the end of his finger.
00:59:20.020 You you or your friend actually defecated in your marital bed.
00:59:23.680 The evidence has shown.
00:59:25.080 I mean, it goes on, Matt.
00:59:26.720 And this is Johnny Depp's testimony in court this week.
00:59:30.380 In part, take a listen.
00:59:31.760 I lost a fucking finger, man.
00:59:34.400 Come on.
00:59:34.960 I had a fucking I had a fucking a mineral can't a jar of mineral spirits thrown at my nose.
00:59:42.880 I mean, you can please tell people that it was a fair fight and see what the jury judge thinks.
00:59:49.220 Tell the world, Johnny.
00:59:50.620 Tell them Johnny Depp.
00:59:51.880 I joined that.
00:59:53.220 Matt, I'm I'm a victim to this.
00:59:55.100 And I know it's a fair fight.
00:59:57.800 It's these probably people believe or side with you.
01:00:00.520 And what did you say in response when Miss Hurd said, tell the world, Johnny, tell them Johnny Depp.
01:00:06.580 I, Johnny Depp, a man.
01:00:08.240 I'm a victim to of domestic violence.
01:00:12.080 I said, yes, I am.
01:00:14.380 So that's her admitting basically on tape that she cut his finger off with a vodka bottle and him complaining about it and her kind of mocking him like, oh, go ahead.
01:00:25.580 Good luck.
01:00:26.100 Tell the world you're the you're the victim.
01:00:27.980 And him saying, you know what?
01:00:29.340 Alive.
01:00:29.940 I am.
01:00:32.060 Yeah, I don't know.
01:00:33.860 This is this is a tough one.
01:00:35.880 You know, I've obviously gotten in trouble over the subject in the past.
01:00:40.680 And I do understand the idea that there needs to be an initial reaction that we believe women at least enough so that they get a hearing, you know, to not believe.
01:00:58.220 But well, well, keep it open.
01:01:01.060 Yeah, like at least accept the seriousness of the accusation, like initially, initially, you can't dismiss it like we used to.
01:01:11.420 Right.
01:01:11.920 Which is what what happened, you know, in the past.
01:01:14.440 And that's something that definitely needs to be corrected.
01:01:16.900 Yeah, I'm doing a story right now about I can't really talk about who it is, but there's a there's a company that's, you know, that's gotten in a lot of trouble and had all sorts of issues financially, really over allegations and not not really about substantiated conduct.
01:01:38.940 And this is something that's just become a little bit, I think, too easy in modern media, which is, you know, we raise an allegation of something or we imply that some something happened.
01:01:50.480 And before you know it, you know, the Twitter takes off and turns it into a fact.
01:01:55.920 And next thing you know, it's a reputational harm issue.
01:01:59.080 And we can't have that person working at our company because, you know, the the staff will be upset about it.
01:02:08.200 That's just become too easy.
01:02:09.680 Like, I think there has to be some kind of happy medium where you have to prove these things out before before people really, you know, go through serious damage.
01:02:19.660 Yeah.
01:02:20.140 I mean, you're not wrong because I mean, I do think like the believe all women thing was always a lie and stupid and absolutely un-American.
01:02:29.080 Nobody gets a presumption of belief.
01:02:31.360 Nobody.
01:02:31.940 Right.
01:02:32.220 It's the worst case scenario is you're you're charged with a crime and the system says you get a presumption of innocence because the state has such an advantage over you when you're sitting there in shackles and he or she gets to go in on the other side in their suit saying, I represent the United States of America.
01:02:49.000 For those reasons, because the deck is stacked against the defendants, we give them a presumption of innocence.
01:02:53.720 We want to hold the system to account for we throw somebody in jail, take away their freedom.
01:02:57.920 But that you don't get that presumption of truth telling in any forum, including a court.
01:03:05.380 And so I'm glad he brought this case because she really was painted as just this poor victim who'd been abused by him.
01:03:15.560 And definitely he suffered from it financially and otherwise.
01:03:18.080 Not that he needs the money, but still, it's just the principle.
01:03:21.720 And I think this trial has exposed that at a minimum, these situations can be a lot more complicated than we we admit.
01:03:30.820 Well, and this has always been a big issue for me over the years, which is that a lot of reporters think that, you know, there's a playbook to news stories or that you can, you know, lapse into cliches when you report things.
01:03:54.040 The reality is you have to clean your slate every time and approach every new story as a completely new set of facts.
01:04:02.000 Because, you know, what might be a Matt Lauer story, you know, in one instance, you know, you might have a completely different fact pattern the next time.
01:04:11.700 You can't carry over expectations from your previous reporting and just kind of shoehorn in a, you know, a cliched understanding of what happened.
01:04:22.780 And I think that's, we've gotten away from doing that, of just wiping the slate clean each time.
01:04:28.380 That's good.
01:04:28.780 It's part of our drift toward collective guilt.
01:04:32.360 You know, this, oh, he must be guilty.
01:04:34.340 He did it.
01:04:34.820 He's a man.
01:04:35.440 He's a rich man.
01:04:36.100 He's a celebrity.
01:04:36.740 He did it.
01:04:37.760 And he can't, that's just not the way life works.
01:04:40.380 He doesn't have any collective guilt because of any, because of his gender, because of his celebrity status.
01:04:46.640 Okay.
01:04:47.080 Hard turn now.
01:04:47.840 Because I do, before we go, want to get your thoughts on Ukraine.
01:04:51.140 You've been really interesting on this whole conflict over there, which goes on.
01:04:56.000 And the news of the day is that Biden wants another $33 billion from Congress for Ukraine emergency funding.
01:05:05.780 It's a big price tag.
01:05:08.220 Germany has now reversed itself on sending arms to Ukraine after claiming it would tap into its reserves.
01:05:15.440 So some rollback from the Europeans, America sending more money.
01:05:19.560 There's still some calls from Republicans and Democrats even for us to get more involved, more weapons.
01:05:27.060 And even still, some people are saying no fly zone and so on, though I don't think that's going to happen.
01:05:31.660 So where do you make of where the United States is now and where this conflict is now?
01:05:36.360 So, uh, first of all, I, I, I was one of the people that got this wrong.
01:05:42.860 Like I never expected, uh, Russia to actually invade Ukraine or at least, uh, the Western part of Ukraine.
01:05:51.660 And so I, I made a wrong call on that.
01:05:53.980 And, and, and then you did something extraordinary.
01:05:55.720 You admitted that you were wrong and you apologized to your listeners and your readers, which is all that's expected, but nobody does that anymore.
01:06:03.440 I mean, it's crazy.
01:06:05.080 Nobody, nobody takes accountability, responsibility.
01:06:07.420 Yeah, you do.
01:06:08.780 You do have to do that, but you know, I, I got that wrong and it's an unpredictable situation.
01:06:13.580 But, uh, but I think what's happened, um, over time is that, uh, we're not really reporting on, um, what the United States is, is, uh, what their policy is.
01:06:28.300 Uh, you know, secretary of state, uh, of defense Austin said this, I thought really fascinating thing this week where he said that, you know, our, basically what our plan is, is to, is to weaken Russia so that it can't do this to the next.
01:06:42.900 Ukraine.
01:06:43.900 Now that seems to me at cross purposes with Ukraine's mission in all this, I'm sure, I'm sure Ukraine wants to defeat Russia militarily, but they may also come to a point where they just want to end the conflict with minimal damage.
01:06:59.460 And, and so if the United States, um, is, is committed to a different policy where we're not going to give them the, the, uh, um, the ability to negotiate, for instance, the end of sanctions.
01:07:12.900 Uh, then Russia is really at war with us, not with Ukraine.
01:07:17.140 Like if, if, if, if, uh, if Ukraine doesn't have that, um, autonomy, uh, then this is an immensely complicated situation.
01:07:26.100 And I also think the United States is delusional if, if they think that this is going to end in some kind of happy regime change scenario in Russia, the much more likely, uh, outcome is that you're going to get a more hardline leader, uh, who's going to come in after Putin.
01:07:42.380 And, and they're going to drop vacuum bombs on every city in Ukraine.
01:07:45.680 Like that, that that's my worry.
01:07:47.300 And the whole thing is that we're, we're, we're pursuing this, um, with this sort of fairyland expectations about how it's going to end.
01:07:55.640 Yeah.
01:07:56.180 I was joking the other day that they, they seem to think Biden and, you know, those around him that if they could just get rid of Putin, they'd get Jed Bartlett, you know, there he is just waiting.
01:08:06.140 He's dying for democracy.
01:08:07.760 Just somebody could take out Putin.
01:08:09.940 I could come in with all my liberal ideas.
01:08:12.800 Right.
01:08:13.420 I mean, were they not paying attention in the last 30 years?
01:08:17.560 I mean, that's, that's the thing that's amazing to me is the United States has already been, been around this track many times with Russia.
01:08:23.560 I was there during this process.
01:08:24.940 Like, you know, we, we tried to voice, you know, an America friendly leader on, on Russia.
01:08:32.800 Uh, and those people were hugely unpopular, mainly because they were friendly with the West.
01:08:39.060 And it was part of the reason we got Putin in the first place, uh, because, you know, Boris Yeltsin was seen as too close to the United States.
01:08:47.780 Uh, Putin was seen as somebody who stood up to us.
01:08:50.620 And so we had popular backing.
01:08:52.760 Um, so the, the person who comes in after Putin, if they, if they think it's going to be like, you know, Emmanuel Macron or something like that, they're, they're, they're, they're, they're,
01:09:02.840 they're, they're, they're, they're, they're, they're, they're, you know, the, they're not, they're, they're not, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're.
01:09:20.480 Nobody liked it. His only friends were in America.
01:09:23.960 Oh, right.
01:09:26.040 It's been a pleasure, as always. Thank you so much for coming on. And to our audience,
01:09:30.060 go check out Matt Substack now. TK News, well worth your time, as you can say. All the best.
01:09:35.040 Thanks so much, Megan, for having me on. Take care now.
01:09:37.180 Don't forget to join us tomorrow. Cheryl Atkinson will be here. Talk to you then.
01:09:42.380 Thanks for listening to The Megan Kelly Show. No BS, no agenda, and no fear.
01:09:50.480 Thank you.