ManoWhisper
Home
Shows
About
Search
TRIGGERnometry
- June 07, 2023
How Twitter Censored COVID Dissent - David Zweig
Episode Stats
Length
1 hour and 12 minutes
Words per Minute
180.95639
Word Count
13,165
Sentence Count
786
Hate Speech Sentences
9
Summary
Summaries generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript generated with
Whisper
(
turbo
).
Hate speech classifications generated with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.120
So it's this idea of like, is a journalist just like a megaphone?
00:00:04.960
You know, legacy media outlets by and large were all on board with a certain narrative.
00:00:11.140
People needed a place like Twitter where they could be exposed to different ideas.
00:00:16.000
Were some of those ideas completely nuts?
00:00:18.160
Yes, but many of those ideas were absolutely legitimate.
00:00:22.300
This is a place for legitimate debate by people who understand the issues.
00:00:26.320
Those views were suppressed in many instances.
00:00:30.680
That is very troubling.
00:00:32.900
There was a Trump tweet about how he said something like, don't be afraid anymore of COVID.
00:00:39.000
It was something to that effect.
00:00:40.600
Does this go against our guidelines?
00:00:42.680
Shouldn't we be like, you know, labeling this as misleading or take it down?
00:00:46.580
And some of the other executives had to write back and say, optimism is not against our policy.
00:00:53.060
My God, this like super high level person actually even entertained the idea for something like that to be suppressed.
00:01:00.740
That alone was like kind of astonishing.
00:01:03.920
It was just a bizarre experience that I will never get over.
00:01:09.100
Like it's permanently altered me.
00:01:11.000
Hello and welcome to Trigonometry on the Road from the USA.
00:01:26.540
I'm Francis Foster.
00:01:27.940
I'm Constantine Kishin.
00:01:29.020
And this is a show for you if you want honest conversations with fascinating people.
00:01:34.420
Our brilliant guest today is an author and journalist, David Zweig.
00:01:37.300
Welcome to Trigonometry.
00:01:38.540
Thanks for having me.
00:01:39.420
It's a real pleasure.
00:01:40.240
So before we get cracking, tell everybody who are you, how are you, where you are.
00:01:43.860
What has been the journey through life that leads you to be sitting here talking to us?
00:01:47.540
It all began on the subway.
00:01:50.000
Or that was the last leg of the journey.
00:01:51.780
Yeah.
00:01:53.020
I'm a writer.
00:01:54.160
I write books.
00:01:55.140
I write news articles, investigative pieces, commentary.
00:01:59.660
So I've been doing that for a number of years.
00:02:02.680
Written a few books.
00:02:04.100
Before that, I was playing music for a number of years.
00:02:06.980
The joke I like to tell people is, well, it wasn't a joke.
00:02:10.600
This was real.
00:02:11.280
That after I'd been trying to succeed as a musician for many years, I went home and I
00:02:17.820
said, Mom, Dad.
00:02:19.400
You know, and this is very hard of them having a kid who's, you know, trying to do that.
00:02:22.280
And I said, look, I'm no longer trying to make it as a rock star.
00:02:25.460
And they said, oh, thank God.
00:02:27.580
I said, instead, I'm going to be writing a novel.
00:02:30.420
No!
00:02:31.400
It's like the worst.
00:02:33.080
So I choose uneasy paths, I guess.
00:02:37.260
But finally, it seems to have been working out.
00:02:40.280
So I've kind of hit my stride, I think, in the last decade working in nonfiction and doing
00:02:46.100
what I'm doing.
00:02:46.680
And here I am, I guess, chatting with you guys.
00:02:48.740
Yeah.
00:02:49.000
And one of the things we really wanted to talk to you about is the Twitter files.
00:02:52.540
And to see, I still don't know what I think about it, because on the one hand, I was one
00:02:57.400
of the people who was like super excited about seeing, you know, lifting the curtain, so
00:03:02.940
to speak.
00:03:03.340
But I also felt that when we did, we kind of only really saw what we already knew.
00:03:08.460
And so it was kind of pre-discounted.
00:03:10.360
Now, I'm not saying that's accurate, but that was my impression.
00:03:13.840
So what is it about the Twitter files that people should actually think, oh, that was worthwhile,
00:03:19.140
you know, all this hype and all the rest of it?
00:03:21.380
Yeah.
00:03:21.620
And it's a good question.
00:03:23.800
I can only speak about my own reporting, or at least I can only speak-
00:03:28.500
Tell us about the experience, first of all.
00:03:30.300
What was it like going in there and, you know, how did it work?
00:03:33.380
Yeah.
00:03:33.900
So I was brought out there by Barry Weiss, who, as most people probably know, Elon Musk gave
00:03:42.320
access to Matt Taibbi initially.
00:03:44.680
I think he was the first one.
00:03:46.100
And then some short period of time after that, he gave Barry access.
00:03:50.080
Barry runs basically a media empire.
00:03:53.220
Yes, she does.
00:03:54.020
I'm just a small piece of the empire, I guess.
00:03:56.540
And I had sent an email to Barry and one or two of her editors who I knew, because I had
00:04:02.260
written for them before.
00:04:03.240
And I said, because I've been writing about COVID stuff for, since the beginning of the
00:04:07.740
pandemic, I didn't know what they were doing, if they were going back.
00:04:11.620
I didn't know anything.
00:04:12.160
I just said, just trying to be helpful.
00:04:13.380
Look, if you guys are going to be sending someone else back there, or if you're going back
00:04:19.480
yourself, these are the things I might look for.
00:04:21.360
And I was kind of sheepish, actually.
00:04:23.380
Like, I thought it was a little maybe presumptuous of me to, I'm like, who am I to tell them?
00:04:27.160
But I was just, these are a few things I would do, do, do, do, do, do.
00:04:30.160
And like an hour later, I get an email back, can you get on a plane to San Francisco?
00:04:34.060
I was like, okay.
00:04:35.940
So they, you know, they figured, you know, I guess based on that and everything that
00:04:40.260
they know about me, that I should be the one to go out and at least initially look into
00:04:44.340
stuff related to COVID.
00:04:46.160
Because I think up to that point, the reporting wasn't specific to the pandemic.
00:04:50.620
It was on a variety of other things, Hunter Biden, laptop, and other stuff.
00:04:54.680
So I went out there, really the sort of, I don't know if this is the right word, but the
00:04:59.000
specialist.
00:04:59.640
I'm like, this is the one lane that I'm going to travel in, which is related to COVID.
00:05:03.820
And so I was there, it was only a few days while I was there.
00:05:08.120
Michael Schellenberger was there, Leighton Woodhouse, Lee Fong.
00:05:12.120
So it was the three of them and me, and each of us working on our own things.
00:05:17.160
But everyone was helping each other as well, just because it was so challenging to look
00:05:22.320
through the material.
00:05:23.960
And that's it.
00:05:25.920
I basically, I mean, I could get into a little bit of details about what the process,
00:05:29.260
you're nodding like, yes, please do.
00:05:31.000
Um, yeah, because I think there, I think there's still a fair amount of misunderstanding or
00:05:37.600
confusion about what the process was.
00:05:40.380
People are like, why didn't you do this?
00:05:41.780
Or you should have done that.
00:05:43.040
I think what a lot of people don't know or realize is it's not like going into some,
00:05:48.440
I don't know, company or some law firm where they're like, open up the books and you can,
00:05:52.220
you know, just rifle through.
00:05:54.540
There were very specific ways that we were able to search.
00:05:57.440
There were no restrictions on what we could look up or, or publish.
00:06:01.500
Um, that's important to note, but the process of actually searching was really challenging.
00:06:07.360
This is not like what we're used to as regular citizens, like Googling something or whatever.
00:06:12.440
The, the, the systems in place to actually do the digging were, um, were not very user-friendly.
00:06:19.420
So I would break it down into, there's basically two separate paths that we could search down.
00:06:25.520
One was looking at what I would call like the log files of individual accounts.
00:06:30.420
Um, and there was an engineer in the room with us on a special laptop and we could give him
00:06:34.880
someone's name and say, I would like to look up this person's account.
00:06:38.420
We were not able to see any like private information on anyone's account, which is
00:06:42.320
important for me to note because there were tons of lawyers involved in this.
00:06:45.560
Um, but what we were able to see were on someone's account, their activity and what Twitter internally
00:06:53.060
had, if there were any flags or marks on certain things on the account as a whole or on certain
00:06:58.060
tweets.
00:06:58.420
And then there were tons of kind of files within files within files.
00:07:02.560
Um, so that was one thing to search.
00:07:04.300
And then the other one was looking up, um, internal communications or internal and external,
00:07:09.420
um, through email or through Slack channels.
00:07:12.380
And so the way that those searches were not performed in the room, we had to basically
00:07:17.560
send them an email to the people.
00:07:19.640
I don't know if they were like next door to us or what, but they were, it wasn't directly
00:07:22.660
in there.
00:07:23.140
And then sometime later, a person would come in with a different laptop and say, here's
00:07:28.100
this stuff.
00:07:28.640
You can look at this now.
00:07:30.160
So if you imagine we're looking at, and we had to look up specific employees, you couldn't
00:07:34.580
just say, I want to find anything on myocarditis.
00:07:37.300
No, you had to give, and I'm not sure the reasoning behind this, but you had to pick certain
00:07:42.320
employees and they could look up that employee's emails and we could say, I want to see all
00:07:46.760
the emails from this person from this date to this date.
00:07:50.180
Um, and then, so we, we could get 4,000 emails from someone if you're looking at, so, and I
00:07:56.000
have, you know, X number of hours and wanted to look at this stuff.
00:07:58.740
So it was, it's, it's very challenging.
00:08:00.520
It's more, it's not just like, oh, let's just do a word search and we're done.
00:08:04.300
Um, although that was part of it.
00:08:05.980
So you're trying to sift through all this stuff to find what you're looking for.
00:08:11.700
Sometimes I had a specific idea.
00:08:13.620
Most of the time I knew what I was looking for and it was just seeing if it was there
00:08:16.780
or not there.
00:08:17.700
And other times you don't even know what you're looking for.
00:08:19.460
It's just like, I know this is an important person.
00:08:21.840
I know this particular topic is something that we might have reason to believe something
00:08:26.520
happened.
00:08:27.040
Let's see what's there.
00:08:28.720
If something's interesting or not interesting or newsworthy or not newsworthy.
00:08:32.020
And that was so, it was very laborious, very tedious to do these types of searches.
00:08:37.620
The stuff on the log files, I mean, it could take a very long time to like dig through and
00:08:42.320
you find a particular tweet and then, and then through there, there's all sorts of code
00:08:46.640
and different stuff written there.
00:08:47.880
And then the engineer was basically explaining to me, okay, here we go.
00:08:50.700
I can see this tweet, what's labeled on here, um, or not labeled.
00:08:55.040
Does that make sense how I'm describing it?
00:08:56.820
Yeah, very much so.
00:08:58.400
Okay.
00:08:58.680
And I was going to ask, so what role did Elon have in this?
00:09:03.020
Was he overseeing it?
00:09:04.620
Did you have to report to him?
00:09:06.860
Did you have to say, look, this is what we found?
00:09:09.660
How involved was he in this entire investigation?
00:09:12.400
I love the way you call him Elon, by the way.
00:09:14.320
He's your best mate.
00:09:15.820
Well, look, I'm being professional.
00:09:17.880
Between me and him, it's musky.
00:09:19.300
Yeah, I mean, again, at least from my, my experience, um, I had no personal interaction
00:09:27.160
with, with Musk at all.
00:09:29.080
Um, my understanding is, and this was through talking with Barry, was that, um, he, we had,
00:09:35.600
we had access to whatever we wanted.
00:09:37.340
There were no restrictions.
00:09:38.300
And, um, the only, uh, sort of rule was whatever we found, we had to publish it on Twitter first.
00:09:46.220
And, you know, and it didn't need to be far in advance, but it could be five minutes, but
00:09:50.020
it just needed to go on to Twitter first.
00:09:51.760
Other than that, it wasn't, if you find X, Y, or Z, you're not allowed to do this, et cetera.
00:09:56.080
And as far as I'm aware, unless there was some incredible conspiracy happening, um, there
00:10:01.880
was, similarly, there was no, um, constraints placed on us to get the information.
00:10:07.220
So, both in regards to what we were able to look for and then what we found, neither of
00:10:12.540
those had any restrictions on them whatsoever.
00:10:15.700
You know, this engineer is, I'm literally looking over the person's shoulder as they're
00:10:19.500
doing the searches.
00:10:20.240
I cannot imagine how that could have been in some way compromised without us knowing.
00:10:25.080
It's, some other guy had interviewed me and he kept saying, well, maybe the email searches
00:10:29.520
or the other thing, they were filtering it.
00:10:32.080
And again, we're talking about thousands of emails.
00:10:35.020
I can't imagine how they could possibly have done this.
00:10:38.280
And it took us, a team of four people in real time looking, so I don't see, because they
00:10:42.600
didn't know what we were going to be searching for.
00:10:44.380
So, this all was happening in real time.
00:10:46.440
It was very kind of like by the seat of your pants, just trying to do it.
00:10:49.960
I saw Elon while we were there, um, but there was no, he did not interact with me.
00:10:54.820
Or have any involvement, um, as far as what I was looking for and what I was able to publish.
00:10:59.760
And that being the case, what did, with your findings, what was the thing that shocked
00:11:05.480
you the most?
00:11:06.060
What was the thing that surprised you?
00:11:08.560
Hmm.
00:11:10.180
I would, I think looking at some of the prior reporting, um, I think maybe I had a little
00:11:17.460
bit more of a, uh, of an assumption about the staff, you know, this guy, Yoel Roth and
00:11:22.960
some other people.
00:11:23.580
And I have to say, um, many, many instances, as I'm reading through these internal Slack
00:11:29.420
communications and in emails, in many instances, these people at Twitter really were trying
00:11:35.220
their best to moderate content in a way that they thought was reasonable.
00:11:41.340
Now, I think they were wrong in how they performed this, but I, but there's that, you know, there's,
00:11:47.440
this is kind of a harsh expression.
00:11:49.220
It's, but, you know, what's the thing about don't ascribe to, to, you know, nefariousness,
00:11:53.560
what can be, you know, actually is just sort of stupidity.
00:11:56.680
I'm not saying that they were stupid, but in the sense that I don't think this was some
00:12:00.600
big plot that, that, that these people genuinely were trying to sift out things that they believed
00:12:06.300
were harmful.
00:12:07.800
I think they were completely out of their depth with what they were doing.
00:12:11.620
And I think structurally, this was totally inappropriate about how they handled a whole
00:12:16.620
variety of things.
00:12:18.180
If so versus like on an individual level, I saw a lot of people with a lot of conversations
00:12:24.200
pushing back against what they perceived the government wanted, pushing back against what
00:12:29.560
other coworkers wanted in too many instances, bad choices were made, but it is not like there
00:12:35.440
was this thing constantly all the time where they were just, you know, taking out the red
00:12:40.000
pen against all sorts of tweets that they didn't like.
00:12:42.980
There was an internal debate that happened oftentimes.
00:12:46.240
And I reported on, on a few of them.
00:12:48.260
One of them that was actually kind of both encouraging in the end result, but also revealing in the same
00:12:54.300
way about how problematic it was.
00:12:57.120
Jim, I forget the guy's name, but one of the lead attorneys at Twitter, there was a Trump
00:13:03.180
tweet about how he said something like, don't be afraid anymore of COVID.
00:13:07.940
It was something to that effect.
00:13:09.140
And he was emailing people internally saying, does this go against our guidelines?
00:13:13.960
Shouldn't we be like, you know, labeling this as misleading or take it down?
00:13:17.640
And some of the other executives had to write back and say, optimism is not against our policy.
00:13:25.140
So it was both like good that there was that stopgap, but also like, my God, this like super
00:13:30.440
high level person actually even entertained the idea for something like that to be suppressed.
00:13:35.240
That alone was like kind of astonishing.
00:13:38.240
Well, that is astonishing.
00:13:39.140
And also, I mean, I think if you remember, and most people want to forget that period of time,
00:13:45.340
because, you know, it was a bit, not a bit, it was very unpleasant for a lot of people and
00:13:49.800
for a lot of people are traumatic.
00:13:51.260
One of the things I think a lot of people, particularly those of us who felt that both
00:13:56.340
government and big tech became quite authoritarian during the pandemic was this feeling that we're
00:14:04.380
not actually able to have a proper sensible discussion about something that actually really matters.
00:14:08.900
To what extent do you feel, based on what you saw, our ability to genuinely discuss one of the
00:14:18.340
most important issues in our lifetimes, actually, for a lot of people, was curtailed by what they
00:14:24.560
ended up doing at Twitter, whether it was maliciously motivated or whether it was incompetence
00:14:28.740
or stupidity, or they just thought that's the right way to do it.
00:14:32.040
How much of what happened was wrong, I guess, is what I'm asking you.
00:14:35.680
It's impossible to quantify, because I didn't do some sort of systematic, you know, analysis,
00:14:42.460
but I would characterize it, to my mind, it was profoundly wrong about what happened,
00:14:50.100
at least in the things that I researched related to COVID.
00:14:52.680
Give us a breakdown.
00:14:53.560
Like, what are those things that were profoundly wrong?
00:14:56.680
Right. So, the thing about Twitter is, I think a lot of us had hoped or saw social media in general
00:15:03.680
as an alternative to sort of, I guess, what's called the mainstream media, legacy media.
00:15:08.460
This is another, this is a venue for people who are shut out of these other platforms that they can
00:15:15.540
use their voice. And when I say people, it doesn't just mean regular people. I'm talking about
00:15:20.340
highly credentialed experts who just had views that were, you know, oppositional to what the
00:15:25.800
establishment wanted. And so, what I observed throughout the pandemic, and then what I was
00:15:32.520
able to basically prove, was there was a systematic suppression of content that went against the
00:15:43.900
establishment narrative, at least in America, about what was considered appropriate, and what
00:15:49.860
was considered correct. And anything, oftentimes, that went against that was labeled as misleading,
00:15:57.020
tweets were taken down, and accounts were suspended on numerous occasions. So, what I don't know is
00:16:03.820
the extent of that, because again, that would require some sort of analysis that, you know,
00:16:08.900
an individual reporter is not capable of doing. But what we do know is, I observed it as a
00:16:13.880
person deep in this for years, and what I wanted to do was kind of deconstruct, well, how did this
00:16:23.040
happen? You know, because so many times, I would see a tweet by someone like Martin Kulldorff, who's
00:16:28.620
this prominent infectious disease epidemiologist who's at Harvard Medical School, and I would see
00:16:35.560
stuff by him being labeled as misleading. I'm like, that's troubling. What's going on? So, man, when I got to
00:16:42.280
go to Twitter, I'm like, I can try to find out, well, how did that happen? What led to that? So,
00:16:47.620
I sort of wanted to work backwards to find out, because I was baffled. I'm like, what the hell is
00:16:52.500
going on? So, there were- What did you find? So, what I found, I view it as sort of like three
00:16:57.720
different, like, buckets about how they did this. There were, there was a system of bots that they set
00:17:04.320
up, this sort of like AI system where, I guess they crawl through Twitter, and they're trained,
00:17:11.400
whether it's through certain keywords or other mechanisms, they're trained to look for certain
00:17:16.340
tweets that were problematic, that the system, the algorithm viewed as problematic. So, some of the
00:17:21.580
tweets that were perfectly legitimate, perfectly appropriate, were caught up in this bot system.
00:17:28.080
If you think of like a trawling net in the ocean, you're looking for certain type of fish, but
00:17:34.580
inevitably, you're going to catch some dolphins in there. And I think this bot system probably did
00:17:39.600
catch a lot of garbage, you know, some crazy, you know, conspiratorial QAnon, you know, microchip
00:17:45.500
type of stuff. But in that process, they caught a hell of a lot of dolphins in that net, too. So,
00:17:51.820
one thing were these bots. The other thing, you had these independent contractors in places like
00:17:55.980
the Philippines, who were basically deciding the content, whether something is misleading or not.
00:18:02.760
They were given these decision trees, basically, this kind of thing where you would have a drop-down
00:18:06.740
menu. Maybe you see a tweet that has myocarditis in it. Maybe a bot would first flag the tweet. Then
00:18:12.020
it gets sent to this independent contractor, and they're like, myocarditis, click. Then there's a
00:18:15.960
drop-down menu, good or bad, click. And then, you know, there would be this system set up. But the idea
00:18:22.440
that some random guy sitting in a cube farm in the Philippines is going to adjudicate something as
00:18:29.020
complex as whether or not a tweet about myocarditis is misleading or not is insane. Like, of course,
00:18:35.340
they're not going to be able to do that. So, you had that. And then the third mechanism were people
00:18:40.180
themselves at Twitter. And there are many instances where either a particular tweet or account would be
00:18:45.000
escalated where people were looking at it, or sometimes they were just intervening, like that
00:18:49.660
example about the Trump tweet. So, you had real people doing this in Twitter, you know, at high
00:18:55.020
levels. You had these independent contractors, and then you had this algorithm. So, between those
00:19:00.200
three systems, that's how things were flagged. And, you know, we had tweets by people who were
00:19:06.800
physicians, who were, you know, licensed medical doctors, who were tweeting things from studies that
00:19:12.960
were published in peer-reviewed journals. Now, does it mean the study is well done? I don't know, but it's in a
00:19:17.620
peer-reviewed journal. Certainly not for Twitter to decide that something that's published is not.
00:19:22.680
And nevertheless, they were flagged. And in every instance, these were things that only went in one
00:19:27.760
direction. There never seemed to be something that was too pro-lockdown, pro-vaccine. That would never
00:19:34.820
be flagged, as far as I saw. But anything that questioned that, that is met by, you know, pulled up in
00:19:41.380
the net. And so, there were many instances where things that were true or that were legitimate from
00:19:47.460
published sources that were tweeted by credentialed people that were nevertheless flagged. And that,
00:19:55.240
to me, is just, like, so profoundly disturbing when you think about the information environment and,
00:20:01.860
like, what we perceive to be real or not real, what information we're all getting. And for someone
00:20:07.400
like me, who I was knowledgeable about this stuff, it was frustrating and enraging. But what I'm more
00:20:13.600
worried about are, you know, regular people who have normal jobs, they're not following this stuff
00:20:17.480
closely, and they see misleading tweet, you know, like a label put on it. I mean, that's obviously going
00:20:22.520
to have an incredible amount of influence over the public conversation and the narrative.
00:20:28.200
AI and risk-related compliance isn't tomorrow's idea, but rather today's edge. Moody's combines
00:20:34.820
advanced AI with one of the most comprehensive data estates, and automated workflows provide
00:20:39.700
faster insight, reduce bottlenecks, and drive more strategic action. Moody's helps banks,
00:20:45.040
corporates, and financial institutions navigate today's challenges and seize tomorrow's opportunities.
00:20:50.420
Stay ahead in an era of exponential risk. Visit moody's.com slash kyc slash ai dash study
00:20:56.640
and get in touch today.
00:20:58.200
Isn't the real problem here, I mean, to me there's two real problems, and I'll come
00:21:05.340
to that, the second question in a moment. But to me, this is the problem when you politicize
00:21:10.980
issues which have no basis in politics whatsoever. This is a virus released from a Chinese laboratory.
00:21:19.500
Allegedly. Allegedly.
00:21:21.280
Allegedly.
00:21:21.340
Allegedly.
00:21:21.420
Allegedly.
00:21:22.420
Allegedly.
00:21:23.420
Little asterisk.
00:21:24.420
But this-
00:21:25.420
Censored.
00:21:26.420
Censored.
00:21:27.420
Channel deleted.
00:21:28.420
But isn't the problem when we politicize issues like this, and this is what happens?
00:21:34.280
Well, yeah, it's politicized, and it's also, at least related, you know, to COVID specifically,
00:21:41.240
there was this default that whatever the CDC says, we're going to go with them. And on one level,
00:21:48.120
I appreciate that. I get it. It's like, well, am I going to believe some random, you know,
00:21:52.640
doctor or some random, not even doctor, some random person? Or am I going to trust the CDC?
00:21:57.280
So on one level, I get that. But that's not how free speech works. And by the way,
00:22:02.340
the experts are wrong all the fucking time. And the CDC, nor a government, remember the leader of
00:22:09.200
New Zealand? I think at one point she said, I am the truth. Maybe she didn't say I, but like,
00:22:13.860
the government, we are your one source.
00:22:15.100
I will be your single source of truth.
00:22:16.580
Yeah.
00:22:17.620
Remember that unless you hear it from us, it is not the truth. Dismiss anything else.
00:22:24.160
That is insane. That's horrible for any democracy, for anyone. So I think, you know,
00:22:37.340
it's like, you know, these are cliches, but sunlight is the best disinfectant. Like,
00:22:41.120
let the best ideas win out. Sometimes there will be bad ideas. But what I think this showed was that
00:22:47.040
our government has no confidence in regular people being able to be informed. There is this like
00:22:56.500
incredible sort of like paternalistic attitude of like, the masses are stupid, and they cannot be
00:23:04.580
trusted. So what I think ultimately happened was this idea of the noble lie, where over and over,
00:23:11.360
there would be these infectious disease experts and others within the government, where they were
00:23:17.200
saying something, whether it was about vaccines, whether it's about masks, where they were kind of
00:23:21.960
often just gilding the lily. There may have been the underlying truth, but then they would push over
00:23:26.340
the edge because they felt like they had to encourage people to quote, do the right thing. And so and I
00:23:31.880
think Twitter, most of the people who work there, I don't know now after the exodus when Elon took over,
00:23:37.020
but they were almost all of them were left leaning. And they were, you know, aligning themselves with
00:23:43.680
the Biden administration, which was aligned with the CDC. So there was this structural, you know, bias
00:23:50.540
toward a certain truth. But again, I mean, there were plenty of differences of opinion from experts
00:23:58.000
around the world on a lot of these ideas, and where different countries were doing things differently.
00:24:02.700
And within America, the idea that you're not supposed to listen to the opinion of a doctor at
00:24:11.040
Harvard Medical School, or someone at Stanford, or anyone for that matter, you have, I don't think
00:24:17.180
for obviously anyone should be able to say whatever they want in a country that values free speech. But as
00:24:23.080
far as a platform like Twitter, I don't think there should be zero constraint, because I think it will
00:24:28.820
create an environment where there's just tons of like violence and pornography and stuff that most
00:24:34.280
people probably don't want to be bombarded with that constantly. So it's not that there's no line.
00:24:39.440
But I think the line should be far, far over toward leniency toward like a wide latitude. And they were
00:24:49.220
far, far too tight on how they were controlling and policing information. And it's profoundly wrong that
00:24:58.440
they did that. And as I said, you know, it just, it altered the information environment. And
00:25:04.380
because, you know, legacy media outlets, by and large, were all on board with a certain narrative,
00:25:13.060
it was people needed a place like Twitter, where they could be exposed to different ideas. Were some
00:25:19.220
of those ideas completely nuts? Yes. But many of those ideas were absolutely legitimate. This is a
00:25:25.700
place for legitimate debate by people who understand the issues. And nevertheless, those views were
00:25:32.200
suppressed in many instances. That is very troubling. David, let's start, Francis. Yeah, I was going to
00:25:38.800
say, I have actually a great deal of empathy for big tech. And I can already see the comments,
00:25:44.640
how can you say it? But it's true, because just imagine. You're a cuck, man. Yeah, exactly. Anyway,
00:25:50.380
in the words of my ex-girlfriend. Now, but just put yourself in their shoes. So you've got this
00:25:58.100
pandemic. No one knows what the mortality rate is. We're seeing these images come from places
00:26:04.360
like Italy, where they look absolutely horrific. And then you have this incredibly powerful tool at
00:26:13.080
your fingertips, which I think it's fair to say, no one understands how powerful this thing is,
00:26:20.000
all the long-term implications of this technology. And you have to figure it out as you go along
00:26:26.780
through the pandemic. Isn't it natural that you would err towards a side of caution when dealing
00:26:32.540
with this particular matter? Because I would certainly do that. Yeah, I mean, I think as I started
00:26:39.020
out saying, I felt a lot of sympathy toward what these people were doing. And I think they were
00:26:44.980
trying their best. I think, you know, one of the things about society and institutions is that
00:26:53.520
ideally structures are put in place to help protect us from ourselves. Your instinct sounds like a
00:27:00.440
reasonable and natural one. Let's try to be cautious with like, let's try to, okay, the CDC says X,
00:27:06.440
Y, or Z. Let's stick with that. If someone's saying ABC, different from X, then maybe we should,
00:27:12.120
you know, the instinct is there and it comes from a good place, I think some people, but there needs
00:27:18.760
to be a structure in place to go against that instinct because that instinct is wrong oftentimes.
00:27:23.500
And, you know, there's just so many instances in society for, you know, all of history where the
00:27:32.620
experts think one thing and they were wrong. Right. And if anything, something like a pandemic,
00:27:38.620
we went in the opposite direction of where we should have gone. We should have, the public,
00:27:43.440
I really, and I'm writing a book about the pandemic, specifically about American schools
00:27:49.200
and what happened with them during the pandemic. But because that's part of the larger thing,
00:27:53.560
I spend a lot of time looking at it in sort of like a more panoramic view. And to me,
00:28:00.060
everything comes back to the public health authorities and that they created an environment
00:28:04.840
where a place like Twitter felt like they had to go with them because they spoke with an unearned
00:28:13.300
certainty about so many things. And as we saw over and over again, those things got walked back from
00:28:19.600
the masking guidance flip-flop to vaccines. We were told they would do one thing and then they did
00:28:24.200
something else. And the excuse of, well, the science changed. Sometimes that was true and sometimes
00:28:29.480
it wasn't. But the point isn't that you can't change your advice or guidance. The point was that
00:28:34.600
when they initially gave the guidance, when they initially told people about these things,
00:28:39.840
this was said with a strong degree of certainty. And it was not only was it politicized, but it was
00:28:45.840
moralized that you were a piece of shit. If you did not do what they said, if you didn't wear a mask,
00:28:52.120
you were a right-wing asshole. And well, the evidence on that has been pretty iffy for a long
00:28:58.420
time about mask mandates. The evidence on a whole variety of things, on school closures and all
00:29:03.300
these other things are actually not clear in the slightest bit. I'm spending, you know, a thousand
00:29:08.300
page book showing that it wasn't the way they said. So that to me, it comes back to that.
00:29:12.900
Well, yes. But I guess the reason we're asking you this question, David, is I think we're all in a
00:29:17.900
position where we're trying to understand what the true impact of this new social media technology is
00:29:23.820
on the world. And one of the things that it has done, it has shattered our shared consensus of
00:29:29.900
reality completely. And the other thing is it puts into question, you know, the two of us, and I'm
00:29:35.580
sure you as well, really feel that free speech is a crucial part of what not only is, you know, you
00:29:40.480
could say the First Amendment in America, but it's not about that. It's about the fact that the entire
00:29:44.560
Western project is based on the principle that people are allowed to express themselves.
00:29:49.260
And at the same time, I have to recognize, you know, we run a YouTube channel which has, you know,
00:29:54.540
half a million subscribers on YouTube and probably the same on a podcast. If we thought that, you know,
00:30:00.780
the comments on our YouTube videos had the potential to kill 10,000 people, we would think about what we
00:30:07.540
do differently, right? Even if our impression of that was wrong. And I think we're all in the place now
00:30:14.420
where the pandemic was really the first major test case of what we now think we're supposed to do
00:30:21.800
about the fact that we have communication technology that allows one person to, some would argue not
00:30:28.400
an insight, a riot or start, you know, we had David Icke in the UK who was talking about how COVID is
00:30:34.820
caused by 5G. And then the next day people are burning down telephone masks, right? So that I think
00:30:42.420
is probably where the good intentions of the people that you're talking about were. But at the same
00:30:47.760
time, like once you start censoring Nobel Prize winning scientists who are talking about their
00:30:52.360
field of expertise, that's, you know, way too far, right? So my point is the line is somewhere in
00:31:00.680
between and nobody knows exactly where it should be, do they? That's my view on it. The only thing that
00:31:07.360
I do know for certain is the line was drawn far, far too close towards censorship. And I think
00:31:13.720
while it shouldn't be no line at all that we're agreeing on, and, you know, something is clearly
00:31:18.720
inciting violence of something, you know, in these types of circumstances, but by and large, we need
00:31:24.680
to let ideas be free. We need to let people communicate. And again, when you have credentialed
00:31:31.740
people, but we shouldn't only say credentialed experts. There's plenty of people who didn't have
00:31:35.460
credentials who are also correct on a lot of things. So that's, it becomes, this becomes more and more
00:31:41.240
complicated. So, you know, part of being able to live in a society where there is free speech is
00:31:49.020
it's the freedom to be wrong. And that's important too. So there, it is complicated. And I think some
00:31:56.700
people probably have a highly binary, simplistic view, let everything go through. But I think that's
00:32:02.500
fairly unrealistic and also unappealing to most people again, you know, so for me, I view my role
00:32:12.780
in this doing what I did, which was, I know a lot about this topic. I have access to internal files
00:32:19.480
at Twitter. Let me actually bring this to light for people so they can see what actually happened.
00:32:25.060
These different mechanisms I was explaining by how these things were done. There's a lot more granular
00:32:28.720
details, but that in a nutshell, and then that starts a conversation. That's the start. So I,
00:32:35.420
it would be, I feel like I would be an asshole to be, you know, pronounce, making pronouncements
00:32:40.360
about, well, here's how everything should run now. You know, so I don't even feel comfortable saying
00:32:44.340
that. You can tell, you know, we're all sort of saying, we know what happened was wrong. We know
00:32:48.920
the line should be somewhere else, or maybe, maybe that's not even the right metaphor. Maybe it's not a
00:32:53.840
line, but it's like a zigzag. Who knows what? It's a levy over the, we can come up with different
00:32:58.220
metaphors, but we don't know exactly. But for my role was, let me just report on this and let's start
00:33:04.460
a conversation. And the idea that, that this is quote, like a nothing burger, as all these people were
00:33:11.280
saying, is so idiotic to me and so like phony, like bogus, naive, like everything about that was
00:33:22.220
wrong. I mean, I, I know that the last time I looked like my thread that I did on Twitter for the,
00:33:27.840
for my Twitter files that I did. I mean, it had more than 60 million views. And as far as I'm aware,
00:33:36.320
no one covered it in any sort of like legacy media. That was shocking to me. And so my view on this is
00:33:44.840
even if, and they would be wrong to say this, but even if someone said, well, what was reported on
00:33:51.020
here isn't important. Even if they thought that the mere fact that something had 60 million views
00:33:57.960
on it, hundreds of thousands of likes or whatever the, you know, whatever the statistics were that in
00:34:03.300
and of itself makes it newsworthy. I mean, that is the definition of news. When something is of great
00:34:09.860
interest to an enormous number of people, particularly on a topic like this, that is news in
00:34:15.880
and of itself, even if they felt like the content wasn't important. So that is something that as a
00:34:22.120
society or that people in media, I think need to reckon with. It is just a remarkable comment on like
00:34:31.680
the environment we're in today, the culture, that something, a thread, you know, and this isn't
00:34:37.220
about me. I mean, Matt Taibbi is reporting that their stuff was far more in depth than mine,
00:34:42.100
hasn't been covered. How is it possible that this stuff was basically ignored? That is remarkable.
00:34:48.860
I'll give you an example. We had it in the UK and I mentioned it at the time. The BBC on the day
00:34:54.060
that the first several instances of the Twitter files were released. They had a story about Elon
00:35:00.420
Musk firing cleaners who happened to be two women of colour. And they had a story, which was, it wasn't
00:35:07.340
a story, it was like a piece, which is who is the billionaire who's running Twitter, Elon Musk.
00:35:12.460
So they obviously felt that Twitter and Elon Musk were important and significant, but they didn't cover
00:35:17.360
the Twitter files at all. And that, to me, that was very telling. But the other thing I wanted to ask
00:35:22.120
you about, and this is obviously something that a lot of people felt on along political lines prior
00:35:27.860
to the pandemic. And then during the pandemic, it, I think, strengthened how people felt in this way
00:35:34.480
about it, which was, you know, Twitter is a company headquartered in California. We all know that
00:35:41.140
California, the values of people in California are not necessarily representative of the entire country
00:35:46.420
or indeed the West more broadly, which is where this conversation was playing itself out.
00:35:50.560
Now, did you find, did, as you were reading this, did you feel that this was a group of people who
00:35:56.380
were almost captured by a particular worldview or just forget captured, that's a loaded term,
00:36:01.920
who held a particular worldview? And as a result of that, they were making decisions that were not
00:36:07.340
necessarily reflective of the worldview shared across the country.
00:36:10.520
A hundred percent. There were exceptions. And like I said, I saw a lot of robust exchanges within
00:36:19.800
Twitter amongst employees, but by and large, absolutely. I mean, again, the fact that they
00:36:25.300
went with, well, here's what the CDC says, that's the truth. I mean, that's crazy that you can't
00:36:32.540
function in a society where a government agency is the truth and everything else is considered
00:36:39.280
misleading or disinformation. And all these people, not all, most of them, and this is a known thing,
00:36:49.460
are left-leaning, for lack of a better term. And because the pandemic in America was so politicized,
00:36:59.480
you were on one team or another. I found myself, you know, sort of floating in space. I had no incentive
00:37:07.220
whatsoever to align myself with Republicans necessarily. For years, I had been a Democrat
00:37:13.520
loosely. I mean, I have a complex, as you guys probably do too, a complex range of opinions on
00:37:21.400
a complex range of topics. So I don't align myself- You think for yourself, shocking.
00:37:26.560
I try, you know. Which makes you far right. Yeah, far right.
00:37:29.460
Exactly. That means I am a right-wing. I mean, I did an interview with, I forget who it was,
00:37:37.140
but someone who's like center-right. And I sent it to a group, this was, I don't know,
00:37:41.560
a year ago. And I texted with, I'm like a group chat with like a bunch of old friends of mine.
00:37:46.420
We've known each other for 20 or more years. And one of them wrote back to me and said,
00:37:50.160
are you in the Proud Boys now? You know, that like white supremacist group. And I was like,
00:37:55.020
come on, man. You know? And he was like, half joking? Jew and the Proud Boys. But anyway,
00:38:01.360
it was, point being, it's like, if you questioned any of these things, you automatically were thrown
00:38:09.560
into that camp on the other team. And, you know, so for me, as, I think I'm considered one of the
00:38:18.520
only journalists or maybe the most prominent one who was writing for a lot of legacy media
00:38:25.780
publications that tend to be left-leaning. But I was writing from a very much sort of contrarian
00:38:33.760
viewpoint, not purposefully. I wasn't like, I'm going to be the contrarian. It was, that's where
00:38:37.760
the facts took me. And that's what launched me on this early on in the spring of 2020, when this all
00:38:43.860
began, that I saw everything that was happening. I, like most of my neighbors and everyone around me,
00:38:49.460
oh my God, there's this crazy virus. Let's, let's like, you know, lock the door, everyone.
00:38:54.640
But very quickly, because of my professional background as someone, I like reading scientific
00:39:00.860
journals. I read academic journals. I talk to scholars and research people all the time.
00:39:07.280
So I have that sort of professional background in doing that. And also for better and for worse,
00:39:11.920
my temperament, I'm always poking holes in things. I always see kind of, I'm always skeptical about
00:39:18.640
things. And I very quickly started, well, let me just look and see what the actual evidence is for
00:39:24.160
this. And very quickly, it took me in a direction that was different from the direction that I was
00:39:31.160
supposed to be in. And all of a sudden I found myself like, you know, cast out of the boat, cast out
00:39:38.720
of the, of the, of the thing with everyone else that I was quote, supposed to be with, even though
00:39:43.560
there was no zero political incentive for this. And so, but nevertheless, I was branded that way.
00:39:49.700
You know, I'm, I'm a right winger. What are you fucking talking about? You know, I was a Bernie
00:39:53.720
supporter years ago. So this, this incredibly infantile binary approach to life is just so
00:40:04.220
tiresome and problematic. But what I've come to believe is that I think that's just how,
00:40:11.860
and this is perhaps a banal observation, but that's just how society operates. Most people
00:40:16.820
want to be in a certain tribe and their default is like in group, out group. And so I spend a lot of
00:40:25.220
time writing about this and digging into this in my book that I'm working on about this. I just,
00:40:29.560
it's, it's, it's incredibly fascinating to me. I, you know, what I know about both of you guys,
00:40:34.260
I think we're in the same, we are in our own little, it's like the Venn diagram. We're in like
00:40:38.300
this tiny, I don't know how, how large is our group? I don't know. Not that we all share the same
00:40:43.180
opinion, but our group of people who are sort of like not, I think there's something, do you think
00:40:47.900
there's something in your personalities? Cause I think I wonder this about myself. Is there something
00:40:52.380
about us and people like us that we seem to have less allegiance to groups than it seems other
00:41:00.120
people do? Do you think there's something like, how did we end up where we ended up?
00:41:04.300
For me, it's quite easy because growing up in the Soviet union, I kind of had a very early
00:41:08.820
experience of like, just cause everyone says, everyone goes along with something doesn't mean
00:41:13.600
it's true. And then it was replicated more generally. I mean, if you, if you're an intelligent
00:41:21.160
kid, you look around at the adults and you realize these people don't have a fucking clue
00:41:24.660
what's going on. Right. So that from a young age, I realized I'm going to have to think about
00:41:31.100
things for myself, you know, but I actually, you know, I think it can be very dispiriting
00:41:36.740
because online communication makes it feel like we are one of a handful. And then there's
00:41:43.760
the crazy people on the right and the crazy people in the left. But I think what we forget
00:41:48.540
and look, the success of our show is kind of reinforcement of this is there's a hell of
00:41:53.840
a lot of people in the middle who are never going to leave a YouTube comment, who don't
00:41:57.920
have a Twitter account, who just want to hear something because they are in the middle.
00:42:03.100
They're trying to make up their mind. They're not hard line. And because of that, you don't
00:42:08.600
see them. They don't show up. Right. And there are crucial points in the sort of trajectory of
00:42:14.540
society when, yes, you have to go in the voting booth and you have to pick red or blue.
00:42:19.340
But most of the time, I think most people aren't really that welded to those political ideologies.
00:42:26.980
And you don't see them as much because they're not on Twitter screaming because they're trying
00:42:30.560
to get their kid to school and put food on the table and whatever. So at least our hope,
00:42:35.720
certainly mine anyway, is we've always tried to speak to those people and stay away from
00:42:42.220
the crazies on both sides who are, you're right, at this point, shaping the debate.
00:42:47.660
No question about it.
00:42:48.540
I think you put this far better than I did. I think you're right.
00:42:51.560
Exactly.
00:42:52.400
Smash that.
00:42:53.900
I should be the guest. I'm the talent here.
00:42:55.980
Smash that like button.
00:42:57.540
Oh, subscribe. Good point.
00:42:59.140
You're right because I observed so many times. But I guess what worries me about that a little
00:43:04.220
bit is, and again, maybe this is just human nature or society, most people are either
00:43:10.040
disinterested or disinclined to speak up. So I think you're both correct that there is a larger
00:43:17.260
number of people like us than there are. But because most people, for whatever combination of
00:43:23.500
reasons, don't say, and I've witnessed this myself in my small town that I live in with my kids when
00:43:30.040
the schools were closed, there was this kind of very loud group of people on the local
00:43:35.840
parents' Facebook group, you know, and I was called a murderer for saying that I thought schools
00:43:40.560
should open, even though kids were in school in Europe at this point. Why do you want to murder
00:43:45.680
children? I'll never forget that. Someone responded to me that way. And I'm like, you're aware there
00:43:50.880
are millions of children in school elsewhere and they're all, they're fine. I think there, are there
00:43:55.240
still kids in Europe? They're still there. Just click. So that's the thing that there's sort of
00:44:02.300
this dual dynamic. There's both a lot of us, perhaps, but for whatever reason, and maybe a lot
00:44:08.180
of those reasons are good. Sometimes it makes sense why someone can't speak out for any number,
00:44:13.240
maybe their job, they're afraid. It's regrettable. I wish more people were a little more vocal or
00:44:20.240
involved then if they didn't, because the forces particularly, and I guess I have, believe me,
00:44:27.620
I have no shortage of condemnation for people on the far right, but I'm far more sensitive about the
00:44:35.800
left for two reasons. One is that I think I used to associate myself with the left for most of my
00:44:41.380
adult life. So I have to have like a greater sense of betrayal, I feel, in the way I feel like some of
00:44:48.440
these people behave during the pandemic. So that's number one. Number two, by and large, the left
00:44:54.340
controls the major levers of society. Yes, the politicians in Washington or in different state
00:45:01.780
houses or whatever change, you know, at each election cycle, but technology, the movie industry,
00:45:09.920
media, publishing, fashion, so many major pillars of our society, the cultural sort of cornerstones
00:45:21.440
are far dominated by the left. So that's why I have like a particular sensitivity of when the left,
00:45:30.040
by and large, moves in a certain direction, why that really needs to be kept in check so much.
00:45:36.480
Let's dive into the media a second, because we're talking about journalism here. And if you think
00:45:44.060
about movies, there was always the archetype of the intrepid journalist going where others feared
00:45:50.000
to tread in order to get the story. For example, if you look at superheroes, I don't think it's an
00:45:54.640
accident that Peter Parker, Spider-Man, is a photojournalist and Superman is a journalist.
00:45:59.260
Right. You know, they were the ones challenging authority. When did that stop? When did we start
00:46:07.580
seeing a journalist as someone to basically uphold what the government's saying?
00:46:15.100
One of the things that like blew my mind in the beginning of the pandemic and as what went on was
00:46:20.920
that I kept thinking and I've written a number of pieces that I believe it's fair to say that I kind
00:46:29.700
of blew something open in the sort of legacy media that that wasn't being talked about, whether I was
00:46:35.600
wrote very early on the very beginning of May saying we should be opening schools. Here's why.
00:46:42.040
Here's a long compendium of data to show why this is a good idea.
00:46:47.500
There's this, I don't think this was common in Britain or elsewhere, but we had this hybrid
00:46:53.620
schooling model here where kids go to school for one or two days a week and they stayed home and they
00:46:57.860
had that. And I was the first person to write a big piece on this and I had a number of experts in
00:47:04.940
the article saying like, this is completely idiotic for multiple reasons. There's no evidence that this
00:47:11.160
is going to reduce transmission. So I wrote that. There was a piece that I wrote questioning the CDC had
00:47:16.500
guidance. I'm going to get to my point in one second. The CDC had guidance before the summer
00:47:23.100
camp guidance where they wanted children to wear masks outdoors. And when I saw that the guidance
00:47:28.080
came out, I'm like, this is nuts. So I immediately, I reached out to some really prominent people in
00:47:33.040
the field. Dimitri Christakis is the editor-in-chief of JAMA Pediatrics, you know, the foremost pediatric
00:47:39.140
medical journal, immunologist at Columbia University Hospital, these people. And I reached out to them,
00:47:44.600
and I said, am I crazy or is this like completely insane? And, and they were like, oh, this is,
00:47:50.140
this is horrible. And, and, but, and, and I had a handful of other pieces on myocarditis. I interviewed
00:47:55.780
the guy, the Israeli scientist who wrote, basically released the first major report. I don't even know
00:48:01.320
how I got ahold of him, but I'm, there I am texting with this guy on WhatsApp and we're, you know,
00:48:05.680
put, so there were a bunch of instances where I wrote about things that I hadn't seen exposed yet.
00:48:11.220
Also the hospitalizations, sorry, now I'm just, take your time. This is what we do. So, okay.
00:48:16.660
Don't rush, take your time. I mentioned this not as an aggrandizement, but to a larger point than
00:48:21.540
I'm about to make. David is brilliant. Yes. I mean, he's so fucking good.
00:48:24.800
So, but there, the hot people have been murmuring for quite a while that, you know, maybe this
00:48:31.100
hospitalization numbers aren't actually reflective of people who are in the hospital. It rather,
00:48:36.560
it's with COVID rather than from COVID. Now this also ties into our conversation about the suppression
00:48:43.440
and censorship on Twitter and elsewhere. For a long time, that was considered a conspiracy theory
00:48:47.880
and you're crazy. Well, it turns out that's true. And, and, and I reported on these two studies that
00:48:54.840
were done on pediatric admissions. And at the time, at least 40% of the pediatric COVID admissions
00:49:02.280
were unrelated to COVID. A kid, maybe they broke their foot or something and they had to test them
00:49:07.680
for COVID and they just incidentally had COVID, but it had nothing to do with why they were in the
00:49:12.300
hospital. But nevertheless, that was added to the tally. Just very quickly, I had, I had a joke that I
00:49:18.180
tweeted during the pandemic. I said, my, my grandfather died with his wife by his side, but it was the
00:49:23.700
middle of the pandemic. So they decided he died of her and put her in prison anyway. Exactly. It was absurd.
00:49:29.800
So, but if you, but people said this were conspiracy theorists, you're some asshole, but it was true.
00:49:35.900
So anyway, so the reason I mentioned all of these things, the school closures, the hybrid shit, the
00:49:39.920
myocarditis, and then a handful of other things over and over, I kept thinking, I need to like get
00:49:47.580
this thing out. I got part of being a journalist. I generally like a slow moving on less in the news
00:49:53.520
cycle, but because I found myself in this, you want to be first. I'm like, this is exciting. And each time
00:49:58.500
I'd be in a panic because I turn an article at first, I pitch an article to, to an editor. Then
00:50:04.280
if they accept it, then there's, you know, then I'm writing it, I'm researching it. The editors go
00:50:08.920
back. It could take a week or two weeks or longer before you finally put the thing out, depending how
00:50:13.260
in depth it is. And I, every, I just be nervous waiting that someone else is going to scoop me.
00:50:19.000
And no one did any of those times. And slowly I started seeing this pattern. And that's the thing.
00:50:25.440
This is a very long response to your, to your prompt about where's the intrepid reporter. And
00:50:30.740
I'm not saying I am the embodiment of that, but what I would say is there was an absence of that
00:50:36.940
elsewhere. When I wrote that thing about the summer camp guidance, I think a day or two before my piece
00:50:43.760
came out, there was a thing in the New York times on it. And it was like the nukes, you know, CDC guidance
00:50:48.820
for camps is out. Here's how to keep your kids safe. And it just unquestioningly, they had a one
00:50:54.780
or two, you know, doctors who talked about it. No one questioned the, the, the validity of any of
00:51:01.320
these recommendations, including making a kid wear a mask while they're going to be playing tennis with
00:51:05.360
someone, you know, 30 feet away on the other side of a net. No, it wasn't questioned. So it's this idea
00:51:11.740
of like, is a journalist just like a megaphone? You know, maybe in some instances, instances that's
00:51:18.540
appropriate, but by and large, that's what these people were doing. And I was kind of astonished
00:51:23.180
that over and over, it seemed like these large media operations with millions or hundreds of
00:51:31.540
millions of dollars behind them. These are huge institutions with large staffs of people and
00:51:37.200
editors and this whole operation. And they weren't questioning the guidance. They weren't questioning
00:51:44.120
what was said. They were simply reporting it. And they had the same crew of experts oftentimes. And
00:51:50.480
I use experts in quotes because oftentimes this is someone like there was a particular emergency room
00:51:54.880
physician who, who I think was at Brown at the time, who was quoted constantly in the New York
00:52:01.720
Times and CNN and these other places. This person had no expertise on infectious diseases,
00:52:06.600
had no expertise on mitigation measures and their effectiveness. Nevertheless, this person was
00:52:13.440
quoted. They're on like the speed dial at a place like the New York Times. So you had no questions
00:52:20.480
asked of Fauci or CDC. And then when there was sort of questions asked, it was almost always from the same
00:52:27.020
pool of, quote, experts who were basically just supporting what was being said anyway. So that was the
00:52:35.520
weird thing for me. I just was, now when I look back, it's funny, but you can imagine me, I was so
00:52:39.880
nervous I was going to get scooped on each of these things. Someone surely is going to write a piece in
00:52:45.900
some major place about like, why our schools aren't open here when they're open everywhere else. And I'm
00:52:50.720
like, no one wrote that. Someone surely is going to write about the hospitalizations. Someone sure, all these
00:52:56.160
things. And it wasn't happening. So it's just been, it was just a bizarre experience that I will never
00:53:04.280
get over. Like it's permanently altered me. Seeing this, you know, from my own experience as a writer
00:53:11.760
in that environment.
00:53:13.520
And it's also has repercussions right the way down the line, because we've all agreed that the media
00:53:20.160
weren't being honest about COVID. The mainstream media weren't being honest. I think that's fair
00:53:25.380
to say. And then they report on Ukraine and everybody goes, well, how can I trust you? And
00:53:31.000
then you get a large portion of people coming out and going, well, they weren't honest about COVID,
00:53:36.840
AKA, they're not going to be honest about this. I don't believe it. And then what you have is a
00:53:42.880
fundamental breakdown between media and the people. And that's a really dangerous place to be in.
00:53:49.480
I mean, I think there's a couple things there. Journalism, in my view, should be adversarial
00:53:56.080
to authority. And for whatever complex range of reasons that, you know, we could go into,
00:54:06.080
but it's again, writing a thousand pages on the why, but for a long range of reasons, that sort of
00:54:12.760
skepticism of institutions was absent. It vaporized.
00:54:19.360
We love big pharma now.
00:54:21.340
Right. Exactly. I mean, these are the, these are the things that traditionally-
00:54:25.080
It doesn't make any sense.
00:54:25.960
People on the left, they used to, the left traditionally was like-
00:54:30.060
Speak for yourself, man.
00:54:30.780
Yeah, exactly. It was like highly skeptical of these places, like big pharma, the government,
00:54:35.960
you know, these, but because, I think because it's so politicized, they then just threw their hat in
00:54:40.800
with this group. That's one piece of it. I don't think it's the only piece that that,
00:54:46.020
and then the public ends up not trusting it because they, or at least a large segment of the public.
00:54:51.780
And I think one of the things that's funny, not to get too far afield here, but, and if I'm not
00:54:58.280
already canceled, maybe this will cancel me, but like-
00:55:00.500
Excellent.
00:55:01.060
But here we go, is that there's so much talk about diversity, which is important, you know,
00:55:06.780
racial diversity and other types, but we don't see, and again, this is an observation many people
00:55:13.300
think, there's not an ideological diversity in a lot of these institutions. And like most of the people
00:55:19.500
who work at these, you know, at the Times or, you know, these major TV networks, these are, I mean,
00:55:26.000
I know a lot of these people personally, and I know of, they all went to Yale and Brown and Columbia
00:55:31.240
and Harvard. They all went to the same places. They're all there, inside there. Imagine if the
00:55:36.320
Times, instead of their diversity lens, instead, if it was trained on, well, let's, let's hire some
00:55:41.960
people who just went to some community college. Let's go and, let's get someone who grew up in,
00:55:46.580
in West Virginia and maybe didn't go to college at all, but it's like real smart and has some sort
00:55:51.680
of grit. Imagine how different the coverage would be of all sorts of issues, whether it's related to
00:55:57.060
the pandemic or otherwise. So there's certain types of diversity, but you're missing out on,
00:56:03.260
on working class people. You know, journalism used to be more of a working class kind of profession
00:56:08.580
at one point in certain regards, you know, you know, a number of generations ago. Now it's,
00:56:13.020
it's far more, um, I think elitist. So these people, and, and I think that there's a wonderful
00:56:19.760
article, I think it's Thomas Frank from years ago in Harper's. Um, and he wrote about how, um,
00:56:26.100
Bernie Sanders and he was running basically was murdered by the, his, his campaign, um, by the
00:56:34.160
mainstream media. And, and he roughly says this, and this is some of my own sort of projection onto
00:56:39.980
his thesis. But in order to get into a place like Yale or Brown, how do you get in there? Well,
00:56:46.400
you're a certain type of person who follows rules. Maybe some people get in there who are
00:56:50.960
iconoclasts. They're just brilliant. But most of the people you're getting in there because you're
00:56:54.420
an apple polisher, you're the worker B, you know how to get straight A's, you know, 4.0 grade average,
00:57:00.560
you're a perfect student. And then you get in there and you're, you know how to work your way within
00:57:05.100
an institution and network and you get perfect grades there. And then maybe you go to Columbia
00:57:08.780
journalism school and you follow this path. So there, the way that these people became
00:57:14.500
successful was through a certain path. That's their life. That's their lived experience is,
00:57:20.880
is doing this. So it makes sense. So then they saw Bernie as this outsider. How could this guy win
00:57:27.740
or who dare he? You know, how should he do this? You know, he's not one of us. And I think,
00:57:33.720
and I think we also see that in coverage of a whole range of issues, including with the
00:57:38.660
pandemic, that if you have all these people who all know each other, they all have the same
00:57:44.480
kind of background, the same way that they became successful, that to veer from that,
00:57:51.680
I'm not even saying this is a conscious thing. I think it's just like, it makes sense. This is just
00:57:55.660
how they view things. So that, that, that's my, that's my thesis on, on that.
00:58:01.960
And we haven't even reached, as far as I'm concerned, the real punchline of the pandemic,
00:58:06.400
which is the effect on children's education and attainment. And also as well, what that is going
00:58:12.220
to be doing to the ever widening gap between attainment, between rich kids and poor kids.
00:58:18.020
Right. Well, you know, there's, um, my focus on reporting has not been on what people will call
00:58:28.540
sort of like victim porn on the horrible things that happen to children. Um, because to me,
00:58:34.340
it's just like, so self-evident that this was absurd. Um, but there's so much just obvious
00:58:41.840
evidence from this. This was obvious before it even happened. And now we have also from the academic
00:58:47.420
harms, um, you know, from kids. And we know that children who are in less privileged backgrounds
00:58:54.000
were going to do, we knew this was going to happen. I wrote a piece for the New York times,
00:58:58.400
actually early on in the pandemic about what they call these pod schools. They probably didn't have
00:59:02.480
this in the UK where, um, you would pay what wealthy parents would pay generally wealthy.
00:59:07.960
You know, it could be $20,000 where a group of parents got together and they hired a teacher
00:59:12.120
and they had a pod of like, it could be five kids, 10 kids where they, because the schools were closed.
00:59:17.420
Um, so those kids who had the money, they had the pod or they had tutors. They have, you know,
00:59:23.820
they can get all this extra help, but some kid who's living in, you know, a tiny apartment in
00:59:28.980
the Bronx, um, with, with eight other people or whatever the circumstance may be. And there are
00:59:33.920
those kids, um, cause I've spoken to them and I've spoken to their parents and their teachers.
00:59:39.060
They, they were never going to learn jack shit. Um, they didn't have an internet connection.
00:59:44.600
There were kids who are sitting in a parking lot of the Taco Bell so they could have a wifi
00:59:49.700
connection to try to get on to some bullshit remote learning program. That is nuts. So for any,
00:59:55.980
so the idea, because I was branded a racist, as were many other people who advocated for schools
01:00:01.020
to open, they was considered, they said it's racist to want schools to open. It's white supremacist.
01:00:06.280
But the, the, the horrible irony of course, is that the greatest harms were, we knew this from,
01:00:13.460
this could be seen very easily from the beginning and it was obvious while it was happening. The
01:00:17.520
greatest harms were born by the kids who were Brian, uh, disproportionately black or brown, um,
01:00:25.320
and certainly moving away from race, just who had less resources, regardless of what your race was.
01:00:30.140
We knew the people who have the money, they're always going to be okay. Most of them. I mean,
01:00:34.540
those kids suffer too. Make no mistake. A lot of them, everyone's different. And it's not
01:00:38.500
automatically you're going to do horribly if you don't have money, but that's the broad trend.
01:00:42.560
And this was of course obvious. So school is more than just a place for learning math or something,
01:00:49.900
and particularly for younger kids. In America, school has a role of, of a societal function where
01:00:57.920
there are people, by the way, were dismissed and maligned for saying, Oh, you, you just want a
01:01:04.040
babysitter. It's a bunch of parents just want babysitters for their kids. Well, that's, that
01:01:09.500
should not be put down. Like care for children is really important. And that's a wonderful function
01:01:16.120
of school. That's not something that should have been like dismissed as like, you're an asshole for
01:01:21.040
wanting your, you know, six year old to go to a place while you're going to work. Like, but somehow these
01:01:27.140
people were branded as selfish. It's the sort of thing you say when you've got your own babysitter
01:01:31.380
that you can pay for. That's when you can say it. And that's where it comes from. But I know
01:01:36.960
you've got a book about this coming out. So tell everybody when it's coming out and what it's going
01:01:41.020
to be called. Well, it's called an abundance of caution. At least in the States, I feel like they
01:01:48.160
did in the UK too. This phrase was constantly used out of an abundance of caution, dot, dot, dot. And I
01:01:54.860
remember when they closed my kids' schools that the email was sent out and there it was
01:02:00.000
out of an abundance of caution, we need to close schools for a deep cleaning. And they
01:02:04.480
said, you know, meant nothing. But, um, so that phrase always stuck in my head because
01:02:09.680
the question is, well, caution in what direction? Um, and what, and, and, and what we, what I
01:02:19.700
spend a lot of time writing about, and God knows when the book's going to come out because
01:02:23.000
I'm still working on it now. It's probably not for a year or so. I saw that when I asked
01:02:26.240
David, there was that little pang. I know that I've written the book. It's like.
01:02:29.820
The pain. Every night, like my son, he's like, dad, are you still working on that book? I'm
01:02:34.040
like, yes. Stop asking me. I'd like, my like temple is pulsing. Yeah. But, um, but you think
01:02:41.340
about a lot of these interventions, including school closures. I think there's good evidence
01:02:47.120
that the most important word that is not talked about enough about relation to the pandemic and
01:02:53.380
these interventions is time. And a lot of things can be effective over a very short period of time.
01:02:59.420
If the physician puts on a very tightly fitted N95 respirator and sees a patient for a 10 minute
01:03:06.200
checkup in a room, that mask may very well be protective to some extent. Um, but that's different
01:03:12.880
from wearing some bullshit cloth mask in a classroom with 20 other kids for eight hours.
01:03:18.340
And similarly, you know, pulling the master switch and keeping everyone home for a week or two weeks
01:03:23.640
or not everyone, but the people who can stay home, that can definitely have an effect. I think there's
01:03:28.320
good evidence. The problem is a pandemic's a marathon. It's not going to end and, you know,
01:03:34.400
quickly. And over time, we know that people's ability to comply with things wanes. You just
01:03:41.940
wearing a mask is fucking annoying for most people. Um, it's not natural. And whether you
01:03:47.360
purposely don't want to do it, or you're just like, even subconsciously, you're tugging at it.
01:03:51.220
People are rubbing their eyes. They've done studies on this. You can see people constantly touching
01:03:54.420
their face or whether it's something like school closures, people are going to spread their ability.
01:03:59.560
And there's Google mobility data for this too. You could see people just started moving around
01:04:04.820
more and more as time went on. And you saw this happen even in places where the restrictions
01:04:09.800
were still in place, that that didn't stop it, that you can't like hammer people with
01:04:15.520
this stuff. Human beings, most of them are not comfortable being completely isolated from
01:04:21.100
each other. They can't do it. And so what happens is this, this confusion about, well,
01:04:27.940
this works or that doesn't work. Of course, if people don't go to school, there'll be less
01:04:31.600
transmission. Well, yeah, maybe for a week or two, but over a month or two months or three
01:04:36.700
months. And that's the main misconception I think that a lot of people have about these
01:04:41.120
interventions. That yes, I'm not someone who says masks don't work or this quote doesn't
01:04:47.220
work. That's too simplistic of a way of framing these things. It's some things may work to some
01:04:53.840
extent in this particular circumstance for a certain period of time. It's not a sexy answer.
01:04:59.460
It's not as easy to just tweet that, but that's the truth. You need all these little qualifiers.
01:05:04.740
So we knew that over time, this stuff was simply not going to be effective. And that's what I
01:05:10.040
believe the evidence shows. Well, I was going to wrap up there, but just maybe one more thing on
01:05:15.040
that. I think one of the things that you touched on there, but I think we could expand on a little
01:05:20.500
bit is that too many people forget that safety isn't the only thing that we are ever pursuing.
01:05:27.020
I mean, freedom for this country would not exist if people didn't get together who valued something
01:05:32.580
other than safety above safety, right? The United States was created by a bunch of people who put
01:05:38.540
themselves in harm's way. That is the very definition of being the opposite of safety for
01:05:43.400
something else. That something else is called freedom in their case, right? So one of the reasons
01:05:49.060
I think we're stuck in the place that we're stuck in is that people have forgotten that safety is not
01:05:54.820
the only value. It's not the only thing that we care about and we cannot care about. Otherwise,
01:06:00.380
we'd never leave the house. And I would say two things. One, agree with you a hundred percent on
01:06:05.240
that, that, you know, there's a reason why we, right. People go swimming, even though a certain
01:06:09.060
number of people drown every year swimming, we get in a car, we do everything. If you, the second you
01:06:14.020
step out of your house, there's some degree of danger, depending on what you're going to do,
01:06:16.960
but that's part of life, including we also endanger other people because the argument against what
01:06:21.860
you're saying, people say, well, that's different. That's you just endangering yourself. But with the
01:06:26.360
virus, you're endangering other people. Well, guess what? Every time you get in a car, you're
01:06:30.160
endangering other people too, but we all choose to partake in that. And we don't force everyone to
01:06:34.780
drive 10 miles an hour. We laugh. So there's a whole range of things we do where we are endangering
01:06:40.120
each other. And we accept that as a society because we value these other things for human flourishing
01:06:45.420
beyond just putting ourselves in bubble wrap. But the other thing is, staying home is not safe
01:06:51.500
either over time. And that's the other flip side, again, about this idea of caution that it's like,
01:06:56.180
well, there are people, and I wrote this piece about this church that got in a huge lawsuit with
01:07:01.360
a county in California where they were barred, all churches there were barred from having anyone
01:07:08.280
gather indoors for many, many, many months. And so this church ignored the God.
01:07:15.420
People got together anyway, and they accrued an extraordinary number of fines. It's an amazing
01:07:21.960
story. I mean, they were spied on. They were special health officers peering at them through
01:07:27.520
a chain-link fence. They were going into the church and monitoring private activities. It's an insane
01:07:33.540
story. But the message from it, though, to your point, is a lot of the people I interviewed,
01:07:41.100
these were people from the church, church members. These were people, and by the way,
01:07:45.120
I don't attend church. I had no skin in the game about support. But these are people suffering from
01:07:49.620
addiction, elderly people who were profoundly lonely. There was a guy who, I think there were more than
01:07:56.160
one, I spoke to one myself, who wanted to commit suicide because he was so profoundly lonely. He had
01:08:02.440
gone through a breakup. And for these people, church was their support system. Is it mine? No.
01:08:08.260
But I have to understand and respect for them. They needed this. So the idea that they're safe
01:08:14.840
because they're kept home was bullshit. These people, some of them were literally at risk of
01:08:20.380
dying themselves. And we know that there is epigenetics with how your environment affects
01:08:26.380
your immune system. Telling a bunch of people you need to be isolated, particularly elderly people,
01:08:31.720
oh, you're in an old age home, and your family can't visit you. Like, of course, that has a profound
01:08:37.600
effect on people's underlying health. So we're seeing this, and it's very complicated, but with
01:08:42.460
excess death rates and all sorts of other measures about what makes for a good life. So, you know, to
01:08:48.560
your point, this sort of like myopic focus on a virus and not looking at all these other aspects of what
01:08:57.520
makes for a flourishing life, again, maybe that made sense for a couple weeks in the beginning.
01:09:03.240
But very quickly, that, you know, the ratio changes about what is reasonable, both from a public
01:09:09.340
health standpoint and also a mental health standpoint. So that's how I view it.
01:09:15.400
David, it's been a fantastic interview. Thank you so much for coming on the show.
01:09:19.540
So the final question we always ask all our guests is, are you vaccinated? No, I'm joking.
01:09:27.760
That's great.
01:09:28.400
Because if you're not, you're killing people. No, you're a murderer.
01:09:30.900
You're a murderer and murdering kids. Anyway, it's what's the one thing we're not talking about
01:09:36.480
as a society that we really should be?
01:09:39.740
I would say, just moving away from this topic, one thing that has long interested me
01:09:48.540
is everyone talks about climate change as this macro concern. And I'm not even going to touch on
01:09:56.860
the validity of this. But one thing that I think is really important for people who care about the
01:10:02.160
environment, and I'm someone who's long cared about the environment very much, is climate change is a
01:10:08.860
very abstract idea. But there are very concrete things that are happening to our planet and to
01:10:14.900
animals that we actually can stop. Because it seems like we failed at climate change. Every year,
01:10:20.080
they have the meetings. They go to Davos or Kyoto or wherever else, and they talk about it. But
01:10:25.720
we still keep producing. This isn't working.
01:10:29.180
You have stolen my dreams.
01:10:30.960
Right, exactly. But the thing we can do is, there are an enormous number of sharks that are killed
01:10:38.920
every year so people can have soup with their fin. Are you fucking kidding me? And they slice the fin
01:10:45.460
off this majestic animal, and they just toss them back in the ocean. And they're left there to just
01:10:51.600
slowly bleed out and die. And they don't even use the rest of the animal. So shark fin soup, where you
01:10:58.560
think about, there are still multiple countries that do commercial whaling. I think, I might be wrong,
01:11:04.360
I think it's like Iceland, Norway, Japan. That is insane, to my view. And palm oil. I'll stop here.
01:11:14.240
You know, most of the garbage that we eat, you know, all these processed foods, they have palm oil in
01:11:19.340
them. That is a massive source of deforestation. They burn or chop down incredibly important and rich
01:11:27.380
rainforests or other really old-growth forests and put up a bunch of, you know, palm trees for making palm
01:11:33.900
oil. That is horrible for the species that used to live there, who are now dead, because they don't
01:11:38.200
have a home. Horrible for greenhouse. So those are the things that I think get lost. Once it's, you know,
01:11:43.420
once again, it's easy to just be like, we need to help stop climate change. Well, guess what?
01:11:47.540
That messaging failed. It doesn't seem to work, because we still keep producing all this stuff.
01:11:51.960
So instead, I would love for people to focus more on how do we save specific animals? How do we stop
01:11:58.920
specific practices that, to my mind, are completely unacceptable? And in 50 years from now, we will
01:12:05.460
be embarrassed that these things were happening, from factory farms to the sharp fins, etc. That's
01:12:11.060
what people should talk about. Absolutely, David. Thank you so much for coming on the show. Really
01:12:14.820
looking forward to reading your upcoming book. Of course, you have got this one as well, which we
01:12:18.880
didn't get to talk about. Buy this book, everybody. Buy this book, everybody. And we're going to head over
01:12:23.500
to Locals for your bonus questions. So join us there. All the links for David's Worker in the
01:12:28.600
description. Take care. We'll see you on Locals very shortly.
01:12:33.200
Does David think the reason Twitter agreed to censor information was due to a willing alliance
01:12:37.580
with the government, out of fear of the government if it didn't comply, or out of ideological conviction
01:12:42.580
that censoring certain voices was correct?
Link copied!