The Jordan B. Peterson Podcast - May 04, 2023


354. Lockdown Horror: Targeting Children, Churches, and Your Freedom | David Zweig


Episode Stats

Length

1 hour and 42 minutes

Words per Minute

169.92046

Word Count

17,390

Sentence Count

1,011

Misogynist Sentences

12

Hate Speech Sentences

10


Summary

David Zweig talks about his involvement in the case, the culture of silence and fear around discussing COVID-19, and the detrimental effects of lockdowns on a generation of children. He also talks about why the media should not collude with the government in the face of a so-called emergency, and why it s important for the media not to be complicit in the government's efforts to stifle critical reporting on COVID and the implications for children in the wake of the events surrounding it. Dr. Jordan B. Peterson has created a new series that could be a lifeline for those battling depression and anxiety. With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way, and offers a roadmap towards healing. In his new series, "Let This Be the First Step Towards the Bright Future You Deserve," Dr. B.B. Peterson provides a roadmap toward healing, showing that, while the journey isn t easy, it s absolutely possible to find your way forward. If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better. Go to Daily Wire Plus now and start watching Dr. J.B.'s new series on Depression and Anxiety, where you can be the first step towards the brighter future you deserve. Let This Be The First Step towards the Better You Desired series by Dr. P. Peterson on Depression & Anxiousness, where he will help you find a way to feel better, and help you on your way to a brighter, more peaceful and more productive life. . Today's episode is a special bonus episode featuring an old friend of mine, the late great Dr. Michael Schellenberger, who was kind enough to share his story of how he managed to get to the bottom of the COVID case, and how he was able to get there in the first place. It's a must-listen episode, and what he did in order to make a difference in a world where we all have a voice and a better understanding of what's going on in the world. , and how we can all be a part of the process, not just one of those who can help us find a place to help us all get a voice to help change the future we deserve a brighter brighter future. Thank you for listening to the story, and thank you for being a friend of the story. -


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.780 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Today I'm speaking with journalist and writer David Zweig about, among other topics, his involvement in the Twitter files,
00:01:15.720 the culture of silence and fear and suppression around discussing COVID-19-related stories,
00:01:22.260 and the detrimental effects of lockdowns on a generation of children.
00:01:28.000 So David, one of the things I really wanted to talk to you today,
00:01:32.380 I think it'd be interesting to go behind the scenes with regards to the Twitter files.
00:01:36.740 I mean, the Twitter, so-called Twitter files,
00:01:38.860 with Matt Tybee particularly leading the charge, in my understanding,
00:01:44.360 didn't get a lot of legacy media coverage, surprise, surprise, and that story was downplayed,
00:01:49.180 but of course it was a viral occurrence online, particularly on Twitter,
00:01:55.640 and rightly so as far as I was concerned,
00:01:57.420 because my sense was that it indicated, illustrated, demonstrated a tremendous degree of behind-the-scenes collusion
00:02:07.140 between most worrisomely government officials and Twitter in particular,
00:02:12.380 but media in general, with regards so-called to crafting the narrative around COVID.
00:02:17.760 And I'm not very impressed by government media collusion efforts to craft narratives.
00:02:23.660 That's certainly not the media's role.
00:02:25.640 That's 100% certain to craft narratives with the government.
00:02:29.240 And I don't think that that's justified, even in the face of a so-called emergency.
00:02:35.180 In fact, that might be the time when it's most important for the media to not collude with the government
00:02:41.480 so that we can be sure that the response to the emergency isn't worse than the bloody emergency
00:02:47.040 or isn't ill-founded in some other grounds, because it often is.
00:02:52.520 It's not like we're necessarily going to respond to an emergency in the proper manner, even if we want to.
00:02:57.920 And critics need to abound in emergency situations, even more so than under normal circumstances.
00:03:04.040 So you were there.
00:03:05.820 Will you walk us through how you were there and what you saw?
00:03:09.020 Let's start with that, and then we'll branch out into all the other tentacles we can attack.
00:03:13.760 Sure.
00:03:14.600 Yeah.
00:03:14.940 So I was there basically at the request of Barry Weiss, who I'd known for a while,
00:03:21.800 and I'd written for her publication.
00:03:24.440 And I knew that Barry was one of the two journalists who had access to the Twitter files.
00:03:32.060 And just being friendly, and because I knew them, I'd emailed Barry and one of her editors saying,
00:03:38.640 hey, guys, in case you're looking up COVID stuff, as they knew, this was my sort of area of expertise for the past number of years.
00:03:46.460 If you want to look at COVID stuff, here's what you might want to look at, blah, dot, dot, dot, dot.
00:03:50.320 And then basically an hour later, I get a response.
00:03:53.540 Can you drop everything and get on a plane to San Francisco for us?
00:03:57.620 So I was not expecting that.
00:03:59.900 I was just trying to be helpful.
00:04:00.940 But when they asked me to go out there, I was like, of course, there was no doubt.
00:04:05.840 As a writer, as someone interested in this issue, for years, that was a no-brainer.
00:04:12.800 So I went out to San Francisco, basically under Barry's umbrella.
00:04:18.860 And there were a few other people, a few other journalists there while I was there.
00:04:23.320 Michael Schellenberger, Leighton Woodhouse, Lee Fong.
00:04:26.900 So the four of us, while I was there, it was the four of us.
00:04:30.500 And I was there for a few days.
00:04:32.980 And yeah, it was quite remarkable just to be at the center of this thing that I had been observing.
00:04:39.860 I had no personal interaction with Elon Musk.
00:04:42.820 I saw him while I was there, but I didn't communicate with him.
00:04:47.100 And I know a lot of people, I think this has been said a bunch of times, but worth saying again,
00:04:52.100 at least from, I can only speak to my personal experience, but I believe this is the case for everyone.
00:04:58.260 I had absolutely zero restraints at all on anything that I was able to report on or find.
00:05:07.340 You know, whatever I was able to find, I could report on it.
00:05:09.980 The only contingency was we had to report first on Twitter.
00:05:15.760 It had to, whatever information that you're publishing, publish it on Twitter first.
00:05:19.780 And then, you know, a short while later, you can publish it on, you know, whatever platform.
00:05:23.680 That was the only restraint, because there's been a lot of speculation or people saying that,
00:05:29.960 you know, we were given certain material and not given other material.
00:05:34.200 That was not the case.
00:05:35.380 Okay, let me ask you about that, because obviously, in some sense, you were given certain material,
00:05:41.800 not other material, because you're not going to get access to absolutely everything Twitter has.
00:05:45.940 So there's always, there's an element of pre-selection, but your sense was that that wasn't forced or manipulated.
00:05:52.240 What did you have access to?
00:05:54.180 And then what did you not have access to, nor need access to?
00:05:58.380 You know, how did you, and how did you make the determination of what material was relevant?
00:06:04.180 And what form was that material in?
00:06:06.920 And how, well, and how much of it was there?
00:06:09.140 Right, these are all good questions.
00:06:10.940 So there were basically two different channels of information that we could access.
00:06:15.760 One was the sort of, what I would call maybe, I guess, the back end of Twitter,
00:06:20.140 where there was an engineer in the room with us on a special laptop,
00:06:25.440 and we would tell this person what specific accounts we wanted to look up
00:06:31.660 to see if there were any special flags or marks on these accounts or on specific tweets.
00:06:38.500 This person could look that up.
00:06:40.460 And I was basically looking over this person's shoulder as they were performing these searches.
00:06:46.480 We had absolutely no access to, no visibility to any personal information on people,
00:06:51.680 like their private account information.
00:06:53.720 And actually, they were, the people at Twitter were very concerned about this.
00:06:57.940 So there were, I think there were a lot of lawyers involved and other things to make sure
00:07:00.860 that the journalists had no ability to view anything that was private or personal on anyone's account.
00:07:06.820 What we were able to view was the log files within Twitter that showed
00:07:12.460 if a specific tweet or specific accounts had certain flags on them.
00:07:17.580 So that was one thing that we could look up.
00:07:20.700 The other thing was we could have them perform searches for us
00:07:25.960 in the internal Slack channels and in emails on specific employees at Twitter.
00:07:33.760 And we would send in a request.
00:07:36.020 This was not performed in the room.
00:07:38.320 It was somewhere off-site.
00:07:40.080 I think they could have been next door, for all I know, where they perform that.
00:07:44.380 And then some time later, they would then bring back a different person
00:07:48.540 and come in a room with a different special laptop where there would be,
00:07:51.800 it could be thousands of emails for a particular employee.
00:07:54.940 And then we had a very limited window of time where we had to search through it.
00:07:58.920 So one of the things that I think people may not be aware of is that,
00:08:02.740 you know, I had a very limited amount of time to sift through an extraordinary amount of information.
00:08:11.180 So it was very challenging.
00:08:12.620 People, why didn't you find this or that?
00:08:14.380 Well, you know, it's hard.
00:08:16.020 This was not something where we could just go, you know,
00:08:18.620 digging into the files for weeks and weeks on end.
00:08:21.340 I think Matt Taibbi and some others had far more access.
00:08:24.440 They were there much longer than I was and on many trips.
00:08:27.160 But for me, it was quite limited.
00:08:29.260 So when I arrived, I came there with a list of people and things that I already knew that I wanted to look for.
00:08:36.420 Basically, for me, as someone who's been writing about, thinking about, and researching matters related to COVID,
00:08:44.400 you know, since the very beginning of the pandemic,
00:08:46.740 I had observed many things that took place on Twitter during those years.
00:08:51.220 And I basically wanted to kind of reverse engineer,
00:08:54.240 well, how did that happen when I saw this tweet or that tweet was labeled as misleading?
00:08:59.480 How did that happen?
00:09:00.720 So I went to Twitter with a list of accounts and tweets that I knew had,
00:09:07.080 that were flagged, that I observed.
00:09:08.600 And I wanted to, I wanted to deconstruct, well, how did we get to that place?
00:09:13.060 And that's what I tried to do.
00:09:15.680 And who were you, who were you particularly interested in?
00:09:19.560 I mean, there were people like Jay Bhattacharya,
00:09:21.500 who were particularly nailed during the COVID, you know, for spreading COVID misinformation.
00:09:26.460 I mean, so you had a list a priori.
00:09:28.900 And I guess that's also why Barry thought you would be a good person to throw into the mix.
00:09:33.540 So you contacted her.
00:09:35.400 Now, you said you had been working in the background on COVID-related material.
00:09:39.480 So why don't you tell everybody what made you the person that Barry decided to put in San Francisco?
00:09:45.760 And then tell us who you were particularly interested in tracking down, you know,
00:09:50.320 because you said you had a list.
00:09:51.600 And so walk us through that.
00:09:53.740 Like, why you and then who you were particularly interested in investigating, let's say.
00:09:59.520 So I am considered, I think, by a lot of people, I'm the first journalist in America to write for a major publication very early.
00:10:09.860 I think it was the first week of May to call for, to question the idea that schools should remain closed in America.
00:10:18.180 And that kind of launched me on the path that I've been on until today speaking with you right now.
00:10:23.460 That was when?
00:10:24.220 When did you do that?
00:10:25.180 This was, I think, the first week in May I published this piece.
00:10:28.560 So in April, I started—
00:10:30.740 May of, sorry, May of what?
00:10:31.920 Oh, I'm sorry, 2020.
00:10:33.480 So the very beginning of—
00:10:34.420 May of 2020.
00:10:35.260 Correct, yeah.
00:10:36.660 I, you know, like anyone else, there was no such thing as a COVID beat prior to the pandemic.
00:10:41.600 But I observed very early on that something seemed off to me.
00:10:47.580 Initially, I was very nervous.
00:10:49.960 I wasn't cavalier about what was happening with the pandemic.
00:10:54.780 But by the middle of April, we started observing—I live right outside New York City.
00:11:01.320 We started observing that cases began to drop precipitously.
00:11:05.540 And we also began getting information from Europe that schools were beginning to open at the end of April,
00:11:11.780 and they were projected to open in the beginning of May in many locations.
00:11:15.140 Coupled with that, there was a lot of data coming out of China, out of Italy, and elsewhere,
00:11:20.300 and it was unanimous that children were at extraordinarily low risk.
00:11:25.520 So all these factors coming together, and I'm like, well, wait a minute.
00:11:29.260 Why is the school still closed?
00:11:30.840 I'm trying to understand this, particularly once they were opening elsewhere.
00:11:34.840 And so I come from a background as a fact-checker.
00:11:38.860 This was before fact-checking became kind of politicized the way it is now,
00:11:42.160 where there are these special fact-checking websites.
00:11:44.240 But I worked for Condé Nast magazines for a number of years.
00:11:48.160 And that sort of training, I think, and also just my own personal disposition,
00:11:53.020 I'm always skeptical about things.
00:11:55.540 That's just for blessing and a curse.
00:11:57.680 And you kept hearing about the experts are saying this, the experts are saying that.
00:12:01.880 And my go-to, because when you fact-check an article,
00:12:05.600 you always have to go to the source, or at least that's the ideal thing.
00:12:08.600 You never just take someone's word for it.
00:12:11.820 Or you don't take just, oh, something's printed in the New York Times.
00:12:15.060 That's never sufficient, or at least it's not ideal.
00:12:17.560 Well, that's particularly good.
00:12:18.600 Well, I don't want to pick on the Times.
00:12:20.100 But any publication, you want to go as close as you can.
00:12:23.440 You're never done, in a way.
00:12:24.920 So even if someone says something, first you have to fact-check,
00:12:27.100 well, did that person actually say it?
00:12:28.700 But then even if that's true, you then want to say,
00:12:30.860 well, is what they are saying true?
00:12:32.580 And then you have to go layers deeper.
00:12:34.300 So anyway, that kind of mindset, that's always been how I view the world.
00:12:39.180 And I started observing these things, and something seemed off to me.
00:12:43.560 And to my amazement, no one seemed to be writing about this,
00:12:47.700 at least not in any of the major publications that I typically read.
00:12:51.520 And I couldn't understand what was going on.
00:12:53.160 I've written for The Atlantic, The New York Times, New York Magazine,
00:12:56.400 a lot of, I guess what are terms, sort of legacy media outlets for many years.
00:13:02.240 So I had contacts at these places.
00:13:04.340 I knew editors, and I started reaching out to people saying,
00:13:07.840 hey, why aren't you writing about this?
00:13:11.120 I've put together this compendium of research.
00:13:14.420 I mean, it was like a bullet list, you know, a mile long of all the data about children,
00:13:18.900 all this stuff, and no one seemed to be writing about this.
00:13:21.680 I couldn't figure out what was going on.
00:13:23.900 I was turned down by every publication, just about,
00:13:28.740 except for one editor at Wired, who said, you know what?
00:13:32.500 Everything you wrote here, you know, in my pitch to him, this all checks out.
00:13:36.220 Like, I try to make, like everyone, I make mistakes.
00:13:40.220 I'm sure in every article there's something wrong, but I try to be very meticulous.
00:13:44.380 And I try to make my case almost like a lawyer, airtight.
00:13:47.780 And when I presented—
00:13:48.380 Why do you think you were turned down so universally, apart from—
00:13:52.480 I mean, look, it's not that—
00:13:54.140 It's the default to a suggestion for an article is to be turned down.
00:13:59.460 So we should start with the fact that it's a high baseline probability.
00:14:02.840 But you seem to be indicating that in this particular case,
00:14:05.940 the baseline was a little bit higher than normal,
00:14:08.060 even though you had quite a compendium of facts and it was a germane topic.
00:14:13.260 It's a very good point.
00:14:14.640 You know, an independent writer, the default is to be turned down.
00:14:18.020 However, this was really solid.
00:14:23.380 Like, you know, you have a sense as a writer, at least I do, after a while,
00:14:26.800 like, oh, I'm going to—I could sell this like that.
00:14:29.480 I had a month earlier, a few weeks earlier,
00:14:32.400 I had written a piece that I think was the number one read piece in the New York Times.
00:14:36.460 It was about this newlywed couple who were stranded in the Maldives.
00:14:39.540 I have a good sense of when something's real and when it can hit.
00:14:42.920 And I was kind of astonished that I was like, I found this special thing.
00:14:48.720 I found a lane for myself.
00:14:50.540 No one seems to be writing about this.
00:14:52.580 All this stuff is true.
00:14:54.420 This is my lane.
00:14:55.500 Like, I'm definitely going to nail this.
00:14:57.020 Because that's what, you know, it's exciting sometimes about journalism.
00:15:00.100 When you find this thing that's important, that's true, that's interesting,
00:15:04.260 and no one else is doing it.
00:15:06.380 And that's what happened.
00:15:07.700 And so we could talk about it later.
00:15:09.860 I could speculate now about why I was turned down.
00:15:12.300 But ultimately what happened was I was able to write this piece for Wired,
00:15:15.820 and that kind of set me on a path where now I've been known as this, quote, contrarian,
00:15:22.520 which, you know, I don't even know what that means, these labels.
00:15:25.640 I've just been following—
00:15:27.620 It means journalist.
00:15:28.740 Right, right.
00:15:29.520 To my mind, journalist is supposed to generally be a very adversarial type of relationship
00:15:34.740 between me and the powers that be, or me and what's being said,
00:15:38.800 rather than working simply as an amplifier or a megaphone.
00:15:43.280 So when we were told all of this information, these models that they were putting out,
00:15:47.740 Imperial College and IHME, these places, I'm like, well, none of these models seem to be checking out.
00:15:53.620 They seem to be wrong.
00:15:54.700 Well, what are the inputs in these models?
00:15:56.580 How are they putting these things together?
00:15:58.120 No one seemed to be asking these questions, or at least not what I was observing.
00:16:01.020 Why are my kids and 50 million other children in America locked out of their school buildings
00:16:07.720 when kids are starting to go back in Europe?
00:16:10.180 So all of these were, to me, very interesting and incredibly important questions
00:16:14.200 that I didn't seem to be getting adequate answers to anywhere else.
00:16:18.500 So I said, okay, I'm going to have to do this myself.
00:16:21.580 It was a very strange feeling because I was in the middle of writing another book,
00:16:25.720 which I'm still in the process of writing at the time,
00:16:28.840 but I found I was unable to concentrate on the book.
00:16:32.700 I mean, I eventually had my agent contact my editor,
00:16:35.980 and they were kind enough to say, okay, he can put this aside for a while.
00:16:39.600 I mean, this was a pandemic.
00:16:41.360 The schools were closed.
00:16:42.280 I could not concentrate.
00:16:43.780 So that set me on my path.
00:16:46.280 And after that point, I just wrote a stream of articles, beginning in Wired,
00:16:51.700 and then I migrated to other places like The Atlantic and New York Magazine,
00:16:55.620 where I think most of what I was writing challenged a lot of the sort of,
00:17:01.800 what would we call it, mainstream narrative, the establishment narrative from both the media
00:17:06.640 as well as the public health establishment in America about what was real or what wasn't real.
00:17:13.160 And so everything from school closures, which has been my focus,
00:17:18.720 but I looked into myocarditis.
00:17:20.900 I think I was the first person or one of the first people to interview the lead scientist in Israel
00:17:25.940 who put out the very first report.
00:17:28.040 I don't even know how I did it, but I got this guy on the phone.
00:17:30.700 I said, send me the report.
00:17:32.720 I have to see this.
00:17:34.220 So I wrote about that very early on.
00:17:37.080 And just all of these things, there's so many of these areas that things seemed a little bit off.
00:17:43.420 And I want to say, I don't want to ascribe ill will to anyone.
00:17:47.520 I don't think there's something in like a nefarious conspiratorial thing happening,
00:17:53.440 or at least that's not how I approach it.
00:17:55.260 I simply approach everything of what is the truth here?
00:18:00.400 What is this sort of like empirical underlying data to support whatever we are being told?
00:18:05.500 And I just keep digging and digging and digging to see if it seems true or not.
00:18:09.800 Going online without ExpressVPN is like not paying attention to the safety demonstration on a flight.
00:18:16.160 Most of the time, you'll probably be fine.
00:18:18.140 But what if one day that weird yellow mask drops down from overhead and you have no idea what to do?
00:18:23.820 In our hyper-connected world, your digital privacy isn't just a luxury.
00:18:27.620 It's a fundamental right.
00:18:28.940 Every time you connect to an unsecured network in a cafe, hotel, or airport,
00:18:33.200 you're essentially broadcasting your personal information to anyone with a technical know-how to intercept it.
00:18:38.140 And let's be clear, it doesn't take a genius hacker to do this.
00:18:41.460 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords,
00:18:46.600 bank logins, and credit card details.
00:18:48.840 Now, you might think, what's the big deal?
00:18:50.960 Who'd want my data anyway?
00:18:52.500 Well, on the dark web, your personal information could fetch up to $1,000.
00:18:56.920 That's right, there's a whole underground economy built on stolen identities.
00:19:01.180 Enter ExpressVPN.
00:19:02.920 It's like a digital fortress, creating an encrypted tunnel between your device and the internet.
00:19:07.200 Their encryption is so robust that it would take a hacker with a supercomputer over a billion years to crack it.
00:19:13.280 But don't let its power fool you.
00:19:15.080 ExpressVPN is incredibly user-friendly.
00:19:17.440 With just one click, you're protected across all your devices.
00:19:20.460 Phones, laptops, tablets, you name it.
00:19:22.640 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
00:19:26.780 It gives me peace of mind knowing that my research, communications, and personal data are shielded from prying eyes.
00:19:32.500 Secure your online data today by visiting expressvpn.com slash jordan.
00:19:37.500 That's E-X-P-R-E-S-S-V-P-N dot com slash jordan, and you can get an extra three months free.
00:19:43.880 Expressvpn.com slash jordan.
00:19:45.760 And that ultimately, so that's the sort of both long and short of how I got to Twitter ultimately as someone that Barry thought I was a good person in particular.
00:19:59.260 How did you get, how did you get in, how did you establish a relationship with Barry?
00:20:03.880 Had you known her at the New York Times?
00:20:05.580 How did you guys get put together?
00:20:06.380 You know, I had written a piece for Barry's website for her publication.
00:20:13.100 I forget how much earlier, so we knew each other through that.
00:20:16.920 I forget how I got in touch with Barry initially, but ultimately I wrote a piece.
00:20:20.060 I think it was about, it was about the vaccines and children and how the, I had interviewed a member of the committee who, one of the advisory committees for the CDC.
00:20:33.040 And it was a pretty remarkable interview.
00:20:36.000 So I had written that for her.
00:20:37.300 So I knew Barry and I knew some of her editors from that experience.
00:20:42.180 And again, I was just trying to be helpful.
00:20:44.260 And I, you know, reached out to them just saying, you know, because I think at that point, Matt Taibbi had written one or several Twitter files and Barry had done one or two.
00:20:54.860 But I don't think anyone had really written about COVID-related material.
00:20:57.840 I didn't know what the story was with the Twitter files.
00:20:59.960 All I was saying, hey, I know you, I've done a lot of reporting on COVID.
00:21:04.180 I know all these people who are scientists who had their tweets mislabeled and, you know, labeled as misleading in some manner, or they were suspended from Twitter.
00:21:15.500 I had all this information that the average journalist just simply wouldn't have just because I've been so deep in this world.
00:21:21.820 I have a Rolodex, you know, a mile long of infectious disease specialists and others who I've been talking to for years now.
00:21:28.000 So I just had all these people and this information.
00:21:30.920 And I was just trying to be helpful saying, here are some things that you might want to look for in case you're sending someone there, you're going back.
00:21:36.880 And then they said, David, just get on a plane and please do it yourself.
00:21:40.800 So you had established this Rolodex and you'd been tracking scientists.
00:21:46.960 And so when you went to San Francisco, there was a set of accounts and tweets that you were particularly interested in investigating.
00:21:55.320 Who were some of the people, the cardinal people on that list and why did you focus on them?
00:22:00.440 Right. So, again, I had observed over the prior, you know, couple years, a number of tweets or accounts having their information, their content suppressed in some way.
00:22:13.780 And content that I knew as a writer and someone who had done lots of research on this and spoken to experts, content that I knew was perfectly legitimate.
00:22:21.600 There are things that experts can or even should disagree on.
00:22:25.460 That's different from saying it's, quote, misinformation.
00:22:27.540 But so someone like Martin Koldorf, who wrote the Great Barrington Declaration with Jay Bhattacharya and Sunetra Gupta, I knew that Martin had a particular tweet that I saw that was flagged as misleading.
00:22:42.000 He was talking about saying, you know, I don't remember the precise language, but it was, children, it's not necessary to require the vaccine of children at this point.
00:22:52.400 I don't see why that's the, you know, he was giving his opinion.
00:22:54.520 This guy is one of the most renowned infectious disease experts in the world.
00:22:59.740 Perfectly reasonable for him to give his view on this matter.
00:23:03.480 And that was flagged as misleading.
00:23:05.320 And I wanted to find out why.
00:23:07.080 How did that happen?
00:23:08.100 There were some other people, Andrew Bostom and some other physicians and others who had tweets.
00:23:14.960 I know Andy had tweeted something about there was a study that found, I think there was a low sperm count following one or two of the doses of the vaccine.
00:23:23.960 This is published in a peer-reviewed journal.
00:23:26.620 Now, what is the quality of that study?
00:23:29.460 I have no idea.
00:23:30.520 But the bar should be, if something is published in a legitimate peer-reviewed medical journal and you're citing it in a tweet, that's reasonable in my mind to not be suppressed in any manner.
00:23:43.120 So there were a handful of things like that.
00:23:45.100 And I wanted to work backwards and understand.
00:23:47.280 And basically, I found there were...
00:23:49.280 Why do you think this stood out for you?
00:23:52.500 You know, you said you were occupied with a book at the time.
00:23:56.360 I mean, you had other things to do.
00:23:58.040 But this issue of particularly the suppression of this information made itself obvious to you as a problem.
00:24:07.800 Do you have any sense of what it was about what you were doing that made that particularly relevant to you and why you decided to pursue it?
00:24:15.460 Because you said it gripped you in some way, right?
00:24:17.540 It even dislodged you out of your book.
00:24:19.540 Right.
00:24:20.480 And then ultimately, and now I'm writing another book on the closures, the American school closures during the pandemic.
00:24:28.080 And I guess specifically with the Twitter suppression that gripped me, perhaps that's sort of emblematic of the broader topic that gripped me, which is the information environment that we all live within.
00:24:41.900 And trying to understand why some ideas were considered okay and other ideas were considered not okay.
00:24:51.720 And we saw that play out in the sort of mainstream media or legacy press.
00:24:57.200 And I'm someone who had written for a number of these publications that I guess would be considered, you know, prestige publications or left-wing or whatever terminology people want to use.
00:25:06.320 And all of a sudden, these places, so I have no, this is non-political for me.
00:25:12.620 I had no specific political affiliation to want one thing or another to be true or to not be true.
00:25:19.200 And I think my prior experience as writing for these places shows that.
00:25:23.120 Like, I'm not coming from a political angle.
00:25:25.460 Yet, nevertheless, I found myself suddenly in this sort of outside this group that roughly I'd been within professionally and personally for basically most of my adult life.
00:25:38.240 And it was a source of endless fascination and consternation to me.
00:25:43.460 Like, how is this possible?
00:25:45.020 Why is this happening?
00:25:46.480 Why am I viewing this differently from so many of my neighbors and other people?
00:25:50.620 Yet, I know from speaking to scientists around the world, I'm not crazy.
00:25:54.680 I wasn't talking to some, you know, lunatic in their basement.
00:25:58.220 These are people at the most prestigious institutions in the world.
00:26:02.060 And they were in agreement with what I was saying.
00:26:04.160 Who are also being censored.
00:26:05.600 Okay, so let me dig into that a little bit.
00:26:07.840 Let me dig into that a little bit.
00:26:09.100 Well, let's give the devil his due.
00:26:15.540 We have, before the pandemic, a substantial amount of public trust in vaccines.
00:26:21.640 And a general consensus that vaccines were miraculous in many ways.
00:26:26.360 And that they were particularly useful for the protection of children.
00:26:29.700 And that we could trust the public health authorities to insist upon what was best for children.
00:26:36.180 And that that's what they were doing.
00:26:37.640 And that they were a good intermediary between the pharmaceutical companies and their financial interest and the health of children.
00:26:45.420 And we trusted the public health authorities.
00:26:47.680 So there's a lot of goodwill towards the vaccine enterprise as such.
00:26:53.140 And so that would be the baseline.
00:26:54.600 And so people would assume, and rightly so, that if we were being told by public health authorities to vaccinate children, that they believed that that was actually in the best interest of the children.
00:27:08.840 And that that was reliable.
00:27:10.440 And so you could imagine there would be resistance to any counter-narrative that would question that fundamental set of presumptions.
00:27:17.640 Now, there were people who were beating the anti-vax drum before the pandemic, but not very many and generally ignored.
00:27:28.080 The problem here, it seems to me, please correct me if I'm wrong or if this isn't in accord with what happened to you.
00:27:34.800 The problem I had very early on was that I didn't see there was any evidence at all that children were actually at risk for any particularly serious consequences in relationship to COVID.
00:27:45.500 You could make a case perhaps that they could get vaccinated like they might get vaccinated for a flu, but the morbidity, mortality risk for children was no higher than it was for the flu.
00:27:59.440 So that begged the first question it begged was, well, should children be getting vaccinated at all?
00:28:05.300 And the second question it certainly begged was, well, is there any reason whatsoever to make such vaccines mandatory for children, especially because that's so much in the financial, it's so egregiously in the financial interest of pharmaceutical companies to have that enforced, or at least in their short-term interest.
00:28:25.860 So, you know, it's that terrible combination of the trust that the public had in the public health authorities and the financial gain that was sitting there ready for the pharmaceuticals to capitalize on, I think, that made this such a toxic, let's say, a toxic brew.
00:28:42.400 And also why the narrative emerged that you had to push back against.
00:28:47.380 That's how it looked to me.
00:28:48.400 What do you think about that as a set of hypotheses?
00:28:50.820 Right, so I think that the way I think about, you know, the vaccine policy, in particular for children, to me, that's all of a piece.
00:29:00.260 It dovetails with the policy regarding school closures and a variety of other factors.
00:29:05.680 They're all part of the same idea, which is a very kind of myopic focus on the suppression or attempted suppression of the transmission of a virus.
00:29:17.440 But we were led to believe that there was this conflation that suppressing a virus is not the same thing as human flourishing.
00:29:27.720 It can be, or societal flourishing.
00:29:29.960 And it's reasonable in the very early stages, when no one knew or few people knew what was happening, or at least there was some degree of uncertainty and chaos,
00:29:38.720 that people want to be particularly careful to try to avoid transmission, to try to figure out what's happening.
00:29:47.120 And I was that way myself, personally.
00:29:49.700 But I think very early on, we needed to also acknowledge that there would be profound harms and damages from the mitigation efforts that were put into place.
00:30:02.860 Setting aside whether these mitigation efforts would be successful, that's a whole separate issue.
00:30:06.900 But even if they were successful, what are the downsides of this?
00:30:10.980 And very early, I think those were both not acknowledged and recognized by many of the authorities, number one.
00:30:18.120 And number two, the wildly disproportionate burden that working class people were going to absorb from those measures was not acknowledged.
00:30:30.720 So we had a, what we had, so there's a biological parallel here and a set of observations on cognitive oversimplification that are relevant.
00:30:42.240 So the biological parallel, which I think is a very good one, is that in a disease process, there are two risks.
00:30:50.080 There's the risk of the disease.
00:30:52.220 And then there's the risk of the overreaction of the immune system.
00:30:56.220 And so the immune system can overreact and cause all sorts of diseases.
00:31:00.880 So autoimmune diseases are like that.
00:31:03.040 Arthritis is like that.
00:31:04.340 And excess inflammation is like that.
00:31:06.900 And you can get a cascade of immunological responses that are fatal when the disease itself would be unlikely to be fatal.
00:31:16.960 And so the threat of immune overreaction is a real one.
00:31:21.340 Now, there is a set of behaviors known as the extended immune system, the behavioral immune system.
00:31:32.720 And that's the manifestation of the biological defenses against infection that manifest themselves behaviorally.
00:31:41.380 And so a couple of those are, well, disgust is one of those, the emotion of disgust, the sense of contamination, the gag reflex, the repulsion that we feel for things that are disgusting.
00:31:54.740 And that's the way the immune system, in some sense, has reached up into the higher stratosphere of cognitive and behavioral proclivity to protect us at the macro scale against pathogens.
00:32:09.420 And that's extremely important because pathogen transmission is extremely dangerous.
00:32:13.380 You may know, you likely know, that when the Europeans came to the Western Hemisphere, 95% of the Native Americans died within about 150 years of contact.
00:32:28.720 And they died because they had no resistance whatsoever to mumps, measles, and smallpox.
00:32:34.120 And that resistance had been bred in European cities where we were in close quarters with animals.
00:32:39.540 And so pathogen transmission is extremely deadly, obviously.
00:32:45.140 And we've evolved all sorts of mechanisms.
00:32:47.200 Now, at a political level, the behavioral immune system also extends itself.
00:32:52.940 And it extended itself, and you might say in this situation, into the entire panoply of authoritarian pandemic responses.
00:32:59.740 And that was spearheaded by China.
00:33:01.980 And the danger there is, it's a parallel danger, is that the response will be more pathological than the pathogen.
00:33:08.240 And the way it was more pathological, as far as I could tell, was that we hyper-focused on the potential danger posed by the pathogen.
00:33:20.140 And we eliminated all consideration whatsoever for the potential side effects of all of the amelioration strategies.
00:33:26.720 So the politicians abdicated their responsibility to so-called experts.
00:33:32.380 And the public health experts, who were concerned with pathogen control, had no idea how to contemplate all the other risks,
00:33:40.160 like the risks to the education of children, the risks to the working class, the risks to the bloody supply chain, the risks to fundamental liberties.
00:33:48.180 Like politicians should have been calculating the balance of risks there, instead of focusing maniacally and monomaniacally on a single problem.
00:33:58.600 Well, and also defaulting their damn responsibility to so-called public health experts, who aren't politicians or economists, who don't have a broad purview.
00:34:07.020 And so we stepped into a social behavioral immune over-response.
00:34:13.580 And I think some of that was also driven by the financial machinations of the pharmaceutical companies themselves.
00:34:20.660 And, you know, I mean, they were trying to make vaccines, and hypothetically we needed the vaccines.
00:34:25.560 But God, it was so much in their financial interest to push this narrative.
00:34:29.940 And they're so effective at lobbying.
00:34:32.580 I mean, and all things considered, as the left once knew, if you had to rank order globalist companies in terms of public corruption,
00:34:42.880 you'd have to put the bloody pharmaceutical companies near to the top.
00:34:46.360 If you use no other measure than size of lawsuits in the past,
00:34:51.720 they've had the biggest, the most and the biggest lawsuits for malfeasance levied against them,
00:34:57.600 I think, of any corporate entities ever in the history of capitalism.
00:35:02.300 And so, well, so that's the perfect storm.
00:35:05.980 Starting a business can be tough, but thanks to Shopify, running your online storefront is easier than ever.
00:35:11.980 Shopify is the global commerce platform that helps you sell at every stage of your business.
00:35:16.240 From the launch your online shop stage, all the way to the did we just hit a million orders stage,
00:35:21.200 Shopify is here to help you grow.
00:35:23.020 Our marketing team uses Shopify every day to sell our merchandise,
00:35:26.160 and we love how easy it is to add more items, ship products, and track conversions.
00:35:31.340 With Shopify, customize your online store to your style with flexible templates and powerful tools,
00:35:36.560 alongside an endless list of integrations and third-party apps like on-demand printing, accounting, and chatbots.
00:35:42.760 Shopify helps you turn browsers into buyers with the internet's best converting checkout,
00:35:47.140 up to 36% better compared to other leading e-commerce platforms.
00:35:50.600 No matter how big you want to grow, Shopify gives you everything you need to take control
00:35:55.060 and take your business to the next level.
00:35:57.680 Sign up for a $1 per month trial period at shopify.com slash jbp, all lowercase.
00:36:03.580 Go to shopify.com slash jbp now to grow your business, no matter what stage you're in.
00:36:08.900 That's shopify.com slash jbp.
00:36:10.600 Well, I mean, you made a bunch of very good salient points, and, you know, I would say that, you know,
00:36:20.280 the overreaction of the human immune system, I think, is a decent metaphor
00:36:25.100 for the overreaction of society, of what happened.
00:36:29.220 And the, as we were saying, this focus on this one thing does not take into account
00:36:34.700 all of these ancillary things, these sort of second-order effects that are going to happen.
00:36:39.180 And while public health professionals, Anthony Fauci on down, may have an expertise in a
00:36:46.660 particular lane, they are not experts on the world.
00:36:50.460 And one of the things that I've always found so irritating and ridiculous is when there's
00:36:57.280 been certain epidemiologists or others who say, you know, stay in your lane to whether
00:37:03.260 it's a journalist, an economist.
00:37:04.820 There's someone named Emily Oster, who's an economist out of Brown, who did a lot of early
00:37:08.960 research related to schools and other matters, and people just immediately dismissed her.
00:37:15.160 Well, she's an economist.
00:37:16.360 What does she know?
00:37:17.120 And I'm thinking, that's who we need to be looking at some of these things.
00:37:21.060 You need economists.
00:37:22.580 We need psychologists.
00:37:24.140 We need people who—educators.
00:37:27.120 We need people in a whole range of fields of human endeavor to try to understand and discuss
00:37:34.080 what are going to be the first, second, and third-order effects of all of these interventions
00:37:40.380 we are imposing.
00:37:42.300 So—and I met this New York Times reporter who's done some science reporting at a party
00:37:48.200 a while back.
00:37:49.380 And I remember speaking with her about this.
00:37:51.460 And she immediately dismissed Emily Oster as, well, she's just an economist.
00:37:56.220 And it gave me this window into—I mean, I had already observed this anyway, you know,
00:38:02.440 just by reading the media and news outlets.
00:38:06.860 But to speak to someone who's actually reporting on this, okay, this confirmed what we already
00:38:12.040 could see, that there is this viewpoint that unless you had an infectious disease physician
00:38:17.980 or an epidemiologist, that your view was somehow not relevant.
00:38:22.140 But this made no sense.
00:38:23.280 Someone who understands disease spread does not have an expertise in childhood, you know,
00:38:29.200 nutrition necessarily, or in education, or in psychology, or in economics, because all
00:38:34.540 of these things, of course, are interconnected.
00:38:36.360 If you have someone who's been running a mom-and-pop business, and then the business gets closed,
00:38:41.200 they lose their insurance, they're depressed and lonely, they're barred from seeing their
00:38:46.000 friends, all of these things, well, guess what?
00:38:47.940 That also has an effect on someone's health.
00:38:49.740 Obviously, that's not as bad as dying from a virus, but not everyone was necessarily
00:38:54.740 at extreme risk of dying from the virus.
00:38:56.960 And we certainly knew this after a few months as time wore on.
00:39:00.920 And to me, you know, one of the biggest problems is that there was never any sort of sunset clause
00:39:06.240 on any of these things.
00:39:07.540 There's—one of the things we know from implementation science is it's very hard to de-implement.
00:39:13.760 So once the wheels are in motion, it is very hard.
00:39:18.300 You know, physicians continue to prescribe an antibiotic prophylactically, even though
00:39:23.120 there's lots of studies that show, you know, post-op it's not necessarily beneficial in
00:39:26.700 certain circumstances, but they'll continue to do it anyway because it's just this force
00:39:30.040 of habit, you know.
00:39:31.120 And then on a much broader scale, when you have politicians involved, then it's not just
00:39:34.900 a clinician, you know, like a doctor.
00:39:37.860 But you have a politician who puts some sort of policy in place, or you have a school superintendent.
00:39:43.720 It is very hard to unwind these things.
00:39:47.400 And there was no mechanism in place early on saying, we need to have some sort of review
00:39:53.300 of what's happening.
00:39:54.740 Instead, there was a bunch of people basically making up arbitrary benchmarks for different—oh,
00:40:01.580 when it reaches 5%, we can do this.
00:40:03.560 When it reaches 3%, you can do that.
00:40:05.280 But even just a cursory review of the literature on this showed that these were hardly grounded
00:40:12.640 in any sort of, like, scientific reasoning, a lot of these benchmarks.
00:40:17.780 And besides, every city and every school system was doing different things anyway.
00:40:23.240 When people are presented with too much information or too many options, let's say, you could even
00:40:29.920 say, too much untrammeled so-called freedom, it's really chaos, they get anxious.
00:40:36.520 Because anxiety is a signal of pathway complexity, too many things to choose between.
00:40:44.040 So there's a real drive to desire a simple and unidimensional solution.
00:40:49.740 And so if you don't want to be bothered thinking, then if someone offers you a reduction of the
00:40:57.900 problem to a single dimension, and then a virtuous pathway forward, it's extremely tempting
00:41:04.220 psychologically to seize that.
00:41:06.380 Because then you can—well, you could have said, well, I can ignore all this COVID nonsense
00:41:10.600 and go back to my book, for example.
00:41:12.660 And people can think, well, the experts have it, and I don't have to think about it.
00:41:16.200 Okay, the problem is, as you've pointed out, and this could get us into the conversation
00:41:20.600 about the church in California and the issue of liberties.
00:41:24.340 The problem is, is that there is an irreducible amount of complexity in the world.
00:41:31.240 And if you oversimplify, you pay a price somewhere else, somewhere invisible, but somewhere else.
00:41:38.520 And so then you might ask yourself, well, what guarantees do we have against that temptation
00:41:44.660 to oversimplify?
00:41:46.380 You know, because we could say, well, every time there's a new illness, we'll just lock
00:41:49.500 everybody up.
00:41:51.400 Well, everyone who has any sense knows that that's a bad idea.
00:41:56.580 The reason it's a bad idea is because locking everybody up violates our fundamental liberties,
00:42:04.600 our natural rights, let's say.
00:42:06.500 And then you might say, well, why the hell do we have those natural rights to begin with?
00:42:10.400 And the answer is something like, societies have evolved and computed that there's a certain
00:42:18.220 set of inviolable freedoms that actually constitute the best solution to irreducible complexity.
00:42:26.580 So the idea is something like this, all things considered, and that would be all, even all
00:42:33.340 the things you couldn't even consider because you don't have enough time, all things considered,
00:42:37.880 it's better to let people say what they need to say, right?
00:42:43.080 In terms of total balance of risks, there's no better solution.
00:42:47.360 And maybe that's because of information dissemination.
00:42:49.900 All things considered, it's better to leave people to assemble freely, right?
00:42:56.540 And to own their own property and husband it according to their dictates.
00:43:02.400 And the reason for that is because even in emergencies, those are the best policies.
00:43:08.080 And that's why those rights are supposed to be inviolable.
00:43:10.920 And what we did instead, especially because we copied totalitarian China so rapidly,
00:43:16.300 which is extraordinarily interesting to me from a psychological perspective,
00:43:19.880 is we set all those intrinsic rights aside.
00:43:22.940 And we said, no, we're going to collapse this multidimensional problem to one dimension.
00:43:26.980 We're going to call virtue one pathway forward, which is don't transmit the disease.
00:43:31.420 And everything else is going to go by the wayside.
00:43:34.080 Well, we're still paying the price for violating those fundamental rights.
00:43:38.560 We've destroyed public trust in the public health enterprise.
00:43:42.240 We've compromised the supply chain terribly.
00:43:45.100 God only knows how many people we've killed.
00:43:47.440 There's a tremendous decrement in vaccine uptake now around the world,
00:43:53.220 because people are very skeptical about vaccines.
00:43:55.680 And the probability that we'll kill more kids by not vaccinating them with vaccines that actually work
00:44:02.580 than we saved with the COVID vaccine, which was unnecessary for children, is extremely high.
00:44:08.460 We're paying the invisible price for violating all of those rights.
00:44:11.900 Now, we talked a little bit before we started our conversation about a particular case in California
00:44:18.200 that you wanted to concentrate on, a case of a church that has just been levied an immense fine
00:44:24.700 by the civil authorities that wanted to stay open during COVID.
00:44:27.860 And we could probably use that as an example of this totalitarian tendency to undermine natural rights
00:44:35.580 in the hypothetical service of social cohesion and safety and dig in from there.
00:44:41.900 So do you want to lay out that case?
00:44:44.080 Sure.
00:44:44.760 Yeah.
00:44:45.080 And the church case is interesting to me because it echoes, to my mind,
00:44:51.820 I had been writing about schools and children for so long that what I observed happen with this church
00:44:57.340 seemed very, very familiar to me instantly, that the same sort of dynamic was in place.
00:45:03.300 And, you know, just as a sort of like macro framing on this,
00:45:07.700 when you're talking about whether it's a vaccine or school closures,
00:45:10.060 in medicine, the default is to not intervene unless you can prove, you know,
00:45:17.700 the saying from the FDA is, you know, safe and effective.
00:45:21.060 So the default in America is for children to be in school and to be able to go to school.
00:45:26.140 That's the default.
00:45:27.600 So in order to prevent them from going to school, that's the intervention, is preventing them.
00:45:34.060 That's not the default.
00:45:35.560 But yet something flipped where keeping them barred from school became the default.
00:45:40.780 But this is not how medicine typically functions.
00:45:43.880 So that, I think, is also broadly how we could think about all sorts of other civil liberties.
00:45:50.680 I should say, I don't think there's a scenario,
00:45:53.680 I don't think it's just never appropriate for an authority or public health authority
00:46:00.180 to infringe on some of our liberties.
00:46:03.720 But the bar to reach that should be quite high, of course.
00:46:08.580 And there should also be some sort of limits on that.
00:46:11.400 So this stuff with kids in schools and whether you want to talk about vaccines
00:46:15.220 or even the information environment on social media and elsewhere,
00:46:18.420 all of these things sort of flipped what we philosophically and ethically
00:46:23.820 tend to think of as the default.
00:46:26.080 That the default then became the intervention.
00:46:29.140 But the default should never be the intervention
00:46:31.200 without strong evidence that the intervention is going to provide a net benefit
00:46:36.480 rather than a net harm.
00:46:38.160 Well, one of the hallmarks for that, you know, there's a claim among scientists.
00:46:42.740 I think Carl Sagan first said this, at least formalized it,
00:46:46.280 although it was known implicitly,
00:46:47.740 is that extraordinary claims require extraordinary evidence.
00:46:54.140 And an extraordinary claim is one that violates a fundamental precept.
00:46:59.500 And the most fundamental precepts in our culture
00:47:01.920 are embedded in the domain of natural rights, right?
00:47:05.240 They're self-evident even outside the constitutional framework.
00:47:08.360 There is nothing more fundamental to the functioning of our culture
00:47:12.140 than that set of natural rights.
00:47:13.680 And so if your solution to a problem is,
00:47:17.140 well, we have to violate one or more natural rights,
00:47:19.900 then it should be incumbent on you to demonstrate that you have a body of proof
00:47:25.260 that is sufficient to justify moving something that is fundamentally immobile.
00:47:32.180 And certainly that standard of proof wasn't met in this particular case
00:47:36.020 because we didn't even have good data on, well,
00:47:39.360 how, what the actual mortality rate was for the COVID infection,
00:47:43.960 especially when it was segregated by different age groups.
00:47:47.640 I mean, it was obvious, it was early,
00:47:49.240 it was obvious pretty damn early on that the death rate among children was vanishingly small.
00:47:54.620 Well, look, I mean, yeah, more children die drowning every year
00:48:00.500 than they did from COVID in any individual year in America.
00:48:03.580 For example, we don't bar, we don't bar swimming.
00:48:07.600 And we had data very early on that, that, you know,
00:48:10.920 and some people legitimately so say, well, what about the teachers?
00:48:14.600 Fine, maybe the kids are okay.
00:48:15.920 We had data early on from studies out of Sweden,
00:48:18.800 which a country that did not close the lower schools,
00:48:20.960 their teachers were at no higher risk than other professionals.
00:48:25.000 They were far below the risk that they found in,
00:48:27.520 I think, like pizza bakers and bus drivers or taxi drivers.
00:48:31.840 So there was no elevated risk for teachers they found.
00:48:35.080 And this is a real place, and it's not, you know, this wasn't one little town.
00:48:38.440 We're talking about more than a million children who were in school there,
00:48:42.500 tens of thousands of teachers, and this is what they found.
00:48:45.600 That evidence was dismissed and ignored.
00:48:48.340 So, and so the thing with the schools, then to me.
00:48:53.620 Yeah, well, you put your finger on something there, too,
00:48:55.920 that's very much relevant, which is, well.
00:48:59.240 When a woman experiences an unplanned pregnancy,
00:49:02.200 she often feels alone and afraid.
00:49:04.540 Too often, her first response is to seek out an abortion,
00:49:07.440 because that's what left-leaning institutions have conditioned her to do.
00:49:11.380 But because of the generosity of listeners like you,
00:49:14.080 that search may lead her to a pre-born network clinic,
00:49:16.760 where, by the grace of God, she'll choose life,
00:49:19.480 not just for her baby, but for herself.
00:49:22.280 Pre-born offers God's love and compassion to hurting women
00:49:24.980 and provides a free ultrasound to introduce them to the life growing inside them.
00:49:29.400 This combination helps women to choose life,
00:49:32.020 and it's how Pre-born saves 200 babies every single day.
00:49:35.780 Thanks to the Daily Wire's partnership with Pre-born,
00:49:38.100 we're able to make our powerful documentary,
00:49:40.440 Choosing Life, available to all on Daily Wire+.
00:49:43.420 Join us in thanking Pre-born for bringing this important work out from behind our paywall,
00:49:48.660 and consider making a donation today to support their life-saving work.
00:49:52.540 You can sponsor one ultrasound for just $28.
00:49:55.400 If you have the means, you can sponsor Pre-born's entire network for a day for $5,000.
00:50:00.620 Make a donation today.
00:50:01.960 Just dial pound 250 and say the keyword baby.
00:50:04.820 That's pound 250 baby.
00:50:06.820 Or go to preborn.com slash Jordan.
00:50:09.260 That's preborn.com slash Jordan.
00:50:11.820 How much risk justifies intervention?
00:50:18.360 And one good rule of thumb is,
00:50:20.300 well, you obviously don't intervene if the risk that's posed is no greater a risk
00:50:25.700 than risk that people will voluntarily undertake
00:50:29.100 in many activities that are necessary to their daily life that are already factored in.
00:50:34.040 And I would say paramount above those, among those, would be driving.
00:50:38.720 Because there really isn't anything we ever do that's more dangerous than driving.
00:50:42.740 I mean, it's not that dangerous per unit of travel, but it's very dangerous.
00:50:48.140 It kills lots of people.
00:50:49.300 And so you could say, well, if a given enterprise poses no more risk than driving,
00:50:55.700 then it can be factored in as acceptable level of risk.
00:51:00.300 And that was certainly the case with the COVID deaths, certainly among children.
00:51:04.760 And as you said, the Swedish data was there extraordinarily early on.
00:51:08.620 And so then, of course, that begs the question again is like, then given that,
00:51:12.480 why the hell the school closures and lockdowns?
00:51:15.640 Because that was a very peculiar response.
00:51:18.640 Well, and I think the driving is an example, I think, of often,
00:51:23.040 because it's applicable to school closures.
00:51:24.720 It's applicable to this church story, which we can talk about in a moment,
00:51:28.220 which is that one of the arguments that people would make is,
00:51:32.280 people who oppose what I'm saying, my viewpoint, they would say,
00:51:35.900 well, this isn't just about your personal risk.
00:51:38.480 This is about you putting other—we have a societal—
00:51:41.920 you know, we have an obligation to other people.
00:51:44.040 And I don't disagree.
00:51:45.020 However, in society, we balance our own personal freedoms with risk to others,
00:51:51.820 and we traditionally have a certain high tolerance for risk we all impose upon each other.
00:51:58.400 And driving is a perfect example, because there's been a lot of argument about masks,
00:52:02.640 and I've written a few very large investigative pieces about the evidence on masks,
00:52:07.860 and specifically for mandating them in schools.
00:52:10.980 And one of the things people would say is,
00:52:12.520 well, it's not just about your own kid.
00:52:14.300 It's about some other kid.
00:52:15.420 But here's the thing.
00:52:16.600 If we think about, in America, most of the highways have a speed limit of 55 or 65.
00:52:21.740 We could make all the highways speed limit at 35 miles per hour.
00:52:27.040 And there would definitely be fewer accidents, fewer serious injuries and deaths, most likely,
00:52:32.460 if everyone was forced to drive slower.
00:52:34.020 We know when you're driving fast that there is a greater risk of serious injury or death.
00:52:39.520 But we choose, as a society, to allow a higher speed limit because we value getting places faster more than whatever that risk—that's just—
00:52:48.900 Well, and that's partly—well, we do that partly.
00:52:51.480 We should point out, too.
00:52:53.040 Part of the reason we do that is so that people don't die other ways, right?
00:52:58.820 Because if you're more efficient, well, you can make more—
00:53:01.440 Well, so—and this issue of risk is an interesting one in that regard because it's no—there is no doubt whatsoever that human beings are dangerous to one another as potential carriers of pathogens.
00:53:13.780 But there's also no doubt that we're extremely valuable to each other as sources of cooperative enterprise and sources of information.
00:53:21.840 And there's a huge battle, biologically, between the risk posed by interpersonal communication.
00:53:28.140 You can think about this sexually.
00:53:29.620 Like, there's no people without sex, but there's no shortage of sexually transmitted diseases.
00:53:34.200 It's exactly the same problem.
00:53:35.820 And so one way of getting rid of sexually transmitted diseases is to forbid sex.
00:53:41.320 And that's the end of that problem.
00:53:42.900 But, well, then there aren't any people.
00:53:44.920 And, you know, that actually turns out to be a worse solution than—like a worse pathogen, so to speak, than the sexual diseases are pathogens.
00:53:54.400 And so we are always faced—you know, it's really interesting, eh?
00:53:58.280 Because to some degree, the evidence suggests that the difference in political type—it's got twisted up in COVID—is actually a difference between pathogen restriction and information freedom.
00:54:12.900 So, classically, before whatever's happened in the last five years, the more liberal types were freedom of information advocates, like, we should move around, we should speak freely, we should transmit information, and we should accept the risks.
00:54:28.600 And conservatives, and conservatives, even temperamentally speaking, were the ones who would say, well, you have to be careful when you're freely interacting because pathogens of various sorts, biological but also ideological, can be transmitted, and that's an eternal risk.
00:54:42.900 And the political landscape is actually a battle between walls and doors.
00:54:49.740 And the liberals say doors, and the conservatives say walls, and the truth of the matter is that walls and doors are both necessary because things have to be let in and kept out.
00:54:58.940 And there's no way of ever getting that right, so you have to argue about it forever.
00:55:03.140 But in the COVID overreaction, we decided that it was going to be all walls.
00:55:10.140 And so oddly, the liberals in particular gravitated in that direction.
00:55:15.980 And that is really kind of a—it's a miracle of paradox.
00:55:20.820 I don't understand it yet.
00:55:22.460 There's typically people in the sort of professional classes, and certainly journalists, at least, you know, that's the ideal, is that they challenge these sort of power structures within society, corporations, you know, and big business, government, the military, religion, all these institutions.
00:55:43.900 Yet during the pandemic, by my view, there was this astonishing lack of curiosity from journalists and the broader public within, you know, this certain sort of elite sphere of influencer class or, you know, professional class people that has blown my mind.
00:56:04.480 Again, I keep coming back to this thing where I'm like, what is the empirical evidence for X, Y, or Z, and let me try to find it.
00:56:12.860 But there is this lack of interest, this lack of curiosity.
00:56:17.140 Oh, well, the experts told me this.
00:56:18.640 If you look at evidence-based medicine, they have this pyramid, this hierarchy of—which you're probably familiar with—the hierarchy of evidence.
00:56:26.620 And in evidence-based medicine, expert opinion is at the bottom.
00:56:31.120 That's like the last thing you want to look at.
00:56:33.520 It's not irrelevant.
00:56:34.780 It's something we should consider.
00:56:35.980 But it's far more important to have actual observational evidence and then higher from that.
00:56:41.980 There's randomized controlled trials.
00:56:43.980 There are these mechanisms that we can use through scientific method to actually get real evidence that we can look at and try to ascertain what's going on.
00:56:52.180 But what I found in most of the reporting is that there was merely Anthony Fauci says X or the experts say—or they'll get in.
00:57:03.040 And oftentimes, the expert wasn't even an expert on this.
00:57:06.460 There's, you know, an emergency room physician who's quoted constantly in The New York Times and other news outlets who had no expertise necessarily related to infectious diseases.
00:57:17.220 Yet this person was repeatedly giving her opinion on things.
00:57:20.380 It went unchallenged.
00:57:22.200 It was just printed in, you know, in the newspaper or elsewhere.
00:57:25.540 This was fascinating to me.
00:57:27.160 How could this be happening?
00:57:28.440 And these are—you know, I'm just an independent journalist.
00:57:30.280 These are places with a machine behind them, enormous, you know, editorial staffs.
00:57:34.880 They have the resources to send people places.
00:57:36.920 You can get any expert you want if you write for some of these prestigious media outlets.
00:57:42.300 Yet they went to the same crew of people over and over, and they—not only did they go to the same people over and over, but it was the same lack of challenging these quotes.
00:57:52.700 This idea that an expert's view in and of itself—there's a reason why people get a second opinion when you go to the doctor.
00:57:59.360 Experts disagree on things oftentimes.
00:58:01.320 You know, it's so interesting that, in many ways, the same people who were so vehemently opposed to Trump's plan, let's say, to build a wall between the United States and Mexico, just a few years earlier, were absolutely 100% gung-ho to build walls absolutely everywhere throughout society in this particular instance.
00:58:22.600 You know, it's such a perverse flip.
00:58:24.200 Now, before we go on to the church issue, I have one remaining question from the Twitter files.
00:58:29.960 Oh, yeah, we got—
00:58:30.980 It's hanging.
00:58:31.740 Yeah, that's okay, but we covered most of it.
00:58:33.800 But you went to San Francisco armed with a dossier, let's say, of names and Twitter accounts that you wanted to look into.
00:58:42.400 But you also, while you were there, you looked into the identities of the people who were actually doing the censoring, right?
00:58:49.700 So there were Twitter accounts, but then there were the Twitter response and the people responsible for that.
00:58:54.560 And so there were people like Yoel Roth, whose name came up repeatedly in the Tybee investigations.
00:59:02.040 And so let's just tie that off, and then we'll move into the California situation.
00:59:07.640 So what was your sense about the sophistication and the forethought, the sophistication that characterized and the forethought that went into the sensorial activity at Twitter?
00:59:22.640 Who was doing that?
00:59:23.660 How was it orchestrated?
00:59:25.560 Was there in any—was it professional and warranted in any manner?
00:59:30.020 What was your sense of that when you looked into it?
00:59:32.360 Right. So there were sort of three different avenues, as I frame it, about how the suppression and censorship took place.
00:59:43.580 And one of them was that they set up this—and I'll get to the people, but I'm going to work backwards to get there.
00:59:49.220 One of them was that there was a system of bots set up that, you know, where they essentially crawl through the system.
00:59:56.680 And the bots were given certain—they were trained.
00:59:59.700 They were given certain information, whether it's keywords or other things to look for.
01:00:03.120 And a bot would flag a certain tweet or a certain account based on it setting off certain triggers within its, you know, whatever they're teaching the bots.
01:00:13.940 They're also—so that's one avenue of how certain tweets were flagged.
01:00:17.720 Another one is they had independent contractors, oftentimes in places like the Philippines or elsewhere, where you have someone essentially sitting in a cube farm, some person who—they were given a decision tree.
01:00:31.460 And I show this in my reporting.
01:00:32.720 It's quite interesting that, you know, they—okay, this tweet involves myocarditis.
01:00:38.500 You check one thing, then a drop-down menu.
01:00:40.480 It says this, you know, and there's five different options.
01:00:42.700 Then you check that.
01:00:43.220 And there's this whole decision tree on how some random person—and the notion that some—
01:00:48.740 And those systems never work.
01:00:50.620 Experts like that never work.
01:00:52.920 Right.
01:00:54.100 Some guy sitting in a cube in the Philippines is going to adjudicate the validity of a tweet about myocarditis and whether that works with, you know, what are the results say about, you know, late gadolinium enhancement.
01:01:06.760 I mean, there's no way that they're going to be able to adjudicate this.
01:01:10.900 So you have that happening.
01:01:12.140 And then the third thing is you have people themselves at Twitter.
01:01:15.820 But all of it comes—but the other two areas, the independent contractor using a decision tree and the bots, those, of course, stem from the people.
01:01:25.740 All those things initiate with human beings making choices about what we value or what we think is or is not acceptable on our platform.
01:01:34.900 Now, I think most people would say they don't want to be on a platform that's just overrun, you know, with pornography and violence and crazy stuff.
01:01:43.880 I think it's not—and people may disagree with me.
01:01:46.460 I think it's not unreasonable for a platform like Twitter or others to put very hard limits on the type of content that's going to be on the platform.
01:01:54.280 Because there's a reason why, you know, not everyone goes on 4chan or whatever.
01:01:57.900 Because there are limits to what most people want to be exposed to.
01:02:01.960 Let me make a technical comment about that because there's been a fair bit of psychological research done on this.
01:02:08.420 Well, so psychologists have started to look into, like, you could call it troll behavior.
01:02:16.100 I called it troll demon behavior because the people who are on social media platforms aren't exactly humans.
01:02:24.540 They're human-machine hybrids, right?
01:02:26.920 As soon as we're interacting with a huge social network, we have a reach that far extends our biological reach.
01:02:33.680 So we're machine-human hybrids on social media.
01:02:37.020 And those machine-human hybrids are very bizarre creatures.
01:02:41.140 We don't know what to make of them.
01:02:42.520 But we do know something about the human beings behind the more troublemaking posters.
01:02:50.940 So psychologists have identified a constellation of traits.
01:02:55.720 Manipulativeness, Machiavellianism.
01:02:59.420 Narcissism.
01:03:01.280 That's their sort of desire for attention without merit.
01:03:05.040 Psychopathy.
01:03:06.200 That's predatory parasitism.
01:03:08.120 And sadism, which is added relatively recently, which is positive delight in the unnecessary suffering of others.
01:03:16.860 And those four characteristics make a set, you could say, called the dark tetrad.
01:03:22.680 And it would be associated with antisocial behavior, criminal behavior, exploitation of others, including on the sexual front, not least on the sexual front.
01:03:33.980 Now, people who are characterized by that constellation make up about 3% of the population stably across cultures and time.
01:03:45.360 And that's because there's a niche for predatory parasites, like a permanent biological niche.
01:03:52.720 Now, the problem is, so they're 3%.
01:03:55.760 It's not everyone.
01:03:57.720 It's a tiny minority.
01:03:58.980 And they don't tend to be very successful, although they're not entirely unsuccessful.
01:04:05.020 Right?
01:04:05.220 So, they propagate.
01:04:08.000 The problem is, is that any social enterprise of any sort can be and often will be destabilized by the dark tetrad types, despite their minority status.
01:04:19.960 And so, then when we set up a communicative system like Twitter, which is basically a cooperative system, the parasites, the predatory parasites, can invade it and demolish it.
01:04:30.040 And they can do the same thing to whole societies.
01:04:33.540 You know, the number of people who organized the Russian Revolution after the Tsarist period was infinitesimally small.
01:04:42.720 A tiny minority of people can cause a tremendous amount of trouble.
01:04:46.180 I talked to Andy Ngo, for example, about Antifa, you know, which is not exactly an organization.
01:04:52.620 It's more like a loose, quasi-terrorist cell phenomenon.
01:04:58.920 There's no centralized bureaucracy.
01:05:01.200 There's no full-time employees.
01:05:02.620 And so, it's easy even to dismiss its existence.
01:05:06.940 And I had talked to a lot of Democrats who had done exactly that.
01:05:09.760 So, I asked Andy at one point how many Antifa organizations he thought were extant in the United States.
01:05:18.800 And he said about 40.
01:05:20.620 And I said, well, how many full-time equivalent employees, so to speak, do they have?
01:05:25.460 And he thought, well, maybe 20 each.
01:05:27.680 It's 800 people.
01:05:29.200 It's one in 400,000.
01:05:32.560 That's all.
01:05:33.420 So, in a city of a million people, you're going to have like two people like that.
01:05:37.260 But the tariff, so that's like no people, right?
01:05:39.440 It's like, well, they don't exist.
01:05:40.680 Two in a million.
01:05:41.440 Who cares?
01:05:42.560 Ah, there's the rub.
01:05:44.000 Two people in a million who are hell-bent on causing nothing but trouble, partly because they like trouble, partly because they like hurting people.
01:05:53.780 They can cause an awful lot of trouble.
01:05:55.640 And societies forever have wrestled with the problem of the free riders or the predatory parasites.
01:06:02.260 And then that brings up the terrible spectrum of censorship.
01:06:05.880 Like you said, well, nobody wants to go on Twitter if it's completely overrun by child porn distributing hyper-violent predators.
01:06:18.460 And fair enough.
01:06:19.440 So that has to be controlled.
01:06:20.800 But then the line of control becomes extraordinarily difficult to establish, right?
01:06:26.100 And that's a universal human problem of regulation of social environments.
01:06:30.400 It's not just a problem that's emerged in social media.
01:06:32.740 We don't know how to regulate that in social media.
01:06:35.980 So, you know, in normal life, if you meet someone like that, there are control mechanisms.
01:06:41.280 They tend to get beat up, for example.
01:06:43.540 They tend to get suppressed physically.
01:06:45.780 But online, zero.
01:06:48.800 There's zero.
01:06:51.620 There's almost zero ways of controlling.
01:06:54.700 And so, yeah.
01:06:56.200 Anyways, that's a little bit of a detour, but you get the point.
01:06:59.160 Well, I do.
01:07:01.020 And the thing is, people have asked me, I don't have the answer about where the line should
01:07:07.860 be drawn.
01:07:08.780 I don't think there's no line.
01:07:11.000 I, you know, as we were discussing, I think there should be some parameters.
01:07:15.360 What does—my job, as I saw it as a journalist going to Twitter was, I simply want to expose
01:07:24.280 to the public what happened.
01:07:26.900 There can be a broader conversation about where lines should or shouldn't be drawn, but my—that's
01:07:33.600 not my job.
01:07:34.520 My job is sunlight.
01:07:36.440 Let me just explain to people what happened.
01:07:39.200 They may not be aware.
01:07:40.200 And so—and the people—you mentioned Yul Roth and others.
01:07:44.520 I have to say, reading through lots of these internal Slack channel communications and emails,
01:07:50.980 a lot of the people at Twitter really were trying their best to do what they felt was
01:07:58.320 right.
01:07:58.660 These were not, by and large, people who were, you know, seeking harm on others.
01:08:03.260 Um, sometimes they went overboard, um, for sure.
01:08:07.420 And I think there was—there seemed to be—I can't prove it because it's impossible to
01:08:11.720 do a fully systematic review of everything on Twitter, but there certainly was, to my
01:08:18.900 mind, an overly heavy hand on suppressing important information.
01:08:24.180 And that heavy hand always seemed to go in one direction.
01:08:27.500 You could say nothing was too extreme as far as locking down.
01:08:31.440 That's fine.
01:08:32.100 But if you have a prominent scientist from Harvard who says, you know, maybe we shouldn't
01:08:39.020 require the vaccine on kids, that was, you know, unacceptable.
01:08:42.540 There was a person who's, um, she's just a regular citizen who had quoted some, um, statistics
01:08:49.360 from the CDC that were unfavorable, you know, as far as an inconvenient narrative.
01:08:54.040 That tweet got labeled as misleading.
01:08:56.080 And this was data from the CDC itself.
01:08:58.320 So it was very clear that—now, some people could argue—
01:09:01.520 Do you think—do you think—here's a—one of the things I noticed in Toronto, which really
01:09:06.300 scared me, was that when the—and I should have known that this was the case because of
01:09:12.240 other things I knew, but it still unsettled me.
01:09:15.580 One of the terrible things that happened was that during the lockdown, there was an opportunity
01:09:24.300 for people to inform on their neighbors.
01:09:27.300 And I certainly saw people in my neighborhood take extraordinarily positive delight in that.
01:09:32.980 You know, and you remember in places like East Germany under the Soviets, a third of the people
01:09:37.200 were government informers.
01:09:38.700 And I think there was a pull—I don't exactly understand it—but there was a pull to that
01:09:44.640 desire for authoritarian control, especially if you could wield it yourself, that added a
01:09:50.040 degree of attractiveness.
01:09:51.860 Because your mystery, as you said, like, oh, there were lots of people at Twitter.
01:09:54.980 They were trying to do their best.
01:09:55.980 But when push came to shove, all the control seemed to be ceded to the players who wanted
01:10:04.700 control.
01:10:05.880 Like, they couldn't go too far, but everyone else could go too far.
01:10:09.460 And so that begs the question.
01:10:10.800 It's like, even though those people were trying hard to do the right thing, when they erred,
01:10:15.820 they erred on the side of the authoritarians, so to speak.
01:10:19.540 And so that's a real mystery.
01:10:21.180 — That's the fascinating point to me, is that they erred on the side of the government
01:10:27.560 and erred on the side of the public health establishment.
01:10:31.040 And there are plenty of people who would say, well, that's reasonable in a time of chaos
01:10:35.100 and, you know, lots of information.
01:10:37.340 But, I mean, if you just take a little bit of a step back, not even a full step, but just
01:10:42.080 a half step back again, these are the things that liberals traditionally are incredibly—they
01:10:48.680 have their antennae up.
01:10:49.640 They're very worried about.
01:10:51.180 Yet, all of a sudden, I think the head of New Zealand at one point, she had said, we
01:10:57.220 are the truth, or something to that effect, you know?
01:10:59.340 — Yeah, she did.
01:10:59.800 Absolutely.
01:11:00.200 — Right.
01:11:00.480 So, I mean, this is—these are astonishing statements.
01:11:04.180 We, of course, don't want, quote, misinformation to get out there with people.
01:11:08.860 And there are plenty of people with crazy QAnon, crazy stuff that Bill Gates is putting a
01:11:13.800 microchip in you with the—fine.
01:11:15.560 But I saw no evidence that that was overrun.
01:11:19.580 This sort of boogeyman, this idea of some—maybe it's part of the 3% you're talking about or
01:11:24.760 some other people who are prone to believing certain conspiratorial things or what have
01:11:29.900 you or things that are outlandish.
01:11:31.300 That did not drive all of this.
01:11:34.640 That was amplified by the media saying that this was the most dangerous thing.
01:11:38.660 But we have to have some degree of trust in people to be able to be given information.
01:11:45.880 And it's—to me, it was incredibly dangerous when regular citizens, but especially fully
01:11:53.520 accredited scientists, people with medical degrees, were nevertheless had their—what
01:11:59.640 they had to say suppressed in some manner, either specifically on a place like Twitter
01:12:03.580 or in a more indirect sense by not being given a voice in a lot of media outlets, where
01:12:10.640 it was always the kind of establishment people, whether within the government or those who
01:12:15.580 wanted to be in the government, many of them—what you could observe, Jordan, was there were
01:12:19.860 certain people who, on Twitter and elsewhere, very early on, were saying lots of things that
01:12:26.960 some people suspect they knew weren't quite true, but they were supportive of what the
01:12:31.100 administration liked.
01:12:33.140 And lo and behold, they ended up getting jobs in the administration later on.
01:12:37.980 And what's going to happen to them when they leave the administration?
01:12:41.200 Maybe they'll go on the board of Pfizer.
01:12:42.800 Who knows?
01:12:43.220 But there—so you could observe that these things going on where there's this establishment
01:12:49.480 of people saying something.
01:12:50.940 It doesn't mean the establishment's wrong automatically.
01:12:53.380 We should have enough confidence in the veracity of what some experts are telling us.
01:12:58.160 They should have enough—that they are not intimidated by other experts and credentialed
01:13:03.500 people or regular people saying something different.
01:13:06.240 If something like a microchip is being inserted into me through, you know, a syringe that's so
01:13:11.280 outlandish, they should not be frightened by this.
01:13:15.640 That, like, let people say crazy stuff.
01:13:17.720 We have to err on the side of letting people—
01:13:19.540 letting people—that's exactly right.
01:13:21.180 Because every once in a while, some of that crazy shit is true.
01:13:25.020 Yeah, that's the problem, man.
01:13:27.260 And some of the things—
01:13:27.800 And that's the thing.
01:13:29.040 And you have to—to my mind, you have to—there should be a line drawn somewhere, but you
01:13:33.920 have to err on the side of more leniency, more latitude for voices, not fewer voices.
01:13:42.320 Well, especially if you're liberal.
01:13:44.280 Especially if you're liberal.
01:13:46.220 Right.
01:13:46.400 And that's what happened on a platform like Twitter, that they weren't trying to, like,
01:13:50.740 crush everyone.
01:13:51.720 These were people who, for a whole number of complex reasons, said, we're going to go
01:13:56.840 with whatever the CDC says, that's the truth.
01:14:00.300 And if someone says something against what the CDC says, then that might be labeled as
01:14:05.660 misleading.
01:14:06.320 In fact, we might even—we might even suspend that account.
01:14:09.780 But again, getting—even though Twitter is a global platform, it gets back to that American-centric
01:14:14.680 idea.
01:14:15.420 There are many things, from pediatric vaccine policies, to school closures, to a whole host
01:14:20.380 of other things that other countries were doing very differently from the United States.
01:14:24.980 But yet, is that misinformation?
01:14:27.360 If the—is the head of some, you know, health department in some country in Western Europe,
01:14:33.980 are they automatically misinforming people?
01:14:37.020 Well, I didn't say Sweden, but yes, Sweden—I mean, people don't realize Iceland as well.
01:14:41.780 Those are different cultures.
01:14:43.300 I get it.
01:14:43.940 It doesn't mean everything's exactly the same.
01:14:45.940 In my book that I'm doing, I did an intensive analysis.
01:14:49.660 I worked with an epidemiologist on this, where we looked at, when you go city by city, you
01:14:54.580 look at all the—the idea that this place was so foreign and so different is such an absurdity.
01:15:00.760 Again, because this was done without real evidence.
01:15:03.540 All of this was based on assumptions.
01:15:05.540 Assumptions built upon assumptions built upon assumptions.
01:15:08.260 That's how modeling works.
01:15:10.380 And you—to—this sort of like—there was this epistemological confusion where we were
01:15:16.580 valuing assumptions over empirical evidence.
01:15:19.860 Well, people—it's hard for people—see, the thing about models is that it's easy for
01:15:25.420 laypeople and for scientists, because most scientists really aren't scientists, you know.
01:15:32.560 It's very easy for them to confuse model with data.
01:15:37.340 Like, model is hypothesis.
01:15:39.320 And your point with models is extraordinarily well taken.
01:15:42.120 It's not only are models hypotheses.
01:15:44.340 They're multi-layered hypotheses.
01:15:46.580 And all of the layers—
01:15:48.420 Journalists often call them a study.
01:15:50.180 And I'm like, that's not a study.
01:15:51.640 But they'll all the time conflate the terms.
01:15:53.920 And the average reader has no clue what the difference is between this hierarchy of evidence.
01:15:58.420 If the model is generated by, you know, an extremely powerful computer, it's even more—it's
01:16:03.560 even easier to succumb to the temptation that the model is actually data.
01:16:09.280 It's like, no, the model is a hypothesis, a multi-layer hypothesis with many assumptions.
01:16:14.540 And that is a hard thing for people to grasp methodologically.
01:16:18.660 And so, you see that also with the climate models, the same problem obtains.
01:16:23.660 I view that as almost this sort of techno-solutionism, they call it, is this term, where it's this
01:16:28.980 idea, oh, well, the more high-tech of our solution, the more accurate it must be.
01:16:34.160 But that's not the case.
01:16:35.260 That's a fallacy to view everything in that manner.
01:16:37.980 So, but the technology could be a supercomputer, or it could be, you know, using the word technology
01:16:44.640 a little more broadly, it could be this idea of these interventions.
01:16:48.840 A mask is a technology.
01:16:50.260 So, these—so, the idea of doing something is very much—comes, I think, from a good place
01:16:57.040 in a lot of people, particularly people in public health or the medical field, that there's
01:17:00.800 a whole bunch of—compendium of literature about this, about people, the urge, the instinct
01:17:07.320 to do something.
01:17:08.680 But, you know, in emergency rooms, they have an expression that says, don't just do something,
01:17:12.980 stand there.
01:17:13.480 And sometimes it's more wise to not always intervene, but there's just this instinct
01:17:23.320 that's so powerful.
01:17:24.480 Yeah, well, it's dangerous to worship that technocratic solution.
01:17:28.080 You know, that old story of the Tower of Babel is exactly that, right?
01:17:31.780 It's a cardinal threat to erect a technological edifice and then assume that that can reach to
01:17:39.200 heaven.
01:17:39.800 That's a big mistake.
01:17:41.280 Everyone ends up unable to talk to each other.
01:17:45.380 Most interventions don't work.
01:17:47.520 That's the story of most medications.
01:17:49.260 Most things are not beneficial, or if they work, they're not a net gain.
01:17:53.260 It's quite—it is quite hard to get something approved, or it should be, typically, because
01:17:57.460 most things don't meet that bar.
01:17:59.360 Yet, the evidentiary basis for these things was not—no one held it up to that standard
01:18:05.280 for a lot of these interventions that we did.
01:18:07.840 And as I mentioned before, and then—I don't know if we have time for the church thing,
01:18:11.500 but it all wraps together that—
01:18:14.020 Well, let's try that.
01:18:14.860 Let's go there now.
01:18:15.740 Let's go to the church example, because it concretizes it.
01:18:18.880 That's right.
01:18:19.680 The people in charge who are making these decisions, like everyone, see the world through their own
01:18:25.220 lens.
01:18:26.120 They are not—no one is this omniscient, you know, being who understands the kaleidoscope
01:18:31.980 of society.
01:18:32.800 So I understand that they are individuals, but they have their own biases, as we all
01:18:37.480 do.
01:18:38.260 And so what happened in California, again, echoing what happened with kids with schools,
01:18:42.800 was churches in this particular county—this is in Santa Clara County, that's the heart
01:18:48.480 of Silicon Valley—and these people were not allowed to attend church.
01:18:53.300 And I could under—seems to me, others may disagree, perfectly reasonable in spring of
01:18:58.480 2020, we're not sure what's happening, information's still coming in, there's a bit of chaos going
01:19:03.440 on.
01:19:04.060 But very quickly after that, things began opening.
01:19:07.020 Remember, some significant portion of society was always out and about working, because guess
01:19:11.980 what?
01:19:12.340 People still needed food delivered to their door, the slaughterhouses were open, people
01:19:17.260 were still working, the cashiers at various places, et cetera.
01:19:21.140 So it's not that societies—no one pulled a switch where everything was shut.
01:19:25.180 Things were still happening.
01:19:26.280 And once that happens, what we know from implementation science, what we know is that people's ability
01:19:33.900 to comply with interventions wanes over time, particularly things that are very uncomfortable,
01:19:41.060 like wearing a mask or not being able to see friends and relatives.
01:19:45.240 And I'm not religious at all.
01:19:47.260 I don't go to church.
01:19:48.500 But I recognized that these people, when I interviewed them, they needed that.
01:19:53.360 And the church wasn't closed for a couple weeks.
01:19:55.260 It wasn't even closed.
01:19:56.260 We're talking about for seven months, people were not allowed to gather indoors and do
01:20:02.560 this thing that they did.
01:20:04.080 And again, this is something that I don't do myself.
01:20:07.200 I have no personal skin in the game, but I recognize—I get emotional thinking about
01:20:11.360 it.
01:20:11.680 These are people who suffer from addiction.
01:20:13.940 These are people who were profoundly lonely.
01:20:16.760 You might have some 70-year-old person who lives alone, and they were told, you are not
01:20:20.760 allowed to get out of your apartment.
01:20:22.300 You're not allowed to see anybody.
01:20:24.400 And maybe this person went to church every week for the past 20 years.
01:20:28.920 That's gone.
01:20:29.560 That's over.
01:20:30.100 You're not allowed to go there and have this experience that's meaningful to you.
01:20:33.720 And there were people who wanted to commit suicide because the profound sadness and loneliness.
01:20:40.800 Everyone's different.
01:20:41.660 Some people may tolerate the idea of being home and watching Netflix.
01:20:45.340 That may be totally fine for them.
01:20:47.140 And the laptop class who were able to continue to make a good living while they could work
01:20:51.300 from home on their computer, that may have been fine.
01:20:53.620 And for many kids, that may have been fine.
01:20:55.460 But for some people, it wasn't fine.
01:20:58.360 And these people—but yet at the same time, while the church—while these people were unable
01:21:03.420 to go there, guess what?
01:21:04.700 The malls were open.
01:21:06.720 Museums were open.
01:21:07.860 They were open at varying capacities, depending which time.
01:21:10.620 But it was always more than the capacity they allowed at the church.
01:21:13.960 What a profound statement that was on the values of the people who were pulling the levers of
01:21:21.100 society.
01:21:22.240 And again, so it's not like they can't even pretend that this was all about, well, we
01:21:26.880 need to mitigate the spread of the virus.
01:21:28.860 Well, if you were really that concerned about it, then why were the casinos open?
01:21:32.320 Why was someone able to go to a liquor store where you're going to have people circulating
01:21:36.100 in and out all day long?
01:21:37.980 That's not a low-impact environment either.
01:21:40.220 So you have all these things where they were picking and choosing what things could be open
01:21:44.500 and what things couldn't be open.
01:21:46.260 And to me, I recognize that for some of these people, they needed this badly.
01:21:51.420 So much so that they were attempting to commit suicide, some of these people.
01:21:55.740 Others fell back into addiction.
01:21:57.960 And it's not a place that I would go.
01:21:59.700 But the idea to not have the empathy and the understanding that we are a vast society where
01:22:09.080 we all have different needs and we all have different interests and things we like.
01:22:13.080 And even though that's not my need or my interest, I recognize the profound importance that that
01:22:19.220 place and that experience has for some people.
01:22:22.140 And maybe some 70-year-old lady said, you know what?
01:22:24.640 I don't want to get COVID.
01:22:26.140 I don't want to die.
01:22:27.120 But I'm willing to take that risk of an additional exposure to go to church because this is what's
01:22:32.060 meaningful to me.
01:22:33.640 And I think it's a, you know, and I don't know where the line gets drawn.
01:22:37.860 But the idea that people could go to a museum but not go to church inside and meet with people,
01:22:43.020 to me, seems wrong.
01:22:45.460 And I think some, what I hope...
01:22:47.700 It's wrong.
01:22:48.340 It's got to be wrong.
01:22:49.380 Look, if...
01:22:50.220 Both epidemiologically and ethically, wrong.
01:22:52.980 Well, if people have the right to do anything, they should have the right to do what they
01:22:59.620 regard as most fundamental.
01:23:01.360 And, you know, you said you came at this problem from the perspective of an atheist.
01:23:05.860 But an atheist who's observing this phenomenon, who's thinking coherently, would think, well,
01:23:12.980 these people are attempting to re...
01:23:16.380 To establish and maintain contact with what they believe to be most fundamental.
01:23:20.780 It might be their community.
01:23:22.000 It might be their beliefs.
01:23:23.040 And there has to be extraordinary, extraordinary evidence before that can be forgone.
01:23:30.140 And maybe it should never be forgone.
01:23:32.000 I think you could make that case quite strongly.
01:23:34.160 Because it's also the case that historically, the churches have been veritable epicenters of
01:23:40.680 positive response to epidemic realities, for example.
01:23:44.700 And so you close them down, not only at the peril of the people who attend the churches,
01:23:49.120 but you close them down at broader social peril.
01:23:52.480 And so it's interesting that that, even though, as you said, you don't practice religious practice
01:23:57.500 yourself, at least not churchgoing, that this is something that really deeply struck you.
01:24:03.460 Instantly, because I've been writing about kids and I try not to get into what they would
01:24:07.900 call like a victim porn, but these stories that I've been told by educators and by parents
01:24:13.480 about children, when you have autistic kids who were denied treatment and care from professionals
01:24:20.360 that they counted on and their families didn't have any money or support system to know how
01:24:25.020 to handle them.
01:24:25.860 The child abuse claims that went up, kids being beaten with a wire at home and the reports coming
01:24:31.860 in because they rely on teachers and going to school as that's a main mechanism for reporting
01:24:37.800 where teachers spot this.
01:24:39.380 The schools serve an enormously important role in people's lives, far beyond merely the education
01:24:45.860 from books.
01:24:47.420 And similarly, while it does not a place I attend, church fulfills that role for many people as
01:24:53.840 well.
01:24:54.300 And the idea that these people were denied something that was critical for their well-being, because
01:25:00.160 that's health too, not wanting to kill yourself, that's health, that matters.
01:25:05.040 And they're also, without getting into the weeds, there was no evidence that this intervention
01:25:09.580 had any benefit whatsoever.
01:25:11.720 Again, two weeks, something like that, if we pull the lever, that may work.
01:25:16.040 But what we see is over time, people aren't complying with staying home.
01:25:19.400 We can see that with the Google travel data, when they can look at the data of people moving
01:25:23.780 around, people were out and about regardless.
01:25:26.640 Schools that were in a hybrid model or the towns that were schools were closed.
01:25:30.340 They had no fewer cases than places or no more cases.
01:25:34.180 It's all over the map, the data.
01:25:36.560 The society is complicated.
01:25:38.800 And as you said quite a while ago, this idea of taking something incredibly vast and complex
01:25:45.160 and distilling it down into this simple thing, do this and you are a virtuous person.
01:25:50.640 And if you don't do that, you're a piece of garbage.
01:25:52.840 For me to even question any of this stuff, I was a lunatic.
01:25:55.760 All of a sudden, I'm some right-wing crazy person because I'm looking at data and pointing
01:26:00.700 something out and saying, wait a minute, kids are back to school in such and such country.
01:26:04.740 Why aren't they here?
01:26:05.900 I'm talking to people.
01:26:06.920 All of a sudden, I am the villain.
01:26:09.140 And the path—well, you're actually, technically, you're the pathogen.
01:26:13.140 I am the pathogen, right.
01:26:14.500 You bet.
01:26:14.880 You're the pathogen that the behavioral immune system is now responding to.
01:26:18.340 And that's extraordinarily dangerous.
01:26:20.480 Because you eliminate pathogens.
01:26:22.540 Well, you know, people often say, well, you vilify someone because you're afraid of them.
01:26:27.440 It's like, no, you don't.
01:26:28.980 You vilify someone because you identify them as a pathogen.
01:26:32.960 And then you're motivated not to avoid them, which is what you do if you're afraid,
01:26:36.700 but to eradicate them because that's what you do with pathogens.
01:26:39.840 You know, I read Hitler's Table Talk, for example, which was a collection of his spontaneous speeches
01:26:45.440 at dinner times, collected over four years.
01:26:48.420 And all the language he used to describe the Jews was pathogen language.
01:26:54.920 Pure blood of the Aryans, the parasitical nature of the Jewish interlopers, the disease-spreading
01:27:01.980 capacity of the ghettos.
01:27:03.600 It was 100% public health pathogen language.
01:27:07.300 And it's way more toxic than mere fear because people say, well, Hitler was afraid of the Jews.
01:27:12.240 It's like, no, he wasn't.
01:27:13.500 It was disgust, contempt, and pathogen language.
01:27:17.180 And that's way worse because you burn out sources of infection.
01:27:21.560 That's, there's no, right?
01:27:24.080 What, you don't offer sympathy to a pathogen.
01:27:29.180 Right.
01:27:29.480 And there's the level of disgust and control with, you know, it reached a point where the
01:27:36.620 county health department hired special officers who spied on church members.
01:27:41.500 They surveilled them.
01:27:42.900 I mean, I wrote it into my mind.
01:27:44.380 It almost was like this dark comedy.
01:27:46.100 They were peering at them through a chain link fence on an adjacent property watching church
01:27:51.340 members.
01:27:51.620 They then, once they got a, there was an order from the court that eventually allowed the
01:27:58.120 officers to enter the church.
01:27:59.880 So they were monitoring these very intimate, personal events.
01:28:04.240 They went in, there was a thing called Manna for Moms.
01:28:06.480 I remember it was one of the events that the church did.
01:28:08.760 This is like a really personal thing that the moms were doing together.
01:28:12.520 Here come the health inspectors to watch us doing this.
01:28:15.180 And you read, and I wrote this in my article on my Substack where you can read about, you
01:28:20.080 know, there were eight women present in the room.
01:28:22.500 Two of them embraced.
01:28:23.780 One woman was not wearing a mask.
01:28:25.560 I mean, just this like list, it was crazy making.
01:28:28.380 And then the, to me, and then the ultimate thing was they also monitored their cellular phone
01:28:33.580 data.
01:28:34.040 That they could see that there was a very sophisticated analysis done by the Stanford professor that
01:28:39.480 they included in, that I found in the court documents where you could see they drew
01:28:43.620 a geofence around the church.
01:28:45.320 It's this like digital border.
01:28:46.940 And you could see, they could monitor how many people were entering and leaving the property.
01:28:53.160 And even on a granular level, different buildings within the property, they could see how long
01:28:58.000 people were within each spot.
01:28:59.720 This was happening in America, like in the present tense, monitoring people.
01:29:05.700 That is something that has gone so far beyond anything that I think any regular person would
01:29:12.120 think is reasonable.
01:29:12.920 And what I would hope is that we have some sort of legislative mechanisms, I guess, put
01:29:19.660 in place to try to put some sort of brakes on something like this happening.
01:29:24.760 Again, whether it's for a pandemic or some other, because there's always an emergency,
01:29:28.620 whether there's some other thing that's going to happen, there needs to be some process in
01:29:33.740 place that was completely absent during the pandemic, in my view, from preventing this type
01:29:39.560 of behavior.
01:29:40.120 Is this the same church that has been fined?
01:29:43.820 So it's the same church.
01:29:44.860 Millions of dollars.
01:29:46.080 Unreal.
01:29:46.300 That's right.
01:29:46.860 Unreal.
01:29:47.120 So I wrote about the whole case because I had access to the court files.
01:29:51.120 And it's all in there.
01:29:52.220 And I do a lot of screenshots from the things that are in, that are a part of the discovery
01:29:56.760 and part of the court documents.
01:29:58.120 But millions of dollars of fines is, I have all the details where it was something like
01:30:02.720 up to $5,000 a day and it just kept accruing.
01:30:05.640 I mean, this was like, this would make a loan shark blush if they saw the fines that were
01:30:10.560 levied against the church.
01:30:11.900 And look, the church was, you know, thumbing their nose at the authorities.
01:30:15.860 And boy, did they not like it.
01:30:17.120 And I understand, again, in the beginning, how people may have said, hey, we need to
01:30:21.880 keep everything closed.
01:30:22.940 But once it was a few weeks and then a month and then two months and then three months,
01:30:28.000 the months just kept rolling by.
01:30:31.060 While remember, you could go to a liquor store.
01:30:33.400 You could go to a museum.
01:30:34.980 You could go to the mall.
01:30:36.420 You could go to a casino.
01:30:38.020 All these other things.
01:30:38.340 You could go to a Black Lives Matter march.
01:30:41.280 That's right.
01:30:42.200 Exactly.
01:30:43.000 So they were picking and choosing which elements of people's lives were worthy of doing and
01:30:50.060 which ones weren't.
01:30:51.120 And a lot of these decisions, not only is that an ethical choice that they were making
01:30:55.220 for regular citizens, but it was not grounded in any sort of real epidemiological evidence
01:31:01.380 to do these things.
01:31:02.500 There was no evidence in the end that people at this church had an elevated rate of cases
01:31:07.580 than anyone else in society.
01:31:09.040 Because with a high, as you know from talking with Jay, Vatacharya, a highly contagious respiratory
01:31:13.500 virus, you're not going to stop this thing.
01:31:16.060 You might have, there might be an effect early on for a very transient period of time with
01:31:21.340 everything is closed, but society will grind to a halt if that happens.
01:31:25.720 So over time, the benefit of any of these interventions just goes down like that.
01:31:32.100 Because the longer time goes on, the more people need to circulate and mix with society.
01:31:36.460 And you know who needs to most?
01:31:38.000 Who has to?
01:31:38.760 The people who are forced to.
01:31:40.100 The plumber, the cashier, the orderly in the hospital, while someone else can sit home
01:31:45.220 on their laptop, in their home.
01:31:47.220 Maybe their children have a beautiful room where they can learn, you know, do remote learning.
01:31:51.000 Maybe they hired some extra tutors or started a pod program.
01:31:54.380 That was fine for them.
01:31:55.700 And then that was virtuous as well.
01:31:57.440 It's very, very convenient.
01:31:58.880 It's nice that not only is this comfortable for us, I am also a more virtuous person.
01:32:03.880 You, you dirty person, you want to go to church?
01:32:06.440 That's bad.
01:32:07.040 You're a bad person.
01:32:08.020 So this wildly, overly simplified binary of good and bad, remarkable, because remember
01:32:14.420 with George Bush back in the day, you're either with us or against us, and liberals mocked him.
01:32:19.900 What an absurdly simplistic view of society.
01:32:23.720 But to my mind, there was very much that same type of dynamic, that same lens of this, you know,
01:32:30.340 bifurcated lens was the same view on a lot of the pandemic response.
01:32:35.260 It doesn't mean we should have done nothing at all.
01:32:37.620 It doesn't mean that we should not care about how we may infect other people.
01:32:42.980 All of those things are important.
01:32:44.140 What it does mean is the way that this was handled, the way that it was done, both with
01:32:49.720 the mechanics of the different interventions, but also with the ethical, the highly subjective,
01:32:56.200 ethical choices that were made about what people could and could not do were profoundly inequitable.
01:33:03.020 And the irony is that this is something that progressives profess to care about the most,
01:33:07.580 is, quote, equity, these sort of, you know, equality of outcome or even equality of opportunity.
01:33:12.740 Yet the children who were harmed the most were the ones who depend on school the most.
01:33:18.280 The rich kids, they suffered too.
01:33:19.900 And plenty of them, you know, I don't want to dismiss it for kids who are more well-off financially
01:33:24.820 or whatever, because everyone's different, has their own experience.
01:33:27.440 But the kids who really suffered were the ones who didn't have all of these resources.
01:33:31.480 Similar to some guy who I spoke with in California, the person who really suffered is someone who
01:33:36.540 suffers from addiction, relied on the church.
01:33:39.840 For him, that was his support system, and that was denied from him.
01:33:43.820 So you have all these people who were denied these support systems that they required, that
01:33:49.000 they needed to flourish as people.
01:33:50.700 And there was no evidence that this denial, in the end, in the end, had any benefit for
01:33:56.860 society.
01:33:59.300 That's a good place to end, but I want to ask you one more question anyways.
01:34:05.540 You've been looking into this pattern of hyper, unidimensional, virtue-signaling, hyper-response
01:34:13.460 for a number of years.
01:34:14.660 And, you know, you said early on that alienated you to some degree from the organizations and
01:34:22.100 maybe even the class of people that you had been easily and profitably and positively associating
01:34:28.300 with before.
01:34:29.520 What's changed for you personally and politically as a consequence of having waded through this?
01:34:36.160 Like, how do you view the world differently now, psychologically, but also politically and
01:34:42.180 socially than, let's say, six, seven years ago?
01:34:46.420 Profound, profound change.
01:34:48.240 And I'll tell you, the one positive part about this, I've chatted about this with a few friends
01:34:52.960 who've been similar to me, these sort of like, you know, politically homeless, so to speak,
01:34:57.680 people, is that I never thought that at my age, I'm 48, I never thought that I would have an
01:35:07.400 experience in my life that would, you know, completely alter how I view the world, how
01:35:14.860 I view myself, that would flip me off, you know, this axis that I was on.
01:35:19.660 You know, I had those experiences as a teenager in my early 20s when I'm reading the beat poets,
01:35:24.900 I'm doing these things, my mind was getting blown on a regular basis, I'm being turned
01:35:29.220 on to all these different ideas, I'm reading Marshall McLuhan, oh my God, the medium is
01:35:33.720 the message, all of these things were blowing my mind.
01:35:36.160 But as I got older, which I think is typical, that stopped happening.
01:35:40.160 There were things that still interested me, that I still was pursuing, but the idea of like
01:35:43.900 my whole frame of reference being changed, that was no longer happening.
01:35:48.960 And it's the one upside is that it's both unsettling, but also kind of amazing that the
01:35:54.880 that happened to me as like a middle-aged adult.
01:35:58.200 And I had always seen myself and my tribe as, if not more virtuous, then certainly at least
01:36:05.080 maybe more intelligent, more wise, more reasonable.
01:36:10.660 And what I observed has completely changed how I view people in different political tribes.
01:36:18.560 I have, I still think plenty of people on the right are completely insane and foolish.
01:36:22.720 And I still think people on the far left are completely foolish and insane in many circumstances.
01:36:27.900 But I no longer have this dismissive view of people, particularly on the right, that I
01:36:34.860 used to have.
01:36:36.060 And I don't even view myself, you know, I hate even using these terms left and right, because
01:36:40.180 like most people, I have a complex range of feelings and opinions on a complex range of
01:36:45.020 topics.
01:36:45.360 So I don't, I don't sit myself in one, you know, camp or another, but the, I've always
01:36:51.860 felt a little bit alienated as a person, just psychologically.
01:36:56.720 And, and this experience has made that sense of alienation profound in a way, but in a way
01:37:07.320 that's also, um, um, that livens me in a way as well, because I, wow, to, to learn something
01:37:14.420 new when you're old or getting older, it's quite remarkable.
01:37:18.620 So as, as destabilizing as this has been, it's also energizing.
01:37:23.540 And, and, you know, so, so that's why this topic has been so interesting to me because
01:37:27.760 now things see, I see them differently and I can see, I used to find it absurd, you know,
01:37:33.200 when people, the elitist Democrats and this, I'm like, oh, be quiet, whatever.
01:37:37.840 Now I, it's so obvious the smugness that I can see in the way some people speak and the
01:37:43.540 dismissiveness.
01:37:44.700 So it took this event to sort of pull the, the cloth back on, on, on, on certain things
01:37:52.740 that I wasn't able to see before.
01:37:54.880 Um, some of my views are the same as they always were.
01:37:57.760 Um, and some of them have changed, but rather than political views changing, I don't even know
01:38:02.480 what terminology to use, but more of my sort of social or societal view has changed different
01:38:09.340 from maybe my core political beliefs, but just that there's a degree of equanimity that
01:38:15.060 I think people need to embody that I still need to as well, but that I've, that I've been
01:38:21.340 able to, to, to turn the dial up on that wasn't there before.
01:38:25.480 Um, so that, I don't know if that answers your question, but it's the thing that's, it's
01:38:29.320 just been like a bizarre and remarkable experience.
01:38:32.960 The one nice thing is there's, there is this, I don't know how I was going to use the word
01:38:38.080 small and it is small.
01:38:39.140 I don't know how large or how small it is.
01:38:41.900 It's a minority, but there is a group of people who I found who are kind of like me.
01:38:47.040 Now, maybe they, maybe they were always liberals or progressives and then they found themselves
01:38:50.340 sort of adrift.
01:38:51.240 Maybe they were always a conservative, but these people who I've connected with the amount
01:38:55.160 of doctors who out of the blue, I would get an email, this is early on in the pandemic
01:38:59.520 and like, you know, spring 2020.
01:39:01.080 And the email would always start with, um, I didn't vote for Trump, but dot, dot, dot.
01:39:07.100 And, and these conversations would always start that way where everyone has to sort of apologize.
01:39:11.240 You have to kind of clear your throat first and say, and, but all these people who felt
01:39:15.220 homeless, who felt afraid to say something out loud or even confused.
01:39:19.440 And, and other people who I met in my, in my small town where I live, there was always
01:39:25.020 this, this, um, sense of while I lost other people and some friends who I think I lost
01:39:31.000 and people who think I've gone crazy, um, I've made other friends along the way of people
01:39:36.040 who also felt unmoored, who also said, Hey, wait a minute, some of this stuff doesn't
01:39:40.280 seem quite right.
01:39:41.580 Um, so that's been a wonderful thing as well that, you know, to use some sort of cliche
01:39:46.400 when a, whatever the door closes, a window opens, whatever it may be, something, something's
01:39:51.200 closed down, but maybe there's some sort of equilibrium that happens that something else
01:39:54.800 then, then opens in a way.
01:39:56.680 So, um, all right.
01:39:58.960 Well, that's, that's a good place to end.
01:40:01.380 I would say, uh, there was lots of other things that we could talk about.
01:40:05.940 I wanted to talk about, but we covered a lot of material.
01:40:09.080 I, and maybe we can get a chance to do that at some point in the future.
01:40:12.280 We didn't cover gain of function research, for example, which I would have loved to have
01:40:15.680 delved into, but that might be, that might be a topic for an entirely separate conversation.
01:40:20.360 Yeah.
01:40:20.800 Yeah.
01:40:21.120 That's part two.
01:40:22.340 Yeah.
01:40:22.600 Yeah, exactly.
01:40:23.360 But thank you very much for agreeing to talk to me today and, and the work that you've done
01:40:27.660 on this front, which, you know, you were an early canary in the coal mine, so to speak.
01:40:32.420 And, you know, that's not an easy thing to do, even though it's an exciting thing to do
01:40:36.380 and maybe too exciting, you know, as you said, to you, you're obviously a case in point
01:40:41.660 that an old dog can learn new tricks and maybe then not be such an old dog for a while again,
01:40:47.540 too, you know, because that's kind of an interesting consequence.
01:40:50.580 And so to everybody who's watching and listening, thank you very much.
01:40:54.220 As I say frequently, your time and attention isn't presumed or taken for granted.
01:41:00.600 I'm very happy that people are tuning into this podcast and watching and listening these
01:41:04.980 complicated conversations.
01:41:06.260 And to the Daily Wire Plus people, thank you for facilitating the recording on both
01:41:11.620 ends, which you do.
01:41:13.040 I'm in Portugal today in Lisbon, and we have a nice studio set up here to make this possible.
01:41:18.600 And that's not an easy thing to arrange at a moment's notice.
01:41:21.820 And, and David, thank you very much for agreeing to talk to me today.
01:41:25.560 Much appreciated.
01:41:27.400 Yeah, this is terrific.
01:41:28.660 Thanks, Jordan.
01:41:29.560 All right.
01:41:30.040 So just so everybody remembers, I'm going to talk to David for another half an hour behind
01:41:35.140 the Daily Wire Plus platform firewall.
01:41:38.900 We'll talk a little bit about the development of his journalistic interests across time and
01:41:42.700 maybe delve into some of the other things that we didn't have quite enough time to touch
01:41:46.320 on in this conversation.
01:41:47.340 If you're inclined to participate in that and you'd like to support the Daily Wire Plus
01:41:51.920 endeavor, find what they're doing worthwhile.
01:41:55.760 They've certainly been useful for me and, and fun to work with, which is quite interesting.
01:42:01.800 And so, you know, join us over there.
01:42:03.660 And other than that, ciao, we'll join you all again on another podcast in the very near
01:42:09.040 future.
01:42:09.900 Thanks, David.
01:42:12.640 Hello, everyone.
01:42:13.580 I would encourage you to continue listening to my conversation with my guest on dailywireplus.com.