TRIGGERnometry - November 04, 2023


Trump & The Intellectual Dark Web - Sam Harris X Eric Weinstein


Episode Stats

Length

53 minutes

Words per Minute

155.22952

Word Count

8,249

Sentence Count

471

Misogynist Sentences

1

Hate Speech Sentences

13


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.720 So what you're saying is you're pro-Trumpson.
00:00:03.000 Yeah, exactly.
00:00:04.280 Build a wall.
00:00:05.220 Build a wall, exactly.
00:00:07.180 You have recently made the decision not to speak to some people who disagree with you.
00:00:12.640 Why is that?
00:00:13.940 I think Brett is one of them.
00:00:16.260 Let's break that bubble.
00:00:17.160 Not because they disagree with me, but because I think they've behaved unethically.
00:00:21.460 I've always thought Brett was an extremely ethical person.
00:00:26.800 And I don't know how he got so turned around.
00:00:31.360 Hey guys, Trigonometry needs your help.
00:00:34.340 We took a big risk creating the show.
00:00:36.640 And for us to keep doing the incredible work that you all love, we need your support.
00:00:42.380 That's the only way we're going to stay independent and create content that you won't be able to find anywhere else.
00:00:48.140 There is no other podcast where you'll hear interviews with Nigel Farage one week.
00:00:52.920 And the next week, you've got Aaron Bastani, the founder of left-wing show Navarra Media, on the same platform.
00:00:58.480 You know the mainstream media aren't honest.
00:01:01.080 You know they've been caught lying again and again.
00:01:04.240 You know they can't be trusted.
00:01:06.100 The only way to change that is to make a stand and support independent content creators, like Trigonometry, to produce better and more honest content.
00:01:16.300 We have big plans and we'll shortly be announcing exciting new shows and more terrific interviews with huge guests.
00:01:22.220 That isn't going to happen without your help.
00:01:24.960 When you support us, you also get incredible extra content, such as extended interviews with none of those irritating adverts.
00:01:34.680 And they'll be released 24 hours early just for you.
00:01:38.380 We'll have exclusive bonus interviews that only you get to hear.
00:01:41.820 Click the link on the podcast description or find the link on your podcast listening app to join us.
00:01:48.320 Support us and help change the way we have conversations and make the world saner.
00:01:54.560 This is from Sal Joe.
00:01:56.460 This is the most upvoted question.
00:01:59.220 Ah, okay.
00:02:00.080 We'll take the measure of your audience's mind here.
00:02:03.240 Would Hamas have attacked Israel if Trump had been in power?
00:02:06.080 Uh, I have no reason to think not.
00:02:14.800 Um, I mean, in some ways Trump looks like a more reliable ally, maybe.
00:02:22.220 But look at the speeches Biden and Trump gave in the aftermath of this thing.
00:02:29.920 I mean, there's...
00:02:31.500 I haven't seen them.
00:02:32.500 Oh, I mean, Biden's, I mean, correcting for, you know, the evidence of, you know, geriatric neurological problems.
00:02:40.420 Because Biden's speech was just a full embrace of our allyship with Israel.
00:02:48.800 And Trump's was this kind of deranged, self-focused criticism of Netanyahu.
00:02:55.860 And, I mean, it was just, it was all about Trump.
00:02:58.660 I mean, it's...
00:02:59.960 I mean, that is on brand.
00:03:01.140 But the argument, I think, might be, sorry to interrupt, that A, Trump was seen as tougher, perhaps more irrational, which is not unhelpful in this sort of situation.
00:03:13.120 And also, he worked very hard on the Abraham Accords.
00:03:16.520 Right.
00:03:17.180 Which were taking the Middle East, some people would argue, towards a healthier place.
00:03:22.620 Yeah.
00:03:23.200 And I would agree with that part, the Abraham Accords.
00:03:27.900 I think it's just a non-factor for me.
00:03:32.560 I think this is going to happen anyway.
00:03:35.360 This will continue to happen.
00:03:37.140 It was unsurprising that it happened.
00:03:38.540 The details of its happening were profoundly surprising, given what we thought the IDF was capable of and what was unlikely to, you know, the specifics were surprising.
00:03:50.020 But this will be a recurring story of our lives until, frankly, until the Muslim world has a civil war with itself, with its own extremists.
00:04:02.340 It's not a problem we're going to solve.
00:04:05.320 We need two billion Muslims to figure out how to deal with their problem of homegrown extremism in 100 countries.
00:04:11.240 Yes, I hear you.
00:04:14.180 I think the source—sorry to needle this, but I think it's important to flesh out.
00:04:18.600 I think the source of this question, I may be hallucinating, is you were vehemently anti-Trump.
00:04:26.020 And I guess what I think people are asking is, were you not wrong about that?
00:04:32.240 Because there's so many other liabilities with Trump.
00:04:34.900 I mean, in fact, it's the similarity to Trump and Trumpism that shed—that did much to shatter Israeli society and seemingly soften it up for this attack, right?
00:04:49.700 It's insofar as Netanyahu, who's a hell of a lot smarter than Trump and more sophisticated than Trump, insofar as he resembles Trump, insofar as he was engineering a similarly populist eruption of bullshittery in his own society.
00:05:03.760 The divisiveness of all of that, the divisiveness of all of that, the untenability of all of that, the lack of pragmatism, the taking the eye off the ball of the real threats to Israel.
00:05:17.500 You know, the similarities there are awful, and insofar as we have—Trump and Trumpism has a similar effect on our own society, it represents the same kind of opportunity cost.
00:05:33.960 I mean, you think of all the things we didn't get done and will not get done for every moment we spend dealing with the divisiveness of Trumpism and the counterreaction from the left.
00:05:45.820 I mean, it's just—it's just a massive opportunity cost.
00:05:48.220 So, no, I think it's—there's nothing good—even if you can find some local instance where Trump was right about something and the Democrats were wrong, I mean, he's totally right about the southern border, right?
00:06:00.760 I mean, so that's a—there's no—in my view, there's absolutely no defense of having an indefensible border.
00:06:08.200 We should know who's coming into the country.
00:06:09.820 We should let the people in we want in.
00:06:11.640 We should keep the people out we want to keep out.
00:06:13.580 We should have, obviously, a sane and compassionate policy with respect to refugees.
00:06:18.180 Build the wall.
00:06:19.100 I mean, yes, if the wall is the best way to secure that border, then build that wall, right?
00:06:22.780 So Trump's right about that, but he's such an awful human being, and his effect on our politics is so toxic that we can't even—that half of our society has to treat that completely sane project as a form of racism because they're reacting to the golem of Trump.
00:06:42.760 So we need other people—even where Trump is right, we need other people quarterbacking those specific decisions.
00:06:52.340 So what you're saying is you're pro-Trump.
00:06:54.180 Yeah, exactly.
00:06:55.560 Build the wall.
00:06:56.540 Build the wall, exactly.
00:06:57.740 Linz1, we've touched on this, but they ask, what is Eric's take on the deluge of pro-Hamas protests after the most recent attack in Western countries?
00:07:13.460 Well, I tried to talk about, I think when I was on Rogan recently, maybe on your show, I don't remember.
00:07:19.200 There's been a huge uptick in open anti-Semitism, and it's right on schedule.
00:07:25.220 I mean, I think that if you are monitoring what I'm monitoring, you're not that surprised by it.
00:07:32.280 You just—you know, what we knew is that it's to the bone.
00:07:38.080 And so when people talk about, you know, black and brown bodies or, you know, I don't feel safe, like there was a spate of postings, but I don't feel safe as a Muslim.
00:07:49.980 You know, like, oh, cut it out.
00:07:51.180 But it's a radical anti-Semitic ideology of revolution, and it's on schedule, and it's terrible that we've gotten here.
00:08:07.520 But in part, a lot of the well-meaning left did not understand that they'd gotten into the ocean with a revolutionary current pulling them along the beach, and they end up getting out of the water very far away from their beach towel.
00:08:25.820 They're not anywhere close to being liberal tolerant because in their attempts to say, well, Black Lives Matter and we care about this, they didn't notice who they were affiliating themselves with.
00:08:40.780 And hopefully, if you've seen a charred body or severed limbs or shot up porta-potties, you're making an intelligent decision about what really matters to you.
00:08:51.420 And if you think, you know, end the occupation, free Palestine, go paragliders, you're probably really deep in it.
00:09:03.080 You know, I hope that deprogramming exists for you.
00:09:07.100 I hope that you really think about whether you want to sanction rape, torture, and message killing.
00:09:16.420 And if that's your thing, then I guess, well, you know where you are.
00:09:21.420 John Watson says, could the lads ask Sam what he thinks his old friend Christopher Hitchens would make of the current political landscape, particularly with regards to gender identity, which has essentially become a new religion?
00:09:37.060 I mean, I think he would make exactly what we have all made of it.
00:09:44.260 And it would be a lot of fun to hear him talk about it.
00:09:47.080 But I don't think there'd be any daylight between him and us on that point.
00:09:51.860 I don't imagine.
00:09:52.720 But he'd be funnier and more cutting at the same time.
00:09:55.700 It would be a lot of fun.
00:09:56.680 Yeah.
00:09:57.060 Yeah.
00:09:57.260 World by Wolf asks, Eric, pick your dream president and VP for our times.
00:10:11.340 Well, it's really hard because I think that the current political landscape deforms everyone who enters this.
00:10:19.120 I've watched this with the pressures put on Tulsi, Bernie, and RFK Jr.
00:10:28.900 We keep getting a fugu sort of situation where there are lots of parts of the fish that you like, and there's some portion of the fish that's deadly.
00:10:38.200 And so we haven't had that.
00:10:39.720 But I'm going to have to cheat on the answer to this, which is the optics of the first Obama candidacy is the substance we need in the White House.
00:10:49.080 But I haven't seen anyone with the ability to run this gauntlet who's viable.
00:10:55.040 And so I just don't think that we have a situation in which anyone is speaking in real terms.
00:11:05.660 I'm sorry to say that I can't see a single voice on the viable political landscape that excites me.
00:11:13.020 And the last thing I was excited about was Obama won.
00:11:15.400 And clearly I ended up buying a campaign that governed in a very different form than the one I expected.
00:11:27.700 BigBob86 says,
00:11:28.720 No, although I'm not sure the loss of religious subscription is the reason.
00:11:57.540 I'm sorry, what I hear in that question is that the moment you lose religious belief,
00:12:05.940 something, some religion-shaped object is going to fill that void.
00:12:09.560 And I don't actually think that's true, but...
00:12:11.660 You don't?
00:12:12.520 No.
00:12:13.220 You don't see wokeness as a kind of replacement religion?
00:12:15.860 No, I mean, I think that's...
00:12:20.280 So it has religious characteristics, right?
00:12:24.220 So it is...
00:12:25.020 But first of all, some of the people who are woke are also religious, obviously.
00:12:29.260 I mean, it's not a perfect substitute.
00:12:31.900 But I think...
00:12:35.220 It's hard to know what to make of the data, and I just know that I'm the existence proof,
00:12:48.500 and many people I know are the existence proof that it's possible to lose your belief in Santa Claus
00:12:54.020 and not replace it with something that does all the work Santa Claus did the day before, right?
00:12:59.740 So you can find your meaning and your aspiration and your motivation in other ways that run by a different logic.
00:13:11.800 You can.
00:13:12.500 But as we were talking about, people aren't the same.
00:13:14.960 But basically everyone I spend a lot of time with can, right?
00:13:19.140 Okay, would you agree with me that your experience and the experience of your highly selected friendship group, I imagine,
00:13:27.480 is not necessarily representative of the average person?
00:13:30.400 It's true, but so then we just have to admit what we're saying, which rarely gets done,
00:13:34.120 which is, I don't need these crazy ideas.
00:13:38.820 You don't need these crazy ideas.
00:13:40.620 But the morons over there are not going to figure out how to live meaningful lives.
00:13:44.500 So you've deliberately loaded it with emotion.
00:13:46.380 So let's take the emotion out.
00:13:47.640 Well, but that's the subtext of this being claimed.
00:13:51.300 No, not necessarily.
00:13:52.420 It could be that different people, because of many, many different things, genetics, cultural values,
00:14:02.660 their material circumstances, their locus of control over their own lives, right?
00:14:08.020 Which, as you get wealthier and more powerful and influential, changes.
00:14:11.640 Some people are able to deal with the things that religion traditionally has fulfilled, fear of death, meaning, purpose, etc.
00:14:21.640 That you, and to some extent I and Richard Dawkins, I put the same question to him when we had him on the show about this.
00:14:27.680 Some people are able to get it from things other than religion.
00:14:31.340 But if you take it away from other people who are not wired the way that you are, who do not have the resources that you do,
00:14:37.680 who, for whatever reason, they're not predisposed to that way of thinking, who have a greater fear of death than you do.
00:14:43.600 For those people, a set of traditional beliefs that center around something that you and I might agree isn't entirely true,
00:14:50.840 or is entirely untrue, you might say, is valuable in a way that it is not for you and your friends.
00:14:56.420 Well, so what I will admit is that there are certain cases where irrational dogmas cause people to behave better than they otherwise would.
00:15:07.480 There are certainly those cases, and that's good except for the fact that there are other ways to get people to behave in those same ways
00:15:17.020 that are more valid and scalable and have more integrity, right?
00:15:21.620 So, yeah, it's possible to go to sub-Saharan Africa and work in a refugee camp
00:15:26.620 because you believe the creator of the universe wants you to do it and you're saving souls for Christ.
00:15:31.260 But it's also possible to realize that you just actually care about a famine in Somalia or Ethiopia,
00:15:36.280 and you want to work with doctors without borders or whatever,
00:15:39.220 and you're doing that actually based on caring about human suffering in a global sense, right?
00:15:44.620 So one code is much better to be running on your brain than the other
00:15:48.620 and doesn't come with the attendant downside of suddenly throwing this moral error of like,
00:15:56.080 now we're Catholics in sub-Saharan Africa and we're teaching the sinfulness of condom use
00:16:01.740 even when people are dying from AIDS and condoms is the only way to prevent them from dying from AIDS, etc.
00:16:08.740 So there are better reasons to be good than religion tends to provide,
00:16:14.880 but I would grant you that religion does provide those reasons for some people some of the time.
00:16:19.540 My problem with religion is that it gets so much wrong that leads to unnecessary conflict and division
00:16:28.600 and opportunity cost in our society, certainly in the 21st century, with all the opportunities in front of us.
00:16:36.460 And it doesn't get the core parts right that I would agree is still our core,
00:16:41.120 like what are our real spiritual opportunities?
00:16:45.580 Just how good could human consciousness be moment to moment?
00:16:50.420 How good could human life be moment to moment?
00:16:52.960 And for that, I do think there are these core experiences at the heart of religion
00:17:00.340 that gave rise to our religions, perhaps some more than others.
00:17:04.420 I mean, again, our religions are different,
00:17:06.000 but I do think we need something like a modern Mysteries of Eleusis where,
00:17:14.440 I mean, I'm very mindful of the errors we made in the 60s around psychedelics,
00:17:20.720 but I'm very hopeful that a renaissance in research and interest in psychedelics
00:17:27.000 could allow us to put certain transformative experiences and states of consciousness
00:17:35.980 in reach for virtually everyone.
00:17:39.900 And we need a cultural context in which to absorb those epiphanies and talk about them
00:17:45.940 that doesn't make reference to our religious sectarianism,
00:17:51.480 or at least doesn't make sectarian reference to our legacy religions.
00:17:55.940 If you want to make eclectic, small-c, Catholic reference to just the storehouse of religious literature,
00:18:04.220 that's great.
00:18:05.120 There's no problem taking the good parts, even from the Koran,
00:18:08.700 to make sense of those experiences.
00:18:12.080 But real religious sectarianism, I think, has to be a deal-breaker at this point.
00:18:19.940 I mean, it's just too costly.
00:18:21.740 So I think it's good to get rid of it insofar as we are getting rid of it in the West.
00:18:28.620 That's good.
00:18:29.520 And the fact that these minor monstrosities of things like wokeness and social justice, moral panics,
00:18:40.700 even if they're drawing energy from the advent of more atheism and more secularism,
00:18:47.360 I think it's a price we should pay,
00:18:52.580 and we need to combat those forms of irrationality as well.
00:18:58.420 So I spent a lot...
00:18:59.500 In recent years, I've spent a lot more time complaining about wokeness than I've...
00:19:04.080 And leftist moral panics than I have organized religion.
00:19:08.180 And it's because it's really, in my immediate purview, it's been more costly.
00:19:17.280 You know?
00:19:17.640 I mean, it's been vitiated in our institutions.
00:19:23.480 Eric, what would you prefer, physics or music?
00:19:25.880 Broadway's smash hit, The Neil Diamond Musical, A Beautiful Noise, is coming to Toronto.
00:19:32.380 The true story of a kid from Brooklyn destined for something more,
00:19:35.880 featuring all the songs you love, including America, Forever in Blue Jeans, and Sweet Caroline.
00:19:41.640 Like Jersey Boys and Beautiful, the next musical mega hit is here,
00:19:45.700 The Neil Diamond Musical, A Beautiful Noise,
00:19:48.340 now through June 7, 2026 at the Princess of Wales Theatre.
00:19:52.660 Get tickets at mirvish.com.
00:19:55.880 I have two questions.
00:20:04.040 Well, jeez, there's a tiny amount of physics that's more beautiful than all music.
00:20:10.720 Okay.
00:20:11.240 So I'm going to go with that.
00:20:12.860 Okay, so Mark Gregg Sputnik asks,
00:20:14.900 What are the fields away from quantum gravity that you feel are the most important paths for physicists to pursue?
00:20:21.960 The standard model of particle theory was revealed to be geometric in the mid-1970s
00:20:33.660 and has been made almost perfectly geometric by the end of the 80s.
00:20:37.940 The right thing to do is to ask why we have three generations of fundamental fermions with 16 particles per generation with a crazy Higgs sector with a quartic potential.
00:20:50.700 The accidents, that is, of why this particular set of objects is, without question in my mind, the most potent question in all of physics.
00:21:02.460 And we abandoned it more or less in 1983 when we switched to saying we're not looking for a unified theory.
00:21:10.380 We're looking to quantize gravity.
00:21:12.040 And so we reinterpreted, and in my opinion, clearly misinterpreted, what the charge of physics was.
00:21:18.860 And the correct thing to do is to ask why this particular universe that we see ourselves confronting,
00:21:24.360 rather than following the path to say what is the general nature of quantum field theory,
00:21:28.760 because quantum field theory was recently revealed, like linear algebra and calculus before it,
00:21:34.720 to not have anything particularly to do with physics.
00:21:37.460 You can't do physics without it, but it turns out that it's a general framework that governs many more things
00:21:42.880 that have nothing to do with the physical world in which we live.
00:21:45.900 And so getting out of that quantum field theory monomania and getting out of quantum gravity monomania
00:21:53.180 and substituting it with a question of why this particular beautiful universe is the one in which we live.
00:22:01.420 Charles C. Verbal says,
00:22:03.080 I would defer to the experts who have really been watching this on that.
00:22:16.140 I think it's still, I mean, you know, last week, notwithstanding,
00:22:21.140 I think it's still the opinion of most qualified people that the Cuban Missile Crisis was the closest moment.
00:22:30.240 But I'm not saying we're in a good spot either.
00:22:33.080 It's, it's, I think it's our number, it should be our number one existential concern,
00:22:38.600 followed closely by a pandemic, engineered or otherwise.
00:22:49.200 So Joe asks, Eric, are there any fundamental questions in physics which AI could help resolve?
00:22:55.300 Sure, you could search for all renormalizable field theories and try to do an exhaustive search.
00:23:11.820 I'm just going to make a cautionary point.
00:23:14.020 I think that the idea of asking AI to solve your problems with physics, which probably contains more leverage than any other discipline has ever had in terms of the power that can be wielded from simply understanding something.
00:23:30.100 It's as if you haven't seen a single science fiction movie to prepare you.
00:23:35.060 This is probably the dumbest consult I've ever heard in my life.
00:23:39.000 I cannot figure out what we're doing.
00:23:42.100 I actually don't understand that last point.
00:23:44.860 You're saying asking AI to do discoveries in physics is so dangerous we shouldn't be doing that?
00:23:50.560 Yes, I think that these are the last AIs that you could consider.
00:23:57.280 You know, the difference between ChatGPT and what is it, ChainGPT, where you have it create the prompts to give you the next.
00:24:12.140 You're playing around with things that you don't understand.
00:24:16.440 Before the attention is all you need paper, you could make the case that maybe AI is very, very distant.
00:24:21.580 I was listening to Sam in my first meeting, the first day I met Sam, we did a podcast.
00:24:27.800 And Sam was talking about AI, and you can tell where this is going.
00:24:30.860 He didn't have a time frame.
00:24:32.440 I don't think anybody really understood the large language model revolution and the transformer architecture.
00:24:38.720 And I think you're now so close to something that could totally surprise you, particularly with emergent behavior,
00:24:45.640 where an AI will learn Bengali because it needs to, without being told you should learn Bengali now.
00:24:54.240 I would be very careful about the idea that we're going to rely on AI to advance our knowledge of physics.
00:25:01.260 We'll be back with Sam and Eric in a minute.
00:25:03.560 But first, we want to take a moment to talk about our partners, GiveSendGo.
00:25:07.220 GiveSendGo is a leading crowdfunding website where thousands of people around the world raise funds for business ventures,
00:25:14.620 medical expenses, personal needs, non-profits, churches, and funeral costs.
00:25:20.020 On GiveSendGo, you can raise money for whatever you need.
00:25:23.960 We've met the people at GiveSendGo, and we can tell you that they're absolutely aligned with trigonometry on our approach to free speech.
00:25:31.260 They don't just talk the talk, they walk the walk, unlike other big tech companies.
00:25:35.520 They, like us, believe that with openness and honesty, we'll create more understanding and ultimately more harmony in the world.
00:25:43.400 GiveSendGo is absolutely free to use.
00:25:45.900 With other crowdfunding sites, you'll pay between 5% and 10% of the money you raise.
00:25:50.440 GiveSendGo charge no money at all to use their platform.
00:25:53.740 They believe you should be able to keep all the money you raise.
00:25:57.160 On GiveSendGo, you can choose to raise funds for short or long-term campaigns,
00:26:02.400 whether you're in the USA, UK, Australia, or anywhere in the world.
00:26:07.440 GiveSendGo supports freedom of speech.
00:26:09.600 They won't cave to the mob, and that's why we are proud to partner with them.
00:26:13.800 Starting a campaign on GiveSendGo is easy and intuitive.
00:26:18.460 Go to GiveSendGo.com today.
00:26:21.180 Start raising money for whatever's important to you, and support the people who support freedom.
00:26:26.740 Now, back to the interview.
00:26:28.520 Ramsey says that it's great to have you back on the show, and respect for that, which is nice.
00:26:36.000 And I think alluding to the very same point, I'll Fight You Naked says,
00:26:40.820 you have recently made the decision not to speak to some people who disagree with you.
00:26:46.040 Why is that?
00:26:47.560 I'm not sure who he might be thinking of.
00:26:50.340 I mean, so there are people who I would not be inclined to talk to.
00:26:56.360 I think Brett is one of them.
00:26:57.780 Let's drink that bubble.
00:26:59.680 Not because they disagree with me, but because I think they've behaved unethically.
00:27:04.180 It's like, there's, no, if you're thinking of your brother, that's conflating two things.
00:27:11.400 There are topics that I think it was irresponsible to debate.
00:27:19.600 Via podcast.
00:27:20.620 Certainly in the middle of a public health emergency, right?
00:27:26.320 And certain styles of conversation, certain participants that I thought was just never going to converge on a useful document to export to millions of people.
00:27:36.360 So I felt it was irresponsible to have certain conversations, certain ways during the pandemic.
00:27:43.320 There are certain other people who totally disagree with me on lots of controversial points who I will talk to just because I, you know, it's, there, there hasn't been any breach of ethics.
00:27:59.720 No matter how much we disagree, they've always dealt with my side fairly or my beliefs fairly.
00:28:03.900 Then there are people who have gratuitously misrepresented my beliefs, thinking they were going to be scoring some kind of points against me.
00:28:14.240 And they were scoring points against me for their audience, right?
00:28:17.380 And those are people I'm not inclined to talk to because I just don't, I don't want to reward that kind of behavior.
00:28:22.340 I mean, it's certainly behavior that I have been fairly careful not to engage in myself.
00:28:28.780 And whenever I, whenever I've done it inadvertently, I've apologized for it.
00:28:34.300 You know, I never, I never feel like I need to straw man my opponent or much less lie about their views in order to argue against them.
00:28:42.340 So, but there's been a lot of incoming of that sort toward me.
00:28:47.260 So there's, there's some people who used to be friends who, yeah, I would not be inclined to talk to, but there are other people who, again, I'm not, I don't know what I don't know.
00:28:59.640 But like there's, there are people who I disagree with a lot, but I'll talk to them.
00:29:05.480 I mean, like, you know, Megyn Kelly is somebody who like our politics are not aligned, but she's, to my eye, she's always treated me fairly to a point that has been uncomfortable for her in front of her own audience, knowing what her audience wants.
00:29:20.740 Certainly in the aftermath of the podcast we did and the clip that got exported to, you know, every, every mind in Trumpistan, you know, the way she dealt with that seemed totally fair and honest.
00:29:34.680 And so, yeah, there's no, I've got no hesitation talking to her.
00:29:37.280 And there's a lot of people on that list.
00:29:39.000 And there's a short list of people who consciously created clickbait for their audience, very often with my name in the title.
00:29:47.880 You know, Sam Harris just said something crazy.
00:29:50.940 And then they've used that clip or some other clip, you know, even more ridiculous clips.
00:29:57.000 There was one clip made by the same person who, where it seemed I was saying, or, you know, the audience was led to believe I was saying that I wished more kids died during COVID.
00:30:08.740 So I could have been proven right in my, in my paranoid views about COVID.
00:30:14.100 So if you're going to do that, of course, I'm not going to talk to you.
00:30:17.060 And if you're going to, if you're going to dunk on, if you're going to amplify that and dunk on that, I'm not, I'm not inclined to talk to you.
00:30:22.120 Sam, one question I have about that is, could you not, I didn't understand why you didn't try to do something with Brett that was closer to correspondence chess, where you had a right to consult with experts, given that you're not a biologist.
00:30:38.420 Right.
00:30:38.700 And, or medical doctors, and the idea is that you don't have it live so that you're, you know, you get broadsided, but just decide we're going to have a go back and forth of, you know, 15 or 20 go rounds.
00:30:52.320 And I'll be consulting people and I'll be consulting people and you can be consulting people.
00:30:56.220 Let's have the best version of that.
00:30:58.700 It seems to me that you didn't opt for an, a non-standard debate.
00:31:04.780 Well, largely because of the opportunity cost, because of how much, you know, I would have viewed that probably as a, a month of work to do responsibly.
00:31:14.940 And I see no reason to do it because your brother's disappearance down the, this particular rabbit hole is, is just, it's not something that I felt I needed to interact with.
00:31:29.580 I mean, I was, it was, it was getting thrust in my face so much that I had to, I had to interact with it enough to just say, I had to kind of disavow it.
00:31:36.220 But the fact that he was going to spend a hundred podcasts in a row worth of his energy on that topic didn't, shouldn't have meant that I needed to set in to be whatever you wanted.
00:31:48.420 I mean, in other words, even if you didn't end up having a debate, at least if you did that, people would understand that you wanted to have the top line issues adjudicated.
00:32:02.440 And I very much agree that you don't get into a fight or a debate with somebody who's totally unethical, right?
00:32:09.000 Cause it's too dangerous.
00:32:10.600 Well, I just think, I mean, for me, I recently released a podcast, which I think you probably heard on this, like my post-mortem on COVID, you know, solo podcast.
00:32:22.140 And for me, the, you know, the, the airplane analogy I use there covers the sort of the, the framing that works for me, which is, you know, there are moments where you want to really drill down.
00:32:36.740 Yeah.
00:32:36.880 You want everyone to do their own research and sunlight is the best disinfectant and let's just talk about everything for as long as it takes.
00:32:42.960 And there are moments when you really don't want to do that, where it's dysfunctional to do that, where other harms follow upon doing that.
00:32:52.420 And, you know, the analogy for me that covers it is that it's like when you're getting on an airplane, you know, or much less when you're, when you're at 30,000 feet on an airplane, there are lots of things you don't want to talk about and debate.
00:33:05.040 You don't want, you don't want someone to take a poll of the passengers and, you know, ask questions like, do we still trust the pilots or, or, you know, how do we know those, those engines were engineered correctly?
00:33:16.480 And you don't want, you don't want someone pulling up an episode of the Joe Rogan podcast, which said, which has a engineer who says that he designed the engines on the plane that you're now on.
00:33:26.060 And it was always just, it's supposed to be a concept engine and it's irresponsible that they're being flown and, and, and it's like, just how much energy do you want to give that when the reality of your situation is you have to actually land this plane safely, right?
00:33:41.500 There's no alternative, but to do that, right?
00:33:44.900 And so this engine, you actually have to rely on this engine right now.
00:33:48.700 And what I, what I thought at the time, and I still think is that we were all essentially in a plane at 30,000 feet together.
00:34:00.020 And we, we, we were having to figure out how to land that particular plane, given the engines we had.
00:34:05.720 And we need, so we, you know, we needed a, we needed a CDC that we could trust.
00:34:11.080 We needed an FDA that we could trust.
00:34:12.540 We needed to trust the government messaging with respect to what was actually happening in the world.
00:34:19.400 Well, Sam, those are two very different points.
00:34:21.680 No, no, no.
00:34:22.120 I agree with you that we needed a CDC, et cetera, but we didn't have one.
00:34:26.420 Insofar as we didn't have that, that is something we need to deal with.
00:34:32.180 What I, what I feared and fear is that what has happened is there's been such an erosion of trust in, in institutions and the layer of conversation we're having about, about this in the aftermath on social media is so dysfunctional.
00:34:51.020 And so, and so, and so amplifying of mistrust, right.
00:34:54.600 And misinformation and disinformation that we're just, we're in a very bad spot to reboot from.
00:35:05.280 I mean, so, so yeah, yes.
00:35:06.640 I mean, absolutely.
00:35:08.060 We have to fix our institutions.
00:35:10.460 Are they as bad as your brother thinks they are?
00:35:13.620 No way.
00:35:14.840 Absolutely not.
00:35:15.740 Right.
00:35:15.980 So I think he is a, you know, he's just running a conspiratorial operation that I think is, is producing a lot of errors of thinking on his part.
00:35:28.020 Um, and that's why I felt like I didn't have to interact with it much, but is, is there a problem?
00:35:33.860 Is he, is he right about some things?
00:35:35.700 Of course.
00:35:36.980 Is he going to, is he going to detect real conspiracies and, and perverse incentives, uh, sooner than, than I will maybe because he's constantly looking for those things.
00:35:48.920 Um, but I would agree that, you know, yeah, stepping back from the precipice of, of, you know, it's all bullshit all the time.
00:35:56.920 All conspiracies are true all the time.
00:35:59.400 Everyone's evil.
00:36:00.340 Everyone's not saying all conspiracies.
00:36:01.380 No, no, no, I'm just, um, that's, that's the cartoon version.
00:36:05.080 Stepping way back from there.
00:36:07.040 Yes, we have to, we have to, we have to align incentives.
00:36:11.040 We need a, you know, a drug discovery regime that actually discovers good drugs and doesn't release bad ones.
00:36:17.900 Um, we need all that.
00:36:19.980 And I didn't see us hashing it out in the middle of an emergency on podcasts with obviously weird people, like some of the people, your brother and, and, and Joe platformed.
00:36:36.520 I didn't see that as the method.
00:36:38.940 I understand.
00:36:40.000 And, uh, we actually said we would finish now.
00:36:41.940 So we'll finish.
00:36:42.500 All I'll say from my perspective is that I think that heated moment is now over.
00:36:47.900 And I always hate good people disagreeing and falling out.
00:36:52.320 I hated what happened.
00:36:53.880 I'll say this in front.
00:36:54.860 This is true of both me and Francis.
00:36:56.280 We hated what happened with you after the interview with us.
00:36:59.040 And I said this to you in an email.
00:37:01.400 Uh, the person who clipped that is not a good person.
00:37:04.700 Right.
00:37:05.200 Uh, so, but so they, just to drill down on this, the reason why I'm sitting with you here is because I perceived you guys to be totally ethical in how you handle that moment.
00:37:15.120 And it was a conflict.
00:37:16.440 I mean, that moment worked to the massive advantage of your podcast.
00:37:20.520 I mean, it brought immense attention to your podcast at the absolute reputational cost of me right of center.
00:37:29.900 Right.
00:37:30.040 So like a right of center, that was a complete just witch burning of me.
00:37:33.900 But I saw the way you guys handled it.
00:37:36.900 And I have no problem with you guys.
00:37:38.520 I've never had a problem with you guys.
00:37:40.580 So, um, so that's why we're talking.
00:37:44.940 But Brett amplified, I forget this guy's name, but whoever, the guy who made that clip, and I think he made the other clip I was talking about.
00:37:52.140 Is it Jack Posobiec?
00:37:53.060 No.
00:37:53.500 No, it's...
00:37:54.720 I don't even want to give this guy attention.
00:37:56.880 He is a pure, I mean, whoever he is in reality, Twitter has made him a psychopath, right?
00:38:03.620 And your brother locked arms with him just without caveat and amplified him as just this, just citizen journalist who has gone to school on all the important topics and found all the errors in my podcast.
00:38:19.780 And those clips were just, it was just, it was a pure smear campaign, which your brother energized.
00:38:27.640 I completely agree with you about this person.
00:38:29.500 And so that's, I mean, that's where we, that's where I am, you know, there's no, I never did the analogous thing to your brother, and I wouldn't.
00:38:37.160 And I'm not sure where your brother's head is now where he can rationalize what happened there.
00:38:42.080 But he's buddies with a complete lunatic.
00:38:45.240 And what was amazing, I mean, this is a much longer conversation about how weird social media has made everything, but, like, when that 20 megaton, you know, faux pas was going off in, you know, in your, very much in your world and blowing up your podcast, in my, like, literally nothing was happening in my world.
00:39:09.320 And in the places I care about, nothing was happening.
00:39:12.060 And yet, that was the first moment I could see how my interaction with Twitter was just giving me, you know, pure noise that was pretending to be signal, right?
00:39:24.100 Now, this, and I, it's not that there was no signal there, and this is something you and I have talked about a lot, you know, offline, which is, you know, there's, I, I basically, based on how I've designed my life and how I've designed my,
00:39:37.400 my business, I have the luxury of being, of not caring about certain things happening to me online.
00:39:45.020 But this was a case where it was just, like, I was, my life was, like, air-gapped from just, like, a mushroom cloud, you know, and it was, it was just so enormous and so non-existent where I live that I just thought, okay, this is a, this is a hallucination machine, you know?
00:40:06.780 And I've spent 12 years staring into it, and it's, it's not worth it.
00:40:14.580 So it's, like, so the, the outcome, you know, the proximate cause of my getting off Twitter was the aftermath of, you know, having gone on your podcast.
00:40:21.240 And it's been, you know, apart from the occasional feeling of, like, in this last week, being on the sidelines of a, of a necessary conversation, which when I look 15 minutes later, I realize, okay, this is still just a digital sewer that I don't need to be contributing to.
00:40:42.780 I have other places to, to make noise, like, you know, on this podcast.
00:40:48.900 It has been, like, I mean, as, you know, we were talking before we, we started rolling.
00:40:54.120 It has just been, I'm embarrassed at what a upgrade of my life it's been not to be on Twitter for the last 10 months or whatever it's been.
00:41:01.180 You're welcome.
00:41:02.740 I mean, no, I mean, it's, it's completely inadvertent, but it's, it's like, it's, it's been such a gift to not be segmenting my life.
00:41:12.580 If in, you know, dozens of times a day looking at, at the, at completely, I mean, it's, I mean, like, it's, it's like waking, it's like waking up from a dream where you just can't even figure out how you got that confused, you know, and, um, it's so, yeah, uh, yeah, thank you.
00:41:35.860 You gave me meditation.
00:41:37.320 Thank you for the absolute chaos you dropped into my life.
00:41:39.920 You gave me meditation and I got you off Twitter.
00:41:41.880 Yeah, yeah, yeah.
00:41:42.460 And it remains to be seen, which is a bigger influence because it was, it was enormous.
00:41:46.440 All I wanted to say, Sam, is I think it was a very heated moment.
00:41:50.480 The way people talk, especially online, encourages fighting, encourages misunderstanding, miscommunication, et cetera.
00:41:57.940 I think you're a great person.
00:41:59.620 I understand you have your concerns about the way Brett communicated about COVID.
00:42:04.000 He's also a great person as a human being.
00:42:07.420 He's, and I can tell you from personal experience because I've, I gave Brett a piece of very unhelpful information to him that I gave to lots of other people that would require him to take an ethical, but very difficult decision.
00:42:21.660 And he was the only person out of dozens that did.
00:42:25.240 So, I hope that one day there is a world where you're sitting around a table, you're sitting on a show, like, or wherever it is, that there is understanding and reconciliation.
00:42:36.800 I really hope for that.
00:42:37.680 That's all I'm saying.
00:42:38.320 It may never happen and it's entirely up to you and Brett.
00:42:40.280 Well, to be clear, I think, I think Brett is, I've always thought Brett was an extremely ethical person.
00:42:49.140 I don't know how he got so turned around as to think that this jackass he's aligned with is a, that his ethics are well served by amplifying that kind of contribution to this conversation.
00:43:03.320 I think that you're not, if I may, because I have the same problem you have with this person.
00:43:08.140 But, but your brother, but strangely, your brother doesn't have the same problem with that, with that person.
00:43:13.420 Well, I wasn't going to get to that, which is, I think that Brett went through a catastrophe over COVID where he tried to make simple points, things like vaccines change adaptive landscapes with respect to fitness computations.
00:43:31.020 That is a non-controversial point.
00:43:33.300 And to become, I watched the system say, oh, we have a problem.
00:43:40.360 Somebody's disagreeing with the narrative.
00:43:43.360 Our beautiful public health narrative is above your pay grade.
00:43:48.420 The only way to get you out of this is to go after you as a human, to say you're, you're promoting junk science.
00:43:55.480 You're a Charlotte and you're a grifter, et cetera, et cetera.
00:43:58.780 And Brett found himself suddenly, you know, because he misstructured the point, in my opinion, like whatever his actual scientific points were, they were misstructured as medical advice over the internet at the time that there was a public health narrative.
00:44:13.720 And even if he was exactly correct on every point, which I don't think he was, I knew that that was going to end in disaster by his structuring of it.
00:44:23.260 It did end in disaster.
00:44:24.460 And what you're watching is somebody trying to claw back their reputation after having the full force of the U.S. federal government find its way, you know, into the public sphere, as we saw in the Twitter files, that, you know, government is very keen that its master narratives not be taken down.
00:44:46.220 And so when you see, you know, I don't know, is it the CDC talking about horse, you know, come on, you're not a horse, don't take horse paste.
00:44:57.460 He probably felt that he had very few friends and allies who would fight shoulder to shoulder.
00:45:02.940 And so in part, as the guy who popularized audience capture as something we all need to worry about, I can also tell you that isolation and lack of friendship is a huge problem.
00:45:15.560 And one of the things that I've been really upset about is we have a very small number of people who are strong enough to try to think in public at the moment.
00:45:22.880 And we can't afford this infighting.
00:45:27.680 And I personally think that, you know, I've said to Brett, Heather should never have gone after you for Trump derangements syndrome.
00:45:36.640 Brett should never have joined in.
00:45:38.300 You shouldn't have returned fire, but you did.
00:45:40.660 He did.
00:45:41.360 Whatever.
00:45:41.880 That's the beginning of this thing.
00:45:43.340 And, you know, now you've got this super important thing that you should figure out a way of offering something so that if and when Brett declines to do your form of a structured debate, you're not the one who's denying them the opportunity.
00:46:01.920 I'll just be very clear about it.
00:46:03.120 There was a point where Cher re-recorded the song, I've Got You, Babe, with Beavis and Butthead.
00:46:08.400 And she was so angry at Sonny Bono that she stuck it to Sonny Bono in the re-recording of that song.
00:46:13.920 The only problem is that that song was many people's wedding songs.
00:46:17.280 That was people's song where they got together and they got engaged.
00:46:20.920 Who knows what?
00:46:21.580 But it wasn't her song necessarily to do with as she saw fit because it was part of too many people's lives.
00:46:28.480 And I believe that in part one of the things that we owe our world is that we work this shit out if we have the opportunity to work this shit out.
00:46:38.460 And, you know, quite honestly, Brett has gone down a hole, in my opinion, because he's been forced to go down a hole.
00:46:46.560 So if that's the question that you're asking, I think that the issue is isolation.
00:46:51.120 And just to close it out, I made this very clear statement that if I find that any of my friends are guilty of murder, mayhem, rape, who knows what, as bad as it can be, I'm not going to be abandoning my friends.
00:47:04.940 And why do I make that statement?
00:47:06.900 One, because isolated people are dangerous.
00:47:10.240 It's not safe to have isolated people.
00:47:12.660 So you need to have friends who keep showing up even when you're in prison.
00:47:16.560 The other thing about it is that then it becomes very easy to get us to move away from each other.
00:47:22.200 And, you know, the one thing, Sam, that I can tell you is that a lot of your friends on the sort of the right-hand side of whatever this milieu that we're a part of, it's pretty clear everybody wants to be in contact with you.
00:47:35.560 They want a better relationship.
00:47:38.040 You know, everybody says, I miss Sam.
00:47:40.480 And whatever confusions they have, you can probably find that people want to figure this out.
00:47:49.160 I mean, I think people view you as a very singular and important voice.
00:47:52.380 And I think it's a mistake to have all this integrity where we stop speaking to friends because they're all now beyond the pale.
00:47:59.320 Well, this time has been so confusing that everybody has made terrible, terrible mistakes.
00:48:04.320 And the best of us have made fewer mistakes than others.
00:48:06.820 But nobody has gotten away scot-free.
00:48:09.140 It's just the quality of information we've been lied to from too many different sources.
00:48:13.100 We don't have any institutions into which we can retreat.
00:48:15.500 And I just, I think it would be a good idea for what's coming, given what we've just seen in the last week, to try to figure out how to get back.
00:48:26.580 Yeah, I mean, not to air all of our dirty laundry in public, but to air some of it in public.
00:48:33.600 I mean, all of the, there's not a general algorithm for this.
00:48:37.480 It's just that this is dependent on the specifics of a relationship and the specifics of a person's behavior.
00:48:46.520 And there are people who I still don't understand where their heads are at.
00:48:54.140 And I can't follow them down the rabbit hole, but there's no, I don't perceive any interpersonal problem between me and that person.
00:49:04.020 And, again, I don't know what I don't know.
00:49:07.560 People are, again, they're captured by their own audience.
00:49:09.680 They've got thousands of hours of audio that they're putting out there.
00:49:12.440 And I don't know what they've said about me.
00:49:13.800 But, for instance, I said something about our friend Majid on Josh Zepsis' podcast that I regretted because I sort of got walked into it.
00:49:25.880 And I said, you know, like he was telling me how crazy Majid was.
00:49:28.500 And, you know, I haven't, I mean, Majid's gotten very conspiratorial and it's all plandemic.
00:49:33.120 And it's like I don't agree with any of it.
00:49:35.820 But I said something, you know, however subtly disparaging of Majid's whole project.
00:49:43.920 And the problem, the ethical problem was I had never personally reached, I had never had a personal communication with Majid trying to figure out where his head was at.
00:49:55.120 So, yeah, so I apologized to Majid.
00:49:57.160 I went on Majid's podcast and apologized to him, talked to him for an hour about a lot of stuff that I still don't agree with.
00:50:03.500 You know, I can't explain where Majid's gone and it's kind of similar to where your brother went.
00:50:07.600 But I do perceive a difference between people who, like they're, listen, if I have misrepresented anyone's position about anything to their reputational detriment, I am just, I apologize for it.
00:50:28.300 And I do it with truly bad actors.
00:50:32.080 I mean, like Glenn Greenwald, who's just, you know, does not have an ethical bone in his body, though every bone is in fact sanctimonious and pretending to be ethical.
00:50:41.940 This is a guy who, when I was 10% wrong about whatever his stated view was, I apologized publicly on my podcast, right, where every word out of his mouth about me from him is a lie or a half-truth calculated to paint me as a dangerous maniac.
00:50:57.860 So it's like, you know, it's something that all I can do is be careful and honest on my side.
00:51:07.380 But I have to notice when people, even some former friends, are treating me with zero integrity and they see absolutely nothing, either they think it's a symmetrical relationship, like we both got this sort of bad and let's just bury the hatchet, but they're not seeing what they did.
00:51:27.120 So in the absence of Brett's attack poodle, like if the attack poodle went away?
00:51:34.000 His endorsement of the attack poodle went away, right?
00:51:36.620 Assume that his endorsement...
00:51:36.880 I don't care about the poodle.
00:51:38.400 I care about what he did with that.
00:51:39.000 Okay, assume that his endorsement of the attack poodle went away.
00:51:41.300 Totally different situation.
00:51:42.920 Interesting.
00:51:43.440 Totally different.
00:51:44.900 I'm glad we had this conversation.
00:51:46.840 I respect you both immensely and Brett.
00:51:51.600 I don't agree with everything any of you say a lot of the time, right?
00:51:55.560 That's how we are.
00:51:56.240 We're human beings.
00:51:57.120 So I really hope there's a world in which that particular thing gets patched up and lots of other things get patched up.
00:52:04.540 The pandemic was, it sent everybody fucking mental.
00:52:09.160 It just did.
00:52:10.060 We have to accept that.
00:52:11.620 Everybody in some way or another.
00:52:13.980 I hope we can get through it.
00:52:15.460 I thank you both for coming, for having this conversation.
00:52:19.180 We appreciate you both.
00:52:20.360 Yeah.
00:52:20.900 And let's hope this one goes viral for the right reasons.
00:52:24.360 Yeah.
00:52:25.180 Inshallah.
00:52:25.780 Yeah.
00:52:26.780 Wait for the clips.
00:52:28.400 Yeah.
00:52:28.820 Thanks, guys.
00:52:29.280 Thank you, guys.
00:52:29.820 Thank you, guys.
00:52:29.840 We'll be right back.
00:52:59.840 A Beautiful Noise, April 28th through June 7th, 2026, the Princess of Wales Theatre.
00:53:06.600 Get tickets at Mirvish.com.