555. How the Internet Is Breaking Our Brains | Sam Harris
Summary
Join us as we attempt to clarify the catastrophe of infinite plurality, and what that means for cultural incoherence, weakness, demoralization, and self-deception, and inability to understand one another. Today's guest is Sam Harris.
Transcript
00:00:00.000
Preborn's network of clinics are on the front lines nationwide on standby for women deciding
00:00:04.480
between the life of their babies. Preborn seeks these women out to help them choose life,
00:00:08.880
not just for their babies, but for themselves. By introducing mothers to the life growing
00:00:12.920
inside of them through ultrasound, her baby's chance at life doubles. $28 a month could just
00:00:18.240
be the difference between life and death of so many lives. To donate securely, go to
00:00:22.140
preborn.com slash dailywire. That's preborn.com slash dailywire. A single heartbeat can echo
00:00:28.120
across generations. I'm increasingly worried that we have effectively rendered ourselves
00:00:34.760
ungovernable based on the way we have shattered the information landscape. This is a consequence
00:00:40.180
of hyper-connectivity and stunning ease of communication. You can just go down a rabbit
00:00:47.040
hole and find endless confirmation that's fairly anonymized. We have to ground our perceptions
00:00:53.780
in an axiomatic framework. The old norms that the gatekeepers, I mean, for all their faults,
00:01:00.840
they had standards. I don't trust anything the New York Times prints at all. The gatekeeping
00:01:05.820
institutions have also revealed themselves as catastrophically flawed. The antidote to that,
00:01:13.300
to the failures of institutions, is not new standards. It's really to apply the old standards.
00:01:23.780
I've spent a lot of time over the years speaking with Sam Harris. We've spoken publicly half a dozen
00:01:41.800
times and privately far more than that. We're coming at the same problems, I would say, from quite
00:01:47.860
different perspectives and establishing some concordance over time. Today, we went down the
00:01:53.440
rabbit hole of rabbit holes, I suppose, discussing the fragmentation of the narrative landscape
00:02:00.080
on the social media front and what that means for cultural incoherence, weakness, demoralization,
00:02:08.140
deceit, self-deception, and inability to understand one another. And so join us as we attempt to clarify
00:02:17.560
the catastrophe of infinite plurality. Well, Mr. Harris, it looks like it's time for our
00:02:25.200
approximately annual conversation. Yeah, nice. You're the clock that ticks once a year.
00:02:32.000
Yeah, well, I suspect that's more than enough. So tell me what you're thinking about lately, Sam,
00:02:39.800
on the intellectual side and what you're doing. Well, it is actually relevant to the chaos in our
00:02:46.200
politics at the moment. I'm increasingly worried that we have effectively rendered ourselves
00:02:54.980
ungovernable based on the way we have shattered the information landscape. And I think independent
00:03:01.860
media of the sort that we're indulging now is part of that problem. I mean, I don't know if you're
00:03:07.880
aware of it or not, but I've been fairly vociferous in criticizing some of our mutual friends. And
00:03:13.640
in my case, some may be former friends, but fellow podcasters and people in independent media. And
00:03:21.420
I just think they've been part of this shattering. And it's been fairly obvious. And the cases are
00:03:30.900
different. But many people have been quite irresponsible in the way that they have
00:03:35.420
platformed people uncritically and let them spread truly divisive and dangerous misinformation.
00:03:43.240
I mean, I'm thinking especially of in the aftermath of October 7th and the global explosion of
00:03:49.180
anti-Semitism. We've had some very big podcasts like Tucker's and Joe's platform Holocaust deniers and
00:03:57.840
revisionists. And it's been quite insane out there. And it's just, I mean, that's just one piece of it.
00:04:04.140
I mean, you can talk about COVID or Trump or Ukraine or, I mean, any, pick your ugly object
00:04:11.620
out there. There's just a radical divergence of opinion into these echo chambers we build for
00:04:20.300
ourselves. And it seems to be very difficult to cross political lines. I mean, it's somehow deeper than
00:04:30.300
politics, actually. So anyway, I'm increasingly worried about that. And, you know, I'm trying to
00:04:37.180
hold up my side of the conversation in ways so as to cross those lines. But it's, I'm just noticing
00:04:42.180
that it's, it's, in many cases, it's proving impossible. Yeah, okay. Well, that's, I am aware
00:04:49.940
of that. It's actually part of the reason I thought it would be useful for us to talk today.
00:04:53.520
Okay. So I want to think about how to respond to that to begin with. Well, I think the first thing
00:05:01.360
that we should probably note is that this is a consequence of hyper-connectivity and stunning ease
00:05:11.140
of communication, right? So, I mean, it's, it's obviously the case that the
00:05:18.420
landscapes of communication that once held us together for better or worse are now so multiplicitous
00:05:26.660
that they're new, they're new, they're numberless. And so that, so what does that mean? I think what it
00:05:34.100
means in part, and this is where I think our conversation might get particularly interesting, is that
00:05:39.460
we don't have a shared story anymore. And I think a culture, I think a culture is literally a shared
00:05:49.160
story. And a story is a structure. This is, you know, being part of our ongoing discussion for a
00:05:54.840
very long period of time, right? This, the relationship between the perceptual framing that
00:06:01.860
is constituted by a story and, let's say, the domain of objective facts, right? This is a very thorny
00:06:08.620
problem. But it seems to me that you have a culture when people share the same story or the
00:06:15.960
same stories. They have the same shared reference points. And with an infinite landscape of
00:06:23.240
communication, that fragments indefinitely. And then no one, see, Sam, let me tell you,
00:06:30.400
I might as well, just to annoy you, just to get, just to get the ball rolling. I spent a lot of time
00:06:36.980
thinking about the story of the Tower of Babel. And there's two stories in Genesis that describe how
00:06:43.840
things go wrong. And one story is the flood, and that's the consequence of absolute chaos bursting
00:06:52.360
forth, essentially. But the Tower of Babel is a story about both totalitarianism and fragmentation.
00:07:00.960
So what happens is the engineers get together, because that's who it is. It's the city builders,
00:07:07.560
the tool makers, those who create weapons of war, the city builders, the engineers, they get together
00:07:14.460
and they build these towers for the aggrandizement of the local potentates. So there was competition in
00:07:23.700
the Middle East of that time to build the highest tower for the glory of the local ruler. And that
00:07:31.600
presumption, so you can think about that as misaligned aim on the sociological front. The consequence of
00:07:39.620
this misaligned aim is a kind of, what? Because the aim of the culture is wrong. Words themselves lose
00:07:50.160
their meaning. That's what happens in the story, right? Everybody ends up speaking a different
00:07:54.820
language, and then the towers fall apart. So it's because the stories are, the story that's being
00:08:02.760
told is one of human self-aggrandizement. That's part of it. And the culture pathologizes and then
00:08:12.140
disintegrates. And so I see that happening in our culture. There's a technological element of it,
00:08:20.500
obviously, that technological utopians are driving this. The transhumanists are driving this.
00:08:28.080
And we're aiming at the wrong goal. And the consequence of that is that our language is
00:08:34.100
falling apart and we don't share the same reference points. That's part of what's happening. So I'm
00:08:39.900
curious about what you think about that, you know, how that fits in with your concern, your emergent
00:08:45.880
concern. Like, when you say fragmentation, Sam, what is it that you think is fragmenting? Because
00:08:54.160
it's not the objective view of the world precisely, although the scientific enterprise even seems to
00:09:02.120
be shaky and corrupt and falling apart in many ways. Well, so I agree with that. I
00:09:09.880
I think the analogy to Babel is quite apt. You know, I don't think bringing Doge into Babel would
00:09:17.840
have helped much. I think it is technological. Yeah, I mean, there's just the fact that there's
00:09:24.640
because of the, I think largely this is a story of social media, but it's really the internet
00:09:29.800
generally because of the information technology we have built. People can find endless confirmation
00:09:39.200
of whatever their cherished opinion is. And it's no longer, there's some cultural immune system that
00:09:52.500
has been lost, right? Like if you usually, if you had to go to the physical conference out in the real
00:09:58.420
world to meet the other people who were sure they had been, who were sure they've been abducted by UFOs,
00:10:05.300
well, then you'd be meeting these people. You would see the, the obvious signs of, of dysfunction in
00:10:11.560
their lives. And you would, there'd be more friction to the maintenance of this, this new conviction,
00:10:19.440
just based on the, the, the collision with other ancillary facts that have social relevance to you.
00:10:27.440
But online, again, this even precedes social media. This, this is true of the internet back in the late
00:10:33.520
nineties. You can just go down a rabbit hole and find endless confirmation that's, that's fairly
00:10:39.700
anonymized, right? You don't, you know, the, the, uh, 20 minute documentary that blew your mind and
00:10:46.520
convinced you that, that the World Trade Center towers were brought down by the, you know, the Bush
00:10:51.220
administration. Um, uh, you didn't know that that was made by some 18 year old in his mother's
00:10:59.860
basement and you didn't have to know that you were just looking at the product online. But if you had
00:11:04.540
had to meet this person, all of a sudden you, you'd realize that this is the, the, the, the maintenance
00:11:09.840
of this fiction becomes quite a bit harder. Um, so we're living now, I think in the second generation
00:11:16.220
of that moment where the, it really is bottomless. I mean, the, the, the, the, the ocean of misinformation
00:11:23.400
and half truth, uh, and misunderstanding is bottomless and the tools we have built to rectify
00:11:31.420
misunderstandings and to spot lies and to, um, be, be better truth seekers are there, but
00:11:39.680
they have been, um, there is, in some sense there's just, this is asymmetric warfare. They're
00:11:46.760
no match for the, the, um, the information waste product that get, that can be produced
00:11:57.160
more quickly. Right. I mean, this is just the, the old problem.
00:12:00.380
Well, it's easier to produce noise than signal, obviously.
00:12:03.840
Yeah. Or, or pseudo signal. Yeah. I mean, there's so much that purports to be signal. Right. I mean,
00:12:09.560
you, I mean, and, and again, this is, this is probably, uh, socially more inconvenient for you
00:12:16.600
than it is for me, but I mean, many of your bedfellows or former bedfellows, uh, are, are
00:12:23.100
the, the principal, uh, parts of this problem. I mean, they're the, they're the gods and goddesses
00:12:29.660
on this landscape. I'm thinking of someone like Candace Owens, who's, you know, quite literally
00:12:34.100
trafficking in blood libels now on her incredibly popular podcast. I mean, she's just gone berserk
00:12:40.700
as far as I can tell. And, um, yet what is the, what, what is the style of conversation
00:12:47.380
that would disconfirm all of that for her audience? At this point, I don't know, because I think what,
00:12:53.940
what's happened is we've, we've trained up a culture of people or, or cultures of people,
00:12:58.900
uh, they simply don't care about facts really. They, they, they want a story that aligns with their,
00:13:08.940
uh, on some, on, in some sense, their confirmation bias. I mean, they, they, they have certain things
00:13:13.840
they want to believe. There's certain ideas they like the taste of, and then they just want people
00:13:18.980
catering to that appetite. And, and there's a good business in that. Well, part of that, I think, is
00:13:26.840
the consequence of the fact that we have to ground our perceptions in an axiomatic framework. And
00:13:36.600
I mean, this has been my concern with the primacy of the story right from the beginning. And I think
00:13:42.900
the deeper question is, a deeper question is, you know, is there some, uh, is there some necessary
00:13:52.580
structure to that fundamental axiomatic framework? You know, the, the postmodernist claim was that
00:13:59.340
the postmodernist claim, the fundamental postmodernist claim is that there is no uniting metanarrative,
00:14:05.260
right? The post, we live in the postmodern world. Now the postmodern world is a place of local truths
00:14:10.480
and the post, the French intellectuals, that's, they, they not only decided that they decided that
00:14:16.140
that was, uh, necessary and an improvement. And now we see the consequences of that. We're in a
00:14:22.260
landscape of infinite narratives. And the question is what, how do you, how do you, um,
00:14:29.240
how do you define a rank order of narratives such that some are valid and some are invalid?
00:14:36.620
You know, the idea of misinformation is obviously predicated on the notion that
00:14:40.180
certain narratives are invalid. And that seems self-evident to me. Um, I wouldn't exactly call
00:14:47.000
myself a fan of the direction that Candace Owens has decided to walk down, but I'm not going to say
00:14:51.720
anything more about her. Um, and, and so, you know, what I've been trying to struggle with is,
00:14:58.100
and this has been the basis of many of our discussions in, in the final analysis is what
00:15:04.980
is the proper grounding for a narrative framework? And I mean, you're, my understanding of your
00:15:11.740
position is that, um, that that's why you've turned right from the beginning to the world of
00:15:18.840
objective fact, uh, so to speak. But the problem is, is that there's a lot of facts and which ones to
00:15:26.980
prioritize and which ones to ignore is a very thorny question. And, you know, one of the things you
00:15:33.540
referred to obliquely was that, well, when you and I were young, cause we're about the same age,
00:15:39.560
I think you're four years younger than me. Um, we had narratives that united us as a culture.
00:15:45.800
There was a certain, well, there were fewer people, there was more ethnic homogeneity,
00:15:54.780
um, at least in the local environments in the world. There was, there were, um, information
00:16:01.980
brokers that were extraordinarily powerful. The universities, the newspapers, the, uh, the TV
00:16:10.340
stations, the radio stations, and they weren't very easy to get access to, and they had gatekeepers.
00:16:18.020
And at least some of the time, those gatekeepers seemed meritorious as well as arbitrary.
00:16:25.080
And, you know, it, it could easily be that the fragmentation of the landscape is a consequence
00:16:31.460
of technological revolution. And also perhaps of the, well, you, you had pointed to the
00:16:40.980
irresponsibility of the participants in that landscape.
00:16:45.280
What if I told you there's a tiny nutrient missing from your body that could potentially
00:16:48.960
change everything about how you feel? Well, if you've ever wondered why you're feeling sluggish,
00:16:53.020
sleeping poorly, or aging faster than you'd like, the answer might be simpler than you think.
00:16:56.920
That's where fatty 15 comes in. Our cells need essential nutrients to stay healthy. And most
00:17:01.380
of us are deficient in one critical one. C-15 fatty 15 is a science-backed award-winning,
00:17:06.640
pure vegan-friendly C-15 supplement with just one ingredient. And it has three times more cellular
00:17:11.800
benefits than omega-3 or fish oil. Plus it's free from flavors, allergens, and preservatives.
00:17:17.260
C-15 is the only ingredient in fatty 15, a hundred percent pure. And unlike fish oil supplements
00:17:22.280
that oxidize quickly, fatty 15 naturally resists breakdowns, both in bottle and in your body.
00:17:27.560
Fatty 15 works by replenishing your cells with C-15, which repairs cellular damage,
00:17:32.560
boosts energy production, and activates your body's natural repair mechanism for better sleep,
00:17:37.220
mood, metabolism, and heart health. Plus it comes in a beautiful, reusable glass and bamboo jar with
00:17:42.540
eco-friendly refills delivered quarterly. Fatty 15 is on a mission to replenish the C-15 levels in
00:17:47.520
your body and restore your long-term health. You can get an additional 15% off their 90-day
00:17:51.720
subscription starter kit by going to fatty15.com slash Peterson. Fatty 15,
00:17:56.400
essential nutrition for healthier cells and a healthier you.
00:18:01.440
I mean, I think it's also, or even more primarily that they're,
00:18:05.200
they're flooded with information and very, finding it very difficult to keep up.
00:18:10.860
Well, they're also just not disposed to function by the old norms that the gatekeepers, I mean,
00:18:16.860
for, for all their faults, that they had standards, right? I mean, the New York Times-
00:18:21.700
But Sam, those, those, those, I agree with you. And, but I also would say that those institutions,
00:18:29.480
the gatekeeping institutions have also revealed themselves as catastrophically flawed in the last
00:18:37.760
five to 10 years. I mean, I'm interested in your take on this. Like you brought up October 7th and
00:18:44.020
the rise of antisemitism. And I've been tracking that with a couple of friends of mine. And we've been
00:18:49.580
spending a lot of time fighting it off in all sorts of ways, some of which are public and some of which
00:18:58.200
aren't. And I'm appalled by it. What's happened in Canada on the antisemitic front since October 7th
00:19:06.660
is something I never thought I'd see in my lifetime. It, it embarrasses me to the core. My,
00:19:12.420
are my goddamn government came out the other day, that those bloody liberals, and they talked in the
00:19:19.480
aftermath of October 7th about combating Islamophobia, as if that's Canada's problem, which it isn't.
00:19:26.740
And so, but, and then, you know, you saw what happened across the, the, the United States and Canada
00:19:33.980
with regard to the universities, Columbia University in particular, and their absolute
00:19:40.380
silence and complicitness while these terrible demonstrations were going on. Not that I think
00:19:47.980
that the demonstrations themselves should have been, well, we can talk about that.
00:19:53.020
Um, letting terrorist radicals take over the universities doesn't strike me as a very good
00:19:59.300
solution. So, so I'm curious about what you think about that because, well, so, so like,
00:20:06.360
I think the gatekeepers have abandoned the gates. Like, I don't trust the new, I don't trust anything
00:20:11.640
the New York Times prints at all. I think they're reprehensible. The universities, I think, are beyond
00:20:17.760
salvaging. I can't see how they can be fixed. Anyways, man, lay it out. Tell me what you think.
00:20:23.000
I think those, all the way up until those last two statements, I can sign on the dotted line. I
00:20:28.660
think the, the, um, all of these institutions have embarrassed themselves in recent years. And for,
00:20:34.720
for the reasons that I think you and I would fully agree about, um, you know, starting, this became
00:20:40.040
most obvious during COVID, uh, but it's, you know, the October 7th is, is, uh, uh, more of the same,
00:20:47.000
but I would just point out that the, the antidote to that, to, to, to, to the failures of institutions,
00:20:53.460
uh, is not new standards. It's, it's really to apply the old standards. I mean, we, we need the
00:21:01.160
institutions. Spoken like a true conservative. Yeah. Yeah. Yeah. Yeah. Fine. Well, I mean, so it's,
00:21:07.000
no, no, but, but, but the antidote, the antidote to fails or failures of science say, you know,
00:21:12.060
or scientific fraud is, is, is not something other than science. It's, it's just more science,
00:21:17.700
real science, good science, science, scientific integrity. And so it is with journalism or any
00:21:22.900
academic discipline or any, anything that purports to be truth-seeking, we have standards
00:21:27.000
and they, and then we, there's nothing wrong with our standards. What's, what's dangerous about
00:21:32.380
the current information landscape where we have just this, this contrarian universe where anything
00:21:38.780
that is outside the institutions is considered to have some kind of primacy, right? Where everyone
00:21:44.480
is kind of a citizen journalist, a citizen scientist, where you just, you just kind of flip the mics on
00:21:50.000
and talk for four hours. And that's good enough. What's, what that's selecting for are the people who
00:21:56.580
have no standards to even violate, right? I mean, these are, these people are, are incapable of
00:22:02.100
hypocrisy. I mean, that was one, the one thing that's good about the New York Times and Harvard
00:22:06.960
and any other institution you would point to that has, has, you know, obvious egg on its face
00:22:12.700
at the moment is that at a minimum, they're capable of, of, of being shamed by their own hypocrisy.
00:22:20.320
And the people who aren't in the, the, the, I would agree with you that there's been some
00:22:23.980
institutional capture where we have people in those institutions who just shouldn't be there,
00:22:28.140
right? But there, but we would make that judgment again, by reference to these old
00:22:32.820
standards of, of academic or journalistic integrity. But Candace Owens just doesn't have
00:22:38.320
that, right? And I, you know, I'm sorry to beat up on her exclusively. I can, I can move to other
00:22:42.060
names if you want, but I mean, she's, she's not, it's not, it's the reason that I don't,
00:22:48.280
the reason that I'm not inclined to discuss her isn't because I agree with what she's doing.
00:22:52.960
It's because I think the best way to deal with what she's doing is not to discuss her.
00:22:56.980
Notice her. Okay. But I could say the same thing about Tucker Carlson, right? And you might,
00:23:01.840
whether you agree with me or not, this is my view of him, that he's not in the truth-seeking
00:23:06.780
journalistic integrity business. He's in the, he's, he's got some other political project
00:23:12.000
that entails spreading a fair amount of misinformation quite cynically and, and, and consciously
00:23:17.680
and smearing lots of people. And in the case of, you know, I don't know how deep his anti-Semitism
00:23:23.600
runs, but in the case of, of that particular topic, midwifing a, a very misleading conversation
00:23:29.880
with an amateur historian who he considers the greatest historian working in America today,
00:23:36.140
Daryl Cooper, the podcaster. And, you know, the opinion expressed, again, this is like,
00:23:42.920
this is at the highest possible level in our information ecosystem to the largest audience.
00:23:47.680
You know, few historians in human history have ever had a bigger audience than Daryl Cooper had on
00:23:53.560
Tucker's podcast, and then quickly followed by his appearance on Joe Rogan's podcast, right?
00:23:58.700
And on that podcast, he spread the lie that, you know, the, the, the recycled, you know, David Irving
00:24:04.500
point that, you know, the Holocaust is not at all what it seemed and, you know, and you wouldn't believe
00:24:11.560
it, but the, the Nazis really never intended to kill the Jews. They just, they just rounded up so many
00:24:16.740
prisoners in their concentration camps and found that, that they just didn't have enough food during
00:24:21.080
winter to feed them. And they just were put in this just impossible situation. And, and might,
00:24:26.560
might it not seem more compassionate to euthanize these starving prisoners in the end,
00:24:31.300
right? I mean, that, that's how they, they, they, they accidentally stumbled into the final solution,
00:24:35.700
right? That's, that's what he spread again to the largest possible audience. And in Tucker's case,
00:24:42.580
you had a very, I would say, you know, sinister midwifing of, of that conversation. In Joe's case,
00:24:49.920
he just doesn't know when he's in the presence of recycled David Irving and is, and is just happy
00:24:56.160
to have a conversation with a podcaster of whom he's a great fan. And, but yet he's still culpable
00:25:02.900
for not having done enough homework to adequately push back about what's being said to his, again,
00:25:08.960
to his audience, which is the largest podcast audience on earth. So it's, it's journalistically,
00:25:15.580
and I know Joe doesn't consider himself a journalist. He's considered himself a comedian
00:25:19.260
who's just having fun conversations. Great. But what, what that is tantamount to at this moment,
00:25:25.080
especially in the context of the worst eruption of antisemitism we've ever seen in our lifetimes
00:25:30.840
globally, that's tantamount to taking absolutely no, no responsibility for the, the kind of
00:25:37.320
information that is flowing unrebutted into the ears of your audience, right? That's why I got angry
00:25:42.940
at, at Joe, right? And I love Joe. Joe is a great person. He's completely in over his head on topics
00:25:49.120
of that sort. And it has a consequence. It has an effect.
00:25:52.460
Well, you know, one of the, one of the problems, I suppose, in some ways, Sam, is that in this new
00:26:01.700
information landscape, we're all in over our heads. Yeah, but some of us are alert to that
00:26:09.160
possibility and worried about it and taking steps to course correct and, and notice our errors and
00:26:14.860
apologize for those errors. Okay. Okay. Well, let's, let's also, let's also try to make a distinction
00:26:20.000
here, you know, I'm, I mean, there is a distinction that's important to make between
00:26:26.340
accidentally wandering into pathological territory, you know, and, and causing disruption
00:26:35.080
because of the magnification of your voice. And there's a big difference between that and exploiting
00:26:43.520
the fringe for your own self-aggrandizement. And there's plenty of the latter online. And I'm,
00:26:52.860
I've been concerned for some substantial amount of time that online anonymity also drives that. I mean,
00:27:00.880
you talked about the utility of embodied interaction in separating the wheat from the chaff,
00:27:08.620
right? So one of the things you see online is, as you pointed out, if you have a crazy idea,
00:27:13.960
you can find 300 other people who have even a crazier idea of the same sort, and you can get
00:27:19.980
together with them, which you couldn't have done 20 years ago, because there's only one of them per
00:27:24.500
hundred thousand scattered all around the world, but they can aggregate together quite quickly online.
00:27:30.400
The, the, the places that females gather online, for example, are rife with that kind of pathology and
00:27:36.780
all sorts of psychogenic epidemics spread, um, without any barrier whatsoever in consequence,
00:27:44.460
because young women in particular are susceptible to psychogenic epidemics. And so that's a huge
00:27:50.760
problem. It's also the case that in real world conversation, um, if I'm talking to you, you know,
00:27:59.300
it's me and I have to live with the consequences of what I've said to you. Um, assuming we ever meet
00:28:06.020
again, and I have to live with the fact that other people hear about it as well, but if I'm, if I'm
00:28:10.760
anonymous, then I can say whatever the hell I want. I can gather the, um, the fruits of that, and I can
00:28:19.460
dispense with any of the responsibility. And so my sense is that online connectivity magnifies our voice
00:28:27.340
to a degree that it's virtually impossible to be responsible enough to conduct ourselves appropriately
00:28:34.640
because the reach is just so great. And anonymity, anonymity literally, um, gives the edge to the
00:28:44.920
psychopaths, predators, and the parasites. And this is a huge problem. You know, as a biologic, we could
00:28:50.580
think about it as biologists for a moment, Sam. I mean, I would say two things when the cost of
00:28:56.580
communication is zero, the parasites swarm the system, right? Because the communication is a resource
00:29:04.500
and abandoned resources attract parasites. And what is it now? 50% of internet communication is bots.
00:29:14.720
And a huge part of the reason for that is that communication is free, but it's not free, right?
00:29:19.720
Because you have to attend to it. It actually has a cost. So the price of free is the wrong price.
00:29:26.000
You know, let me give you an example of this. Just tell me what you think about this. You know,
00:29:31.080
one of the things I've done recently with my daughter and, and, and her husband, mostly,
00:29:37.980
and a bunch of professors is start this Peterson Academy. And we have an online social media
00:29:44.240
element to that, which tracks about 15,000 regular users. And we keep a pretty close eye on it.
00:29:53.480
And we refunded the money of 10 of our students, because they were causing trouble on the social
00:30:02.660
media platform. 10 out of 15,000. That's all. And it markedly improved in their absence.
00:30:10.900
And so, you know, there's, there's, there's an interesting dynamic there. You know,
00:30:15.960
we don't know what online anonymity does. We don't know what free communication does when the actual
00:30:22.600
price isn't zero. It's, it's certainly serves the parasites extraordinarily well. And we don't,
00:30:32.780
we, we are learning that bad information is easier to generate and spread than good information,
00:30:38.760
right? These, none of this is personal, right? None of this really, I know we've already talked
00:30:44.620
about the fact that all of this, all of this, what would you say, edgy conversation can be monetized
00:30:53.380
and used to attract attention towards bad actors. Let's leave that aside. I agree with that completely.
00:31:00.000
I think it's appalling, but there are structural problems here that are even deeper.
00:31:03.620
You know, and I think, well, anonymity is a huge problem, but then also I think, well,
00:31:09.440
what the hell are we going, what, what, what kind of world would we define and live in rapidly if
00:31:15.580
every bloody thing that you had to say online was verified with a digital identity? I mean,
00:31:21.220
they've taken a lot of steps in that direction in China. That doesn't look very good to me.
00:31:25.680
Well, I think the structural problems run even deeper because I agree with, I agree with everything
00:31:31.140
you said about the effect of free and the effect of anonymity. And I, you know, I draw two lessons
00:31:35.880
from your, your experience with your online forum. One is that having, having it behind a paywall
00:31:42.700
made it, made it much cleaner than it otherwise would have been. You only found 10 people you had
00:31:48.120
to kick out to clean the whole thing up. But the other point is that those 10 people can have,
00:31:54.280
really have an, an outsized toxic influence on a, on a, a larger culture. So, I mean, that's,
00:32:01.080
I think we, we want social media platforms that, that draw that kind of lesson, but it's not just
00:32:07.540
anonymity and it's not just, uh, people who are grifting or, you know, are otherwise incentivized to
00:32:13.960
be liars or, or, um, uh, spreaders of misinformation. There are people who, with reputations,
00:32:23.080
you, you would think they would want to protect. I mean, people with real, like the, the, the biggest
00:32:28.200
possible reputations and the biggest possible careers who in the presence of social, social media
00:32:35.320
have gone properly nuts. And I, I would, you know, put as patient zero for this, uh, contagion,
00:32:43.960
uh, Elon Musk, right? I mean, Elon has, you know, I've witnessed a complete unraveling
00:32:49.720
of the person I knew, and I believe I knew him fairly well, uh, under the pressure of
00:32:57.080
extraordinary fame and wealth, but, but really, you know, kind of weaponized by his addictive
00:33:04.040
entanglement with Twitter. I mean, he was so addicted to Twitter that he needed to buy it
00:33:09.200
so that he could just live there. Right. I mean, that, that was, Twitter was his whole life
00:33:14.480
before, uh, anyone heard about his impulse to, to buy it or anyone heard about his concern about the,
00:33:20.400
the, the woke mind virus. I mean, before COVID, he had gone off the deep end into, into Twitter
00:33:26.840
being everything. Um, how do you, how do you know this? Like, I'm not, I know, I know, I know this,
00:33:32.880
I know this because I was his friend at the time and I, um, I was there, you know, in his close,
00:33:43.360
very close social circle when, you know, Twitter was causing obvious problems for his life and his
00:33:49.360
businesses. When he would tweet, you know, you know, 420, you know, funding, funding secured,
00:33:55.120
um, right. You know, and the SEC, you know, raised his, raised the offices of Tesla and seizes
00:34:01.040
everyone's computer. Right. I mean, that, that was, he was get, he was, he was screwing up his life
00:34:05.960
through Twitter and yet it was unthinkable that he would get off of it. So, so potent a drug was it
00:34:14.160
for him. Let me ask you about that. Let's speak, think about this biologically again. One of the ways
00:34:21.020
you could define addiction is as the pursuit of positive emotion that's bound to us a very short
00:34:29.760
timeframe. So you get addicted when you optimize positive emotion over a very short timeframe.
00:34:35.980
So, so for example, um, the addictive propensity of cocaine is dependent on the dose, but also the
00:34:45.000
rate of administration. So the reason that snorted cocaine or injected cocaine is more potent than
00:34:52.420
the same dose of, um, um, like swallowed cocaine is because it crosses the blood brain barrier faster
00:34:59.160
and raises the dopaminergic pitch quicker. So there's a rate and, and also there's the, the, the, the reward
00:35:07.580
component appears to correlate subjectively not with the peak in, in actual pleasure of the, the, the resulting
00:35:18.160
stimulus, but in the peak of the expectation that the pleasure is about to arrive.
00:35:23.700
Yeah. Yeah. Well, the dopaminergic system is an expectation system and cocaine. Okay. So now,
00:35:29.180
so here's what we have with social media, with, with the bots, with the, with the AI algorithm
00:35:35.580
optimizers, right? So this is what's happening. You can see it happening to YouTube too, is that
00:35:39.940
the systems are optimized to grip attention, but the battle is for the, for shorter and shorter,
00:35:49.960
what would you say? For shorter and shorter durations of attentional focus. So the battle
00:35:56.120
is not only for attention, but for the shortest possible amount of information that will grip the
00:36:01.620
maximum amount of attention. Now the AI systems are using reinforcement learning to determine how to
00:36:08.520
optimize that. And that's driving that fragmentation. Like you can see it on YouTube
00:36:12.760
because YouTube is tilted more and more towards shorts like TikTok, right? These fragmentary bursts
00:36:19.500
of maximally attractive information. And they could capitalize on rage because rage has a positive
00:36:25.100
emotion element. Now I want to put this in, in, to the context of what you said about Twitter
00:36:29.780
and you and I could have a conversation about X and, and Twitter that's personal as well. So you said,
00:36:36.460
you know, Elon got hooked on X and, and, um, enough to buy it. And so let's, let's assess that
00:36:44.880
situationally and biologically. Now I've spent quite a bit of time on X. In fact, it's the social media
00:36:51.860
platform that I've used personally the most. It's the one I'm most familiar with. Um, and I would say
00:36:59.400
it's been a very, it's very complex platform for me. There's been some concerning research about the
00:37:07.360
true safety of the abortion pill. That's worth discussing. A recent report suggests that serious
00:37:11.820
adverse effects from the abortion pill may be more common than previously understood, potentially
00:37:16.220
affecting around 11% of patients, according to their findings. Given that the abortion pill now
00:37:21.020
accounts for about 60% of all abortions in the U S with roughly a million procedures annually,
00:37:26.060
this could impact tens of thousands of women each year. This raises important questions about how we
00:37:30.900
approach reproductive healthcare organizations like the pre-born network are taking a different
00:37:34.980
approach. They reported helping over 67,000 women last year by providing comprehensive support that
00:37:40.480
addresses both physical and emotional needs while also offering spiritual guidance through their faith
00:37:45.320
based perspective. What's interesting is they're finding that when women have the opportunity to see
00:37:49.580
their ultrasound and hear their baby's heartbeat, it increases the likelihood that they'll choose to
00:37:54.140
continue their pregnancy. They've structured their program so that a single ultrasound costs just
00:37:58.320
$28 and $140 can help support five women and their babies through their decision-making process.
00:38:04.380
To support pre-borns important work, you can donate by texting baby to pound two 50 or visit
00:38:08.940
pre-born.com slash Jordan. All contributions are tax deductible.
00:38:15.980
Um, yeah. Hasn't it at various points, uh, convinced you that you should no longer use it?
00:38:20.760
Haven't you gotten on and off? Multiple times, multiple, multiple times, multiple times.
00:38:25.620
I learned that lesson exactly once, but it really did stick. I have not looked back.
00:38:29.960
Yeah. Well, that's partly what I want, that's partly what I want to talk, what I, what I want to talk to
00:38:33.420
you about. I mean, so part of it is, you know, I get a lot of my podcast guests and my ideas for
00:38:39.860
podcast guests from X from, because I follow about 2000 people, but, um, I'm very extroverted and
00:38:49.000
there's an element of impulsivity that goes along with extroversion. I'm very verbally fluent. And so
00:38:54.520
I can think up new ideas in no time flat and I'm likely to say them. And so it's very easy for me
00:39:01.400
if I'm on X to react to a lot of things. Yeah. And so foot, foot, meat, mouth. Well, that, but it's
00:39:10.300
weird. It's a weird thing because some of the things that some of my impulsive moves, so to speak,
00:39:17.000
which have got me in quite a lot of trouble, I'm not the least bit unhappy about, you know, um, I got,
00:39:24.320
you cannot believe how much flack I got for, um, tweeting out something arguably careless on October
00:39:32.900
8th. What was that? I, I, I, not being on Twitter, I'd never saw that. What was the, what was it?
00:39:39.940
I think I said, get, give him hell Netanyahu. Yeah. Yeah. Right. So that took like eight months
00:39:48.220
of cleanup work to deal with. Seriously. It was not, it was, and, and, but, but, but, but, and,
00:39:55.740
well, and I got kicked off X. Yeah. You're not going to get any dispute from me about that. I mean,
00:40:00.840
Netanyahu, just to, just to close the loop on that, Netanyahu is, is obviously a very polarizing
00:40:05.760
figure and probably a fairly corrupt figure. And he's, he's got lots of problems that have
00:40:11.400
implications for Israeli politics, but I'm not convinced that even the perfect prime minister
00:40:16.480
who has no optical problems, judge from our side, would have waged this war any differently. I mean,
00:40:23.140
I just don't, I don't know what, what they, they should have done differently, uh, at every stage
00:40:27.960
along the way. And I don't know that any other prime minister would have, uh, taken a different
00:40:31.980
path. Well, the situation to me looks like, and you tell me what you think about this and then
00:40:36.320
we'll go back to the, to the, to the problem of AI optimization of grip of short-term attention and
00:40:44.760
the manner in which X in particular falls into that category. So my sense with, with the situation
00:40:50.660
Israel was, has been right from the beginning that Iran in particular would and has set up the
00:40:59.640
situation. So if every single Palestinian was sacrificed in the most torturous possible manner
00:41:06.160
to irritate, annoy and destroy Israel and agitate the Americans, that would be 100% all right with
00:41:14.380
Iran. I think someone once said that the, the mullahs in Iran will fight Israel to the last Arab.
00:41:21.620
I think that's the line that I, yeah, yeah. Captures that. Yeah. Well, that's exactly, okay. That,
00:41:25.240
well, that's exactly, that's exactly how it looks to me. And so I look at that situation and I say,
00:41:30.060
well, I think, well, like what, what do you do in a situation like that? That's moral. If you're
00:41:36.440
Israel anyways, I don't want to go down that rabbit hole too deeply, but that's, but that's, yeah. Yeah.
00:41:41.660
Yeah. Well, but that, but okay. But so I've had this like complex relationship with X and some of it's
00:41:48.960
been real useful because I follow a lot of people there and I keep an eye on the main streams of the
00:41:53.920
culture and I extract out my podcast guests and I can see where the real pathology is emerging
00:41:59.600
and I can keep an eye on it. And the price of that is that, you know, now and then I stick my foot in
00:42:05.020
it in a major way. And sometimes that's good and sometimes it's not. And, and now I've sort of built
00:42:11.560
a variety of fences around me that are part of my organization that, you know, there are, there,
00:42:18.980
they're kind of these intermediary structures that we've been talking about that put a lag in
00:42:26.740
between what I read and how I respond, you know, well, that that's one, well, you know, and, and
00:42:33.260
this is part, it's the destruction of those things that we're starting to, you and I are starting to
00:42:38.680
talk about here because, you know, it's, there's never been a time in human history where you could
00:42:44.760
publish your first pass opinion about anything to 20 million people in one second, right? No one
00:42:54.760
could ever do that. And, and we're not, we're not, we're not neurologically constructed to live in a
00:43:03.180
world where you can yell at 10 million people whenever you want about anything.
00:43:08.640
Yeah. The problem for me is that, so what's happened now going back to this, this core topic
00:43:14.260
of, of what, what in particular is wrong with X and the time course at which people are reacting
00:43:21.700
to information and producing information in turn. Um, there's a lot wrong with that. And it's what,
00:43:30.740
what it's done to our culture and it was what it's done to specific people. I mean, I, again,
00:43:34.700
Elon for me is the, is the enormous, the, the 800 pound canary in the coal mine is that it has,
00:43:41.420
you know, it's effectively made them behave like psychopaths. I'm not saying that, I mean,
00:43:46.980
if you look, if you just look at X and this is what, what convinced me to get off of it, you,
00:43:51.560
you would think there were many more psychopaths in the world than there are. In fact, I was seeing
00:43:56.980
people who I knew in any, every other context would be psychologically normal or at least normal
00:44:03.260
enough behave like a psychopaths to me, toward me, in front of me. Uh, and in some cases,
00:44:10.820
these are people I actually knew. There's some people I, in some cases I, the people I had,
00:44:14.000
had dinner with and I knew what I was seeing on X was, was, was, would have been impossible
00:44:20.680
across the table from me at dinner. Um, right, right, right. And so that's, that's an interesting,
00:44:27.860
interesting definition of, of a pathological sub-environment, isn't it? Like you can tell
00:44:33.620
a family is pathological when the rules that apply in the family don't generalize to the outside world.
00:44:41.380
And, and you're, you're, you're making, you're, you're pointing out that the, the game dynamics of
00:44:51.360
It's that the game that's being played in Twitter doesn't suit the world well. It's not an iterable
00:44:58.000
game in the world. And, and, and it, it could easily be the fact that it maximizes for short-term
00:45:04.140
emotional reactivity is exactly what gives it that psychopathic edge, because the definition of a
00:45:11.920
psychopath in many ways is the person who will sacrifice the future and you for immediate
00:45:18.280
gratification, right? That, that, that's the pathology of the, of psychopathy is a form of
00:45:25.280
extended immaturity. Yeah. Well, there, there, there's a lot of aggressive immaturity on display
00:45:31.460
on X. And again, Elon is one of the primary offenders. I mean, so I mean, the, the, the one
00:45:37.680
instance for me that made this especially clear and the, and the role played by X especially clear
00:45:44.640
was when he, um, when he jumped up on stage during one of these, these campaign events,
00:45:49.960
or I forget if it was campaign or, or, um, I guess the, uh, the, the election already been
00:45:55.660
won, but some, some event with Trump and Elon, you know, quite famously, quite infamously did
00:46:01.640
what prepared, appeared to be a Nazi salute twice to the crowd, um, and got a reaction from
00:46:09.640
much of the world of, of, of horror and, and, um, uh, insult. And now, honestly, you know,
00:46:17.380
as his former friend and as somebody who just imagines he, he, his worldview has not, you
00:46:22.980
know, fully, um, uh, disintegrated into, um, uh, a tissue of, of, uh, weird internet memes.
00:46:31.240
Um, it's, it was impossible for me to believe that he was sincerely announcing his, his solidarity
00:46:40.820
to, to, to, with the project of Nazism by, by making those salutes, right? So I, I didn't
00:46:46.220
view those as Nazi salutes, even though just ergonomically they were in fact Nazi salutes.
00:46:53.200
Um, I just thought, okay, I don't know what he's doing, but the idea that he's picking this
00:46:58.820
moment to say, I'm a Nazi seems frankly impossible. So, um, uh, I was, I was interested to see what
00:47:07.600
he was going to do in response to the controversy, what he did in response. I mean, and, and again,
00:47:12.660
this controversy is coming in a, in a context that doesn't look at all good for my very, um,
00:47:20.420
charitable interpretation of his behavior, because it's in a context where he's funding the far
00:47:25.100
right party in Germany, uh, assuring us that there's absolutely nothing wrong with that
00:47:30.200
party. Whereas the party does in fact contain whatever Nazis there are to be contained in
00:47:35.180
Germany. Uh, not that it's only a Nazi party, but it is in addition to everything else. It's
00:47:39.780
got the Nazis. Um, it's, uh, he's, he's playing footsie with lots of, you know, fairly, um, aggressive
00:47:49.380
anti-Semites on his own platform. He's with great fanfare. He had brought back Nick Fuentes and Kanye
00:47:55.160
and these people are, you know, anti-Semites, if not actual Nazis. Um, so he's, he, he is facilitating
00:48:03.960
a very unhappy, uh, recrudescence of anti-Semitism on the platform he owns. Uh, and now he's doing Nazi
00:48:13.640
salutes in public. So what is a, what is a genuinely not, not anti-Semitic well-intentioned
00:48:21.460
person who cares about his reputation and is still capable of embarrassment do in the aftermath of
00:48:27.600
this? Well, it would have been just trivially easy for him to have said something totally sensible,
00:48:35.100
uh, and, and apologetic that would, would have been honest and would have taken the sting out
00:48:42.780
of the moment perfectly. He could have said, listen, I know how that looked. I don't know what
00:48:47.480
I was doing up there. I was just, you know, captured by the energy of the moment. Obviously
00:48:51.840
I was not doing a Hitler salute. I'm not a Nazi. I've got no, uh, no interest in, in amplifying
00:48:59.520
their message on X or anywhere else. If you're a Nazi, please don't follow me. I hate your whole
00:49:04.560
project. Uh, you're completely, you're completely wrong about everything, right? End of tweet, right?
00:49:10.220
He did nothing like that. All he did was troll his audience, making Nazi jokes and puns on
00:49:18.540
X that I, so you can fault his character for that. But what, but what I also think we should
00:49:25.260
fault is the medium itself, right? This is the way his brain is conforming to the technology.
00:49:32.280
Yes. Well, look, you, you know, you know, the fundamental attribution error is like the
00:49:38.440
one thing social psychologists have discovered that's actually valid. That's a bit of an
00:49:43.240
exaggeration, but, but the fundamental attribution, yes, a dozen things. The fundamental attribution
00:49:50.020
or error is the proclivity to attribute to character what's actually a consequence of
00:49:54.600
the situation. You know, in these, we, we should be very careful. And I think we are at the moment,
00:50:00.240
be very careful to assure that our first presumption is that it's the pathology of the technology.
00:50:06.080
That's the fundamental driver. And that people are swept along in it.
00:50:10.500
That's, that's my account of what has happened to Elon almost in its entirety. I think, I think,
00:50:17.260
you know, Twitter has, you know, he, he is the, the, the greatest living casualty of what Twitter does
00:50:25.300
to someone who becomes properly engorged by it. And that's, yeah. So, but what, and, and one of the
00:50:33.740
reasons why I got off, frankly, was apart from my own misadventures on the platform, which were
00:50:38.260
nothing like Elon's, I, I looked in the kind of the funhouse mirror of what was happening to him
00:50:47.020
in his life. And I thought, you know, here's a, here's a very smart guy who's got much better
00:50:52.140
things to do than fuck up his life in this way. And yet he can't seem to stop. How much, how much
00:50:59.020
am I like him? How much, how much is there this component of addiction and dysregulation and, and
00:51:05.280
failures of impulse control and a need to just, you know, get, get my thoughts out on a time course
00:51:13.260
of seconds rather than more carefully, you know, over the course of days. I mean, cause it was so,
00:51:19.960
and, and, and, and so then I yanked it for that reason. And the one thing I found is that when you
00:51:24.260
don't have it as an outlet, right, when you literally can't publish that quickly, then things have to
00:51:31.320
survive a much larger informational half-life. So then there's this thing online that happened that
00:51:38.140
I'm tempted to react to, it has to survive until I do my next podcast, which might not be for three
00:51:44.020
or four days. Right. And so, and, and, and, you know, obviously 90% of the things I thought I had
00:51:49.120
to react to don't survive that, that time course. Yeah. You know, I made a deal with my wife
00:51:54.220
going sideways, I think with a fair degree of accuracy and that disrupts me emotionally now and
00:52:10.140
then. And I made a deal with my wife several years ago that I can't complain about anything I won't
00:52:17.460
write about. Right. Well, that's, well, it's the same thing and it, it bears on the same issue that
00:52:24.660
you're describing is that if it's not important enough to, to write about, then you should ignore
00:52:34.260
it. Right. You're not actually, it's not significant enough. It's not significant enough to sacrifice some
00:52:41.780
genuine time and thought. You, you shouldn't be commenting on it. And that, that's, that's,
00:52:49.380
that's kind of a maturity, but it's also, it's a, it's a weird thing because it's not exactly like,
00:52:57.500
it isn't something that people had to contend with previously because you couldn't publish
00:53:02.200
immediately. There was, there were barriers of cost and difficulty and gatekeepers and, and distribution.
00:53:09.320
And so that wasn't something you had to think up for yourself. Like, how do I put a lag in my life
00:53:16.900
before I communicate with a million people or 5 million people? And so you're, you're basically
00:53:22.360
building these inhibitory structures out of whole cloth. And, and now you, you pulled out of Twitter
00:53:31.300
a long, quite a while ago now, it's a couple of years ago. Yeah. Right. Okay. So two and a half
00:53:37.220
years, something like that. Yeah. Well, it was actually, it was actually right when Elon took it
00:53:40.720
over, but it wasn't because he took it over. I mean, that, the timing there was, was fairly
00:53:45.360
accidental. I was, I was getting ready to pull the plug. And then I just saw how much chaos was being
00:53:52.920
introduced into his life around it. And I just thought, all right, this is, this is a sign.
00:53:56.680
And so I, I yanked it. And I mean, one of the benefits, apart from just this introducing this
00:54:06.080
different time course into my life by which I, I interact with information, I just don't like,
00:54:13.300
you know, there's this, there's this phrase, you know, that Twitter isn't real life. And then
00:54:17.300
at a certain point, many of us realize, okay, that's, that's too sanguine a thought because
00:54:23.080
we're noticing people get, you know, losing their reputation so fully that, you know,
00:54:27.000
they get on an airplane, like the, I think it was the Justine Sacco incident where she got on an
00:54:31.020
airplane and then half the world was tweeting about her and she, she arrived and at her destination
00:54:35.620
only to find that she had been properly canceled and lost her job, et cetera, et cetera. Um, so,
00:54:41.020
so obviously Twitter can, you know, whether you're on it or not, it can, it can,
00:54:44.160
under the right circumstances or the wrong ones become real life. But the truth is given the
00:54:49.580
platform I've built, given the, the, I mean, I just frankly, how lucky I've been to find an
00:54:54.400
audience and to build up, you know, a readership and a podcast listenership, Twitter really isn't
00:54:59.660
real life for me. Right. And like, I, I, like I'm still, Elon still attacks me on Twitter by name.
00:55:04.640
And I find out I'm trending on Twitter, you know, years after I've left and it matters not at all for
00:55:11.060
my life. It matters not at all for my business. Nothing happens. Right. And yet if I were on Twitter,
00:55:17.620
there would be this illusion of emergency, right? If I was on there looking at it and looking at the,
00:55:24.360
you know, looking at the biggest, literally the biggest bully on Twitter has just punched me in
00:55:28.680
the face and I'm seeing the aftermath of it, the temptation to respond to that and to make it to,
00:55:35.780
and to, and to feel that not only do I have to respond there, but I have to respond on my podcast.
00:55:40.020
And, and then now this is how I'm spending my week because this thing just happened on Twitter.
00:55:44.320
Um, it would be almost impossible not to be taken in by that and not to, not to be just convinced of
00:55:52.320
the necessity of it because all of this is really important. I mean, we're talking about millions of
00:55:57.420
people. Like, I mean, like literally there, there, there are videos, um, denigrating me for things I've
00:56:05.360
never said or believed that Elon has amplified and these videos have 50 million views. Right. And I'm,
00:56:13.580
I just happened to be lucky enough to have built a life and a career where that matters not at all.
00:56:19.100
Right. But for somebody else finding themselves in that situation, I can, I can well imagine, all right,
00:56:25.700
this is just, this is the destruction of my reputation in a way that matters. And.
00:56:29.680
Well, that's what it looks like. Sure. And, and like you said, it's virtually impossible to, to, to resist
00:56:36.640
that temptation. I mean, who are you to deny the, um, impact of the opinion of 50 million people? You know what I
00:56:44.960
mean? I mean, it, that, that looks like an insane pride in a way to ignore that. But the point that you're making
00:56:52.000
is that it's very difficult to, to, um, Getting the most out of life means being prepared for
00:56:59.400
whatever comes your way. But many of us don't realize that a simple will doesn't actually cover
00:57:03.500
all aspects of estate planning. There are crucial elements that need separate attention. That's
00:57:08.200
where trust and will steps in to help ensure your loved ones are fully protected in every situation.
00:57:12.920
Right now you can visit trustandwill.com slash Peterson to get 20% off their simple, secure,
00:57:17.620
and expert backed estate planning services that cover all your essential bases. The process is
00:57:22.320
straightforward and free of complicated legal jargon. So you can complete your estate planning
00:57:26.160
from the comfort of your own home, knowing your assets and final wishes are properly documented and
00:57:30.680
legally protected can give you peace of mind so that you can focus on living your life fully,
00:57:35.040
knowing your loved ones will be taken care of. And according to your exact wishes, plus their
00:57:39.000
website is incredibly user-friendly and simple to navigate, making the whole process super
00:57:42.880
straightforward. What's particularly reassuring is that your personal information and documents are
00:57:47.140
protected with bank level encryption for maximum security. Each will or trust they create is
00:57:51.440
tailored specifically to your state's laws and your individual needs, covering everything from
00:57:55.440
care wishes and guardian nominations to final arrangements and power of attorney documents.
00:58:00.040
It's no wonder they have an overall rating of excellent and thousands of five-star reviews on
00:58:04.080
Trustpilot. We can't control everything, but Trust and Will can help you take control of protecting
00:58:08.220
your family's future. Head over to trustandwill.com slash Peterson for 20% off. That's 20% off at
00:58:17.140
Well, it's very easy to ignore it when it actually isn't making contact with my views.
00:58:25.440
Right, but it's hard to see that it isn't, like, because it's so—it appears so powerful.
00:58:30.440
You know, we've found as a social media platform that Twitter is the worst of all social media
00:58:41.920
That's because you're next to some, you know, somebody getting beaten to death in a liquor
00:58:48.480
store. I mean, like, when I go on Twitter, since I don't have an account, I'm not, you know, so I
00:58:53.640
have a naive account. It's not following anyone, and I almost never click anything. So I really see
00:58:59.460
this pure algorithm when you just kind of just look at the homepage scroll and—or as pure as it gets.
00:59:05.980
I mean, maybe it's got some information on me based on my, you know, IP address or something. But
00:59:10.220
if I ask myself, what is this algorithm trying to get me to be or to believe?
00:59:17.180
Honestly, I can tell you that it is trying to get me to be a racist asshole, right? And a fan of
00:59:25.820
Elon's, right? So it's given me a lot of Elon, and then it's given me a lot of, like, black teenagers
00:59:32.060
beating up white—you know, a single white teenager or people of color robbing stores and getting shot
00:59:39.040
in the face. I mean, it's just, like, 4chan-level awfulness, and then the occasional, you know,
00:59:46.000
unlucky brand advertising to me in that context. I mean, it's just—it's a monstrosity of a platform
00:59:53.200
from which to actually try to sell things. So it's—but yes, if I were on Twitter following 2,000 smart
01:00:03.680
people as you are and feeling that they are curating for me, you know, the best of their
01:00:11.680
information diet, I would have a—I know what that experience is like, because that's what I was
01:00:15.680
doing. That's why I was on it for, whatever, 12 years and couldn't convince myself to get off it.
01:00:20.400
It seemed like a professional necessity. It seems—it seemed so good in the sense—the incoming stuff was
01:00:28.160
so good because, again, I had chosen who to follow, and all these people were reading great articles and
01:00:33.120
forwarding them and having great short takes on them. And it was—all that stuff was great, but I
01:00:38.400
have managed to get a surrogate of that in the way I find information otherwise. And what I don't have
01:00:49.360
is the emergency. Like, I mean, the ruined vacation where somebody, you know, like somebody, some genius
01:00:55.440
over at the New York Times has called me a racist, and now I have to, you know, spend the rest of my
01:01:02.080
vacation with my family trying to figure out how to respond to this. I've tweeted back at them and
01:01:09.760
blah, blah, blah, blah. It's escalated, and now we've just nuked each other. And—
01:01:17.040
Yeah, it looks real, but it feels real, and it is real if you spend your time that way. I mean,
01:01:22.160
that's the thing. If you spend your time that way, which I did for years, it is real. It is the
01:01:27.840
substance of your life. It is the manner in which you—it's the thing you bring back to the
01:01:33.040
conversation with your wife, you know, five minutes later, or five hours later, more likely.
01:01:38.480
And it's in your head. And just, it was a ghastly use of attention. That's what I finally realized.
01:01:47.200
Well, you made an illusion when you were talking about what you regard as the unfortunate effect of
01:01:53.840
X on Elon and maybe on other users, so let's assume that, that you were afraid that the sort of things
01:02:03.040
that you were seeing happening to others, more than merely Elon, let's say, in your estimation,
01:02:10.160
were also happening to you. And so, what do you think, in retrospect, what do you think it was doing
01:02:19.440
to you? You just talked about the effects on your family on vacations. I've experienced a fair bit
01:02:24.960
of that. I understand exactly what you're saying. And it does seem like the world's burning, and you
01:02:30.160
better do something about it right now. And it's no wonder it seems that way, because it's lots of
01:02:35.600
people, and generally in our normative ecosystems, if lots of people appear to be upset with you or
01:02:43.440
around you, you should pay attention. But Twitter isn't the real world. We don't know what the hell it is,
01:02:49.120
you know? It looks more and more like a world of demonic bots, and God only knows what that world is.
01:02:54.560
But what did you see, especially now that you've been away for a while, what elements of your
01:03:01.520
character do you think were pathologized and that were brought to the forefront? Because of this.
01:03:09.440
Yeah, I considered myself a fairly careful user of it. I mean, I was not at all like Elon. I was
01:03:17.520
not addicted to it in that way. I was not tweeting hundreds of times a day. I think I averaged something
01:03:24.480
like three tweets a day over the course of my use of it. And that would come in spurts. I mean,
01:03:30.880
so I would not tweet for three days and then send out a dozen tweets, you know, because it was some hot
01:03:36.720
topic. I was always fairly careful so that I, I honestly don't think I ever said anything on the
01:03:46.240
platform that I regretted, right? I mean, if I ever made a mistake, I apologize for it. But I was,
01:03:51.520
I never, you know, I treated it like writing. I treat, I was aware I was publishing in that channel,
01:03:58.320
however quickly and impulsively I was, you know, I'm a much, I'm enough of a writer and an academic
01:04:05.520
to feel like, okay, this is yet another occasion where embarrassment is possible and you don't want
01:04:10.160
that. So I never, I'm not, I don't remember ever really screwing up on the platform. And yet what
01:04:19.040
happened there was, I mean, I can honestly say that for a decade, the worst things in my life,
01:04:28.000
and in some sense, the only bad things in my life came from Twitter, came from my interaction with
01:04:34.320
Twitter. I mean, apart from like a family, you know, family illnesses, you know, that's leaving
01:04:39.040
something, leaving that aside. My life was so good. And yet I had this, you know, digital serpent
01:04:47.520
in my pocket that I would consult a dozen times a day, 20 times a day, maybe 100 times a day. So I,
01:04:54.240
again, I might've only posted once or twice, but if something was really, you know, if the news cycle
01:05:00.320
was really churning, I might be looking at this, this, this, my consulting of this, this news feed,
01:05:06.480
effectively, um, was interrupting my day, you know, not just every hour, but maybe every five
01:05:14.960
minutes of many hours, right. Or for 10 minutes of that hour. And, and, um, so it was segmenting
01:05:22.080
my day, however good that or productive that day was, or should have been, I was constantly chopping
01:05:29.440
it up by how I was engaging with this scroll. Um, again, mostly consuming, but, you know,
01:05:34.960
often in response to the one or two things I had put out. Um, yes, there was a dopaminergic, uh,
01:05:40.960
component to that. Obviously, you know, I said something that I thought was clever,
01:05:44.080
that was perceived as clever by my fans, you know, and perhaps to the detriment of my enemies. And,
01:05:49.360
you know, that, all of that seemed, you know, exactly what it, what I wanted in the moment,
01:05:53.840
but even when it was at its best, right. Even when there was just good information coming to me and
01:06:01.520
I was responding happily with good information back, even the, even the non-toxic version of it
01:06:09.920
was a, a style of, of, of, was frag, was, was intrinsically fragmenting of my life. You know,
01:06:18.320
it's like, I, I, like, I don't pick up, I don't, I don't read a book that way. I don't,
01:06:21.520
I don't have a book that I pick up for two and a half minutes and then I put down and then try to
01:06:27.200
have a conversation with my kid and then say, okay, hold on one second and pick up the book again.
01:06:31.680
It's like, that's not how you, that's not, not how anyone reads a book. Right. Um, and yet Twitter,
01:06:38.800
far too often became that sort of thing in my life. Right. Right. And it's like a parasite.
01:06:43.920
It's like, it parasitizes the exploratory, um, instinct. It's something like that. Right.
01:06:50.160
Because, and, and maybe, look, you know, for a long time, I didn't have a, a cell phone. I was
01:06:58.800
a late adopter of cell phones and I didn't watch the news probably really from like 1985 till about
01:07:06.480
2005. I had cut myself off from news sources. I didn't read newspapers. And the reason that I didn't
01:07:13.920
do that was because I realized. A few things happened in there. Did you catch 9-11? Did that,
01:07:17.840
did you miss that? Well, the, you know, I used to read, for example, I would read some credible
01:07:22.880
magazines like The Economist when, when I still was credible because I don't really think it is
01:07:27.520
anymore. But wasn't that amazing? Isn't it amazing to consider that magazines like Time and Newsweek
01:07:33.280
and they could, could wait a week, like could expect that their audience would wait a week
01:07:39.360
to be informed about the news of that week? That just seems extraordinary to me now.
01:07:43.440
Well, well, well, the, my conclusion about that was that if it isn't important in a week,
01:07:49.520
Yeah. It's not important. Right. Yeah. Yeah. Yeah. Right. And so, and so I substituted these longer
01:07:56.800
lag time news aggregators for TV in particular, or radio. It's like, if it's today's news, it's not news.
01:08:06.080
Maybe if it's not important in a month, it's not news. Right. And that's part of that, that,
01:08:10.800
that intelligent filtering. And I guess part of the reason that X is dangerous and social media is
01:08:19.040
dangerous, X in particular, is that, you know, that proclivity to forage for information is an,
01:08:28.560
is in general an extremely useful instinct, right? It's the instinct to learn.
01:08:33.360
But what, what we're learning, you might say that the shorter the period of time over which the
01:08:40.480
information is relevant, the more like pseudo information it is. And so then any system that
01:08:48.160
optimizes for the grip of short term attention is going to parasitize your learning instinct with
01:08:55.840
pseudo information. Yeah. So it's also the algorithms are going to maximize that.
01:09:00.720
The half-life is one thing, but also the, the culture that, that is informing these algorithms,
01:09:07.360
you know, that the, the, the actual human behavior that the algorithms are, are, you know, skimming and,
01:09:13.280
and, and, and boosting is increasingly a, a, a bad faith style of conversation. I mean, it's just people
01:09:23.360
are so many people, especially the anonymous people are, are in the misinformation business. I mean,
01:09:29.760
they will just cut together a clip that is designed to mislead. And that is the clip that will get
01:09:37.040
spread to the ends of the earth. Well, it's maybe, is it designed to mislead or is it designed to
01:09:43.280
optimize their particular grip on short term attention for their own grad, for their own
01:09:49.200
aggrandizement? Like that, like the, the psychopathic move. And let's say that it's facilitated by these
01:09:55.840
short term attention aggregators that are, that, that, that are driven by bots that are learning how to do
01:10:04.000
this. The, like the psychopathic proclivity, the narcissistic proclivity is going to say whatever puts
01:10:10.880
you at the center of attention, whatever it is. Now, if you're governed by some kind of ethos that is
01:10:18.960
outside of attention seeking, then that's a different story. But the game, if the game is that
01:10:29.040
the machine optimizes for short term attention, then it's going to reward all the players that are doing
01:10:35.200
whatever it takes to grip short term attention. Yeah. But, but the thing is, but people, you know,
01:10:41.520
whatever it takes though, is to get somebody seeming to say something totally outrageous.
01:10:48.880
And in context, in context, it might've made perfect sense, but, or at least be, be a very
01:10:55.520
different point than the one that's being advertised by the clip. But the clip, shorn of context, is just,
01:11:02.560
is calculated to, to mislead in that the, the, the person who has edited that clip knows that
01:11:09.840
the, the naive viewer is, can only draw one conclusion from the, from the, the utterance as presented.
01:11:16.880
Right. And, and, and they're not, and that not, even if they're, you know, well-intentioned and
01:11:21.200
fairly alert, alert to this problem, almost no one is going to go back to the original podcast and
01:11:27.920
look at the, the, the comment in context. I mean, this just happened to, to Rogan, I believe, I think
01:11:33.120
he had Bono the, you know, the singer for U2, U2 on his, um, podcast. And, um, uh, Bono said something
01:11:43.840
critical of Elon, I believe. And this got chopped up in a clip that was just, it made it look like
01:11:50.800
Joe really disagreed with Bono and, and was, and was critical of him. And, and so, and the clip just
01:11:57.760
got exported as like, look at, you know, look at Bono getting owned by Joe Rogan or whatever. But
01:12:03.360
that's not what, that's not what the conversation was at all. Right. Like, like, like Joe conceded,
01:12:07.920
you know, had most, most of the point that, that Bono was making. Um, it was just, it was false. It was,
01:12:14.000
it was a false picture of what happened there. And the, the person who makes that clip just knows that,
01:12:20.240
that if, if they, if they frame it as a, as a smackdown, people are going to love to see that.
01:12:25.840
And it doesn't matter that they're lying about what happened and, and damaging people's reputations
01:12:30.720
in the process. Yeah. Well, and that's especially true if they're anonymous and their reputation bears
01:12:36.480
no consequence of their lies. You know, well, the other thing that's happening, I don't know how much
01:12:41.520
this is happening to you, but, and this is another example of the parasite problem. So increasingly,
01:12:49.440
um, my voice and my image are being used, not exactly in the way that you're describing,
01:12:58.240
although that's happening a lot. Yeah. I'm selling, I'm selling cognitive
01:13:01.920
enhancers somewhere as an AI version of myself. Okay. Okay. Well, that, that's happening a fair
01:13:08.240
bit too. And, and sometimes worse than cognitive enhancers, but it's the worst thing that's happening
01:13:13.520
now is that these sites that are operating under my name, using my image and my voice are providing
01:13:26.160
pseudo philosophical content and pseudo psychological insight as if it's me. And so it's, it's, it's,
01:13:37.120
it's, it's like what I've said has been put through a filter of stupidity and reorganized in my voice.
01:13:47.280
And this is happening constantly. Like YouTube has already taken 65 channels down that are doing this.
01:13:56.320
And so this is another example of that parasite problem, right? You store up a reputation and then
01:14:03.120
the parasite swoop in and pull off the attention that the reputation has garnered and monetize it.
01:14:11.680
And they can escape into the ether because they do it anonymously. And so like this is going to become
01:14:17.280
a stunning problem. I mean, it's, it's, it's a big problem. I can see that it, you know, the,
01:14:23.520
the perfect version of it is, is at most a year away. I mean, it might only be a couple of months
01:14:28.880
away. We've experimented with this on our side too. Just like, for instance, in my meditation app,
01:14:36.400
waking up, we're now experimenting with translation to other languages. And, you know,
01:14:41.440
they've got, AI has got me speaking 22 languages perfectly in my voice. And it really sounds like
01:14:47.920
me speaking those languages. And the translation from what we can tell so far is, is fairly impeccable.
01:14:54.400
So we're going to roll out a, you know, a Spanish version of the app, uh, in the not too distant
01:14:58.800
future just to see what happens. But it's like, it's, it's, it's getting, it's getting too good.
01:15:04.240
So I think what the, the, the lesson that, that, that consumers of information who care to have real
01:15:11.680
information are going to have to learn is that you can't trust if you're, if you're looking at Jordan
01:15:16.960
Peterson on YouTube, you simply cannot trust that it really is Jordan Peterson, unless it's coming
01:15:26.000
through one of one channel that, you know, you can trust, which is so no, we're back to the age of
01:15:31.600
gate. Ironically, we're back to the age of gatekeepers, right? It's not on your channel or Joe Rogan's
01:15:38.480
channel or, you know, Chris Williamson's channel. Uh, if it just purports to be them, but on somebody
01:15:44.800
else's YouTube account, you can't trust it. Did you know that over 85% of grass-fed beef sold
01:15:50.320
in U S grocery stores is imported? That's why I buy all my meat from good ranchers.com instead.
01:15:56.320
Good ranchers products are a hundred percent born raised and harvested right here in the USA by local
01:16:00.980
family farms. Plus there are no antibiotics ever, no added hormones and no seed oils, just one simple
01:16:07.120
ingredient meat. Best of all, good ranchers is tariff proof due to their 100% American supply chain.
01:16:12.460
So while grocery prices fluctuate, good ranchers stays the same. Lock in a secure supply of American
01:16:17.920
meat today. Subscribe now at good ranchers.com and get free meat for life and $40 off with code
01:16:23.220
daily wire. That's $40 off and free meat for life with code daily wire. Good ranchers, American meat
01:16:28.780
delivered. Yeah. Well, it might also be Sam that the real solution to that is payment. Like if it's,
01:16:38.400
the rule is going to be, maybe this is the rule, the rule is going to be, if it's free, right. If it's free,
01:16:45.280
it's a lie. Right. Yeah. That's the world we're rapidly moving into. And, and, or if it's.
01:16:51.900
Except someone's going to be able to create, I mean, until you find them and stop them, someone will create
01:16:57.620
the fake Jordan Peterson Academy that has a paywall, right? That looks like you, sounds like you. And,
01:17:05.580
you know, it's only, it's only $5 a month. Uh, and so they'll, they'll monetize that way and that'll,
01:17:11.240
that'll still be the problem. Has that been happening with your, with your meditation app,
01:17:16.220
with your, with your enterprise yet? Not, not that I'm aware of. No. I mean, I just think, uh,
01:17:21.780
I'm just aware of seeing short clips of me seeming to, to hawk, uh, you know, psychotropics that,
01:17:31.080
that, uh, I, I've never heard of. Um, and that's just an AI version of my voice. It's real footage
01:17:37.100
of me stolen from somebody's podcast and then an AI, uh, work over of that, you know, that turns
01:17:44.860
into an, like an Instagram ad. Yeah. Well, I talked to some lawmakers in DC about a year and a half ago
01:17:51.580
about the fact that this was going to happen, hoping that they would, well, it takes a long time
01:17:56.720
to take notice and, and takes action. But, you know, it's essentially the digital, it's the digital
01:18:02.820
equivalent of kidnapping. Like, I think people should, people should be put in prison for a long
01:18:07.500
time for stealing your digital identity and monetizing it. Like it is very much akin to
01:18:13.620
kidnapping because what they're doing is they're draining the value out of your reputation.
01:18:20.640
That's essentially the game, you know? And so, so what, what's happened to your life? You,
01:18:26.120
you, you, you, you, you, there's a couple of, there's a couple of things I'd like to investigate
01:18:30.460
here first. You know, the, the first I said, I'd like to return to something that you and I talked
01:18:35.000
about that, that we beat, that we wandered around a fair bit in our previous conversations. You know,
01:18:41.820
you had, um, partly because you were concerned about the distinction between good and evil,
01:18:48.100
and don't let me put words into your mouth, you were hoping to find a, um, objective basis for
01:18:55.340
morality, a way of grounding morality in, in, in the objective world. And I have a thought about
01:19:00.400
that that's relevant to our current conversation. You know, so tell me if you accept this proposition.
01:19:07.600
Part of the pathology of Twitter is that it operates by game rules that not only don't apply
01:19:15.260
in the real world, but that when exported to the real world, pathologize it. Is that fair?
01:19:20.660
Yeah. Yeah. Okay. So, so, okay. Okay. Right. Okay. So, so here's a way of, of, I think,
01:19:28.020
bridging the, a gap between the way you've been thinking about the world
01:19:31.780
from the moral perspective and the way I've been thinking about it. So, you know, I've always been,
01:19:40.220
I've understood that you had a very deep concern about moral judgment and that your attempt to
01:19:54.180
provide a scaffolding of objectivity for morality was grounded in that even deeper concern. And I
01:20:02.180
thought that I could understand why you did that. And I, I didn't agree with the conclusions that you
01:20:08.580
had drawn, but I agreed with the overall enterprise and it struck me recently. And I think we've already
01:20:18.840
obliquely made reference to the, to it in our conversation, that there's another way of
01:20:25.300
conceptualizing this relationship between morality and objective fact. And that it, that might be,
01:20:34.640
it might be more fruitful to, to look into the realm of something like, well, it's like theory of
01:20:42.140
iterability. It's, and generalizability. It's, it's maybe a variant of something like game theory.
01:20:50.480
Like imagine that. So let me give you an example, Sam. And it's a pretty famous example. You know,
01:20:56.280
those trading games where behavioral economists sit people down and say, two people, they say,
01:21:02.180
I'll give you a hundred dollars. You have to make an offer to the, okay. Yeah. So the finding across
01:21:07.960
culturally is that people generally approximate a 50%, 50, 50 split, right? Yeah. And they're,
01:21:16.060
and they're highly, they're, they're, they're not game theoretic with respect to unfair trades. Like
01:21:22.440
they don't want, they don't want to accept unfair trades, even when it would just narrowly be to their
01:21:27.160
advantage to accept them. Exactly. Exactly. Okay. Okay. And that's true, even if they're poor. So if,
01:21:33.440
if you put a poor person in a situation where they have to accept an unfair trade, that would be to
01:21:39.080
their immediate economic benefit. They seem even less likely to accept it. Now, the, I think the right
01:21:44.940
way to construe that is that if you and I engage in an economic trade, we're doing two things at the
01:21:50.680
same time. The first is what the classical economists would say is we're trying to maximize our short,
01:21:58.400
our gain, let's say. But the problem with that notion is that we aren't playing one game or while
01:22:06.900
we're playing one game, we're also setting ourselves up to play a very large and unpredictable sequence
01:22:12.460
of games. Those are happening at the same time. And so we don't want to just optimize for gain in the
01:22:18.440
single game. We want to optimize our status as players in a large series of unpredictable gains,
01:22:25.380
games. And so we want to put ourselves forward as fair players so that people line up to play other
01:22:31.920
games with us. Okay. So then imagine that the hallmark of morality is something like
01:22:41.300
generalizable iterability across contexts. Right? Because this would allow for, and so you could
01:22:51.620
think about a truly moral system is the most playable game. And an immoral system augurs in.
01:22:59.520
And like when we've seen, we're talking about this to some degree with regard to X, because our
01:23:04.880
proposition is that fundamentally, because it's optimizing for short-term attention grip, and it
01:23:12.020
benefits the psychopaths and the short-term gain accruers, the parasites, and perhaps the predators,
01:23:20.000
that it's fundamentally a non-playable game. And that if its consequences generalize outside the world of
01:23:28.000
X, that it pathologizes the environment. And the reason for that is it's not optimally iterable.
01:23:33.920
And so the pattern of object, the pattern of morality that would be grounded in the objective
01:23:39.000
world isn't in the world of objective fact. It's in the world of optimized iterability across people
01:23:46.180
and contexts. Well, I would just say that there are some set of objective facts that subsumes that
01:23:53.460
picture, right? I mean, the world is the way it is. The social world of social primates such as
01:24:00.780
ourselves is the way it is. It admits of certain possibilities, and certain other things are
01:24:06.020
impossible, given the kinds of minds we have. Our minds could change in all kinds of ways. They
01:24:10.900
could change by being integrated with technology. They could change by, you know, genetically being
01:24:16.800
manipulated at some point in the future. There's this landscape of possible experience that the right
01:24:24.620
sort of minds could navigate. And we're someplace on that landscape, and we're trying to find our way.
01:24:31.300
And so I view morality as a, at bottom, a navigation problem, right? And it's got this iterative quality
01:24:37.360
that you describe. It's, the question is, it's always, you know, where can we go from here? Where
01:24:45.720
should we go from here? Where should we go from here, given all the possible places we might go from
01:24:51.760
here, both individually and collectively? Okay. Well, you know, the reason that I got obsessed with
01:24:57.900
stories to begin with, Sam, was because I realized 30 years ago that a story was a description of a
01:25:08.940
navigation strategy. That's what a story is. And so then the question is, okay, let's see if we can
01:25:18.420
formalize this a bit more. The story has to, let's say an optimized story has to iterate and improve.
01:25:28.220
So for example, if you construe your marriage properly, it exists stably, but that's not as
01:25:37.360
good as it could get. It could exist stably and improve as it iterates. And then you can imagine
01:25:43.720
that there's a small world of games that are playable in the actual natural and social world
01:25:51.440
that improve as they iterate. And those are, those games, pointers to that game, those games are moral
01:25:59.720
pointers. And I think that that's what the core of the religious enterprise dives into and elaborates
01:26:09.200
upon. I think that's what makes it the religious enterprise is that it deeply assesses. So, I mean,
01:26:17.680
if you imagine this, imagine that your proposition, the proposition you laid out is accurate, is that
01:26:23.780
the fundamental concern is navigation. How do we get from point A to point B? Well, a story, you can
01:26:33.600
think about this and tell me what you think, but I believe that a story is a description of a navigation
01:26:38.220
strategy. If you go see a movie, you infer the aim of the protagonist and you adopt his perceptual
01:26:45.940
frame and his emotional perspective. That's how perception works. And then you can imagine that there
01:26:51.620
are depths of games. Some are shallow and short-term games that maximize for short-term gain and to hell
01:27:01.320
with everything else are shallow. And games that are sophisticated can be played in many situations
01:27:07.840
with many players. They take the future into account and they improve as you play them. And there's a
01:27:15.180
hierarchy of value in consequence of that, that, that, that is obliquely associated with the world of
01:27:22.280
fact. Cause it has to operate in the world of fact, but that isn't fundamentally derived from like data
01:27:30.000
that's directly associated with the facts. Well, not, not operationally, but, but potentially so it's
01:27:38.000
just not, not in fact, it's just, that's just not, I'm never claiming when I say that there are,
01:27:44.380
there are objective truths to, to all of these questions that those objective truths will be
01:27:49.460
delivered by some guy holding a clipboard, wearing a white lab coat. Uh, but there are things we just
01:27:55.120
know to be true. And it would take a lot of explaining to, to, to get to the, to the bottom
01:28:01.740
of how we know them to be true. But I mean, just very, they're very simple claims. Um, we just, we know
01:28:09.000
that, uh, life in, um, you know, the, the best, uh, and most refined and most ethically, you know,
01:28:20.440
positive, some, uh, developed world context, right. You know, you and me and our most conscientious
01:28:27.800
friends at the, um, the nicest resort, uh, after having done a great day's work, we're enjoying a
01:28:34.840
great meal, uh, and, uh, talking and creatively and positively about how to improve the world.
01:28:40.480
We know that's a better game than, you know, trying to find some child soldiers to, to torture
01:28:47.380
the neighbors in some malarial hell hole, uh, you know, in the, you know, sub-Saharan Africa,
01:28:54.420
uh, and, um, so that we can extract, you know, the, the, um, you know, some heavy metals, uh, you
01:29:02.940
know, the, the, the, the extraction of which is polluting the environment and, and causing
01:29:07.840
the, the, uh, the, the, the life expectation to be 30 years lower than it is in where we
01:29:14.060
live. Right. I mean, so like there's, there are different, they're fundamentally discordant
01:29:18.480
human projects that are available to some very lucky people and unavailable to others.
01:29:23.480
Uh, and, and luck is by no means, um, evenly distributed in this world. Um, so there are better
01:29:31.820
and worse games, right? By any, any measure of better, uh, you, you want to, you know,
01:29:37.400
ethically better, artistically better, entrepreneurially better, economically better. It's just to have,
01:29:43.820
you know, better, better, better with respect to the health outcomes, et cetera, et cetera.
01:29:48.440
So we're all trying to play the best game we can be a part of. We're all trying. I mean,
01:29:53.820
some, some people, I take that back. Many of us are, we're all trying to play the best game
01:30:00.820
we, we, we, we can think of as best, but one of the, the, um, the consequences of my argument is
01:30:10.000
that it's possible to be wrong. It's possible to actually have false beliefs about what is in fact
01:30:15.840
better or worse. Yeah. Like you can be confused. Well, I also, you can be confused. I also think
01:30:19.760
you're insufficient. You're insufficiently pessimistic too, Sam, I think, because I don't think everyone
01:30:25.480
is trying to play the best possible game. I think that there are truly negative games where,
01:30:32.120
well, no, but people are being rewarded in some way, you know, like the, the, the sadist
01:30:37.820
whose favorite game is to just see, to cause the suffering in others and, and enjoy that suffering.
01:30:45.580
The fact that he enjoys their suffering, right? That's, that's a problem with him, right? He's a,
01:30:51.720
you know, he's a, a neurological monster of a sort. Um, and he's, he's confined to being the sort of
01:31:00.060
mind that finds that very, um, uh, low level game more rewarding than the, than the game I just
01:31:09.540
advertised at the resort with us being creative and productive and, and, and, you know, positive some.
01:31:15.300
Yeah. Well, that's the man who wants to rule over hell, Sam. Right. Right. Yeah. So I'm not saying
01:31:21.020
that doesn't exist. Yeah. Okay. Fine, fine, fine, fine, fine, fine, fine. But my point is that there
01:31:26.760
are, there's, we're obviously living in a, a, a realm where there are better and worse outcomes by any
01:31:33.920
definition of better and worse that, that makes sense. Uh. Even from within the confines of the games
01:31:40.260
that you're describing. Yeah. Right. Because one of the ways of deciding that a game is
01:31:44.900
counterproductive is that if you play it, it doesn't produce the result that it intends.
01:31:49.760
Right. Right. So, so that's another kind of universal hallmark of moral judgment. Like
01:31:54.920
if you're aiming at something and your strategy doesn't get you there, either your strategy is
01:32:00.100
wrong or your aim is off by your own definition. Right. There's no relativizing your way out of that.
01:32:06.520
And then we can say, well, there's a hierarchy of games that, that expand and improve as you play
01:32:14.220
them. And there's a hierarchy of games that degenerate as you play them, even by your own
01:32:19.800
standards of degeneration. Yeah. And, and the, and the games, the more refined games actually
01:32:26.280
refine you as a player. I mean, they, they, they, you get, you get changed by the game you play,
01:32:32.420
uh, you know, to, to your advantage or to your disadvantage. And it makes you more or less
01:32:37.700
capable of playing any specific game. So, I mean, this is, this is what learning, this is what
01:32:43.800
education is. This is what skill learning is. This is what, you know, interpersonal skill learning
01:32:49.140
amounts to. This is what the difference between having good relationships versus bad relationships,
01:32:54.180
uh, being in a good culture where it's institutions, um, incentivize you to be your effortlessly to be
01:33:03.060
the best possible version of yourself, as opposed to, you know, you having to be some kind of moral
01:33:07.260
hero, just to be just not a psychopath. I mean, this is what's so, so, um, important about incentives
01:33:14.340
and about contexts like, like, like Twitter that, that incentivize the wrong things. What we want,
01:33:20.660
I mean, we don't want to have to take on the burden of rebooting civilization ourselves based on our
01:33:30.180
own native moral intuitions every single hour of every single day. That's for sure, Sam. That's for
01:33:36.980
sure. We need systems that make it easy for strangers to collaborate effortlessly in high trust
01:33:46.100
environments, right? I mean, this is like, we need to offload all of our moral wisdom into institutions
01:33:53.540
and to systems of incentives such that you would have to be a very bad person indeed, not to see the
01:34:00.500
wisdom of being a peaceful, honest collaborator with the next person you meet, right? In this,
01:34:06.420
given the nature of the system, what, what, you know, whereas, I mean, if you look, I mean,
01:34:10.020
just to sharpen this up because that can sound very abstract, if you take a, a, an actually normal,
01:34:16.260
decent person who just wants to be, be good and have positive, some relationships with, with everyone
01:34:22.260
he meets, you put that person in a maximum security prison in the United States, that person will be
01:34:28.820
highly incentivized to join a gang that has, you know, has the requisite color of his skin, right? And be,
01:34:37.300
essentially a, a monster because that's the only way to survive in that context, right? To, to,
01:34:42.980
to not join a gang, to not join a racist gang is to be the victim of everyone, right? So what you
01:34:48.900
have in a, in a maximum security prison is a system of terrible incentives that, where you have to be
01:34:54.980
some kind of, you know, self-sacrificing saint to opt out of, of, of ramifying this awful system of
01:35:02.580
incentives further. We, we want the opposite of that in, in, in situations that we control and in
01:35:11.700
institutions that we build. And, you know, what, the thing that's so disturbing to me about this
01:35:18.100
contrarian moment is that so many people have gotten the message, and this is really most explicit since
01:35:27.220
COVID, they've gotten the message that, that we don't need institutions. We don't want institutions.
01:35:32.500
We just, we just need to burn it all down. And we're just going to navigate by substack newsletter
01:35:40.340
and podcast. And that's just not going to work, right? We're just, we, we can't be all contrarian all
01:35:48.100
the time. We need, we need institutional knowledge. Intermediary institutions. Yeah.
01:35:54.260
That work. Yeah. So whether we have to build new ones or perform exorcisms on our old ones,
01:35:59.780
that might, you know, that might be a different answer depending on the case, but there's no
01:36:04.260
question we need institutions that, that are better than most individuals and that may, and that make
01:36:10.740
most individuals, uh, uh, live up to norms that they themselves didn't invent and, uh, would, you know,
01:36:20.100
under another system of incentives would struggle to emulate.
01:36:25.220
All right. I'm going to bring it in to land, Sam. I think what we're going to do on the Daily Wire
01:36:29.540
side, I want to talk to you, I think for half an hour about the anti-Semitic landscape on the left
01:36:37.940
and the right. And I want to go down those rabbit holes and explore them with you. So that's for
01:36:42.820
everybody watching and listening. I think that's what we're going to do on the Daily Wire side.
01:36:46.020
And because you, you made some comments earlier about your concerns about the right-wing parties
01:36:52.660
in Europe, for example, and the Nazis that are hiding there. And, um, I've seen no shortage of
01:36:58.340
right-wing anti-Semitism rear its ugly head, let's say in, in, on X, for example. But I also want to
01:37:05.380
talk to you about the same, uh, pathology emerging on the left, because there's no shortage of
01:37:10.900
unbelievable anti-Semitism on the left. And we should sort that out a little bit. And so that's
01:37:16.260
what we'll do on the Daily Wire side. Um, uh, Sam, every time we talk, I think we get a little bit,
01:37:24.340
well, we understand each other a little bit better. You know, I, I think there's something
01:37:28.660
very fruitful for us to continue discussing in, in relationship, well, to a number of the things
01:37:34.020
you discussed today about the necessity for intermediary institutions. That's the principle
01:37:39.780
of subsidiarity. It's an ancient principle of Catholic social, um, what would you say, social
01:37:45.300
philosophy. You have to have intermediary institutions. They're the alternative to tyranny and slavery.
01:37:51.620
Um, the idea that there's a harmony between individual development and proper institutions
01:37:57.220
that has to be established. You know, you can't be a, it's very difficult to be a good person in an
01:38:02.020
entirely pathological social situation. Um, and then this idea that there's a hierarchy of games,
01:38:09.940
because part of what interest got me interested to begin with in the religious world, let's say,
01:38:16.740
was because I started to understand what constituted the religious as the structure of the depth of games.
01:38:25.700
It's by definition. I'm not talking about what people think about as superstitious
01:38:31.940
belief. I have that. That's not the issue. The issue is that there's a hierarchy of game
01:38:40.420
from shallow to deep, from counterproductive to productive, um, from unplayable to iterative,
01:38:48.260
and that that's a real world. And there's a reason for that, that I think is allied with your
01:38:54.500
desire, lifelong desire to investigate the object, the objective grounds of the moral world.
01:39:04.980
One thing I would add to that is that also by definition on my account, the, whatever's true
01:39:11.140
there, whatever's truly sacred, you know, the true spiritual possibility has to be deeper than,
01:39:19.060
than, than culture. And it certainly has to be deeper than the, the accidents of, of ancient
01:39:27.300
cultures being separated from one another based on linguistic and geographical barriers, right?
01:39:39.540
Christianity is the real answer versus Hinduism being, you know, the, the real answer. Because,
01:39:45.220
I mean, one, they're, they're incompatible answers at the surface level. Uh, whatever the, whatever
01:39:51.220
deep truth they may be in touch with, that is something we have to understand in a 21st century
01:39:57.140
context that is, that is deeper than, than provincialism. That's my, my argument against
01:40:04.980
We definitely have a, we definitely have much to discuss the next time we talk.
01:40:10.420
All right. So for everybody watching and listening, join us on the Daily Wire side,
01:40:14.900
because we'll go down the anti-Semitic rabbit hole. And that'll give Sam and I a little bit,
01:40:20.580
a little bit of time as well to discuss the political, which we haven't, you know,
01:40:25.940
which we've conveniently circumvented in a sense, but we had other things to talk about. So join us there.
01:40:32.420
Thank you to the film crew here today in Scottsdale. Thanks, Sam. It's always a pleasure to talk to you.
01:40:36.820
Yeah. I'm glad you're doing well. It's real good to see you, man. Yep.